« More blog articles
The Past, Present and Possible Future of Hyperlocal Media

The Past, Present and Possible Future of Hyperlocal Media

Posted on: June 2, 2021

John Zaffuto provides a brief history of convergence and hyperlocality, and a possible glimpse into their future


“Hyperlocality” - as it is known within the select community of persons devoted to the study of media and its effects on human behavior -  defines a peculiar variety of indigenous content designed to appeal to a distinct, yet relatively compact geographic area.  Those unaware of their academic significance may simply refer to hyperlocal outlets simply as “things that were once other things;” such as a website bearing the name of a long-lost (or at least greatly diminished) local newspaper.  It is an obtuse, but fairly accurate, description of what may be the last great hope for independent journalism at the rural and municipal level. 

Although the term “hyperlocal” is generally associated with media targeting suburbs and smaller DMAs, it is also quite useful when contemplating the evolution of legacy media within the American cultural experience.  Early newspapers were necessarily “hyperlocal” as they were primarily concerned with the political and social interactions observed within their specific communities.  While stories of wider import were often included on their front pages, they were vulnerable to both factual errors and the ideological tendencies of the publisher.  Thus, local items such as legal proceedings and sporting events were of far more interest to the average reader due to the relative impact on their daily lives.

(FIGURE 1)
An early example of “hyperlocal” media
Filesource://www.flickr.com/photos/peagreenchick/384744367/

The catalyst for the current manifestation of hyperlocal media activity can arguably be traced to the Telecommunications Act of 1996.  Primarily known for enacting a framework for the eventual adaptation of high-definition digital broadcast standards, the legislation also dramatically relaxed media ownership regulations, leading to a frenzied period of mergers, layoffs and buyouts within the communications industry.  In many cities, multiple radio and television broadcasts began to emanate from a single facility, managed and maintained by a single staff - including on-air talent.  In pursuit of an even more substantial profit margin, some owners even attempted the operation of some properties remotely from a central control room located a considerable distance from the DMA of origination.

These developments coincided with the general acceptance of the internet as a viable source of information and entertainment.  With elements that could be read, passively watched, or interacted with according to the whims of the consumer, “convergent” online multi-media content was seen by many, particularly those belonging to younger and more affluent demographic groups, as a desirable alternative to newspapers and spot-heavy affiliate newscasts.  Within an incredibly short period of time, all that was needed to create a legitimate source of local news, sports and entertainment was online access and relatively inexpensive web design software, thus initiating the contemporary era of convergent and hyperlocal media.  As such, contemporary “hyperlocal” content concedes the ability of larger, group-held outlets to quickly cover national and statewide events and instead attempts to provide detailed information catering to an extremely concentrated demographic group.  Typical hyperlocal offerings often include coverage of town hall meetings, school events and unedited interviews with local politicians.

      

(FIGURES 2 and 3)
Landing pages of two hyperlocal sites serving communities in Southeastern Louisiana.

Although the eventual widespread adoption of convergent and hyperlocal strategies is considered somewhat inevitable within the communications industry, at this time there still seems to be considerable indecision concerning the correct implementation of such structures.  Evidence suggests that convergent and hyperlocal efforts should seek to both address a recognized informational void within a particular geographic area and incorporate a great deal of input from those residents.  Furthermore, if an online hyperlocal presence is linked to a traditional media outlet (such as a magazine, newspaper or radio station) care should be taken to cultivate a distinct digital identity where accessibility and interaction are a priority.  There should also be a realization among hyperlocal staff and management about the unusual relationship between such media and user expectations.  In other words, if an online resource brands itself according to a zealous commitment to the specific concerns of a localized audience, they must be prepared to both receive and accommodate an enthusiastic amount of input and opinion from that same audience. 

The hyperlocal/convergent entity is an unusual beast, and currently exists as a medium where mobility and versatility are valued, and cannot be positioned as simply an “internet newspaper with video” that can be economically leveraged according to older, obsolete business models.  Consequently, as of this writing at least, paths to convergent failure are a bit clearer than those leading to success and sustainability.  Nonetheless, there are still those that value convergence and hyperlocality primarily as cost-cutting maneuvers and not as organic approaches to multi-media integration.  In October 2019, the Seattle-based digital platform company Maven gained control of Sports Illustrated magazine through an agreement with Meredith, the current owners of the iconic publication.  The move was accompanied by a promise to “refocus” the brand amid layoffs and a stated goal to cultivate an inexpensive network of freelance content providers in lieu of full-time writers and creative professionals. As reported by Deadspin, employment under the new arrangement requires at least three video updates per day “as well as hundreds of posts per month.”  Additionally, “prospective Maven ‘partners’ were told by company execs that if they had trouble creating enough content, they should go to the nearest college and find eager young students who would write for free.”  At this early juncture, it is obviously unclear how these changes will be received by the most important Sports Illustrated stakeholders - readers and users of Sports Illustrated content. Despite statements from Maven management to continue a tradition of “unparalleled journalism and powerful storytelling,” some may suspect that such actions may actually only serve to conceal the final economic exploitation of a well-known media entity.

One development that could significantly alter the future of convergent and hyperlocal media is the adoption of the ATSC (Advanced Television Systems Committee) 3.0 broadcast standard.  Approved by the FCC in November 2017, the protocol is intended as a significant upgrade from the current ATSC 1.0 standard which, when implemented in the early 1990s, allowed for the introduction of digital and high definition broadcast transmissions (a planned but soon outdated 2.0 standard was abandoned with some features integrated into the new ATSC 3.0 structure).  Despite being the first major overhaul of the American broadcasting standard in approximately 25 years, the upgrade has thus far received little attention aside from promises to deliver free-to-air 4k and HDR images and enhanced audio.  The fact that ATSC 3.0 will require consumers to purchase new televisions or conversion equipment is a significant concern, although provisions have been made that require legacy ATSC 1.0 signals to be simulcast for five years following conversion. 

Further scrutiny of 3.0 parameters indicates that the standard is in fact an IP-enabled technology allowing for advanced levels of interactivity and convergence.   While ATSC 3.0-enabled televisions will be able to receive a “base layer” broadcast signal, homes equipped with a hybrid receiver or “gateway” will enjoy “enhanced layer” reception, which may include features such as on-demand programming and targeted, interactive advertising.  Since use of the hybrid receiver will also require some degree of online access (in some cases just a wi-fi router), the ASTC 3.0 standard is seen as an advance that could retrofit the typically staid and analogue-minded network affiliate into an efficient and affordable alternative to their OTT rivals. 

Unlike the older standard, dual-layer ATSC 3.0 exists as a two-way signal.  For the consumer, this could mean access to additional targeted content from the same broadcaster, some of which could be aimed at specific areas within a single metropolitan DMA.  In turn, broadcasters could utilize IP protocol to tailor ads demographically as well as glean an impressive amount of information about the viewers themselves.  ATSC 3.0 will also allow broadcasters to extend their OTA presence to mobile devices including automobiles, additionally giving consumers access to various apps and services outside of conventional wireless data plans. In addition to programming and analytic enhancements, ATSC 3.0 will also implement the Advance Warning and Response Network, a revamp of the current Emergency Alert System capable of providing additional public safety information such as photos, surveillance video, storm tracks, evacuation routes and other information.  Through the use of the AWARN system, authorities have the potential ability to remotely power televisions (and presumably other ATSC-enabled devices) in order to deliver emergency information. 

 

  

(FIGURE 4)
Map of current and future deployments of ATSC 3.0 within American DMAs.
atsc.org

Current promotional material seems to characterize ATSC 3.0 as a significant advancement in broadcast image and audio quality that has the potential to act as a sort of “free” OTA wi-fi service supplying only content provided by traditional broadcast stations.  While such a service could potentially prove a threat to internet-based hyperlocal outlets, it is still unclear when and how extensively ATSC 3.0 will affect the communications industry.  As of this writing, conversion to ATSC 3.0 remains a voluntary proposition for American broadcasters, and requires significant investment in broadcast engineering infrastructure (although current plans allow for stations in a single market to “share” ATSC 3.0 broadcast capabilities, possibly allowing for further facility consolidation).  Consequently, ATSC 3.0 signals are currently only available in test markets with consumer-ready conversion devices still relatively expensive and difficult to obtain.  Given these factors, coupled with the lack of a comprehensive promotional campaign introducing ATSC 3.0 to the general public, it is still unclear how the standard will be received if and when widespread adoption occurs.  The ability to utilize a “wi-fi-like” content service free of charge could be attractive to some consumers, although the prospect of purchasing additional devices could offset this enthusiasm.  The similarity of the ATSC 3.0 interface to existing OTA applications could further confuse some consumers, who would consider the standard redundant and not worth further investment.  It is also significant to note that these questions persist in an environment where the average consumer is more informed of the transformative powers of expanded 5G coverage than any advances in the delivery of free OTA services, no matter how innovative or revolutionary. 

From a contemporary perspective, it would seem that convergence and hyperlocality exist as a “moving target” within the contemporary media environment.  However, while prospects of success (or at least sustainability) may differ according to location and demographics, a few consistencies are apparent when considering hyperlocal structure and intent.  As evidenced within existing examples, convergence and hyperlocality are workable strategies for success and relevance if implemented within a simple yet specific paradigm:

- identify gaps within a local informational spectrum.  This can be in terms of both provided content and underserved target audiences.

- acknowledge that experimentation, along with extensive consumer interaction, may intuitively dictate a “bottom-up” approach towards content creation.  This approach requires content specialists to “trust” their audience’s interests and concerns and consider them “partners” in an effort to establish an authentic local source of news and information. 

- realize that utilizing convergent strategies as a cost-cutting measure – often in the service of a renewed “hyperlocal” focus – is a far riskier approach.  Aggressive “top-down” applications of convergent methodologies are generally implemented without much user feedback and can result in a product for which there was no existing demand.

Ironically enough, the very efficiencies that entice larger corporations to transform some properties into convergent “content generators” may in fact inspire additional grassroots competition from disgruntled former employees and underserved consumers.  Indeed, in such circumstances, the old adage “if you build it, they will come” would be effectively replaced by “if you break it, we will build our own.”

Special thanks to Dr. Amber Narro for assistance in compiling this material.