Evolution of Media Monitoring and Innovations in PR Measurement

When you sit down and log on to your media monitoring platform, you usually know what to expect. Up-to-the minute results based on your keywords—which have been adjusted to reduce the likelihood of false positives—with results from publications around the globe. Social media mentions might be included too, along with some automated analysis such as sentiment. Charts and graphs are automatically generated for some content, allowing you to see results at a glance.

It was not always this easy, nor this quick, to collect and assess coverage.

Media monitoring has evolved considerably in the more than a century of its existence. And monitoring continues to change and adjust to advances in technology and customer demand.

Clipping Services

Media monitoring has a long history, starting with clipping services. The first of these was established in London by Henry Romeike, who moved to the US in 1887 and established a clip service there called Henry Romeike Incorporated of New York. Other, similar services popped up during the same period, including Burrelle’s Press Clipping Bureau established in 1888.

Clip services involved individuals physically paging through newspapers and periodicals, hunting for keywords and phrases, and then cutting out the articles. A very labour- and time-intensive process was the norm for decades.

Broadcast Monitoring

Television and radio monitoring was another innovation, enabled by advancements in audio and videotape technologies. Again, this was a largely physical process, which depended on someone recording the content and segmenting off the relevant portion to distribute to the client for review.

The internet expands what we find

Online clipping services were the next big innovation. As use of the internet grew, so did our understanding of what was available to search.

In addition to vastly increasing the access to online publications, electronic searching—rather than wading through physical publications—meant faster results. You no longer had to wait to receive copies of printed clippings sent in the mail.

Computerised results were more comprehensive and quicker to receive, but presented a challenge for those individuals and agencies who were accustomed to proving results through the thickness of binders full of clips.

Links to online video also meant that broadcast content was easier to locate and share, as it was no longer tied to the exchange of bulky video and audio cassettes.

The internet changed not only how PR professionals found media monitoring mentions—it also required that we re-think what those results meant.

Social media enters the chat

Although some companies initially dismissed the potential impact of user-generated posts, it quickly became clear that such content could be wildly influential.

In 2004, a video posted on a blog showed how quickly one could open a Kryptonite bicycle lock using an inexpensive ballpoint pen casing. The post generated a fair amount of discussion online, which was then picked up by the New York Times. In addition to upsetting cyclists—who found it troubling that an expensive lock was so easily circumvented—this early social media crisis resulted in negative press for the company.

Businesses learned that blogs had the potential to share good news or bad—and quickly spread that around the world. The monitoring of this content alongside traditional media sources followed.

As other social media platforms —such as Twitter (now X) and Facebook—emerged, it was not long before requests to monitor those outlets were added to the overall list.

Platform technologies varied, presenting a challenge for monitoring firms. In some cases, highly specialised tools emerged that would monitor the content from a single platform. However, this presented a problem for companies struggling to stay on top of all of the results coming in: who has time to monitor five types of media by using five different monitoring tools?

Adding context; natural language processing

With the amount of content growing dramatically, monitoring tools realised the need to layer in context in an effort to surface the items that had the potential to have the greatest impact.

Charts that generate automatically show daily, weekly, and monthly volumes. Word clouds show prominent terms used within collected coverage. Content over time, charted, can show when coverage rises and falls.

Early attempts at automated sentiment analysis fell short, as sarcasm and humour slipped through the cracks, but this has steadily improved over time.

Incorporating natural language processing into analysis programs has yielded improvements in sentiment analysis, whilst also aiding areas, such as categorisation of content and keyword analysis.

Artificial Intelligence

Generative AI has demonstrated an adeptness at summarising large amounts of content, an attribute that is being deployed to assess news coverage.

Media monitoring tools are using AI to streamline labour processes and understand large sets of data. AI’s best use right now is to alleviate repetitive tasks, and assist in quality checks. As generative AI becomes more sophisticated, new uses are on the horizon.

What is next?

The dramatic rise in the use of AI tools brings with it both promise and problems. In addition to enabling faster, better processing of online information, generative AI can be used to create content that is misleading, deceptive, or skirting the line of plagiarism and copyright violation.

To address this darker side of AI, we may need to look to AI to show us what is “real” content and what has been artificially generated. Filtering content that has been computer-generated so that it may be reviewed for accuracy by humans could become an important part of the analysis process.

Conclusion

The evolution of media monitoring tools has largely been driven by changing technology, and this shows no sign of slowing down. Moving from the physical collection and dissemination of clips and broadcast to online was a major leap for clients of media monitoring products. Now that most content is online, advances are more likely to come in the areas of collection, assessment, and analysis. In the future, there may be a need to deploy AI to quality-check, to ensure that the content is accurate, valid, and reliable.

The Measurement Standard

Events

On-Demand Webinars