How Redskins PR Generated $442,126 of Negative Publicity


oopsUsing Bad Measurement Can Cut Both Ways

It’s admittedly silly to get worked up about PR Measurement, but today I am.  I see the goal of measurement to fall into one of two camps – Scoreboards or Roadmaps.  The Scoreboard measures how you’re doing. The Roadmap looks at data and provides insights into how you can do better.

Then there’s the PR measurement done by the Washington Redskins – which uses very fuzzy math to make an argument for the exact economic benefit of PR activity.  A couple of years ago the Redskins moved their training camp to Richmond and recently put together a report estimating that the PR value to the City of Richmond was just a touch over $76 Million. Seventy Six Million Dollars. This was based on an estimate that 7.8 billion people were exposed to their messaging.  Put in perspective this is roughly half a billion more people than are currently on Earth.

For a brand that already suffers from bad publicity, having a PR measurement report trigger another negative news cycle is not exactly ideal.  If I were the person responsible – here’s how I would measure the impact.

Find Your Clips

First find as many relevant mentions as possible.   A quick search lead me to the following:

Just looking at these headlines must cause a bit of discomfort for anyone that works in PR.

Get Your Data

We use our Report Mule app to quickly format reports and gather impressions and social data.  I dropped in all the links and exported to Excel to make this pivot table.

Pivot Table with impressions and Social Data for Redskins coverage

Working backwards from the original numbers the Redskins used – if you multiply the total impressions by .0097 you can put an exact dollar value on those impressions.  In this case the 45,580,000 impressions would be equivalent to $442,126.  Whoops.

The point of this is not to beat up on someone (or several people) that are probably having a rough enough time as it is.  PR measurement is tricky and far from an exact science.  However a little bit of common sense can go a long way to avoid problems like this.

Impression numbers are based on the total number of unique visitors to each web property per month. For more detail on this, FiveThirtyEight recently had an amazing write up on the difficulty of measuring web traffic that I would highly recommend reading.  In this chart we can see that the estimate for the Washington Post is almost 14 million.  There’s no way to exactly know how many of the 14 million potential readers actually saw this one article.   One proxy that we like to use for this is social sharing.   In this case – between Twitter and Facebook the article was shared 449 times.    When you have good publicity – you may want to argue that each one of those shares then exposes the article to all of their friends and followers. However – estimates show that less than 5% of tweets are ever seen.  Many social monitoring tools will quickly tally up the amplification effect of social sharing assuming that every tweet is seen by every follower.  This is a slippery slope that should be avoided.

So how many people saw this negative Washington Post article? Was it 14 million?  Was it 449? While we can’t know for sure, I would argue that the number is much closer to 449 than 14 million.

Since all of these numbers are made up – let’s say that the actual number of impressions is 10x the total number of social shares.   In this case that would be 151,890.   Multiply that by .0097 and your net negative effect from this bad PR cycle $1,473.34.  Not awesome, but a lot better than the high end estimate.

For the record – I don’t think adding a dollar value to unreliable source data is doing anyone any favors in good times or bad.

The point here is that there’s a reason to gather data, but it’s important to take a step back and really understand what the data can communicate (and how easily it can be manipulated). For professional communicators it’s worth the effort to understand these numbers and translate them into more realistic reports for your stakeholders.

How to Handle Data

– If you include impression numbers and economic benefit data in a report – very clearly explain that these numbers do not come close to reflecting the actual number of people that saw your messaging or the actual economic benefit.  The real numbers cannot be determined with any accuracy.

– Do continue to gather data so that you can understand the potential reach of each publication.  Supplement this data with as much other relevant information as you can find.  We think social sharing is very useful to consider.

– Compare results over different time periods.   The Redskins would have been much better served to look at the impressions from 2013 as compared to 2014 and show an increase from one year to the next.  (There are other potentially gaping problems with this process, but it’s still more responsible than what they chose to do.)

– Use common sense.   Reputation is important, so consider the big picture.   Look at your results through the lens of the general public and think about how they might react.   (Go back and look at the headlines above for reference.)

– Consider context.  All of these source articles were not about the City of Richmond.   They were about the Redskins.   Don’t try to argue that the benefit from media exposure of these stories is equivalent for both the city and the team.

My gut feeling is that there probably is a nice economic benefit to the City of Richmond for hosting the Redskin’s training camp.   PR exposure is definitely one of them.   If the Redskin’s PR team had approached the situation with a more realistic presentation, they would have had a much more compelling argument and avoided an embarrassing news cycle.  Win win.