I want to preface this by saying that this is quote possibly the most pompous thing I've ever done on this blog: Quote myself. As I've often explained to clients, students, co-workers, friends and family who are unaware of WebTrends, Clicktracks, comScore, Nielsen, log files, clickthroughs, unique visitors and hits vs. impressions:
"The greatest part about the internet is that you can measure everything. Getting everyone to agree on the numbers is another issue." - Me
[Author's note: Excuse me, I need you to make room for my head on the page while you read.]
Another story is out today about how a web property (Myspace) was getting jobbed on their uniques. To me, the argument is a closed-case. The IAB (Internet Advertisers Bureau) has demanded an audit of comScore and Nielsen, who both essentially sample a percentage of internet users and extrapolate that to represent all internet use (you can see how completely fucked that model is online), which should be provided by the end of the year.
Throwing another monkey wrench into the equation is how antiquated algorithms account for newer technologies like AJAX. This has also proven tricky for both of the major players in this space:
But then there are also wild variations between ComScore and Nielsen Online's panel data, which further muddies the waters.
Sucks for advertisers. And for publishers. Interesting to note that "deleting cookies" and "work/home internet access" are listed as reasons the big players don't trust publishers' log files, claiming that they inflate the numbers. While that may be true, it doesn't explain why either are able to explain why they're unable to track specific access to sites, like Facebook, or the significant drop off in September when kids go back to school. In fact, it takes several months for their "ratings" to adjust and account for college students.
There are several players trying to fill in this gap. Quantcast offers what they're calling "Open Internet Ratings Service", by using a hybrid approach of pulling in numbers from log files and from a panel of 1 million folks. Compete is also offered as an alternative in this space but at this point they're not a true measure of log fies.
What's the solution? Well, for starters let's agree to share access to your log files. That's a great first step and it completely kills this notion that a sampling of internet users determines all internet usage. Not gonna happen. A second approach would be agreeing on which stats matter. Is it impressions? Unique monthly visitors? Time spent on a site? Conversion rate? These are all based on mass. The greater the mass, the higher the value. Right, Britney?
When it comes to measurement, however, I don't think measuring the masses is the answer. I've said it before, but the question, while easy to dismiss if you're loading up the buckshot and pulling the shotgun's trigger, isn't how great the audience is by volume, but rather by quality.
Is quantity a measure of success? Sure, if the only thing that matters is eyeballs. But the low clickthrough rates of banner ads suggests the offline measurement model being used by advertising doesn't work much better online. Even Google PPC ads that yield a 5% click through are considered as success -- and the last time I looked 5% was a failing grade.
Let me put this into perspective for you with yet another anecdote:
A project I worked on while at Blue Cross Blue Shield of Michigan was a basic event invitation form. Success for this campaign was determined by the number of RSVPs by a certain date. We sent out email invites to 50 physicians, and all 50 RSVP'ed via our online form. That's a 100% success rate. Would advertisers care about it? No, but to individual businesses riding the long tail it does matter.
The point is that while the web may be immeasurable, the metrics discussion will never end partly because the goals set up front are meaningless. 5%? I'll take 100% any day.
Update: Proof positive: via Valleywag, "ComScore backtracks on numbers that tanked Google's shares". Nice.