Monitoring social media is a tricky one. A lot of favour has been given to metrics provided by URL shorteners like bit.ly which provide inbuilt statistics on click throughs. Are these accurate?

I had a tweetup with @bfurby_digital on the subject. We’re certain that shortened URLs are available to all the web and that presents some technical challenges. On one side of the web are users who’s movement we want to track. On another side of the web are the bots, spiders and automated scripts. These roam around digital content looking for links to explore, whether in the hope of finding fresh content to index or virgin pastures to spam. Does their movement trigger statistics? What then does that mean for the validity of our click through rates and estimates of user participation?

A big step forwards from Y2K era web statistical software was when analytic software started to either segment or remove robot movements from data. Typically you’ll see a big downshift in overall traffic, especially in frequently updated sites. Overall it may decrease a page (or link’s) popularity yet increase the conversion URL’s conversion rate. Googlebot and others understand the typical rate at which sites release new content and revisit accordingly. Unless special technical measures are implemented to throttle this spidering it can consume a significant portion of overall page views.

The advent of mashable and dynamic content further exacerbates this issue. A page may remain fundamentally of the same content however a small box showing links to latest updates may constantly change thereby fooling the search spider into believing there is a high update rate. Again, the net result is high page views which will skew any analysis of effectiveness.

Pages with URL shorteners are often mashable and/or dynamic pages!

Here’s a way of testing the accuracy of your URL shorteners inbuilt statistics. Use a tracking URL in the URL address you shorten. These are simple parameters tagged on the back of a URL. To a viewer, they do nothing. However in your analytics package you’ll see it as separate. Let’s look at an example:

  1. http://www.myblog.com/hits.htm <– your normal URL
  2. http://www.myblog.com/hits.htm?src=bitlycampaign1 <– the same URL prepared with a tracking code for us to identify precisely who used it.

The prepared code can then be input into bit.ly or your URL shortener of choice. The majority of web analytics software will identify the unique parameter passed on the URL and log it as seperate. We’ve now succeeded in making two URLs for one page. This means we can identify direct hits on the page (visits to the untagged URL) and visits via our shortened URL campaign. This immediately means we have a way of verifying our inbuilt link shortener statistics.

For cleanliness, SEO buffs amongst you may want to place a canonical URL tag on the destination page in order that Google, whilst acknowledging there are two URLs at play, treats them as one link juicewise. We don’t want to start splitting Page Rank over multiple URLs since this is an analytic hack rather than a content creator.

My immediate thoughts are that most URL shorteners are just providing unsegmented click through rates. Using them on Twitter, which has an ever growing number of friendly, neutral and malicious bots in constant operation, will therefore make your campaigns look far more buzzy than they may be! As serious marketers, we want to look at honest rates so we can start to determine true KPIs and thereby provide clients with great value campaigns backed up with measurable rates. This is a useful hack for you to self-audit. Please let me know your feedback as to how accurate or not your system is and I’ll update as I go.

Advertisements