Whenever you fully measure an online campaign (SEA, banners, emailing), you usually get data from your beloved Web Analytics tool (what I call internal or onsite data) and data from your advertising agency (what I call external or offsite data). A common reflex is to put both together, trying to reconciliate them… But they won’t and (almost) never will. How is that possible?
“There must be something wrong!”
I have stopped counting the number of times I got asked this question “I got figures from my Adwords/banner campaign but when I look at our WebTrends data, I don’t get the same numbers. Why are there some discrepancies? Which data are correct?”. Be prepared –if you start tagging your campaigns with your own tool (and God knows you should if you don’t) and if you get advertising data – you will be asked such question a lot! It is also a recurring question on the Yahoo WA forum.
“Welcome to the real world!”
You will have to accept that "external" data and “internal” data will never match (or you are very lucky if they do) – external ones (from banners/adwords) being usually higher. The difference can vary to a large extent: from few percent (very good) to more than 25% (not unusual). Is this abnormal? Not necessarily.
In a perfect world, one would expect that figures would match. After all, isn’t the Internet highly measurable? Isn’t Web Analytics a science? Well, world is not perfect – far from it. Same for Internet and Web Analytics.
Why they will never match
Business stakeholders want an answer. Something else then “because, it’s like that” (it hardly works with my 3-year son so…). You need to provide explanations. Here are some of the reasons why these damned figures don’t match (they evil!)
- Clicks are NOT visits: Most of the times banners/ads performances are measured using impressions and clicks while your tool will report visitors and visits. These are totally different measures & concepts. Comparing these is like comparing apples with er… well something else :-) Again in an ideal world, one would imagine that 1 single click would convert into 1 visit. But not in the real world. Why? For example a visitor clicking on an ad, bouncing back to the original page and going back will result in two clicks but 1 visit. If you really want to compare clicks to something else, the measure that comes closer to a click is a page view.
- Different tools, different results: Add to the unit difference the fact that two different tools will give different numbers because they have their own way to collect data and they have their own secret algorithm & measure definitions. The final results will differ (but it should be in a limited extent).
- Link tagging issues: Most Web Analytics tools rely on specific parameters added in the URL’s to identify campaign traffic (for example: utm_campaign in Google Analytics, WT.mc_id in WebTrends). If no present, traffic from these URL's will be considered as “normal” traffic. This is a very frequent source of discrepancies so make sure that all your campaign links get correctly tagged – what you need are good guidelines, standards & process (see "Working with Web Agencies - Take control" post). If not tagged, clicks from these URL's will be counted but not the visits (well, they will be but not as campaign traffic).
- Landing page tagging issues: If the landing page or microsite is not tagged (or badly), no visit will be counted while ad clicks will be - leading to discrepancies. Always double check (quality process)!
- The use of first party cookie: In many tools, campaign traffic measurement is based on a cookie (usually a first party cookie). But not all users will accept it and they will not be counted as campaign traffic – introducing some differences in visits/visitor counts (while ad clicks are based on server hits recorded at server level).
- Tag position: it is usually recommended to place your WA tag at the end of the HTML page – giving priority to content (and therefore to the user experience) over measurement. However with heavy content (often the case with Flashy campaign microsites), the page can be slow to load. This gives enough time to users to bail out (you have no idea how quick they can be) before the tag is triggered. The click is counted, not the visit. Or they can have time to move to another page – in this case the visit will be counted but the campaign referrer will be lost.
- Referrers got lost in between: If the ad points to a URL that does a redirect to the actual landing page, the referrer parameters will be lost (if no special mechanism in place) and the visit will be considered as “normal” traffic. It may seem obvious but it happens more frequently than you think.
- Technical issues or bad configuration: If the page fails to load (can happen) or if ad links are not correctly set-up, users will never reach the pages while clicks still got recorded.
“Ok but what can I do?”
The first thing is to understand and accept there will always be some discrepancies between external & internal data. You can have a look at the difference between the various sources and check it remains more or less constant over time. If not, there may be something wrong happening.
Also if you see a major difference, something like above 30% - it’s time to put on your Sherlock Holmes hat and start investigating!
- First check that URL’s are correctly tagged & configured (no redirect).
- Inspect your landing page and make sure it is correctly tagged and that the tag is fired quickly enough (use WASP or any HTTP sniffer). Also if your campaign measurement relies on a first party cookie, is it set-up correctly?
- Check server monitoring logs to see if there was any performance issue or downtime during the campaign.
- If you are using page tagging, a good way to investigate is to have a look at the Web server logs and analyze them with a log analyzer. Are the results closer to your data or the external ones?
Another advice is to define a quality process & checkpoints to ensure that tagging is done correctly and fully tested before going live. Again, it seems so obvious but still happening too often.
To conclude, here is an example from my personal experience:
Once, one of our local markets made an important banner campaign for our luxury brand. They decided to have it tagged using our central solution (WebTrends) in addition of usual advertising reporting they got from their local agency. After the campaign, they got back to me as the difference between the figures reported by their agency and WebTrends was above 50%!
The first point to solve for me was to check if ad requests reached or not our server. We took the Web server logs and analyze them with our WebTrends software installation. The campaign traffic figures were quite close to the ones from the page-tag logs. So the problem occurred upfront. The agency investigated and discovered that a technical issue prevented some of the banners to work correctly so clicks were recorded but visitors never reached the site. Case solved!
What do you think? Does it make sense? Any other sources of discrepancies? How do you tackle such situations?
Related posts & resources:
- "Good practices: Working with Web agencies - take control!" - September 2008
- "Yahoo ad click-thru rate and Webtrends referral visits" thread on Yahoo WA forum
- "Discrepancies between WebTrends and Google Adwords" from WebTrends Outsider
- "Why Don’t the Numbers Match?!?" by Judah Philips from WebAnalyticsDemystified