As a marketer, it can be incredibly frustrating when you are trying to match up numbers between two different services for the same link, campaign, ad, web page or any other online event. We feel your pain!
From our experience in auditing and reconciling reporting between services we have found these differences are often due to one or more of the few reasons discussed below. Further, while everyone’s tolerance of “close enough” is different, we’ve found that +/- 10% is usually good.
Comparing Apples to Oranges
Due to the initial lack of a centralized source of what was and wasn’t a bot, spider, crawler, etc. most major platforms now have their own proprietary list and methods of how they determine organic versus inorganic traffic. On top of this they may also be filtering out clicks, events, actions, etc. for other reasons as well (fraud detection, a whitelist of users, etc.). This will result in one service filtering traffic in one way and another filtering the exact same traffic in another way. Of course, the numbers will never match up!
This can also be an issue based on what exactly are you counting. Total clicks or unique clicks? Or something in between? What constitutes a unique click? Does a unique click have a different IP or a completely different device fingerprint? Is the process for identifying the fingerprint of a device different between services?
Unfortunately, time can cause discrepancies between reporting among services in a few different ways:
First is around the definition of a day. Yes, 24 hours make up a day, but when you start and stop that day makes a huge difference. For example, if you have traffic that spikes at a certain time period that spike may be counted inside one day if you are measuring the day based off of starting and stopping at midnight UTC compared to Pacific Time (US West Coast), as that is an eight-hour difference.
The longer the time period you are comparing between services the more likely a difference in time zones will start to normalize out, but it’s always something to be cognizant of for comparisons during shorter time spans.
Some services won’t count a click/pageview/event if the time at the final destination was less than a certain period. Often called a “bounce” this behavior might be reflected in some reports but not others.
On a similar note, some services tie their metrics to a “cookie” or “conversion window”. Where one service might be counting in one-hour increments another might be doing it in 24 hours where a third might be 30 days. This is one of the areas that Facebook uses to claim that their reports should be different than Google Analytics.
A reporting delay can also be a factor in comparing numbers. Some services report in real time or “near” real time (anywhere from 1 to 10 minutes delayed), while other services may only update their reporting on a daily basis or with a 24-hour lag.
In the world of multiple redirects (used to leverage multiple unique services and programs) and one of slow internet, there is often some level of fall out between each step in a link or page resolving. In theory, the first redirect should have the most traffic and for every additional redirect, there should be some percentage less. With this in mind, the upstream service (earliest tracker) should have a higher number than the downstream or final destination (tracker).
This problem is often exacerbated by slow or spotty mobile connections and older devices.
While there are a lot of reasons why the numbers between reports from different services should match exactly, they often should be close (we consider +/- 10% to be close). If they are not it’s definitely worth investigating as this can be a good indicator of an issue. There is nothing worse than running a campaign and then feeling like nothing matches and that you can’t trust any of your data.
If you run into this problem please let us know, we’d love to help!
Please note that Geniuslink is completely revamping how we report on “junk” traffic (bots, pre-fetch and duplicate clicks) which will go live Jan. 1, 2018. Read more about the rollout in our blog Better Click Counts, Purer Data or hear the story of how Geniuslink has dealt with bad clicks in the past in our article Tracking Down Bad Clicks.