“Argh! My Attribution Analytics Don’t Match!” (Part 1 of 2)
Attribution analytics can be frustrating. Different tools often present wild deviations in conversion, revenue, and attribution reporting. But this isn’t always problematic. In fact, different attribution models can be a benefit — if you know how to use them.
eCommerce Marketers want to understand the return on their brand’s investment in paid ads, social media, search optimization, email marketing, and other channels. They want robust analytics, so that they can optimize campaigns within a single channel — and prioritize spend between channels.
It’s almost 2023. This should be easy, right?
Well, no…Attribution analytics can still be quite confusing. It’s quite common for DTC brands, even enterprise eCommerce brands with dedicated data and analytics teams, to struggle trying to understand differences in the reporting they find in Google Analytics, Google Ads, Facebook Ads, and so forth.
In fact, an entire product category of multi-touch attribution analytics tools, has grown out of these challenges, led by products like TripleWhale and North Beam. But even these tools can be confusing and problematic (as described below).
In this two-part Medium series, I will cover common measurement and modeling issues that can cause these reporting deviations. In Part 1 (this post):
- I’ll talk through the best way to troubleshoot pesky data anomalies.
- I’ll describe things you can do to collect more complete, more reliable data across your measurement platforms.
Then, in Part 2:
- I’ll explain why it’s sometimes a good thing that different tools are reporting different attribution numbers.
The first step in understanding differences in attribution analytics reports is to understand the data you are collecting, and how it’s being utilized by different analytics platforms.
Step 1: Differentiating Between MEASUREMENT and REPORTING
My company, Fueled, provides a free Google Analytics integration for BigCommerce and Shopify. This app was released to help DTC eCommerce merchants migrate from Google’s Universal Analytics (which is sunsetting in July 2023) to Google Analytics 4 (GA4).
Quite frequently, new customers will ask us why they are seeing different numbers when comparing UA reports to GA4 reports.
Moreover, customers will often log into BigCommerce or Shopify to compare the order count and revenue reported within those applications to what is being reported in Google Analytics.
When these reports don’t match, it’s natural to assume that our app might be the culprit. The vast majority of the time, however, the variation in attribution analytics is due to differences in how various reporting mechanisms and tools are set up.
Google Analytics, Facebook Pixel, Fueled, Segment, mParticle — there are many applications that can be used to measure (or collect) attribution data. But these are separate from the tools used to report on this data.
Example: Google’s Universal Analytics and Google Analytics 4 offer drastically different reporting features. Many marketers switching to GA4 are disappointed in its current limitations. GA4 actually can collect more complete event data than UA— but GA4 doesn’t yet provide tools for analyzing and operationalizing this additional data.
Step 2: Understanding How Attribution Events Are Being Captured
Event-Based Analytics vs. API-Drive Analytics There are two ways that attribution analytics reports are generated. First, tools like Google Analytics can capture customer events, like purchases, in real time, as those events occur. Alternatively, reporting tools can poll APIs to aggregate data from various sources.
Sometimes, merchants are confused when they log into Shopify and see 20 sales for a given period, when Google Analytics is reporting, say, 21 or 22 sales. Often, this is due to the fact that Shopify will deduct cancelled orders for that period. Since Google Analytics is only counting orders as they come in, and doesn’t know when an order is cancelled, Google Analytics can overreport these sales.
This can happen the other way too: If Google were to capture a $100 sale, and then the customer were to email the merchant to add an additional $50 product to the order, Shopify would report $150 in sales, whereas Google would only report on the initial $100 order.
What Is the
Value of a Purchase?
One of the most common causes of differences in attribution reports that I see comes down to how the value of a purchase event is being calculated between attribution platforms.
value of a purchase event being sent to Google include taxes and shipping (which is what Shopify would call the
total_order_value)? Or is the
value of a purchase simply calculated as
discounts (which is what Shopify would call the
subtotal of the order)?
For the most part, the
Client-Side vs. Server-Side Event Capture
Recently, a new breed of attribution tracking solutions has emerged (Fueled being one of them), which track these events on the server.
With server-side tracking, events are sent from an eCommerce or marketing application to Google or Facebook or another destination directly, through a “server-to-server” connection. This guarantees a higher percentage of tracked events, getting around ad blockers or other restrictions in the browser.
However, if one attribution tool is leveraging server-side events and another is leveraging client-side events, they might receive slightly discrepant data.
Differences in Tracking Script Setups
As mentioned above, client-side tracking scripts can be configured to send different values and event parameters to platforms like Google and Facebook.
Moreover, different scripts could be implemented in different “places” in the customer journey (i.e., on different pages), leading to different results.
For example, Shopify provides a built-in Facebook Pixel integration for capturing purchase events immediately upon a customer hitting “submit” on the checkout screen. Google Ads is usually implemented on Shopify’s Order Confirmation Page.
Understanding Ad Blockers
As mentioned above, ad blockers are becoming more and more prevalent on the Internet. For example, in the U.S., its estimated that up to 14% of attribution tags loaded by Google Tag Manager are blocked.
That’s 14% of your orders and customer data that you can’t see in your analytics platform!
Ad blockers don’t block all client-side attribution snippets equally. Fueled’s attribution scripts, for example, are blocked less frequently than Google Tag Manager because these ad blockers don’t yet recognize our scripts.
In short, if you’re seeing slight differences in order counts between two attribution scripts, it could be due to ad blockers.
Step 3: Understanding How Reporting Tools Work
Data Lags and Time Zone Differences Whenever a merchant on Google Analytics comes to us to say that the data reported in GA differs from what they see in Shopify or BigCommerce, the first question we ask is: What reporting window are you using?
Google Analytics reports that it can have a 24–48 hour delay in processing event data that it receives from a website. Frustratingly, this “data lag” isn’t consistent. Sometimes Google Analytics reports are updated within an hour of an event occurring. Sometimes the data is present, but incomplete for 48 hours… (Other analytics platforms, like MixPanel, don’t face this challenge.)
Similarly, different reporting tools will use different cutoff times and time zones when tracking daily order counts and revenue figures.
The time zones used for Shopify reporting and Google Analytics are both configurable in those applications’ settings. Other reporting tools default to midnight UTC. Others still use the time zone of the user generating the report.
“Data stitching” refers to how an attribution tracking solution compares/combines user events to connect the effects associated with a single shopper visiting your website.
Interestingly (frustratingly??), Google’s Universal Analytics and Google Analytics 4 use different methodologies to define user sessions.
In fact, one of the value propositions of Google Analytics 4 is that it leverages machine learning to stitch together events and identify users across browsers and devices.
While differences in data stitching approaches shouldn’t throw off order counts and total revenue, it might result in different user counts that throw off per-user calculations between applications.
Analytics Troubleshooting Hit List
Consolidating the insights shared above, if you’re seeing different attribution data in different reporting tools, consider troubleshooting as follows:
Common Measurement Issues:
- If order counts match but your revenue reporting is off, review how the order
valueis being calculated.
- If order counts match and order values are being calculated the same way, check if the reporting tool is deducting the value of cancelled, refunded, or modified orders.
- Are ad blockers preventing the tracking of some orders?
Common Reporting Issues:
- Check if your reporting tool has a documented data lag, as does Google Analytics.
- If your reports are just slightly off, make sure your cutoff windows are using the same time zone settings.
- If channel attribution doesn’t match between tools, are the tools using different attribution models? Are they using different multi-touch attribution windows? (More on different models in Part 2.)
Up Next In Part 2
In the next part of this series, we will talk through common attribution modeling techniques, and why different reporting tools leverage different models. I will then explain why it’s sometimes a good thing that different tools are reporting different attribution numbers. More soon! Thanks!!!
If this post has been helpful, or if it’s just opened up more questions for you about eCommerce attribution analytics, please reach out! I’d love to learn more about the challenges you’re facing or solutions you’ve put together for event tracking and attribution reporting.
If you’re interested in a free, incredibly robust solution for Google Analytics 4 event tracking, check out our Shopify App:
Finally, if you’re looking for a more robust customer data platform for eCommerce, let’s talk Fueled!!