Accurate data and reporting can mean the difference between a profitable business and one that's...well, not so profitable. We take a look at some of the common mistakes we see digital marketers making, and how you can improve your reporting, analysis, and tracking.
More About Vanity Metrics:
https://youtu.be/-0QL3hzGoJk
More About Marketing Attribution:
https://youtu.be/sEXmGb9wCgw
Hey guys, Roy here from Apotheca Marketing. You know, data is incredibly important to understanding how your marketing programs are working, right? Whether they're profitable, whether they're targeting the audiences that you want. They will tell you how people are using your website, but it's not the sexiest topic, right?
But it's one of the areas we see a lot of mistakes that are being made by digital marketers all the time. We want to talk about some of the things that you can do to make sure that you have the data that you need to be able to measure your programs, measure your users, and so you can make the most money profitably that you can.
So let's take a look. All right. Probably the biggest area that we see where people make mistakes with their data and leads them to making poor decisions about their marketing programs, is how their analytics is implemented. Again, this isn't the sexiest area. It's very technical, but doing a good implementation takes somebody who really knows what they're doing.
[00:01:06] Admittedly, there are content management systems and website platforms that allow you to just put your Google ID in there and it'll set up basic tracking for you. Usually, for a lot of sites, that's okay, but what it doesn't have is information about events and conversions and stuff like that out of the box.
[00:01:26] Now the other thing to keep in mind is that, that doesn't really work with Google Analytics four, which is taking the place of Universal Analytics in July of next year. So, you know, just popping some code in there and hoping that it works isn't going to do you any justice, and then you're not going to get the data that you need.
[00:01:42] So you need to make sure, from the get, that your Google Analytics is implemented correctly, or your Adobe Analytics or whatever platform you're using, that it's implemented correctly, that it's on all your pages, that it's tracking correctly, that your conversions are firing, that you're tracking the events that you want to track.
[00:02:00] Once you have that setup, you need to test it and make sure that your data is coming through correctly. Because if you don't, if you find out a month or two from now that you're missing some key metrics like shopping cart activity or something like that, you don't get it back. Once you lose that data, it's gone, and now you have a gap in your record and it makes it increasingly hard to do year-over-year comparisons or anything like that if those key metrics are missing. Even if you're sure that the implementation is correct, what we recommend is doing a regular audit; anytime someone touches the website, anytime something changes, go through and perform an audit. And again, making sure that all your pages are tagged correctly, that you're tracking the right metrics, that those conversions are coming through, all of the data is coming through correctly.
[00:02:56] The other thing that we see quite often is, and it's this, and we'll talk about it a little bit more later, but it's this idea that too many hands are in the cookie jar, right? A lot of times we'll see like somebody puts that base code onto the site and then all of a sudden their numbers are looking odd or they're doubling their traffic and it's because somebody else has come along and not realized it, or not paid attention and implemented code again, or through something like Google Tag Manager. Now what you have is you have the same Google account tracking twice and it's going to duplicate information and data. You're going to have suspiciously large traffic numbers, and incorrect data because again, it's double counting. You don't get back the real data.
[00:03:45] You just have to annotate that and realize that the data was incorrect going forward. That's why we recommend one implementation. We recommend using a tag management product like Google Tag Manager, so that you are sure that there's one instance that you're controlling and that it's the right instance.
[00:04:04] Now, there is some easy tools to use out there to make sure. One of the things that we use is the Google Tag Manager or Google Analytics, Chrome plugin extensions and that will actually record all of the Google Analytics instances on your page. It'll show you whether you have GTM, it'll show you whether you have Universal Analytics or GA four, and it will show you duplicates. So it'll say like, this is on here a couple of times. So it's an invaluable tool just to make sure, you know, that you're checking that pretty regularly that somebody didn't inadvertently go and put the code on the site twice. Another thing to double-check with your implementation is how is it categorized, for instance, your marketing programs.
[00:04:47] To make sure that there's a consistent hierarchy to your UTM codes in Google, for instance, that the campaigns are called the same thing, that your ad groups are the same thing, that the sources are the same thing. Create a document, create a structure that outlines how you're going to talk about your marketing programs, and how you're going to track those because that way when you go into your reporting, you don't just have those helter skelter assortment of different campaigns. Was that a Facebook ad? Was it organic Facebook? What campaign did that fall under? Making sure that you have the taxonomy for your UTM tech and your marketing programs is super important, especially if somebody else is coming into your reporting that doesn't know anything about it, they can understand the structure.
[00:05:31] It also helps to make sure that that data is consistent over time. That you know what marketing programs you're talking about in your reporting. Same thing for products. Making sure that the products are categorized correctly, making sure that the product names are coming through correctly. That's really going to help your merchandising reporting if you're a retail company to make sure that you know you're reporting on all of those metrics consistently.
[00:05:58] So that you understand what products and categories are selling. The same thing with page names and titles and that type of stuff. Make sure that it's clear, concise, so that you can report on those easily. That it's not a jumbled mass like we do see with some clients. Unfortunately, the other thing in line with the implementations that is critical, especially for larger companies is to make sure that you’re filtering your employees out of your data. I know I've worked at companies in the past where when you log onto your computer at the office, it automatically goes to the website and you may have a call center or a customer service center that uses the website to actually walk people through trying to find stuff or to help them, or in some cases even place orders.
[00:06:44] So that’s your traffic. You're not interested in your traffic. You're interested in what your customers are doing and making sure that you've set up the correct filter to block your internal IP addresses so that you're not reporting on your inside data, your employee data, that will inflate user experiences.
[00:07:09] Sometimes it'll inflate bad things, like it'll lower your conversion rate, for instance, because they're not necessarily buying on the site, they're just landing on the pages and looking at them or whatnot. Now, the thing to be careful about here is making sure that you don't set up the wrong IP addresses.
[00:07:25] This is where it gets tricky because sometimes and we've seen this where people think that they're setting up a filter for their company's IP address. When actually their network is using an array of IP addresses or a bunch of them and you might be inadvertently blocking part of Spectrum or part of a local distributing company or something.
[00:07:52] The problem there is that you're now blocking a whole bunch of people that could be the potential customers that aren't your employees. So it's something that you want to be careful with. I would err on the side of not filtering your employees if you think there's a chance that you're going to filter real customers because you don't want to lose that data, especially if you don't have a call center.
[00:08:15] If it's not problematic for people visiting the site all the time, that's a decision that you'll have to weigh. But just realized that on the one side, it can throw off your data by showing too many visitors to the site if it's a whole bunch of employees coming. Conversely, you could be eliminating a whole bunch of customers when it's not that big a deal if you weren't blocking that information.
[00:08:37] Another thing that we see a lot is where disparate groups within a company will use different data sources and or different tools to report. It's extremely important that at the end of the day you have, especially if you're reporting, a bunch of you are consolidating reports to report to an executive team or the board or a client, that you have a single source of truth.
[00:09:04] What that means is that it's a consistent platform. It's a single platform that is measuring things the same. For instance, you may have Google Analytics installed on your website. You may also have Adobe Analytics installed, and you do not want to use one of those and then have somebody use the other, because those metrics are never going to math.
[00:09:28] Implementations of Google Analytics on the same site, if you have two different accounts, cause we've tested this, depending on how they're implemented, depending on where on the page they are, all of that stuff, you will have different metrics even between the same tool. So that is problematic. The other is that for instance, if you're reporting clicks from Google Ads or you're reporting clicks from Facebook, those are not going to match up with visits to the site.
[00:09:56] They're not going to match up to users on the site because there's inevitably going to be more clicks in that third-party system than people that actually made it and got recorded on the site. So trying to match up those numbers is never going to quite work. What you want to make sure that you're doing is that you're looking at one source of data.
[00:10:17] In line with that is having the same consistent data definitions. When people talk about traffic to the site, for instance, are they talking about users or are they talking about sessions? It's consistent definitions and setting a criteria for that to make sure that if you're working with a third-party agency, if you're working with other teams internally that you're not cross-reporting. That you're using the same definitions and the same metrics so that your reporting matches up. This particularly becomes important when you're doing things like conversion rates. If you're using conversion rates based on users, your conversion rate might seem smaller or larger, rather than if you're doing it by sessions.
[00:11:03] You have a lot more sessions than you have unique users and so you know that's going to throw off that data. Making sure that everybody on your team is using consistent definitions about what conversion rate is and what metrics they're using for traffic is going to make sure that you're all reporting apples to apples.
[00:11:21] The other thing to make sure that you're doing, depending on your marketing program, is reporting on the same attribution models. What that means is, and we have a video that we’ll link somewhere, what that means is that you want to look at how programs are getting credit for your marketing programs.
[00:11:41] What it does is it looks at different things like last click or in GA four, the default model now is an AI data-driven model that Google uses to determine which marketing program gets credit for that customer and for that sale or that lead. If your attribution is different, so for instance, some affiliate programs will use a 90-day window for attribution to give credit to an affiliate because that makes the program look better, more affiliates are more likely to get paid if it's a 90-day window. Most by default programs, most marketing programs for default, especially in Google Analytics, are 30 days. If one person is reporting on a 90-day window and another is reporting on 30, they're going to be essentially stealing data from each other. They're going to be stealing credit for those programs and it's not malicious I know. Well, maybe it is, but the numbers aren't going to match up and you need to make sure that those are the same when you're making a consideration about which marketing program is working correctly. That you're looking at those different attribution models to see, is one program working better than the other, and where should you allocate your funds.
[00:13:05] Seasonality is something that sometimes people forget about and they're reporting particularly if their data is new or if they don't have, for instance, Google Analytics four that hasn't been implemented for more than a year or something like that. People tend to focus on immediate data.
[00:13:24] So this week's performance. What we see a lot of times is you'll compare this week's performance with the prior week's performance, when in fact you should also be looking at the prior year because that's going to seasonality into consideration. You may have a drop in traffic, people might be freaking out.
[00:13:44] You may have a drop in sales, but is it a drop in sales that typically happens around this time of year? Is it something that is seasonal? Is it because of an event like a holiday or a Super Bowl, or you name it? By looking year over year, you're going to be able to understand those trends better than this myopic view of just that week that you're looking at the week prior.
[00:14:11] In line with that, you also want to make sure that you're extending your data far enough. You want to look at something more than just this week, because a drop in traffic this week, when you're only comparing that short period of time, may look dramatic but if you actually extend that data out and look at the larger view of the data, you may see that you're trending in the right direction.
[00:14:33] This is just a tiny little blip, that in a larger scheme of things doesn't even register as a drop in traffic. I've seen that where people kind of start panicking. They'll be like, oh my gosh sales are down, let's send an email. Let's do something to the marketing program and if you actually take a step back, it's like, no, this is just a normal little variation.
[00:14:54] That could be caused by weather, it could be caused by anything. That is not as dramatic as it seems when you're super focused on just that weekly or daily data. We see clients sometimes that will have an hourly chart of traffic and they'll get worked up about it. Sometimes that real-time traffic, it can be exciting but you know, if you take that step back, it may not be as big a deal as you think. We've talked about it in other videos, but reporting on vanity metrics is also something that we see a lot. Getting caught up on stuff that's not actually actionable. It could be something like, for instance, Facebook likes. Did that post get a lot of response?
[00:15:39] You know, is it trending? Is it whatever? Right. That can be great from a brand awareness perspective, but is it really actionable? You want to get your dashboards and your reporting to focus on stuff that is more actionable. What can you make decisions about? About your budget and about things that you change on the site or what you can test and not get caught up in those vanity metrics.
[00:16:04] We did a video about vanity metrics where we talked about it in a little bit more detail, so we'll put a link for that below as well. But just keep in mind that you want to try to focus on data that you can do something about. Because if you can't do anything about it, it's only interesting, it's not actionable.
[00:16:24] Analysis paralysis is something that we see with a lot of companies where you have just tons of data. You're collecting data about everything and you're tracking every possible event on a site, every click, everything that a person does; this is just a heap of data. When you have too much data that you're not looking at actually what is important for making decisions. You tend to not make decisions because you don't know where to start, and that's the paralysis part. You have so much data and so many reports that people don't know where to look. They're not actually getting an analysis of what is important.
[00:17:12] They're not seeing key metrics that are actionable. It's just a sea of data about interesting stuff. Hey, people are clicking here on the page. That's great. What does it tell me? What can I get from it? What can I glean from that to either make changes to the website or to the marketing programs? Just where they're clicking is not in and of itself important. Then there are lots of different metrics like that, that if you just have too many of them, you have a lot of, for instance, demographic detail. That's great if you know how to pull it out of the reporting and pull it out of the data and actually do something with it to analyze your art audiences and make decisions on that.
[00:17:58] Otherwise, that data's just sitting there. Nobody knows where to look at it; I've seen this where on a weekly basis, companies are producing these Excel spreadsheets that are just tab after tab after tab and it's a sea of data with just the lines and lines and rows and columns of data.
[00:18:15] You may have some charts and stuff thrown in, but it is overwhelming. What happens is people's eyes glaze over and they get that report, they don't really look at it and they don't know how to react to it and so they start ignoring it and you have data paralysis. It's where you need to have somebody that can take that distill it to what is important, and make arguments for why you don't necessarily need to spend all of your time doing reporting that nobody's looking at.
[00:18:49] Instead, doing a deep dive into the analysis to make decisions about your programs. That kind of leads into another issue that we see kind of related to data paralysis is just people not using their data. I can't tell you how many clients we have that kind of think that they have Google Analytics installed or they have another tool installed.
[00:19:13] We've had customers who have Adobe Analytics on their site and if you're familiar with Adobe Analytics, it is not an inexpensive tool. It's actually a very expensive tool and it's a great analytics platform but nobody in the company knows how to use it because maybe somebody that had championed it before had left the company or through layoffs or something like that.
[00:19:39] They don't have a team that knows how to use that tool. What happens is somebody comes in and they put Google Analytics on the site instead because that's what they're used to. They have all of this old stuff hanging out there that nobody ever looked at. They only have one tool on the site and they kind of can look at traffic and other stuff, but they don't really know what to make of it.
[00:20:04] This is where it would really help to have someone come in and create usable dashboards for you that you're going to look at on a regular basis. This is one of the reasons why we've started using, for instance, what was previously called Google Data Studio, which is now Google Looker.
[00:20:23] We use that to create dashboards for business leads and executives so that they don't have to learn a tool. They don't have to go in and learn Google Analytics and how to try to find the data that they're looking for and poke around because maybe the tool changed. This is going to become increasingly a problem if you look at Google Analytics four, which does not have a user-friendly user interface. You have to really kind of dig around to find the data that you're looking for and there are a lot of executives that just won't take the time to do that. So they're not going to look at the data.
[00:21:03] I'm sure you've experienced it too, where even if you send data to an executive, did they even look at it? They maybe glanced at it, but did they actually take time to digest it? That's usually because nobody bothered to actually analyze it. So just sending a dashboard is probably going to be overlooked. If you send a dashboard with an analysis of action items and things that you're seeing and why it's important that they look at it, you're going to get more engagement. They're going to end up using that data more and actually start a conversation about what you should be doing to make those changes. One of the things that we always like to do is, is not just lob a dashboard at somebody.
[00:21:50] We create dashboards for clients all the time in Looker that are available to them 24 hours a day and they never, ever access them unless we make an analysis point to that data, and reference it, and then maybe they'll go look. That's the way you get the conversation going and you get people using it is by pulling findings from it that is useful for it.
[00:22:17] This is just scratching the surface of some of the stuff that can go wrong or mistakes that people are making in their data and reporting. I'd love to hear from you guys. What some of the things that you run into that are challenges for you from a data perspective? In your reporting or dealing with executive teams or dealing with marketing teams and what you've done to kind of solve it.
[00:22:37] There's lots of data tools out there, there's lots of reporting tools and platforms and all of them tend to have some of the same problems with audience usage, implementation, and just consistency in definitions. Hey, if this was helpful give us a thumbs up and let us know in the comments if there's stuff that you are seeing or that you would like to see more of and hopefully we'll talk to you soon. Thanks.
info@apothecadigital.com