We recently consulted with a long-time client who was concerned that his conversion rates had slipped considerably and the overall bounce rate had zoomed in recent months. To begin looking for a cause, we took a deep dive into his Google Analytics.
We discovered that the client’s traffic for visitors using Internet Explorer 7.0 had soared by over 2,600% during the period July of 2014 through November 2014 compared to the same period the previous year. The bounce rate for that same browser had also jumped from 34% to 98%. We would have expected jumps in the use of later versions of Internet Explorer that were released during that time, but not such a dramatic jump in the number of visitors reportedly using a very outdated version of the browser, a version known for its security holes.
Some quick research and conversations with other clients told us that this was not an isolated problem. In fact, the issue had been noticed across the web and was – at least initially – blamed on a wave of new bots visiting sites in the AdRoll network using IE7. The thinking was that this caused ad networks to register the bot visits as legitimate users who then clicked on the ads, sending the bots back to the advertisers’ sites for the purpose of driving up impressions and – consequently – inflating ad prices and clicks.
Of course, the understandable reaction would be to leave the AdRoll network. But once they discovered it, AdRoll stopped serving ad impressions on IE7 browsers, which it estimated to make up just one percent of its network traffic. Compare that to our client’s Analytics showing that 45% of its web traffic from July to November of this year was using IE7, compared to only three percent the year before and the difference is astounding.
While the AdRoll switched helped a bit, we still suspected that the bulk of the IE7 traffic was coming from bots.
Finding the Right Solution
We presented a few options to the client on how best to filter out the bot traffic that was skewing his performance results.
- We could redirect IE7 traffic to a different domain. Any legitimate IE7 customers would be lost.
- We could redirect IE7 traffic to a new page on the existing domain encouraging IE7 users to upgrade their browser, providing a link to download the latest IE version. This option would then require setting up a filter in Google Analytics to exclude visits to that page.
- We could create a separate view in Google Analytics using Google’s known bot filter.
The client’s audience is older than average and has a tendency to use Internet Explorer over other browsers, often ignoring upgrades. Losing potential sales to legitimate customers isn’t desirable for anyone, so option 1 was out of the question. Option 2 might have worked – at least for a while – but it wouldn’t eliminate bot traffic that was not coming from IE7. Option 3 was the best choice for this client, and it was very simple to implement. Since Google is constantly updating its database of known bots as it finds them, most future issues with bot traffic will also be addressed without any further involvement on our part or our client’s.
Setting Up a Bot-Filtered View in Google Analytics
We recommend creating a separate bot-filtered view in your Analytics because any traffic data from before the date you create the view will not be reported in the new view. That way, you can just switch between views to access your historical account data. The instructions that follow assume that you have sufficient privileges in the Analytics account in which you want to create the additional view.
To set up a new, bot-filtered view in Google Analytics, just log into your Analytics account:
- Click Admin at the top of Analytics, then View Settings found at the top of the View (right) column on the Admin section.
- Click Copy View (so you can create a copy while still retaining the original view):
- Give the new view a name that’s easy to recognize and then click Copy View. We used Exclude Bots.
- The Admin areas for the new view displays, so you’ll need to click View Settings again.
- Scroll down the page to and check the box under Bot Filtering:
- Click Save to save the new view.
Then, when you view Analytics, you can switch between views by going back to the Admin section and toggling between the new and old views in the View column, then just click the Reporting link to see the data.
The End Result
The end result is a more accurate picture of the traffic visiting your website. This was the best solution for this client. Admittedly, it’s not perfect. Any bots not yet identified by Google are not going to be filtered out. But the more accurate the data, the more useful it is for drawing a clear picture of what’s happening with your website. And trying the simplest fix first makes a whole lot of sense.
As Director of Paid Search, Leslie Lewis leads the agency’s efforts in all online marketing, overseeing campaign management and strategy for search, social and mobile media buys.