Google Analytics Can Now Exclude Traffic From Known Bots And Spiders

Google made a small but important update to Google Analytics today that finally makes it easy to exclude bots and spiders from your user stats. That kind of traffic from search engines and other web spiders can easily skew your data in Google Analytics.

Unfortunately, while generating fake traffic from all kinds of bot networks is big business and accounts for almost a third of all traffic to many sites according to some reports, Google is only filtering out traffic from known bots and spiders. It’s using the IAB’s “International Spiders & Bots List” for this, which is updated monthly. If you want to know which bots are on it, though, you will have to pay somewhere between $4,000 and $14,000 for an annual subscription, depending on whether you are an IAB member.

botOnce you have opted in to excluding this kind of traffic, Analytics will automatically start filtering your data by comparing hits to your site with that of known User Agents on the list. Until now, filtering this kind of traffic out was mostly a manual and highly imprecise job. All it takes now is a trip into Analytics’ reporting view settings to enable this feature and you’re good to go.

Depending on your site, you may see some of your traffic numbers drop a bit. That’s to be expected, though, and the new number should be somewhat closer to reality than your previous ones. Chances are good that it’ll still include fake traffic, but at least it won’t count hits to your site from friendly bots.