Most web sites flooded by alien visitors

Study shows 'good' and 'bad' bots combine to make up 51% of typical web traffic

By  

Incapsula's report claims most site owners don't know how much of their traffic is non-human because Google Analytics, the most common traffic monitor for small- and mid-sized sites, doesn't identify the difference between 'bot and human visitors.

The truth is a little more subtle.

Google Analytics collects visitor data using JavaScript, which most web crawlers and other "good" search-engine 'bots don't activate. So Google Analytics typically won't count traffic from Googlebot or other search engines as legitimate traffic, according to web-performance service Yottaa.

Plenty of "bad" 'bots are designed to run JavaScript on and accept cookies from the pages they visit, making them harder to identify and allowing them to be counted under Google Analytics' default settings.

It is possible to filter out traffic from bad bots using scripting and filtering tools built into Google Analytics, which are routinely used to filter traffic from a site's developers or staff from external "real" traffic.

Filtering bots is tricky – because of the variations among them, not the complexity of creating filters. Yottaa and other web-management-advice sites offer plenty of tips and guidelines, including Google's, however.

Keeping ahead of the 'bots poses the same problem as keeping ahead of spammers, who change their packages and payloads to avoid filters just as quickly as new filters are created to stop them.

There are plenty of Black Hat SEO tools and service companies, bogus-traffic generators and hacking tools available to help, that let even relatively non-technical competitors create malicious 'bot traffic if they choose.

Photo Credit: 

Reuters

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Ask a Question