Webmasters howl over Google's antispam campaign

Searchzilla's latest tinkering with its super secret algorithm seems to have bugs that make poor judgment calls.

By John P. Mello Jr., PC World |  Internet, Google, spam

When Google sneezes, the Internet catches a cold. Or so it seems with Searchzilla's latest tinkering with its super secret algorithm to squeeze spam out of its search results. The company announced two weeks ago it would be launching a new offensive on spammers and this week it made good on its promise. There's a question, though, whether or not innocent webmasters might be collateral damage in Google's quest to quell spam.

The launch of the anti-spam campaign this week was a targeted one, Google web spam boss Matt Cutts wrote today in his personal blog. Slightly more than two percent of queries were changed in some way, but less than half a percent of search results changed enough for someone to notice, he explained.   "The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site's content," he wrote.

While the changes may have a minimal impact on the eyeballs of searchers, that's not the case for some webmasters. "We saw a huge 10-20 [percent] drop in almost all positions for our biggest and oldest site," reported "drall" at the WebmasterWorld forum.

"It doesnt matter if it is all 100% unique with tons of backlinks and really well laid out or simply an image," drall continued. "Everything got whacked."

Ironically, drall's out-of-date, ad-filled sites were "humming along fine."

"So whats the message here Google?" drall asked. "Write an in-depth article that takes 3 days to complete and is linked to by hundreds of companies and gov agencies and loose all positions site wide while our out of date half-baked and useless content does fine?" (sic)

Another forum member, ziajunu, said they lost 40-60 percent of the traffic to their websites after the apparent Google change.

If Google is trying to push original content higher in its search results, another forum contributor said that they hadn't seen it yet. "I wrote about our problems a few months ago with regard to original content," explained "Spencer." "We had stopped writing new onsite content because our rate of spidering by Google was far lower than it was by scrapers. The result was that anything we wrote would end up getting top listings on a scraper site and our pages would get nowhere fast."


Originally published on PC World |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Ask a Question
randomness