Depending on the severity of the problem and the value you put on the time to stop it, it's possible to segment your traffic, create honey pots to trap bogus visitors and add other security-enhancing hoops through which bots would have to jump before being either counted as legitimate traffic or getting access to potentially sensitive data.
Before you go to the trouble, evaluate the risk involved and the cost to fix it. The value of perfect 'bot filtration is up for debate even among the black-hats who use or create the tools.
It's not unlikely that half of a typical site's web traffic does come from non-human sources, but it's also possible that sites whose security managers are unfamiliar with either Google Analytics filters or 'bot signatures could put aggressive 'bot blockers in place that will actually stop human visitors while letting the non-humans through.
The key is not to conceal a site from 'bots or block every HTTP request that hasn't proven it's from a human. The key is to know the difference between 'bot traffic and human and know how much effort is justified to block one or the other.
Read more of Kevin Fogarty's CoreIT blog and follow the latest IT news at ITworld. Follow Kevin on Twitter at @KevinFogarty. For the latest IT news, analysis and how-tos, follow ITworld on Twitter and Facebook.