Incapsula's report claims most site owners don't know how much of their traffic is non-human because Google Analytics, the most common traffic monitor for small- and mid-sized sites, doesn't identify the difference between 'bot and human visitors.
The truth is a little more subtle.
It is possible to filter out traffic from bad bots using scripting and filtering tools built into Google Analytics, which are routinely used to filter traffic from a site's developers or staff from external "real" traffic.
Filtering bots is tricky – because of the variations among them, not the complexity of creating filters. Yottaa and other web-management-advice sites offer plenty of tips and guidelines, including Google's, however.
Keeping ahead of the 'bots poses the same problem as keeping ahead of spammers, who change their packages and payloads to avoid filters just as quickly as new filters are created to stop them.
There are plenty of Black Hat SEO tools and service companies, bogus-traffic generators and hacking tools available to help, that let even relatively non-technical competitors create malicious 'bot traffic if they choose.