This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
Between 30%-70% of traffic to most websites is from bots, meaning it is non-human traffic. And while many assume bot traffic should be blocked, that is a black and white approach to a problem that’s very much grey. The reality is that some bots are good, some bots are bad, but most will be somewhere in between. What you need is a bot management tool that lets you apply a range of management actions based on your website’s business model.
Consider the airline industry. A typical airline bookings site might see 50% of its traffic coming from human visitors, with the other 50% coming from bots. While the first thought is often to block the bot traffic, a more effective approach is to understand why the bots are here and what the impact is on the business.
The airline industry is highly competitive with airlines competing on schedules and prices across thousands of routes every day. Flight schedules and prices are highly variable from one day to the next and also fluctuate rapidly based on supply and demand – prices go up as any given plane fills, and vice-versa. Consumers trying to get the best price, competitors are trying to win your customers, and partners are trying to help customers get the best prices and all are dependent on continuous access to the latest data.
Providing this data access comes with a cost. Most airlines don’t actually store their flight schedules and prices in house. They use a global distribution system (GDS), like Amadeus or Sabre. Every time someone looks up flight data, that entails a call to the GDS. And every call to the GDS, whether it’s from a human or a bot, incurs a cost, which can be substantial over large volumes – on the order of hundreds of thousands of dollars every month. With consumers and partners, the GDS cost is built into the cost of sales. With competitors, however, it’s just a lost cost without any potential benefit.
All of this is business context that has to be considered in a conversation about bots, because it helps us determine the technical requirements of a bot management solution. First, we know that blocking bots won’t work. The airline industry is a $700 billion global industry. Both competing airlines and the competitive intelligence services that sell to them have a tremendous financial incentive to get the flight data and capture a larger percentage of that $700 billion in revenue. Blocking their bot doesn’t make them go away. Blocking just lets them know that you’ve detected their bot. And that just triggers them to evolve their bot to evade your detections.
Second, distinguishing between good and bad bots can be extremely difficult, if not impossible. While you can always whitelist bots from a small number of authorized partners, the travel industry has many smaller search engines and travel sites. These smaller sites fall into a grey area of unauthorized partners that, while they may not have a formal relationship with airlines, still serve as a potential channel for selling to customers. This means that blocking unknown bots will likely catch these unauthorized partners and have an unexpected impact on sales.
So how do you solve the business problem in a meaningful way while remaining cognizant of the fact that many unknown bots may contribute to sales? Bot management needs to move beyond blocking and enable organizations to apply a wider range of management actions or bot responses.
For example, in the airline industry, the bot traffic itself is not the primary problem. The primary problem is the high cost of GDS lookups made by bots visiting the bookings site. You can alleviate this problem without trying to block bots en masse by using data stored in cache to respond to bot requests for flight schedules and prices. By serving bots with cached data we can dramatically reduce the number of calls being made to the GDS without alerting the bot operator that we’ve detected its bot. This helps minimize the high business costs associated with bots, while slowing down the rate of bot evolution and without shutting the door on incremental customers and sales revenue through unauthorized (and unknown) partners.
The situation above is by no means limited to the airline industry. The line between good and bad bots are increasingly blurred in most industries, making it difficult for any organization to employ a black-and-white approach like blocking.
Online fashion retail provides another fascinating example. Most fashion retailers today sell directly to consumers through their e-commerce sites, in addition to brick-and-mortar stores. With a B2C model, it is always tempting to think you could just block non-human visitors to your e-commerce site, under the assumption they don’t contribute to sales.
What this assumption misses is that consumer behavior is constantly changing, and how those consumers interact with your online brand and website is constantly changing as well. Consumers today no longer buy the majority of their clothing from a single brand, but instead cross-shop multiple brands. In addition, many are looking for an introduction to new and different brands. This means consumers are actively going to third parties for new ideas and recommendations.
Consider Fashiola. Fashiola is one of a new breed of specialized search engines focusing on a niche of the search market – in this case, online fashion retailers. While likely not yet on the radar of major brands – especially outside of Europe – Fashiola helps connect millions of users with over 1,800 different brands, representing a potential channel to reach consumers. If you were in the market for a pair of blue suede shoes, for example, you could just go to Fashiola, type in “blue suede shoes” and view the relevant offerings from over 1,800 brands.
Just as you would expect, Fashiola keeps up with the changing products and prices across the brands they index using automated bots. Fashiola, and other sites like it, prove that employing a simple black-and-white approach to bots is no longer possible. The traditional approach of blocking bots by default, while whitelisting a few known good bots, assumes that you know all the bots that are good. But when the number and diversity of good bots are changing daily, organizations can’t possibly keep track of them all. A default posture of blocking bots risks missing rapidly changing consumer behavior and not participating in new business models, thereby shutting the door on incremental sales.
When it comes to bots, context matters. Consumers, partners, competitors, and others are interacting with you and your website in new and different ways than before. Because of that, blocking bots as a default strategy is more likely than ever to hurt the business – and in unpredictable ways.
Good, bad, or somewhat grey, managing bots should start with the business context that bots have and consider the role they play in your online strategy. By deploying more appropriate management actions to better manage bots, you can harness the positive ways in which they can help your business, while minimizing the negative ones.
This story, "Your strategy for dealing with web bots has to take into account business context" was originally published by Network World.