There is a ruthless, ongoing arms race for data taking place out there affecting competition in business in a huge way, particularly in relation to pricing. Your average bookshop might adjust its prices every week or two – often much less frequently on the bulk of its titles. For the big online retailers, this is an entirely different process; operating more like an ecosystem than a stocktake, sometimes updating several times a day. Bots are used not only to constantly adjust prices across a business’s own site, but also to monitor their competitor’s prices. In an environment where often the only difference between one seller and another is price, even a few cents can make a difference between making a sale or not. One of the biggest challenges facing publishing today is the race-to-the-bottom discounting, but to understand what’s driving it, you have to understand the online market and the value of up-to-the-minute information.
Fending off unwanted bots trying to hoover up your site’s data is a serious matter. Those ‘CAPTCHA’ tests, (an acronym for ‘Completely Automated Public Turing test to tell Computers and Humans Apart’), which use pictures of wonky type, are designed for just this purpose – most bots can’t read that stuff.
To use the book industry’s, and indeed the world’s, biggest online retailer as an excellent illustration of this information arms race, Reuters recently reported on an event where the bots that Walmart was using to track Amazon’s prices found themselves blocked from Amazon’s site. This was a major problem for them and was a demonstration of just how sophisticated Amazon has become in its use of these software agents. But the company has to be; about a third of the traffic on most online retail sites is composed of bots – on Amazon, it can be close to 80%.