As bad bots increasingly take up a greater share of Internet traffic, are data centres providing the roads to ruin?
Bad bots accounted for 20 percent of all web traffic last year, according to new research from Distil Networks published today. With bots making up 40 percent of web traffic, that half of them were ‘bad’ is a cause of concern. This concern heightens when you discover the same research found 75 percent of them to be advanced enough to load JavaScript or other external resources, hold onto cookies and enable persistence through randomisation of IP addresses, headers and user agents.
Distil Networks researchers also reveal that 60 percent of all these bad bots originated from data centres, with Amazon topping The bad bot market share for the third year in a row with responsibility for 16 percent of all bad bot traffic.
We know that bots can include things such as search engine crawler with their indexing processes necessary to keep product, service and site details up to date. But what, exactly, does a bad bot do?