Most Websites Are No Match for a Bot Attack: Study

Most Websites Are No Match for a Bot Attack: Study
June 21, 2017 Marketing GrafWebCUSO

A new study developed by Distil Networks and the Online Trust Alliance revealed 95% of top websites across different verticals, including financial services, are no match against advanced persistent and damaging bots.

The survey found that while an average of 16% of websites across all industries can thwart simple bot attacks, only 5% properly protect against sophisticated attacks. In fact, the report found that of the websites tested, only 4% could detect evasive or advanced bots.

The ninth annual audit, powered in part by San Francisco-based bot detection and mitigation firm Distil, evaluated the top 1,000 websites in retail, banking, consumer services, government, news media, internet service providers and OTA members.

According to Distil, bots, used by competitors, hackers and fraudsters. are key culprits behind web scraping, account takeover, competitive data mining, online fraud, data theft, unauthorized vulnerability scans, spam, digital ad fraud, and downtime. Bots vary in volume and sophistication, but all place an increasing burden on IT security and global web infrastructure teams, wreaking havoc across big and small operations.

“While top websites do a better job protecting against simple bots, they continue to miss the mark in more sophisticated bots that can mimic human behavior,” Rami Essaid, CEO/co-founder of Distil Networks said. “Our annual Bad Bot Report found that 75% of today’s bad bots are advanced persistent bots that can either load JavaScript, hold onto cookies, and load up external resources, or randomize their IP address, headers and user agents.”

Distil tested each website included in the OTA Online Trust Audit on their ability to defend against bot attacks of different sophistication levels, including:

  • Sophisticated Bots – “Low-and-slow” bots coming in from dozens of IP addresses, using browser automation tools that hold cookies and maintain state
  • Moderate Bots – Contain normal browser user agents and headers, coming in slowly from one IP. 
  • Simple Bots – Non-browser user agents and headers, coming in fast from one IP.
  • Crude Bots – Basic script that behaves like a bot, coming fast from one IP address.

The findings show that while most industries tested can adequately protect against crude bots, they struggle to effectively block simple, moderate, and sophisticated bots. Financial institutions are 85% successful against crude bots, 14% against simple, and 7% each against moderate and sophisticated bots.

Federal websites block 22% percent of simple bots, but only protect against 1% of sophisticated bots, performing below any other industry tested.

Despite poor performance, this year’s findings reveal a noticeable upgrading from Distil’s 2016 study, which found that websites tested could protect against only 0.7 percent of sophisticated bots. Such improvement comes from the gradual movement toward greater awareness and adoption of more advanced bot detection and mitigation solutions.