How Scraping Attacks Can Compromise Web Security and Impact Business Continuity

Radware Bot Manager
5 min readAug 27, 2020


We often see businesses devising ingenious ways to pull ahead of competitors in the hyper-competitive online business industry. From tiny startups to business giants such as Amazon and Walmart have dedicated teams to track competitors’ pricing strategy and product catalogs. Many online businesses either employ an in-house team or leverage the expertise of professional web scrapers to gain competitive intelligence.

These attacks are planned in various stages to exploit the vulnerabilities of existing security systems such as WAFs, Intrusion Detection Systems/Intrusion Prevention Systems (IPS/IDS), and other in-house measures that lack the historical look-back, deep learning capabilities, and the ability to sniff automated behavior in syntactically-correct HTTP requests.

Scrapers use exploit kits that comprise a combination of tools (such as proxy IPs, multiple UAs, programmatic/sequential requests) to intrude into web applications, mobile apps, and APIs. Website security and business continuity can be grossly undermined due to such attacks. For instance, an attack that we monitored on a popular e-commerce portal involved multiple stages to bypass prevailing security measures.

Let’s take a closer look at what happened during this attack:

A popular e-commerce portal was inundated with scraping attacks and faced hundreds of thousands of hits on its category and product pages during a 15-day period. Attackers deployed a purpose-built scraper engine to execute attacks. They used an ‘exploit kit’ with different ready-to-use combinations of hardware and software to bypass web defense systems. The attack was executed in three stages: (1) Fake Account Creation (2) Scraping of Product Categories, and (3) Scraping of Product Categories

Stages of the Attack

Fake Account Creation

Attackers targeted the sign-up page using different attack vectors. They created multiple fake UIDs (User IDs) to register bots as legitimate users on the website. They used these fake accounts in combination with different device IDs, cookies, and UAs to masquerade as genuine users and generate perfectly-valid HTTP requests to easily circumvent rule-based conventional security measures.

Scraping of Product Categories

Using fake UIDs, attackers logged into the website and made hundreds of thousands of hits on category pages to scrape content from category results.

Price and Product Information Scraping

After scraping the category pages, attackers carried out hundreds of thousands of hits on specific product pages and managed to store the prices and product details of targeted products in their own database.

The attackers maintained a real-time repository of the entire product catalog on the e-commerce portal. They also regularly tracked the price changes to keep their database updated with the latest pricing information.

The topology of The Attack — How Three-stages Work in Unison

All three stages were part of a single large-scale scraping attack and worked in unison to perform real-time monitoring of product pages.

During the attack, we observed that rule-based systems were incapable of stopping exploitation on the application layer. Other automated attacks such as account takeover, API abuse, application DDoS, and carding are also performed on the application layer and failure in detecting such sophisticated attacks can heavily impact the business continuity.


Measures that E-commerce Firms can Take to Combat Scraping

All large e-commerce platforms have sophisticated bot activity on their website, mobile apps, and APIs that can expose them to scraping and loss of Gross Merchandise Value (GMV). E-tailers must be diligent in their approach to find and mitigate malicious sources of bot activity.

Spot highly active new or existing user accounts that don’t buy

E-commerce portals must track old or newly-created accounts that are highly active on the platform but haven’t made any purchase in a long time. Such accounts may be handled by bots that mimic real users to scrape product details and pricing information.

Don’t overlook unusual traffic on selected product pages

E-tailers should monitor unusual spikes in page views of certain products. These spikes can be periodic in nature. A sudden surge in engagement on selected product pages can be a symptom of non-human activity on your website.

Watch out for competitive price tracking and monitoring

Many e-commerce firms deploy bots or hire professionals to scrape product details and pricing information from their rival portals. You must regularly track competitors for signs of price and product catalog matching.

Build capabilities to identify automated activity in seemingly legitimate user behaviors

Sophisticated bots simulate mouse movements, perform random clicks, and navigate pages in a human-like manner. Preventing such attacks requires deep behavioral models, device/browser fingerprinting, and closed-loop feedback systems to ensure that you don’t block genuine users. Purpose-built bot mitigation solutions can identify such sophisticated automated activities and can help you take action against them. In comparison, traditional solutions such as WAFs are limited to tracking spoofed cookies, user agents, and IP reputation.

Originally published at on July 30, 2019.



Radware Bot Manager

Radware Bot Manager(formerly ShieldSquare) is a non-intrusive API-based Bot Management solution to manage bot traffic from website and app.