APJ

Akamai Announces Content Protector to Stop Scraping Attacks

Akamai Technologies

The first scraper-specific product tailored to the unique characteristics of scraping attacks

 Akamai Technologies, Inc. announced the availability of Content Protector, a product that stops scraping attacks without blocking the good traffic that companies need to enhance their business.

Scraper bots play a vital role in commerce by finding new content, comparing products, and updating information for customers. However, they’re also misused for harmful activities like undercutting competitors, surveillance for inventory hoarding attacks, and counterfeiting. Scrapers can degrade site performance by constantly pinging sites, leading to consumer frustration and abandoned visits. Moreover, they’ve become increasingly evasive and sophisticated over recent years.

“This includes competitors undercutting your offers, slower sites that lead customers to get frustrated and leave, and brand damage from counterfeiters passing off subpar goods as your legitimate merchandise”

Rupesh Chokshi, Senior Vice President and General Manager, Application Security, at Akamai

Akamai Content Protector helps detect and mitigate evasive scrapers that steal content for malicious purposes. It facilitates significantly better detections and fewer false negatives without increasing the rate of false positives. The product is designed for companies that need to protect their intellectual property, reputation, and revenue potential. It offers tailored detections that include:

  • Protocol-level assessment: Protocol fingerprinting evaluates how the client establishes the connection with the server at the different layers of the Open Systems Interconnection (OSI) model — verifying that the parameters negotiated align with the one expected from the most common web browsers and mobile applications.
  • Application-level assessment: Evaluates if the client can run some business logic written in JavaScript. When the client runs JavaScript, it collects the device and browser characteristics and user preferences. These various data points are compared and cross-checked against the protocol-level data to verify consistency.
  • User interaction: Analyses human interaction with the client through standard peripherals like a touch screen, keyboard, and mouse. Lack of interaction or abnormal interaction is typically associated with bot traffic.
  • User behaviour: Monitors the user journey through the website. Botnets typically go after specific content, resulting in significantly different behaviour than legitimate traffic.
  • Risk classification: Provides a deterministic and actionable low-, medium-, or high-risk classification of the traffic, based on the anomalies found during the evaluation.

Rupesh Chokshi, Senior Vice President and General Manager, Application Security, at Akamai. “This includes competitors undercutting your offers, slower sites that lead customers to get frustrated and leave, and brand damage from counterfeiters passing off subpar goods as your legitimate merchandise. Content Protector helps demonstrate the direct business value of security while enabling business leaders to grow their digital businesses.”

Related posts

The ERP revolution is here: Why point solutions might be failing your business

enterpriseitworld

Publicis Sapient to Create a BU for Google Cloud AI

enterpriseitworld

Merck and Digital Trust Centre at Nanyang Technological University, Singapore, collaborate

enterpriseitworld
x