The DeepScan technology is a DOM parser based on an improved Chromium engine. This engine enables DeepScan to emulate the way that the user uses the browser including virtual mouse movement and mouse clicks.

  • DeepScan crawls HTML5 websites including single-page applications (SPA) and executes JavaScript just like a real browser would.
  • You can thoroughly analyze web applications developed in Node.js, Ruby on Rails, and Java Frameworks including Java Server Faces (JSF), Spring, and Struts.
  • DeepScan is additionally able to discover the most popular JavaScript frameworks: Angular, Vue, and React. When it recognizes the framework, it adjusts the crawl to its specific structure improving crawl efficiency and effectiveness.

To crawl and scan and areas of the web application that require authentication, the scanner needs to know how to log in and requires credentials. To make this possible, Acunetix uses the Login Sequence Recorder (LSR). With LSR, you can quickly and easily record a series of actions and/or restrictions that the scanner can replay to authenticate itself during a crawl and a scan. The Acunetix LSR supports a large number of authentication mechanisms including:

  • Multi-step/custom authentication schemes
  • Single Sign-On authentication
  • CAPTCHAs and multi-factor authentication

Most modern web applications are built on top of APIs. The same APIs are also accessed, for example, by mobile applications or directly used by third parties. If the API is accessed by a web application, DeepScan helps Acunetix map the endpoint structure.

  • The crawler interacts with AJAX, SOAP/WSDL, SOAP/WCF, REST/WADL, XML, JSON, Google Web Toolkit (GWT), and CRUD operations.
  • Although Swagger, WADL, and WSDL files can give a head start to the scanner, the crawler can automatically build the structure of endpoints and available calls with no need to provide additional information.