What can I scan using Detectify?

Designed to be used in combination, Surface Monitoring and Application Scanning provide continuous monitoring and assessment of your attack surface. They offer extensive coverage by detecting and flagging misconfigurations, security policy breaches, vulnerabilities, and other DNS and application-level anomalies.


Surface Monitoring runs on the DNS levels to discover assets across your external attack surface. It continuously monitors and runs payload-based tests against all identified Internet-facing domains, subdomains, IPs, ports, and technologies and detects exposed files, misconfigurations, and the latest vulnerabilities, including 0-days.


We recommend enabling Surface Monitoring on the apex level (example.com) rather than the subdomain level (blog.example.com) to monitor all associated and underlying assets. If you are responsible only for a smaller part of an application, enable Surface Monitoring on the subdomain highest up in the hierarchy to get the most value out of the product.


Application Scanning works as a complement to Surface Monitoring by running more in-depth testing on essential web applications. It uses a web crawler to explore which parts of your web application should be included in security testing. To do this, it needs a web interface to interact with. The following are aspects to consider when determining which endpoints to set up Application Scanning for:


Are you responsible for securing a custom-built application?


Custom-built applications rely on dozens of smaller components, not least JavaScript frameworks like React, AngularJS, or Vue. Adding a scan profile ensures vulnerabilities associated with these technologies are discovered and quickly remediated.


Are you scanning a domain with a login?


Logins contain PII, which makes them valuable targets for malicious hackers. These data can be used to move laterally through an organization or even sold on the darknet. Setting up a Scan Profile ensures Detectify can assess both web apps with unauthenticated and authenticated states through crawling and fuzzing.


Are you using tools to track user behavior or A/B test?


Today’s web apps commonly incorporate tools that teams use to evaluate and improve the user experience. Teams should regularly test these valuable interactive components. Since web apps are designed to face externally and are heavily optimized to meet the user’s needs, we encourage you to use Detectify to perform deep crawling and fuzzing of the functionality.


Be mindful of redirects


If your domain redirects us, ensure that the redirect points Detectify to the same domain, such as “www.example.com” redirecting to “www.example.com/en/.” If the redirect takes us to a different domain, it breaks the scope of the scan profile, and we will not be able to crawl it successfully.


If scanning a login page, have you set up Recorded Login?


Detectify Recorded Login ensures we can rigorously test pages that require an authenticated state for crawling and fuzzing. You can set this up under Application Scanning settings - > Scan behind login. Read more about the authenticated scanning functionality in our Knowledge Base article. We also support authenticated scanning via basic auth credentials or a session cookie.


Be mindful of redirects


If your domain redirects us, ensure that the redirect points Detectify to the same domain, such as “www.example.com” redirecting to “www.example.com/en/.” If the redirect takes us to a different domain, it breaks the scope of the scan profile, and we will not be able to crawl it successfully.


Be mindful of API endpoints


We don't encourage users to actively scan API endpoints since our crawler commonly cannot click its way through the application to gather necessary information for security testing. However, if there's a web application utilizing APIs, we may detect these endpoints incidentally during our crawling and fuzzing processes. Consider the domain “https://api.example.com/”: there's no direct interaction or crawling potential. Recorded Login is not feasible here as there is no HTML to interact with, just plain old JSON. Yet, if you authenticate to “https://www.acme.com/”, and if ACME's functionality relies heavily on “api.acme.com”, our crawling process can pick up on this connection, allowing for audit capabilities.