Eager to get your results and start improving your site’s security? We get that! Here's a couple of ways to help speed up the scan.
Avoid pages that share the same backend code
Most websites have the same backend code for a lot of different pages. Take a web shop for example: the code showing product information is most likely the same regardless of the product. The difference lies in the content, for example, text and pictures.
Our tool is doing its best to find similar pages and pages that use the same backend code and should therefore not need to crawl all pages or products on a web shop. This function is not 100% foolproof and we would rather have duplicates than remove legit pages, so we might still find a lot of pages on a big site.
1. Whitelist/blacklist pages and paths
What you can do in order to avoid testing pages with the same backend code is using Paths. This feature can be found by clicking on your scan profile, and then selecting Deep Scan Settings. You can learn how to use it by reading our article on how to allow and disallow paths.
By using Paths, you can allow (whitelist) and disallow (blacklist) specific paths and pages. For example:
This tells our crawler to ignore /shirts/ or anything under that, so it would not crawl http://example.com/shirts/green-large-50, for example. However, since we whitelisted http://example.com/shirts/blue-shirt-340, it would still crawl that particular product page. The same logic can be applied when scanning blogs and blog posts.
Note: The more you restrict the scope, the bigger the risk to accidentally disallow something of interest, so be careful!
2. Scan a specific URL
If changes have been made to a specific page on a website, it is possible to only scan that part.
By using our Paths feature in Deep Scan Settings, it is possible to blacklist/disallow everything on the website and only whitelist a specific path. For example:
In this case, only http://example.com/about/contact would be scanned.