We use heuristics during filtering, which means that we try to automatically detect when two pages are the same or use the same backend code. The point of this is to avoid redundant crawling of pages or products that share the same code.
For example, this often occurs on an e-commerce site where the code showing product information is most likely the same regardless of the product. The difference lies in the content, for example, text and pictures. If there is an XSS-vulnerability on the product page, it is enough to warn about it on one product page instead of creating a new XSS-finding for every product there is, if we identify it to be the same code.
However, sometimes this fails. To ensure we do not remove any pages erroneously, the scanner will decide to treat a page as unique if it cannot say for certain that it is a duplicate of an already scanned page. When this happens, the scanner might create two, or for large sites many more, findings of seemingly the same vulnerability because the scanner has determined they are two different pages.
Use Paths to reduce duplicates:
You can use Paths feature if you would like to reduce duplicate finding by avoiding testing all pages using the same backend code. This feature can be found under your Deep Scan Settings. Learn more about Paths in this article on how to allow and disallow paths.