InfiSecure Knowledge Hub
Over-Crawling Prevention
InfiSecure Knowledge Hub
Over-Crawling Prevention

What is Over-Crawling

Crawlers are used for various purposes, the most significant of which is Search Engine Indexing. Sometimes, crawlers crawl web pages excessively and result into over-crawling a website.

Industries Affected

Content Websites, E-Commerce, Travel and OTAs, Social websites

What happens when your website is over-crawled

Every hit made by a crawler adds to your server capacity and cost. When over-crawling occurs, your server costs go high and the server capacity is reduced. This excessive cost could be beyond 50% of the server cost being paid currently.

How to stop over-crawling on your website

Almost all search engine crawlers follow robots.txt rules. To avoid over-crawling, it is a good practice to modify and add rules to your robots.txt file. InfiSecure helps you identify the crawlers that access your website, their hit rates and alerts you when over-crawling is identified. It also helps you block unwanted crawlers that eat your server space. Moreover, the system gives you a detailed split of crawlers accessing your website on its dashboard. This helps you understand the impact of various crawlers on your website.