Deny crawling of API via robots.txt

Answered

Hi there,

my API endpoints got crawled by public crawlers which resulted also in high cpu usage.

Is it possible to prevent that e.g. via a robots.txt file hosted on the root of our hosts?

9 replies