Hi there,
my API endpoints got crawled by public crawlers which resulted also in high cpu usage.
Is it possible to prevent that e.g. via a robots.txt file hosted on the root of our hosts?
Hi there,
my API endpoints got crawled by public crawlers which resulted also in high cpu usage.
Is it possible to prevent that e.g. via a robots.txt file hosted on the root of our hosts?