Check robots.txt rules through an API. This is our solution to maintain all web crawlers polite with our friends the webmasters. You can test our first alpha release for free!
Who needs us?
People building web crawlers or SEO tools that don't want to manage the complex system of rules that webmasters use on their websites.
You can integrate our service into your product in a very simple way, simply make a network request with the data you want to check, and we will take care of the complicated work.
We have implemented a rich set of features to make a flexible service, following the directives proposed by the most current internet draft for REP (Robot Exclusion Protocol).