grobotstxt is a native Go port of Google's robots.txt parser and matcher library.
-
Updated
Mar 16, 2022 - Go
grobotstxt is a native Go port of Google's robots.txt parser and matcher library.
Generate an XML sitemap for a GitHub Pages site using GitHub Actions
Manage the robots.txt from the Kirby config file
Alternative robots parser module for Python
Lightweight R wrapper around rep-cpp for robot.txt (Robots Exclusion Protocol) parsing and path testing in R
Robots Exclusion Standard/Protocol Parser for Web Crawling/Scraping
Robots for Kirby CMS
RFC 9309 spec compliant robots.txt builder and parser. 🦾 No dependencies, fully typed.
User-Agent parser for robots.txt, X-Robots-tag and Robots-meta-tag rule sets
PowerShell module for reading robots.txt files
python binding for Google robots.txt parser C++ library
Typescript robots.txt parser with support for wildcard (*) matching.
The repository contains Google-based robots.txt parser and matcher as a C++ library (compliant to C++17).
Parse robots.txt and sitemaps using dotnet
Parsers for robots.txt (aka Robots Exclusion Standard / Robots Exclusion Protocol), Robots Meta Tag, and X-Robots-Tag
Add a description, image, and links to the robots-exclusion-protocol topic page so that developers can more easily learn about it.
To associate your repository with the robots-exclusion-protocol topic, visit your repo's landing page and select "manage topics."