BotDetector is a Go library that detects bots, spiders, and crawlers from user agents.
go get -u github.com/logocomune/botdetector/v2
userAgent := req.Header.Get("User-Agent")
detector, _ := botdetector.New()
isBot := detector.IsBot(userAgnet)
if isBot {
log.Println("Bot, Spider or Crawler detected")
}
You can add custom detection rules with the WithRules
method. For example:
userAgent := req.Header.Get("User-Agent")
detector, _ := botdetector.New(WithRules([]string{"my rule", "^test"}))
isBot := detector.IsBot(userAgent)
if isBot {
log.Println("Bot, Spider or Crawler detected")
}
Custom Rule Patterns:
pattern | description |
---|---|
"..." | Checks if the string contains the specified pattern. |
"^..." | Checks if the string starts with the specified pattern. |
"...$" | Checks if the string ends with the specified pattern. |
"^...$" | Checks if the string strictly matches the entire pattern. |
In this example, the custom rules "my rule" and "^test" are added to the existing detection rules.
You can add a lru cache rules with the WithCache
method. For example:
userAgent := req.Header.Get("User-Agent")
detector, _ := botdetector.New(WithCache(1000))
isBot := detector.IsBot(userAgent)
if isBot {
log.Println("Bot, Spider or Crawler detected")
}
BotSeeker is inspired by CrawlerDetect, an awesome PHP project.