Documentation
¶
Overview ¶
Package crawlerdetect provides a Go version of the https://github.com/JayBizzle/Crawler-Detect PHP library It can be used to detect crawlers based on their HTTP User Agent header
Here is a simple example
uastring := "curl/7.54.0" if crawlerdetect.IsCrawler(uastring) { fmt.Println("Found a crawler") }
To use custom patterns a dedicated instance can be created:
uastring := "curl/7.54.0" crawlerDetect := crawlerDetect.New() crawlerDetect.SetCrawlers([]string{`curl`}) if crawlerDetect.IsCrawler(uastring) { fmt.Println("Found a crawler") }
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type CrawlerDetect ¶ added in v0.2.0
type CrawlerDetect struct {
// contains filtered or unexported fields
}
CrawlerDetect contains the patterns and exclusions for detecting crawlers
func New ¶ added in v0.2.0
func New() *CrawlerDetect
New creates a new CrawlerDetect instance with the default patterns
func (*CrawlerDetect) IsCrawler ¶ added in v0.2.0
func (c *CrawlerDetect) IsCrawler(input string) bool
IsCrawler checks for a user agent string if it is a crawler
func (*CrawlerDetect) SetCrawlers ¶ added in v0.2.0
func (c *CrawlerDetect) SetCrawlers(crawlers []string)
SetCrawlers sets a custom list of crawler patterns
func (*CrawlerDetect) SetExclusions ¶ added in v0.2.0
func (c *CrawlerDetect) SetExclusions(exclusions []string)
SetExclusions sets a custom list of exclusion patterns
Click to show internal directories.
Click to hide internal directories.