crawler

package module
v0.2.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 16, 2019 License: MIT Imports: 8 Imported by: 1

README

crawler

Godoc Report Tests Coverage Patreon

Web crawler that accepts a bunch of URLs and tries to fetch them.

Style

Please take a look at the style guidelines if you'd like to make a pull request.

Sponsors

Scott Rayapoullé Eduard Urbach
Scott Rayapoullé Eduard Urbach

Want to see your own name here?

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Crawler

type Crawler struct {
	// contains filtered or unexported fields
}

Crawler is a web crawler that accepts URLs and tries to fetch them.

func New

func New(headers client.Headers, delayBetweenRequests time.Duration, tasksBufferLength int) *Crawler

New creates a new crawler.

func (*Crawler) Queue

func (crawler *Crawler) Queue(task *Task) error

Queue queues up a task.

func (*Crawler) Wait

func (crawler *Crawler) Wait()

Wait waits until all tasks have been completed.

type Task

type Task struct {
	URL         string
	Destination string
	Raw         bool
}

Task represents a single URL fetch task.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL