crawler

package
Version: v0.0.0-...-091bbc5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 19, 2019 License: MIT Imports: 6 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Page

type Page struct {
	// contains filtered or unexported fields
}

Page represents a webpage

func NewPage

func NewPage(url string) *Page

NewPage returns a page object give url

type Site

type Site struct {
	// contains filtered or unexported fields
}

Site represents the site to be crawled. It uses a concurrent map so that downloading can be done in parallel and each go routine can safely check if a link had been already visited

func NewSite

func NewSite(url string) *Site

NewSite returns a site object given url

func (*Site) Crawl

func (s *Site) Crawl(url string)

Crawl crawls the site starting from the given URL

func (*Site) ShowMap

func (s *Site) ShowMap()

ShowMap prints the site map

Source Files

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
t or T : Toggle theme light dark auto
y or Y : Canonical URL