crawl

package
v0.8.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 20, 2016 License: MIT Imports: 12 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Exists

func Exists(client *github.Client, user string, reponame string,
	path string, opts *github.RepositoryContentGetOptions) (file bool, dir bool)

func ExtractInfo

func ExtractInfo(client *github.Client, user string, altname string, repo github.Repository,
	sha string, versionString string, verbose bool, logger *log.Logger) dirinfo.DirectoryInfo

The goal of this function is to construct a DirectoryInfo object. It does this by first reading whatever directory information it can find in impact.json. Then it tries to "infer" the rest using some heuristics (to lower the burden on library developers)

Types

type Crawler

type Crawler interface {
	Crawl(r recorder.Recorder, verbose bool, logger *log.Logger) error
	String() string
}

type GitHubCrawler

type GitHubCrawler struct {
	// contains filtered or unexported fields
}

func MakeGitHubCrawler

func MakeGitHubCrawler(user string, pattern string, token string) (GitHubCrawler, error)

func (GitHubCrawler) Crawl

func (c GitHubCrawler) Crawl(r recorder.Recorder, verbose bool, logger *log.Logger) error

func (GitHubCrawler) String added in v0.8.0

func (c GitHubCrawler) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL