crawler

package
v1.21.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 12, 2024 License: BSD-3-Clause Imports: 16 Imported by: 2

Documentation

Index

Constants

View Source
const (
	// Name of this module
	Name = "crawler"

	// Description of this module
	Description = "Crawls the target domain in order to retrieve most of the target external origins"

	// Author of this module
	Author = "Muraena Team"
)

Variables

This section is empty.

Functions

func Contains

func Contains(slice *[]string, find string) bool

func IsSubdomain

func IsSubdomain(ref string, toCheck string) bool

Types

type Crawler

type Crawler struct {
	session.SessionModule

	Enabled bool
	Depth   int
	UpTo    int

	Domains []string
}

Crawler module

func Load

func Load(s *session.Session) (m *Crawler, err error)

Load configures the module by initializing its main structure and variables

func (*Crawler) Author

func (module *Crawler) Author() string

Author returns the module author

func (*Crawler) Description

func (module *Crawler) Description() string

Description returns the module description

func (*Crawler) Name

func (module *Crawler) Name() string

Name returns the module name

func (*Crawler) Prompt added in v0.1.2

func (module *Crawler) Prompt()

Prompt prints module status based on the provided parameters

func (*Crawler) SimplifyDomains added in v1.21.1

func (module *Crawler) SimplifyDomains()

SimplifyDomains simplifies the Domains slice by grouping subdomains of 3rd and 4th level as *.<domain>

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL