emailscraper

package module
v1.1.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 22, 2022 License: MIT Imports: 16 Imported by: 0

README

GolangCI Version Go Report Card Coverage Status Go Reference

emailscraper

Minimalistic library to scrape emails from websites.

Requires chromium or google-chrome available in environment for JS render utilization.

Installation

go get github.com/alixleger/emailscraper

Usage

package main

import (
	"fmt"
	
	"github.com/alixleger/emailscraper"
)

func main() {
	s := emailscraper.New(emailscraper.DefaultConfig())

	extractedEmails, err := s.Scrape("https://lawzava.com")
	if err != nil {
		panic(err)
	}
	
	fmt.Println(extractedEmails)
}

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Config

type Config struct {
	MaxDepth int
	Timeout  int

	Recursively         bool
	Async               bool
	EnableJavascript    bool
	FollowExternalLinks bool
	Debug               bool
}

Config for the scraper.

func DefaultConfig

func DefaultConfig() Config

DefaultConfig defines default config with sane defaults for most use cases.

type Scraper

type Scraper struct {
	// contains filtered or unexported fields
}

Scraper config.

func New

func New(cfg Config) *Scraper

New initiates new scraper entity.

func (*Scraper) Scrape

func (s *Scraper) Scrape(url string) ([]string, error)

Scrape is responsible for main scraping logic.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL