gospider
Fast web spider written in Go
Name:gospider
Category:Discovery
Publisher:trickest
Created:6/23/2021
Container:
quay.io/trickest/gospider:a4244c8
Output Type:
License:Unknown
Source:View Source
Parameters
--raw
Turn on raw-p
Proxy (Ex: http://127.0.0.1:8080)--length
Turn on length--verbose
Turn on verbose--debug
Turn on debug mode-u
User Agent to use (web: random web user-agent, mobi: random mobile user-agent)--json
Enable JSON output-K
RandomDelay is the extra randomized duration to wait added to Delay before creating a new request (second)-a
Find URLs from 3rd party (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com)--cookie
Cookie to use (testA=a; testB=b)--filter-length
Turn on length filter--header
Header to use (Use multiple flag to set multiple header)-k
Delay is the duration to wait before creating a new request to the matching domains (second)-s
Site to crawl--robots
Try to crawl robots.txt (default true)-m
Request timeout (second) (default 10)--no-redirect
Disable redirect--whitelist-domain
Whitelist Domain--sitemap
Try to crawl sitemap.xml--base
Disable all and only use HTML content-t
Number of threads (Run sites in parallel) (default 1)--js
Enable linkfinder in javascript file (default true)--subs
Include subdomains--blacklist
Blacklist URL Regex-d
MaxDepth limits the recursion depth of visited URLs. (Set it to 0 for infinite recursion) (default 1)-S
Site list to crawl--whitelist
Whitelist URL Regex-r
Also include other-source's urls (still crawl and request)-c
The number of the maximum allowed concurrent requests of the matching domains (default 5)-w
Include subdomains crawled from 3rd party. Default is main domain--burp
Load headers and cookie from burp raw http request