Details

Category: Discovery

Publisher: trickest

Created Date: 6/23/2021

Container: quay.io/trickest/gospider:a4244c8

Source URL: https://github.com/jaeles-project/gospider

Parameters

raw
boolean
Command: --raw - Turn on raw
proxy
string
Command: -p - Proxy (Ex: http://127.0.0.1:8080)
length
boolean
Command: --length - Turn on length
verbose
boolean
Command: --verbose - Turn on verbose
debug-mode
boolean
Command: --debug - Turn on debug mode
user-agent
string
Command: -u - User Agent to use (web: random web user-agent, mobi: random mobile user-agent)
json-output
boolean
Command: --json - Enable JSON output
random-delay
string
Command: -K - RandomDelay is the extra randomized duration to wait added to Delay before creating a new request (second)
3rd-party-url
boolean
Command: -a - Find URLs from 3rd party (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com)
cookie-to-use
string
Command: --cookie - Cookie to use (testA=a; testB=b)
filter-length
boolean
Command: --filter-length - Turn on length filter
header-to-use
string
Command: --header - Header to use (Use multiple flag to set multiple header)
request-delay
string
Command: -k - Delay is the duration to wait before creating a new request to the matching domains (second)
site-to-crawl
string
required
Command: -s - Site to crawl
crawl-robot-txt
boolean
Command: --robots - Try to crawl robots.txt (default true)
request-timeout
string
Command: -m - Request timeout (second) (default 10)
disable-redirect
boolean
Command: --no-redirect - Disable redirect
whitelist-domain
string
Command: --whitelist-domain - Whitelist Domain
crawl-sitemap-xml
boolean
Command: --sitemap - Try to crawl sitemap.xml
html-only-content
boolean
Command: --base - Disable all and only use HTML content
number-of-threads
string
Command: -t - Number of threads (Run sites in parallel) (default 1)
enable-link-finder
boolean
Command: --js - Enable linkfinder in javascript file (default true)
include-subdomains
boolean
Command: --subs - Include subdomains
blacklist-url-regex
string
Command: --blacklist - Blacklist URL Regex
max-recursion-depth
string
Command: -d - MaxDepth limits the recursion depth of visited URLs. (Set it to 0 for infinite recursion) (default 1)
sites-list-to-crawl
file
required
Command: -S - Site list to crawl
whitelist-url-regex
string
Command: --whitelist - Whitelist URL Regex
include-3rd-party-urls
boolean
Command: -r - Also include other-source's urls (still crawl and request)
number-of-concurent-req
string
Command: -c - The number of the maximum allowed concurrent requests of the matching domains (default 5)
include-3rd-party-subdomains
boolean
Command: -w - Include subdomains crawled from 3rd party. Default is main domain
load-headers-and-strings-from-burp
string
Command: --burp - Load headers and cookie from burp raw http request