Name:gospider
Category:Discovery
Publisher:trickest
Created:6/23/2021
Container:quay.io/trickest/gospider:a4244c8
Output Type:
License:Unknown

Parameters

raw
boolean
--rawTurn on raw
proxy
string
-pProxy (Ex: http://127.0.0.1:8080)
length
boolean
--lengthTurn on length
verbose
boolean
--verboseTurn on verbose
debug-mode
boolean
--debugTurn on debug mode
user-agent
string
-uUser Agent to use (web: random web user-agent, mobi: random mobile user-agent)
json-output
boolean
--jsonEnable JSON output
random-delay
string
-KRandomDelay is the extra randomized duration to wait added to Delay before creating a new request (second)
3rd-party-url
boolean
-aFind URLs from 3rd party (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com)
--cookieCookie to use (testA=a; testB=b)
filter-length
boolean
--filter-lengthTurn on length filter
header-to-use
string
--headerHeader to use (Use multiple flag to set multiple header)
request-delay
string
-kDelay is the duration to wait before creating a new request to the matching domains (second)
site-to-crawl
string
required
-sSite to crawl
crawl-robot-txt
boolean
--robotsTry to crawl robots.txt (default true)
request-timeout
string
-mRequest timeout (second) (default 10)
disable-redirect
boolean
--no-redirectDisable redirect
whitelist-domain
string
--whitelist-domainWhitelist Domain
crawl-sitemap-xml
boolean
--sitemapTry to crawl sitemap.xml
html-only-content
boolean
--baseDisable all and only use HTML content
number-of-threads
string
-tNumber of threads (Run sites in parallel) (default 1)
--jsEnable linkfinder in javascript file (default true)
include-subdomains
boolean
--subsInclude subdomains
blacklist-url-regex
string
--blacklistBlacklist URL Regex
max-recursion-depth
string
-dMaxDepth limits the recursion depth of visited URLs. (Set it to 0 for infinite recursion) (default 1)
sites-list-to-crawl
file
required
-SSite list to crawl
whitelist-url-regex
string
--whitelistWhitelist URL Regex
include-3rd-party-urls
boolean
-rAlso include other-source's urls (still crawl and request)
number-of-concurent-req
string
-cThe number of the maximum allowed concurrent requests of the matching domains (default 5)
include-3rd-party-subdomains
boolean
-wInclude subdomains crawled from 3rd party. Default is main domain
load-headers-and-strings-from-burp
string
--burpLoad headers and cookie from burp raw http request