Details

Category: Discovery

Publisher: trickest-mhmdiaa

Created Date: 12/26/2022

Container: quay.io/trickest/crawlergo:v0.4.4-patch-2

Source URL: https://github.com/Qianlitp/crawlergo

Parameters

url
string
required
Command: - URLs to scan, separated by spaces (must be same host)
fuzz-path
boolean
Command: --fuzz-path - whether to fuzz the target with common paths. (default: false)
log-level
string
Command: --log-level - log print Level, options include debug, info, warn, error and fatal. (default: Info)
post-data
string
Command: --post-data - set PostData to target and use POST method.
encode-url
boolean
Command: --encode-url - whether to encode url with detected charset. (default: false)
filter-mode
string
Command: --filter-mode - filtering Mode used for collected requests. Allowed mode:simple, smart or strict. (default: smart)
form-values
string
Command: --form-values - custom filling text for each form type. e.g.: -fv username=crawlergo_nice -fv password=admin123
output-mode
string
Command: --output-mode - console print or serialize output. Allowed mode:console ,json or none. (default: console)
robots-path
boolean
Command: --robots-path - whether to resolve paths from /robots.txt. (default: false)
max-tab-count
string
Command: --max-tab-count - maximum Number of tabs allowed. (default: 8)
push-pool-max
string
Command: --push-pool-max - maximum Number of concurrency when pushing results to proxy. (default: 10)
push-to-proxy
string
Command: --push-to-proxy - every request in 'req_list' will be pushed to the proxy Address
request-proxy
string
Command: --request-proxy - all requests connect through defined proxy server.
custom-headers
string
Command: --custom-headers - add additional Headers to each request. The input string will be called json.Unmarshal (default: {Spider-Name: crawlergo, User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.0 Safari/537.36})
fuzz-path-dict
file
Command: --fuzz-path-dict - Fuzz dict
tab-run-timeout
string
Command: --tab-run-timeout - the Timeout of a single tab task. (default: 20s)
before-exit-delay
string
Command: --before-exit-delay - the Time of waiting before crawler exit. (default: 1s)
max-crawled-count
string
Command: --max-crawled-count - the maximum Number of URLs visited by the crawler in this task. (default: 200)
event-trigger-mode
string
Command: --event-trigger-mode - this Value determines how the crawler automatically triggers events.Allowed mode:async or sync. (default: async)
form-keyword-values
string
Command: --form-keyword-values - custom filling text, fuzzy matched by keyword. e.g.: -fkv user=crawlergo_nice -fkv pass=admin123
ignore-url-keywords
string
Command: --ignore-url-keywords - crawlergo will not crawl these URLs matched by Keywords. Default [logout quit exit])
event-trigger-interval
string
Command: --event-trigger-interval - the Interval of triggering each event. (default: 100ms)
wait-dom-content-loaded-timeout
string
Command: --wait-dom-content-loaded-timeout - the Timeout of waiting for a page dom ready. (default: 5s)