cariddi
Take a list of domains, crawl URLs, and scan for endpoints, secrets, API keys, file extensions, tokens, and more…
Details
Category: Discovery
Publisher: trickest-mhmdiaa
Created Date: 2/5/2022
Container: quay.io/trickest/cariddi:v1.3.1
Source URL: https://github.com/edoardottt/cariddi
Parameters
Command:
-err
- Hunt for errors in websites.Command:
-json
- Print the output as JSON in stdout.Command:
-debug
- Print debug information while crawling.Command:
-plain
- Print only the results.Command:
-proxy
- Set a Proxy to be used (http and socks5 supported).Command:
- List of domains to scanCommand:
-headers
- Use custom headers for each request E.g. Cookie: auth=yes;;Client: type=2.Command:
-t
- Set timeout for the requests. (default 10)Command:
-info
- Hunt for useful informations in websites.Command:
-intensive
- Crawl searching for resources matching 2nd level domain.Command:
-ua
- Use a custom User Agent.Command:
-headersfile
- Read from an external file custom headers (same format of headers flag).Command:
-s
- Hunt for secrets.Command:
-rua
- Use a random browser user agent on every request.Command:
-ef
- Use an external file (txt, one per line) to use custom parameters for endpoints hunting.Command:
-e
- Hunt for juicy endpoints.Command:
-ext
- Hunt for juicy file extensions. Integer from 1(juicy) to 7(not juicy).Command:
-c
- Concurrency level. (default 20)Command:
-i
- Ignore the URL containing at least one of the elements of this array.Command:
-d
- Delay between a page crawled and another.Command:
-sf
- Use an external file (txt, one per line) to use custom regexes for secrets hunting.Command:
-it
- Ignore the URL containing at least one of the lines of this file.