Create a runbook for all available GuardDuty finding types found on the GuardDuty docs website using the information documented for each finding.
This project is a kick-start to generate a base set of runbooks when GuardDuty is enabled in an organization.
Runbooks will need to be customized to fit organizational incident response procedures and add contextual information.
This tool can be installed from PyPI
pip install guardduty-runbooks
It can also be installed locally. After cloning the directory, run in the folder:
pip install .
The tool can then be run with optional flags:
guardduty-runbooks [--outdir outdir] [--overwrite]
If outdir
is not specified, it will write all runbooks to the local directory.
guardduty-runbooks --outdir ./my-runbook-directory
This tool can be run multiple times to create runbooks for new finding types. Run the tool again over the directory where runbooks are stored and it will write new runbooks only, unless --overwrite
is specified.
Overwrite is a destructive command and will erase any customization made to the runbook.
guardduty-runbooks --outdir ./my-runbook-directory --overwrite
Runbook filenames are written using the “finding type” specified by GuardDuty.
Because finding types include non alphanumeric characters like :, /, !,
and .
, those characters are replaced with dashes -
and all other characters are made lowercase.
This is for ease of programatically locating runbooks for tools like Panther and Matano.
For example: CryptoCurrency:EC2/BitcoinTool.B!DNS
becomes cryptocurrency-ec2-bitcointool-b-dns
Prompt injection is a type of security vulnerability that can be exploited to control the…
Firefly is an advanced black-box fuzzer and not just a standard asset discovery tool. Firefly…
Winit is a robust, cross-platform library designed for creating and managing windows in Rust applications.…
In today’s digital age, convenience often comes at the cost of security. One such overlooked…
Terminal GPT (tgpt) offers a seamless way to bring the power of ChatGPT 3.5 directly…
garak checks if an LLM can be made to fail in a way we don't…