CloudGrappler is a purpose-built tool designed for effortless querying of high-fidelity and single-event detections related to well-known threat actors in popular cloud environments such as AWS and Azure.
To optimize your utilization of CloudGrappler, we recommend using shorter time ranges when querying for results.
This approach enhances efficiency and accelerates the retrieval of information, ensuring a more seamless experience with the tool.
pip3 install -r requirements.txt
To clone the cloudgrep repository locally, run the clone.sh file. Alternatively, you can manually clone the repository into the same directory where CloudGrappler was cloned.
chmod +x clone.sh
./clone.sh
This tool offers a CLI (Command Line Interface). As such, here we review its use:
Define the scanning scope inside data_sources.json file based on your cloud infrastructure configuration. The following example showcases a structured data_sources.json file for both AWS and Azure environments:
Modifying the source inside the queries.json file to a wildcard character (*) will scan the corresponding query across both AWS and Azure environments.
{
"AWS": [
{
"bucket": "cloudtrail-logs-00000000-ffffff",
"prefix": [
"testTrails/AWSLogs/00000000/CloudTrail/eu-east-1/2024/03/03",
"testTrails/AWSLogs/00000000/CloudTrail/us-west-1/2024/03/04"
]
},
{
"bucket": "aws-kosova-us-east-1-00000000"
}
],
"AZURE": [
{
"accountname": "logs",
"container": [
"cloudgrappler"
]
}
]
}
Run command
python3 main.py
python3 main.py -p
[+] Running GetFileDownloadUrls.*secrets_ for AWS
[+] Threat Actor: LUCR3
[+] Severity: MEDIUM
[+] Description: Review use of CloudShell. Permiso seldom witnesses use of CloudShell outside of known attackers.This however may be a part of your normal business use case.
python3 main.py -p -jo
reports
└── json
├── AWS
│ └── 2024-03-04 01:01 AM
│ └── cloudtrail-logs-00000000-ffffff--
│ └── testTrails/AWSLogs/00000000/CloudTrail/eu-east-1/2024/03/03
│ └── GetFileDownloadUrls.*secrets_.json
└── AZURE
└── 2024-03-04 01:01 AM
└── logs
└── cloudgrappler
└── okta_key.json
python3 main.py -p -sd 2024-02-15 -ed 2024-02-16
python3 main.py -q “GetFileDownloadUrls.*secret”, ”UpdateAccessKey” -s '*'
python3 main.py -f new_file.json
Your system will need access to the S3 bucket. For example, if you are running on your laptop, you will need to configure the AWS CLI. If you are running on an EC2, an Instance Profile is likely the best choice.
If you run on an EC2 instance in the same region as the S3 bucket with a VPC endpoint for S3 you can avoid egress charges. You can authenticate in a number of ways.
The simplest way to authenticate with Azure is to first run:
az login
This will open a browser window and prompt you to login to Azure.
Overview WhatsMyName is a free, community-driven OSINT tool designed to identify where a username exists…
Managing disk usage is a crucial task for Linux users and administrators alike. Understanding which…
Efficient disk space management is vital in Linux, especially for system administrators who manage servers…
Knowing how to check directory sizes in Linux is essential for managing disk space and…
Managing user accounts is a core responsibility for any Linux administrator. Whether you’re securing a…
Linux offers powerful command-line tools for system administrators to view and manage user accounts. Knowing…