Saltar a la navegación Saltar al contenido principal Ir al pie de página

A common challenge technical teams (e.g. penetration testers) face is centralized deployment and pipelining execution of security tools. It is possible that at some point you have thought about customising several tools, buying their commercial licenses, and allowing a number of people to run the tools from AWS.

The problem is that this means you also have to deal with a bunch of tedious tasks like giving your team access to the EC2 instances, managing the IAM users, updating the OS to protect against privilege escalation, protecting tool licenses, powering the EC2 instances on and off as required.

Let’s imagine that we want to define a pipeline that we want to execute it continuously (e.g. a CI/CD pipeline). When given a range of IP addresses, it scans the UDP ports with Nmap, launches Nessus PRO to analyse the available ports for vulnerabilities and also runs ScoutSuite to evaluate an AWS account. Let’s further imagine that we want all this traffic to originate from a specific pool of AWS IP addresses, that the pipeline tools should be executed in a distributed manner and, while we’re at it, offer the user a web interface so as to abstract them from all the infrastructure that runs underneath.

CowCloud is a serverless solution to distribute workloads in AWS that can execute these pipelines. To get started, spin up an EC2 instance, access it, install Nmap, Nessus and register your Nessus pro license. Then download the ec2py/template.py file from the CowCloud repository and customise it to run both tools against one target and saves the output in the temporal folder `tmp_folder`.

Once you confirm that the template.py works, create a snapshot of the EC2 instance and save the AMI ID of the snapshot.

Next, clone the repository locally, open the Terraform/variables.tf file, and update the AMI variable with your AMI ID, and then simply follow the rest of the installation steps in the repository’s Readme.md.

At the end of the CowCloud deployment, access the URL shown in the Terraform output, log into the website, and queue a new task. Subsequently, the tasks will be consumed by the ec2py tool, which runs on an EC2 instance using your AMI as the base image. And the output/result/reports will be compressed, encrypted and uploaded to an S3 bucket so that the user can download the result of the Nmap and Nessus scans.

That’s all there is to it!

This solution is ideal for cases where you want to maintain an AMI with up-to-date commercial and open source tools and custom configurations for your pentests. With CowCloud, you can abstract users from the hurdles of maintaining and managing the infrastructure so that they only have to worry about the target. All they have to do is send a small amount of required information to the tools that run on the EC2 instances.

CowCloud can be used for a whole range of purposes – you may already have thought of some use cases yourself – but some of the more common ones are detailed below:

  • Baselining security testing. Use CowCloud to launch a series of tools that you consider as a baseline every time you do an external pentest (or participate in a bug bounty) and from a pool of EIPs from which the client expects to receive attacks
  • Centralized Tool Access and Management. Add API keys and commercial licenses to your AMI so you can provide your teams with the best and most relevant capability, while responsibly managing your licenses.
  • Distributed password cracking in AWS. Update the `instance_type` in the variables.tf file with one suitable for cracking passwords

Check out the CowCloud tool here: https://github.com/nccgroup/cowcloud