UPDATE: Aug 2021 - We do not currently support Local Grapl; please visit our documentation to learn how to set up a Grapl instance.
To make Grapl even easier to get started with we’ve released a version that can run locally on your system! This post will take you through the process of setting up a local Grapl environment and performing a basic engagement with some demo attacker data.
To run Grapl you must have the following software installed:
A Python3 environment with:
Grapl has been tested primarily on Linux systems, where Docker support is best. If you’re working with another OS your experience may vary. If you run into any problems please file an issue or let us know in our Slack channel.
Setting up Grapl to run locally on your system is a quick and simple process.
First clone the Grapl repository. In the directory where Grapl has been cloned, run the command docker-compose up. You may see warnings in your terminal as services boot up.
Eventually the build process will reach a steady state - this shouldn’t take more than a few minutes.
git clone https://github.com/insanitybit/grapl.git
TAG=v0.2.0-latest docker-compose up
Note: Make sure the TAG argument for docker-compose up is using the latest version of Grapl. To find out the current version of Grapl, please check our releases.
Uploading Your Analyzer
Next, we’ll upload a basic Analyzer (Grapl’s attack signatures), which searches for processes named "svchost" without a whitelisted parent process. We've provided a demo Analyzer in the Grapl repository. If you're interested in the code, see our Analyzer docs.
To upload the Analyzer to Grapl, navigate to the root of the cloned grapl repository and run the following command:
Grapl may take a couple of minutes to get started, so if you get an error similar to “could not connect to the endpoint URL”, give Grapl a few more minutes to finish provisioning.
Adding Data to Grapl
Once the Analyzer is uploaded, run the following command in the root of the cloned repository to get data into Grapl:
python3 ./etc/local_grapl/bin/upload-sysmon-logs.py --bucket_prefix=local-grapl --logfile=eventlog.xml
Working With Grapl Data:
To analyze our demo data using Grapl open two browser windows in Google Chrome.
In the first window navigate to Grapl's Jupyter Notebook on localhost:8888. The 'Grapl Notebook' is where we’ll interact with the engagements using Python.
Log in with the password "graplpassword". Once logged in you'll see a directory with files that will be used later in the tutorial.
In the other window navigate to localhost:1234 to connect to the Engagement UX. The Engagement UX displays risks in our environment. Credentials are not needed when running Grapl locally, just click the ‘submit’ button to get started.
After logging in, you’ll be redirected to the Grapl UI. The Lenses section will show one lens which associates a risk with some kind of correlation point - in this case, an asset.
To examine the graph of suspicious nodes and edges relating to our asset lens, click on the lens name, in this case ‘DESKTOP-FVSHABR0’.
After clicking the lens name a graph will appear in the right panel. The graph has two nodes "cmd.exe", "svchost.exe", and an edge between the two.
Click the node labeled ‘cmd.exe’ and copy the value of the node_key.
The Demo_Engagement notebook creates a new engagement, which shows up on the ‘Lenses’ page.
Replace "<put cmd node_key here>" with the node key as a string.
Click the first block of code, then click the ‘Run’ button four times. A new lens will appear in the ‘Lenses’ list. This is our Engagement.
As you continue to click the ‘Run’ button in your Jupyter Notebook the graph will update with new nodes and edges that get pulled into the Engagement graph.
As we pivot off of the data that we have our graph expands to visually display a‘dropper’ behavior.
There's a bit more to this attack, so go ahead and keep expanding out the Engagement to see what else the attacker did.
Check out our docs to see other ways to interact with your data.
Grapl is dramatically improving in many ways. We have recently completely rewritten our front-end experience, we're actively working to support more data sources, and we continue to improve our documentation.
We’ve hired multiple new engineers, who have either started or will start full-time with Grapl in the coming weeks. With these added members to the team we'll be keeping up with demand and issuing fast paced improvements.