BoxBoat Blog

Service updates, customer stories, and tips and tricks for effective DevOps

x ?

Get Hands-On Experience with BoxBoat's Cloud Native Academy

Integrating GitHub Advanced Security with Splunk

by Scott Jameson | Wednesday, Mar 15, 2023 | GitHub Security Splunk

featured.png

Did you know you can send alerts from GitHub to Splunk? You might be thinking, why would I bother? One of the main points of Splunk is to centralize your logs and alerts. If you’re busy hopping between AWS, Azure, GitHub, GitLab and more, you’ve likely got alerts and logs coming from them all, vying for your attention and precious time. Rather than manually going into each and every monitoring tool you have in your enterprise, why not simply have them send their information to one convenient location?

That is the wonder of Splunk; a specialized digital assistant checking many input sources so you can focus on more important tasks. Not to mention the customizability that goes with the likes of Splunk, allowing you to filter the alerts and consolidate all that information into a dashboard. Easy to consume, with just a glance. Perfect.

Here’s how to set up Splunk to work for you, using GitHub’s Advanced Security alerts as an example:

Pre-requisites

  • Make sure you have already installed Splunk and have appropriate access.
  • Make sure you have appropriate access to either the GitHub Org or the GitHub repo, depending on whether your goal is to apply this to a whole org or just select repositories.
  • GitHub Advanced Security is enabled for the desired Repository, Org or Enterprise.

Setting up Splunk for GitHub

This section covers installing the apps needed as well as a little config so that Splunk can accept data from GitHub.

  • Head to the Splunk interface.

  • On the left hand pane, after logging in, there should be a list of apps. Click the “Find more Apps” button.

  • Search for “GitHub” and you should see results similar to the image below this text.

    splunk-github-apps.png

  • Install both: “GitHub App for Splunk” and “GitHub Audit Log Monitoring Add-on”

  • Next step is to setup the “HTTP event collector” in Splunk. This needs enabling to accept the information from GitHub.

  • Head to “Settings” at the very top of the screen and from that dropdown, choose “Data inputs”

    data-inputs.png

  • You should see a setting that says “HTTP Event Collector” and to the right of that there is “Add New”. Click that and it should bring you to a form like this:

http_event_collector.png

  • Follow the prompts to create the “HTTP Event Collector”

    • For the “Source Type” make sure to select “github_json”

    • Create a new index called “github” and leave the details as default

    • Copy the token at the end into a notepad or similar, and your form should look similar to the image below:

      http_event_form.png

  • Once done, return to the “Data inputs” screen -> “HTTP Event Collector”. It’s usually set as disabled. That’s no good. We need to enable it otherwise it’s useless.

  • Click on “Global Settings” just above the newly created “HTTP Event Collector” and toggle the “All Tokens” to “Enabled” and hit “Save”.

    global-settings-splunk.png

  • Next, we need to adjust a macro within the Splunk UI. So, click on “Settings” → “Advanced Search” → “Search macros” and locate the macro called “github_webhooks”. Click on it and Edit the index to match the name of the index you created for the “HTTP Event Collector” then hit save. If you don’t remember it, then you should see another macro in there called “github_json” which should also have index=.

GitHub Setup

Note: From this point, we will be enabling the webhook on a particular repo. You should be able to do similar from within the Org menu as well if your intention is to get alerts from every repo in the org. Alternatively, if your intention is to apply this to a select number of repos but it’s too many to do manually then you can leverage the GitHub API to apply the below.

  • Head over to GitHub and Sign in.
  • Head to your chosen repo and go to “Settings” → “Webhooks” → “Add Webhook”
  • Payload url should be in this format:https://SPLUNK_URL:8088/services/collector/raw?token=TOKEN_FROM PREVIOUS_STEP
  • Content type should “Application/json”
  • Secret can be left blank.
  • Choose the “Let me select the individual events” and choose the following events:
    • Code scanning alerts
    • Repositories
    • Repository vulnerability alerts
    • Secret scanning alerts
    • Security and analyses
  • Once done, hit “Update webhook”
  • Now, before we proceed it is worth confirming the following, by default the webhook will use SSL. If your server is not setup for this then you need to choose “Disable” under the SSL verification on the webhook. Also note that this option does not show until after you have created the webhook. This is not recommended, however.
    • As an additional point of security. In Azure or similar, you can restrict the source IPs for port 8088 to GitHub’s IPs.
  • Alright, you’re all setup and ready to start pushing events to Splunk… or are you?
  • There is just one, small problem, if you click edit on your webhook and choose the “Recent Deliveries” tab you will likely see an error. If you open up the recent delivery and check the response. It says 400 Query string authorization is not enabled
  • Well, what this means is the Splunk is not configured to accept the request with the auth token in the way it is setup. So, you can either enable this in Splunk by editing it’s inputs.conf file or we can simply switch our authentication to Basic Auth by changing the webhook payload url to the below:
  • https://xxxxx:THETOKENFROMABOVE@SPLUNK_URL:8088/services/collector/raw
    • Don’t worry about the xxxxx - since the username does not matter, we can just put that there as we are using a token.
  • Alright, once that is changed you can just hit “redeliver” where you saw the error from earlier and it should come back with a green tick.
  • Awesome. Now, let’s go checkout Splunk and see if we got anything!

Splunk UI

Now, if you haven’t triggered any alerts after setting that webhook then you won’t have any data in Splunk. If this is an active codebase then you can just do an empty commit to trigger the scans.

Alternatively, if this is just a proof-of-concept (POC) then we can just add some known bad data to trigger the scans like so:

  • Create a requirements.txt and add the following:
requests==2.1.2
flask==1.8.0
django==2.2.0
  • Create a main.py file and add this to the contents:

# Super secret example key
AWS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

from flask import Flask
app = Flask(__name__)

import traceback

def do_something():
    raise Exception("I tried to do something!")

# No good.
@app.route('/bad_url')
def server_bad():
    try:
        do_something()
    except Exception as e:
        return traceback.format_exc()
  • This should be enough to trigger a bunch of alerts.
  • Now, in the Splunk UI you should see an “Advanced Security” tab in the menu bar. Click on that and then “Advanced Security Overview” in the sub-menu. You should then see something like the below:

splunk-example-dashboard.png

That's it. Now you have GitHub alerts integrated with Splunk.

What next?

Now you can edit that dashboard to show the data that you want to see. You can do this by simply hitting “Edit” in the upper right. Additionally, you can also export this data by hitting “Export” in the upper right as well.

  • There are panels which require search strings to populate with data.

  • You can amend the way the data is currently presented.

    • Change the colors and graph style

    • Move the graphs around, delete ones that are not relevant

      changed_graph.png

  • There should be filter boxes already visible, so you can drill down to say a specific repo and only see alerts for that repo.

  • Change the alert severity to filter only on “high” and “medium”

Try it out and customize to your preference.

Thanks for checking out this blog post!