Connect Snyk to Splunk by leveraging the Splunk HTTP Event Collector and visualise your vulns to Splunk.
I recommend the following resources:
- Forward Snyk Vulnerability data to Splunk Observability Cloud, (Harry Kimpel)
- Snyk webhook subscription, (Harry Kimpel)
- Using Snyk Webhooks to connect Snyk to Slack with AWS Lambda, (Fredrik Klasén, Eric Fernandez)
- An AWS account with access to:
- Create new roles (or use an existing one)
- Modify Lambda functions
- Modify API Gateway
- Create new roles (or use an existing one)
- Snyk account with Organization Admin access
- Splunk Clound account - note: not the same as Splunk Observability Cloud!
- 1. Splunk Setup
- 2. AWS Setup
- 3. Snyk Webhook Setup
- 4. Get and display code issues
To start with, we will set up and configure the Splunk HTTP Event Collector and test our connection before moving to AWS Lambda.
Our intention is to get data in Splunk Cloud via monitoring. We'll leverage the Splunk HTTP Event Collector, which is an endpoint that lets developers send application events directly to the Splunk platform via HTTP or HTTPS using a token-based authentication model.
It's a handy solution, because we can use the Splunk .NET and Java logging libraries or any standard HTTP Client that lets us send data in JavaScript Object Notation (JSON) format.
The HTTP Event Collector receives data over HTTPS on TCP port 8088 by default. We can change this port, as well as disable HTTPS.
🛠️ Implementation steps
1. Log in to your Splunk Cloud account (you receive the login information via email, like the Splunk Cloud Platform URL, the Username and a Temporary Password)

2. After a succesful log in, navigate to Settings in the top menu bar and select the Add Data icon!

3. In the Or get data in with the following methods section choose Monitor
4. Among the many options choose HTTP Event Collector
5. Give a name for your Token and make sure that the option Enable indexer acknowledgement is selected!

6. On the Input Settings site the source type should be automatic, and we can allow the main index (The Splunk platform stores incoming data as events in the selected index):

7. After reviewing all the information, we're done, you should see the generated Token Value (in this setup on AWS Lambda it is called SPLUNK_HEC_TOKEN):

Sending events to HEC with indexer acknowledgment active is similar to sending them with the setting off. There is one crucial difference: when you have indexer acknowledgment turned on, you must specify a channel when you send events.
The concept of a channel was introduced in HEC primarily to prevent a fast client from impeding the performance of a slow client. When you assign one channel per client, because channels are treated equally on Splunk Enterprise, one client can't affect another.
In order to We need a unique identifier which we can generate for example here, this will make our communication unique by using this globally unique component (in this case message) identifiers.
To test our Splunk connection, we will use Postman this time (feel free to use your own API platform to interact with Splunk).
I recommend to create a new collection in Splunk, and put all the requests there.
🛠️ Test steps
Parameters of the POST request: - As a URL, let's use ``` https://prd-p-2mqiy.splunkcloud.com:8088/services/collector ``` - Authorization type: No Auth - Headers: - Content-Type: application/json - Authorization: Splunk \ - X-Splunk-Request-Channel: \- Body: Let's just use a short sentence as an httpevent, like:
{
"event": "Let's ping Splunk",
"sourcetype": "httpevent"
}
After sending the POST request, we should see in Postman:
We need to navigate to the Search & Reporting site in Splunk Cloud (right menu pane):
We shall start a new search, into the search field we need to enter source="http:<name-of-your-http-event-token>" (index="main")
.
Fortunately we turned on the indexing option when setting up the HTTP Event Collector, now it's easy to find our messages.
As we can see, Splunk successfully received our message. Now we can set up and configure our AWS Lambda finction.
In this section, I'll show you how to configure AWS in order to send data towards Splunk, as well as the background of the 5 implementation steps.
Note: The AWS Lambda function and the API Gateway have to be configured in the same region.
We are going to use AWS Lambda, because it's a relatively cost-effective and efficient way to run code on events, for example when there is a new Snyk vulnerability.
To start with, we need to create an IAM role that we can assign to the AWS Lambda function. We need to provide basic execution roles and permissions to invoke an API Gateway which we'll be interacting with. If you're interested in the implementation, click below.
🛠️ Implementation steps
"Resource": "*"
You will see something like:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "logs:CreateLogGroup",
"Resource": "arn:aws:logs:us-west-2:880724394176:*"
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": [
"arn:aws:logs:us-west-2:880724394176:log-group:/aws/lambda/Splunk:*"
]
}
]
}
You can check, your roles should look like these (AWS build-in roles)
//AmazonAPIGatewayInvokeFullAccess
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": "*"
}
]
}
//AWSLambdaBasicExecutionRole
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"execute-api:Invoke",
"execute-api:ManageConnections"
],
"Resource": "arn:aws:execute-api:*:*:*"
}
]
}
🧞 The fastest and most convenient way is to go to Splunk's development site and create a Lambda function using a Splunk blueprint: select the "splunk-logging" blueprint option, or click here to immediate action within AWS Lambda
Alternatively, of course we can create own our JavaScript code as described below.
🛠️ Implementation steps
![]() |
1. Go to the AWS Console 2. Navigate to Lambda 3. Click on Create function 4. Choose Node.js 16.x for the Runtime 5. x86_64 for the architecture 6. Attach the previously created role ("Use an existing role") to the Lambda function (you can also create a new role, but make sure that you attach the AmazonAPIGatewayInvokeFullAccess policy in IAM to it afterwards) 8. Click on "Create function" 9. From the official Splunk Devtools site you can choose a language and find a logging script from the officially maintained scripts. You can find an official, but from Snyk not maintained example script here (last checked: 28.09.2022) "splunk-logging.js" file! They are automatically generated when using the official Splunk blueprint. |
In order to interact with Splunk and the Splunk HTTP event collector, we need to set two environment variables in AWS Lambda:
SPLUNK_HEC_URL: URL address for your Splunk HTTP event collector endpoint.
Default port for event collector is 8088.
An example can be:
https://host.com:8088/services/collector
SPLUNK_HEC_TOKEN: Token for your Splunk HTTP event collector.
To create a new token for this Lambda function, refer to the Splunk Docs.
🛠️ Implementation steps
1. Go to Configuration in your AWS Lambda
2. Click on Environment variables
3. Add new environment variables (if you created the Lambda function on your own and didn't use the Splunk blueprint):
SPLUNK_HEC_TOKEN and SPLUNK_HEC_URL.
- We have already generated our Splunk Token which we can use now.
- When configuring the URL we need to pay attention to the following configurations.
In our setup, the HEC URL is going to look like:
https://prd-p-2mqiy.splunkcloud.com:8088/services/collector?channel=2b5fcd04-f37e-4484-9610-8ea31cb510ef
You might ask, why we need the ?channel=2b5fcd04-f37e-4484-9610-8ea31cb510ef
part in the URL, you can find an explanation here and here.
Our goal is to have the Lambda function triggered by a Snyk webhook. To do this we are going to use the API Gateway provided by AWS to trigger the Lambda every time a new event is received.
🛠️ Implementation steps
The payload we are going to receive is going to have a message, so we want to create a POST method that will receive the message and verify it is a valid message and then send onwards to the Lambda.
🛠️ Implementation steps
🛠️ Mapping template code
To the template add the following code:
{
"method": "$context.httpMethod",
"body" : $input.json('$'),
"headers": {
#foreach($param in $input.params().header.keySet())
"$param": "$util.escapeJavaScript($input.params().header.get($param))"
#if($foreach.hasNext),#end
#end
}
}
With the POST method configured now we want to deploy these changes so our Lambda can start receiving the information.
🛠️ Implementation steps
Steps to deploy the POST method:
1. Go to Resources
2. Click on POST
3. Then on Actions click on Deploy API

4. Then select the Deployment stage to deploy the new API to, in this case we can use the default stage

5. We have to navigate back to AWS Lambda
6. In the Lambda trigger configuration, you should see a new API endpoint. Copy this endpoint as we will need it when setting up the Snyk webhook

With the API endpoint saved we can now set up the Snyk webhook
To test our AWS Lambda function, we will use Postman this time, as well (feel free to use your own API platform to interact with Splunk and AWS Lambda).
It is really easy to test our AWS Lambda endpoint. Since we have already configured Splunk and established a connection between AWS and Splunk, if we trigger our AWS Lambda, it will also appear in Splunk.
🛠️ Test steps
As a POST request we can send a short message to AWS. - AWS Lambda endpoint: we have already configured, [see instructions here](#252-deploying-the-post-method) - Headers: Content-Type: application/json; charset=utf-8 - Body:{
"event": "Snyk is great! Test message from Postman -> Lambda",
"sourcetype": "httpevent"
}
What we expect is the callback message from AWS Lambda:
callback("Snyk is great! Test message from AWS Lambda -> Postman", event.key1);
Let's check Splunk (Search & Reporting):
If we open the body and headers fields in the message:
To set up the Snyk webhook we are going to use the Snyk API v1 and the inbuilt console of Apiary to do this request. With this request done your connection from Snyk to Slack will be completed and every time there is a new vulnerability you will get a new notification!
Follow the instructions to set up a Snyk Webhook!
Let's test the connection: let's retest a project in your selected Snyk org!
If we go in Splunk to Search & Reporting >> Dashboards, we can check if we receive the new vulnerabilities (raw data and number of H severity vulnerabilities):
At the moment it only works in two steps:
- We have to pull code issues (of a given Snyk Project) vial the Snyk REST API. Make sure to use the right version of the API (2022-04-06~experimental)
curl --request GET "https://api.snyk.io/rest/orgs/{orgID}/issues?project_id={projID}&severity=high&type=code&version=2022-04-06%7Eexperimental" \
--header "Accept: application/vnd.api+json" \
--header "Authorization: Token {your Snyk Token}" | tee code_results.json
- Then we have to create a POST request to send the pulled data towards AWS Lambda:
curl --location --request POST 'your AWS Lambda endpoint' \
--header 'Content-Type: application/json' \
--data @code_results.json
Feel free to use the scripts rest-get-code-issues.sh and rest-get-code-issues.py