# Splunk integration

## Introduction

This article is meant to be used in conjunction with the [Splunk Integration Guide for Vectra AI](https://docs.vectra.ai/configuration/response/siem/splunk-siem-vectra-integration-guide-start-here-for-qux) (for the Quadrant UX) or the [Splunk Integration Guide for Vectra XDR](https://docs.vectra.ai/configuration/response/siem/splunk-siem-vectra-integration-guide-start-here-for-rux) (For the Respond UX). If you are unsure of which UX you are using, please see [Vectra Analyst User Experiences (Respond vs Quadrant)](https://docs.vectra.ai/deployment/getting-started/analyst-ux-options-rux-vs-qux). Either article will provide the following:

* An overview of Vectra's data sources that can feed Splunk.
* A listing of Vectra's Splunk Add-ons and Apps.
  * Links are provided to the Add-ons and Apps in Splunkbase.
* An installation matrix showing what needs to be installed where in your Splunk environment.
* Common prerequisites for any Vectra / Splunk integration

As of Vectra software versions 8.1 and above, Vectra supports the use of Splunk's HTTP Event Collector (HEC) as the recommended publisher option for Stream metadata transport to Splunk.

## Overview of Steps

{% stepper %}
{% step %}
[**Read integration guide for RUX or QUX**](#introduction)

Links are in the [introduction](#introduction). Keep it hand for links to Splunk apps and guidance to go along with this article.
{% endstep %}

{% step %}
[**Create a Splunk index**](#id-2.-create-a-splunk-index-for-vectra-stream-metadata)

To be used for Vectra Stream metadata using the instructions below.
{% endstep %}

{% step %}
[**Install the Add-on using the instructions below**](#id-3.-stream-add-on-installation)
{% endstep %}

{% step %}
**Create Splunk Data input**

Choose option [4A](#id-4a-configure-a-splunk-hec-data-input-token) or [4B](#id-4b-install-syslog-server-universal-forwarder-network-data-input):

[4A](#id-4a-configure-a-splunk-hec-data-input-token) - Configure a Splunk HEC Data Input / Token (recommended in Vectra v8.1 and higher)

[4B](#id-4b-install-syslog-server-universal-forwarder-network-data-input) - Install Syslog Server / Universal Forwarder / Network Data input.
{% endstep %}

{% step %}
[**Install the App using the instructions below**](#id-5.-stream-app-installation)
{% endstep %}

{% step %}
[**Configure Stream to send metadata to Splunk using either**](#id-6.-configuring-vectra-stream-to-send-metadata-to-splunk)

* Publisher of "Splunk HEC" with Splunk HEC URI Format matching "Splunk Enterprise" or "Splunk or Splunk Cloud Platform" to the server IP/Hostname and port number of your choice where Splunk will be listening. You will also need to input an HEC token from your Splunk deployment. This is the recommended method in Vectra software versions 8.1 and above.
* Publisher of "Raw JSON" using Protocol of "TCP" to the server IP/Hostname and port number of your choice where Splunk will be listening.
  {% endstep %}
  {% endstepper %}

## 2. Create a Splunk Index for Vectra Stream Metadata

* Create a new index in Splunk, you can select the "App" as Search & Reporting.
* Configure other options as desired.

## 3. Stream Add-on Installation

* The Add-on must be installed on Search Heads.
* For standalone deployments
  * Install the Add-on
* For distributed deployments:
  * If data is collected through Intermediate Heavy Forwarders, the Add-on must be installed on Heavy Forwarders and should not be installed on indexers.
  * If you do not have Heavy Forwarders, the Add-on must be installed on the indexers.
* The Add-on expects an initial source type named `vectra:stream:json`, the source type will be transformed into more specific ones (see source type list).
* **Please Note!! The Add-on is installed by default with Global permissions.**

### Source types list

The initial source type must be set to `vectra:stream:json` then a set of transform rules included in the Add-on is modifying the source type based on the type of events received:

* vectra\_isession
* vectra\_ssl
* vectra\_x509
* vectra\_dns
* vectra\_beacon
* vectra\_http
* vectra\_dhcp
* vectra\_radius
* vetcra\_smbfiles
* vectra\_smbmapping
* vectra\_kerberos
* vectra\_ntlm
* vectra\_dcerpc
* vecta\_ldap
* vectra\_ssh
* **vectra\_smtp**

{% hint style="info" %}
**Tip:**

To validate that the add-on is working as expected, filter on the index where the Vectra NDR/Detect events are located and look at the source types list. It must contains a subset of the the above list. If only vectra:stream:json is shown, it means the Add-on is not working as expected.
{% endhint %}

### Supported CIM Data models

The following data models are supported:

* Network Traffic (isession metadata)
* Network Resolution (dns metadata)
* Email (smtp metadata)
* dhcp (Network Sessions)
* httpsessioninfo (Web)

More information on [CIM Data models](https://docs.splunk.com/Documentation/CIM/5.0.1/User/Overview).

## 4A - Configure a Splunk HEC Data Input / Token

{% hint style="info" %}
This is the recommended option in Vectra software versions 8.1 and higher.
{% endhint %}

Details below are for Splunk Enterprise. If using Splunk Cloud the procedure is similar. Please see <https://docs.splunk.com/Documentation/Splunk/9.1.2/Data/UsetheHTTPEventCollector> for details from Splunk.

* Navigate in Splunk to *Settings > Data Inputs > HTTP Event Collector.*
* Click on "Global Settings" in the top right.
  * Ensure that the "Enable SSL" checkbox is checked to allow Vectra to communicate with your Splunk instance and save your settings.
* Click on "New Token" in the top right and step through the process of creating a new HEC token.
  * On the 1st screen:
    * Give your token a name.
    * DO NOT enable "Enable Indexer Acknowledgement" as it is not supported and if selected, data will not flow from Vectra to Splunk.
  * On the 2nd screen:
    * "Select" a source type of "vectra:stream:json".
    * "Select" the "Allowed Index" you previously created for Stream metadata.
  * On the 3rd screen:
    * Review your configuration and the click "Submit"
  * On the 4th screen:
    * Copy your token for later input to Vectra.

After completing the above steps, you should have a new token with a configuration that looks like similar to this (just click on the token name to compare your token to this sample):

![](https://4227135129-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHJ1ltuWFvsArFWtevnRn%2Fuploads%2Fgit-blob-f75532118bec61b147e40eca44b1f15eaacab808%2F69caf46617bec49af1b58347c096c4af3becf321fc7ccd4e2f7c2f350016370b.jpg?alt=media)

## 4B - Install Syslog Server / Universal Forwarder / Network Data Input

{% hint style="warning" %}
**Only do this if using this type of Publisher in Vectra for Stream metadata. As per the "Overview of Steps" above, Splunk HEC is the recommended Publisher option for Splunk integration in v8.1 and higher of Vectra software. A syslog server must be installed to receive logs from Vectra Detect (outside of the scope of this guide).**
{% endhint %}

* Install the [Splunk Universal Forwarder](https://www.splunk.com/en_us/download/universal-forwarder.html) on the syslog server.
* Configure Splunk inputs.conf and Network Input (see example below):

```ckeditor_codeblock
## inputs.conf example. The stanza should be copied e.g. to $SPLUNK_HOME/etc/system/local/inputs.conf

# Example input when syslog-ng/rsyslog is used

[monitor:///var/log/vectra/*.log*]
index = <your destination index goes here>
sourcetype = vectra:stream:json 
```

### Creating a Splunk Network Input

1. On the Splunk dashboard, click Settings > Data Inputs
2. Click “Add new” under TCP or UDP\
   a. Select TCP or UDP\
   b. Add a port number\
   c. As *Source type*, select one from the list, `vectra:stream:json`\
   d. As *Index*, create a new one\
   e. Fill *Index Name* e.g. vectra\_stream
3. Save

## 5. Stream App Installation

The App must be installed on Search Heads:

* On the main Splunk dashboard, click the "+ Find More Apps" sign to open the app browser (or Manage > Find more Apps).
* Search the app store for Vectra Cognito Stream.
  * Click Install.
* Return to the main dashboard.

Once the installation is completed, the macro's configuration needs to be updated:

* Navigate to *Settings > Advanced search.*
* Click on *Search macros.*
* In the App dropdown list, select Vectra Cognito Stream.
* A macro named: *cognito\_stream\_index* must be listed.
  * Click on the name to edit it.
* Update the definition to match the name of the index where **Vectra Stream** events are located.
* A macro named: vectra\_cognito\_index must be listed.
  * Click on the name to edit it.
* Update the definition to match the name of the index where **Vectra NDR/Detect** events are located.
* Save.

**Please Note!! 99% of visualizations in the App are using the Vectra Stream data source. Only 3 visualizations are using the data from Vectra NDR/Detect (Host View dashboard).**

## 6. Configuring Vectra Stream to Send Metadata to Splunk

### Option 4A Example - Configure a Splunk HEC Data Input / Token

In your Vectra UI, navigate to *Settings > Stream* and set your Splunk HEC Publisher options similarly to this example:

![](https://4227135129-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHJ1ltuWFvsArFWtevnRn%2Fuploads%2Fgit-blob-f157f2fe957e8df0d8556471e0fb4e861cc0017f%2F869d7baa92028bfe9cc1fdf735ddcbee73707a1cdede680299f17d3098b11dea.jpg?alt=media)

### Option 4B Example - Install Syslog Server / Universal Forwarder / Network Data Input

In your Vectra UI, navigate to \*Settings > Stream \*and set your destination as shown below, using whatever port your have configured your Splunk deployment to listen on:<br>

<figure><img src="https://4227135129-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHJ1ltuWFvsArFWtevnRn%2Fuploads%2Fgit-blob-23d4c3ca2306a856150f696f874944fd284b34af%2F4e299bd79b3e6e1c1f89a527af06bad1d016a27b015a4b1574ee101e20fde312.jpg?alt=media" alt=""><figcaption></figcaption></figure>
