Elastic, Syslog, and Kafka

Details for Elastic, Syslog, and Kafka Stream publishers.

Elastic

Vectra Stream creates new indices in your Elasticsearch cluster so as not to conflict with existing indices. It creates daily indices for each metadata stream to make it easy to clean up old indices from elastic based on your retention needs. The index names are as follows, where the date is in the format of year.month.day (e.g. 2018.02.19).

Metadata Type

Index Name

DCE/RPC

metadata_dcerpc-<date>

DHCP

metadata_dhcp-<date>

DNS

metadata_dns-<date>

HTTP

metadata_httpsessioninfo-<date>

iSession

metadata_isession-<date>

Kerberos

metadata_kerberos_txn-<date>

LDAP

metadata_ldap-<date>

NTLM

metadata_ntlm-<date>

RDP

metadata_rdp-<date>

SMB Files

metadata_smbfiles-<date>

SMB Mapping

metadata_smbmapping-<date>

SSL/TLS

metadata_ssl-<date>

X509

metadata_x509-<date>

Beacon

metadata_beacon-<date>

SSH

metadata_ssh-<date>

SMTP

metadata_smtp-<date>

It is important that you enable auto-indexing in your elastic cluster, otherwise elastic will fail to create the new indices as it receives the metadata. You can enable automatic indexing by modifying the auto_create_index in Elasticsearch. This can be enabled globally or for a specific index name as shown below:

"action.auto_create_index": "+metadata_*"

Once auto creation on indices is enabled, you can create index patterns in your Kibana UI if you are accessing the data through Kibana.

Example Elasticsearch document

Syslog

Vectra Stream sends metadata over syslog (TCP or SSL). On the receiver (e.g. Splunk), create a syslog receiver on a specific port and protocol (TCP or SSL). Once created, you can input the IP address and port of the syslog receiver in the Cognito Stream setting in the UI.

Example Syslog Output

Kafka

Vectra Stream sends each metadata type as its own topic to the Kafka broker. The content of the topic contains key value pairs for all the attributes for the particular metadata. Users can optionally specify a user defined string that will be used as a prepend for all Kafka topic names (max of 32 characters). This allows for each identification of topics downstream

The following are the names of the topics per metadata type:

Metadata Type

Index Name

DCE/RPC

metadata_dcerpc

DHCP

metadata_dhcp

DNS

metadata_dns

HTTP

metadata_httpsessioninfo

iSession

metadata_isession

Kerberos

metadata_kerberos_txn

LDAP

metadata_ldap

NTLM

metadata_ntlm

RDP

metadata_rdp

SMB Files

metadata_smbfiles

SMB Mapping

metadata_smbmapping

SSL/TLS

metadata_ssl

X509

metadata_x509

Beacon

metadata_beacon

SSH

metadata_ssh

SMTP

metadata_smtp

Users can optionally specify a user defined string that will be used as a prepend for all Kafka topic names (max of 32 characters). This allows for each identification of topics downstream.

You either need to name automatic topic creation enabled in Kafka, or manually create these topics before enabling forwarding to Kafka. To enable automatic topic creation, set the auto.create.topics.enable attribute to true in the Kafka properties config file.

Example Kafka Message:

The Kafka publisher supports SASL and TCP or SSL protocols:

Last updated

Was this helpful?