githubEdit

Forwarding to Target Destinations

From Ascent, data can be filtered and forwarded to many different monitoring tools, resource management tools, and custom endpoints.

Here is our complete list of available forwarders.

The process of configuring a forwarder is simple. To configure a Forwarder, navigate to the Forwarder page link first and select your preferred forwarder.

Configuring a Forwarder (Example)

Below, an example of configuring a Splunk HTTP Event Collector:

  1. Creating an HTTP Event Collector Data Input key from Splunk

    • Navigate to your Splunk Environment

    • Locate the Settings menu

    • Locate the Data Inputs sub-menu

    • Click on the New Token option, which is located on the top banner

    • Enter a Token name, and skip to the last pag,e and click Done

    • Use the generated HTTP Event Collector key in Ascent

  2. Creating a Splunk HTTP Event Collector on Apica

    • Navigate to the Create Forwarders page

    • Click on Forwarders

    • Click on the Splunk HTTP Event Collector

Create an Splunk Forwarder
  • Fill out all the below fields and click Create

    • buffer_size: The Buffer size for logs

    • host: Splunk Endpoint

    • password: Data Input Key created in step 1

    • port: Splunk server receiving port (default 8088)

    • type: log format (default _json, or set to _metric to send to a metric index)

    • user: UI username of Splunk Endpoint

    • name: Name of the forwarder

That's it. You've successfully created the Splunk HTTP Event Collector forwarder. Now navigate to the Explore page and start doing Mapping or Replay operations.

Currently, Ascent supports the following target destinations (among others):

Target
Type
Description

Syslog, TCP, CEF

Forward syslog frames over TCP

Syslog, TCP

Forward ArcSight CEF frames over TCP

JSON

Batched JSON forward to Datadog

JSON

Batched JSON forward to Dynatrace

JSON

Send data to an Elastic index

TCP

Syslog forwarder for RSA Netwitness

TCP, CEF

Syslog CEF forwarder for RSA Netwitness

JSON

Batched JSON forward to NewRelic

JSON

Batched JSON forward to Splunk

Syslog, TCP

Syslog forwarder for Splunk

Syslog, TCP, CEF

Syslog CEF forwarder for Splunk

S2S

Forward data to Apica in Cooked mode

S3 Compatible

AWS S3, CEPH, Minio, GCP Cloud Storage, OCI Buckets

Object Store (Azure Blob)

Azure Blob Storage

Native support for Azure blob storage API's

OpenTelemetry

Forward Logs, Metrics, Traces to OpenTelemetry compatible destinations.

Last updated

Was this helpful?