# Forwarding to Target Destinations

From Ascent, data can be filtered and forwarded to many different monitoring tools, resource management tools, and custom endpoints.&#x20;

Here is our [complete list of available forwarders](https://docs.apica.io/flow/list-of-forwarders).

The process of configuring a forwarder is simple. To configure a Forwarder, **navigate to the Forwarder page** [**link**](https://docs.apica.io/flow/list-of-forwarders) first and select your preferred forwarder.

#### Configuring a Forwarder (Example)

Below, an example of configuring a Splunk HTTP Event Collector:

1. **Creating an HTTP Event Collector Data Input key from Splunk**
   * Navigate to your Splunk Environment
   * Locate the Settings menu
   * Locate the Data Inputs sub-menu
   * Click on the New Token option, which is located on the top banner
   * Enter a Token name, and skip to the last pag,e and click Done
   * Use the generated **HTTP Event Collector** key in Ascent
2. **Creating a Splunk HTTP Event Collector on Apica**
   * Navigate to the Create Forwarders page
   * Click on Forwarders
   * Click on the Splunk HTTP Event Collector

<figure><img src="https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2FScx6nyDdLNYwgGPTRY9J%2Fimage.png?alt=media&#x26;token=44d3964d-d2ad-4b14-9c4a-8ebf93bf5d74" alt=""><figcaption><p>Create an Splunk Forwarder</p></figcaption></figure>

* Fill out all the below fields and click Create
  * **buffer\_size**: The Buffer size for logs
  * **host**: Splunk Endpoint
  * **password**: Data Input Key created in step 1
  * **port**: Splunk server receiving port (default 8088)
  * **type:** log format (*default* \_*json, or set to \_metric to send to a metric index*)
  * **user:** UI username of Splunk Endpoint
  * **name**: Name of the forwarder

That's it. You've successfully created the Splunk HTTP Event Collector forwarder. Now navigate to the Explore page and start doing Mapping or Replay operations.

Currently, **Ascent** supports the following target destinations (among others):

| Target                                                                                                                                   | Type               | Description                                                             |
| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------ | ----------------------------------------------------------------------- |
| [ArcSight Logger](https://docs.apica.io/flow/list-of-forwarders/arc-sight)                                                               | Syslog, TCP, CEF   | Forward syslog frames over TCP                                          |
| [ArcSight Logger](https://docs.apica.io/flow/list-of-forwarders/arc-sight)                                                               | Syslog, TCP        | Forward ArcSight CEF frames over TCP                                    |
| [Datadog](https://docs.apica.io/flow/list-of-forwarders/datadog-forwarding)                                                              | JSON               | Batched JSON forward to Datadog                                         |
| [Dynatrace HTTP Event Collector](https://docs.apica.io/flow/list-of-forwarders/dynatrace-forwarding)                                     | JSON               | Batched JSON forward to Dynatrace                                       |
| [Elastic Compatible](https://docs.apica.io/flow/list-of-forwarders/elasticsearch-forwarding)                                             | JSON               | Send data to an Elastic index                                           |
| [RSA NetWitness Syslog](https://docs.apica.io/flow/list-of-forwarders/rsa-new-witness)                                                   | TCP                | Syslog forwarder for RSA Netwitness                                     |
| [RSA NetWitness Syslog (CEF)](https://docs.apica.io/flow/list-of-forwarders/rsa-new-witness)                                             | TCP, CEF           | Syslog CEF forwarder for RSA Netwitness                                 |
| [NewRelic HTTP Event Collector](https://docs.apica.io/flow/list-of-forwarders/new-relic-forwarding)                                      | JSON               | Batched JSON forward to NewRelic                                        |
| [Splunk HTTP Event Collector](https://docs.apica.io/flow/list-of-forwarders/splunk-forwarding/splunk-http-event-collector-hec-forwarder) | JSON               | Batched JSON forward to Splunk                                          |
| [Splunk Universal / Heavy Forwarder](https://docs.apica.io/integrations/list-of-integrations/splunk-universal-forwarder)                 | Syslog, TCP        | Syslog forwarder for Splunk                                             |
| [Splunk Universal CEF Forwarder](https://docs.apica.io/integrations/list-of-integrations/splunk-universal-forwarder)                     | Syslog, TCP, CEF   | Syslog CEF forwarder for Splunk                                         |
| [Splunk Universal Forwarder / Heavy Forwarder](https://docs.apica.io/integrations/list-of-integrations/splunk-heavy-forwarder)           | S2S                | Forward data to Apica in Cooked mode                                    |
| [Object Store](https://docs.apica.io/flow/list-of-forwarders/s3-compatible) (S3)                                                         | S3 Compatible      | AWS S3, CEPH, Minio, GCP Cloud Storage, OCI Buckets                     |
| [Object Store](https://docs.apica.io/flow/object-store-forwarding) (Azure Blob)                                                          | Azure Blob Storage | Native support for Azure blob storage API's                             |
| [OpenTelemetry](https://docs.apica.io/flow/broken-reference)                                                                             | OpenTelemetry      | Forward Logs, Metrics, Traces to OpenTelemetry compatible destinations. |
