Getting Started with Flow
This guide provides a walkthrough of configuring your data pipelines using Ascent's Flow solution.
Quick Start Guide for Using Ascent's Flow Solution to Optimize Your Data Pipelines.
This guide will teach you how to use the Flow solution to optimize your data pipelines. You will learn how to create a processing pipeline that filters unnecessary data from your logs, reducing storage costs. Finally, you will learn how to route that streamlined data to a downstream observability platform, such as Datadog.
For a full video walkthrough, please refer to our video guide:
**EMBED VIDEO LINK HERE**
Let's begin.
Prerequisite: Make sure to have logs ingested into the Ascent platform before getting started.
Step 1: Create A New Pipeline:
Go to -> Explore -> Pipeline
Click -> Actions -> Create Pipeline


Enter a name for the new Pipeline and press "Create"
Step 2: Create A Filter Rule:
Click on the 3 dotted lines menu for the Pipeline you created
Click on "Configure Pipeline"

Click on "Add Rule" -> FILTER

Enter mandatory fields:
Name / Group

Click on "Drop Labels"
Then, click on "Data Flows"

Next, you will select what labels you want to drop.

Enter the labels you want to drop on the left hand side as shown below:

To preview the changes, go to the right-hand side and click "Pipeline Preview" -> "Run Pipeline"


Click "Save Pipeline"
Next, "Apply Pipeline" by clicking on the 3 dot menuand clicking "Apply Pipeline"

Then, select the namespace and logs you want to apply the new FILTER RULE ( in this case, we are applying it to our "OtelDemo" logs

Click "Apply"

Step 2: Create A Filter Rule:
Create a Forwarder (Datadog in this example), to push our FILTERED OTEL logs downstair to another Observability platform.
Click on "Integrations" -> "Forwarders"

Step 3: Create A Forwarder (DataDog in this example):
Click on "Add Forwarder" and select your destination (Datadog in our example)

Then, copy over the "DataDog (JSON) configs as shown below:
Buffer_size: 16000
Host: app.datadog.com
Tags: logs
Type: JSON
Name: Datadog Forwarder

Click "Create"
Step 4: Assign the Forwarder to the Logs:
Next, go back to "Pipelines" and

Click on "Map Forwarder" from the 3 dot menu:

Select the "DataDog Forwarder" that you created and click OK:

Step 5: Verify Data in Destination (Datadog in this example):
Go to your Datadog dashboard and verify data coming in as expected:


As you can see, data ingestion has decreased after the FILTER rule was applied:

Conclusion:
By following this guide, you have learned how to successfully use Flow to manage and optimize your telemetry data. You now know how to build a data pipeline that filters unneeded fields, drops irrelevant log messages entirely, and forwards the clean, cost-effective data to a downstream platform like Datadog. Applying these techniques allows you to significantly reduce observability costs while maintaining cleaner and more efficient data pipelines.
Last updated
Was this helpful?