Apica Docs
  • Welcome to Apica Docs!
  • PRODUCT OVERVIEW
    • Ascent Overview
    • Ascent User Interface
  • TECHNOLOGIES
    • Ascent with Kubernetes
      • Kubernetes is a Game-Changer
      • Ascent: Built on Kubernetes
    • Ascent with OpenTelemetry
      • Why Implement OpenTelemetry?
      • Common Use Cases for OpenTelemetry
      • How to Get Started with OpenTelemetry
      • Best Practices for OpenTelemetry Implementations
  • RELEASE NOTES
    • Release Notes
      • Ascent 2.10.5
      • Ascent 2.10.4
      • Ascent 2.10.3
      • Ascent 2.10.2
      • Ascent 2.9.0
      • Ascent 2.8.1
      • Ascent 2.8.0
      • Ascent 2.7.0
      • Ascent 2.6.0
      • Ascent 2.5.0
      • Ascent 2.4.0
      • Ascent 2.3.0
      • Ascent 2.2.0
      • Ascent 2.1.0
        • Data Fabric
          • Releases-old
        • Synthetic Monitoring
        • Advanced Scripting Engine
        • IRONdb
      • Synthetic Monitoring
  • GETTING STARTED
    • Getting Started with Ascent
      • Register and Gain Access
      • Using the OpenTelemetry Demo
      • Getting Started with Metrics
      • Getting Started with Logs
        • OpenTelemetry
    • Ascent Deployment Overview
    • Quickstart with Docker-Compose
    • On-Premise PaaS deployment
      • On-Premise PaaS Deployment Architecture
      • Deploying Apica Ascent PaaS on Kubernetes
      • Deploying Apica Ascent PaaS on MicroK8s
      • Deploying Apica Ascent PaaS on AWS
      • Deploying Apica Ascent EKS on AWS using CloudFormation
      • Deploying Ascent on AWS EKS with Aurora PostgreSQL and ElastiCache Redis using Cloud Formation
        • Deploying Apica Ascent on AWS EKS with Aurora PostgreSQL and ElastiCache Redis using CloudFormation
        • Apica Ascent on AWS EKS (Private Endpoint) with Aurora PostgreSQL and ElastiCache Redis on prod VPC
      • Deploying Apica Ascent EKS on AWS using custom AMI
      • Deploying Apica Ascent EKS with AWS ALB
      • Deploying Apica Ascent PaaS in Azure Kubernetes Service
        • Azure Blob Storage Lifecycle Management
      • Deploying Apica Ascent with OpenShift
    • Boomi RTO Quick Start Guide
      • RTO Dashboarding
      • Alerting on RTO Metrics
      • Alerting on RTO Logs
    • Dashboards & Visualizations
  • DATA SOURCES
    • Data Source Overview
    • API
      • JSON Data source
      • RSS
    • AWS
      • Amazon Athena
      • Amazon CloudWatch ( YAML )
      • Amazon Elasticsearch Service
      • Amazon Redshift
      • MySQL Server (Amazon RDS)
    • NoSQL Data Sources
      • MongoDB
    • OLAP
      • Data Bricks
      • Druid
      • Snowflake
    • SQL Data Sources
      • PostgreSQL
      • Microsoft SQL Server
      • MySQL Server
    • Time Series Databases
      • Prometheus Compatible
      • Elasticsearch
      • InfluxDB
    • Ascent Synthetics
      • Checks
    • Ascent Logs
      • Logs
  • INTEGRATIONS
    • Integrations Overview
      • Generating a secure ingest token
      • Data Ingest Ports
    • List of Integrations
      • Apache Beam
        • Export Metrics to Prometheus
          • Pull Mechanism via Push-Gateway
        • Export Events to Apica Ascent
      • Apica ASM
      • Apica Ascent Observability Data Collector Agent
      • AWS
        • AWS CloudWatch
        • AWS ECS
          • Forwarding AWS ECS logs to Apica Ascent using AWS FireLens
          • ECS prometheus metrics to Apica Ascent
        • AWS S3
      • Azure
        • Azure Databricks
        • Azure Eventhub
        • Azure Event Hubs
      • Docker Compose
      • Docker Swarm logging
      • Docker Syslog log driver
      • F5 Big-Ip System
      • Filebeat
      • Fluent Bit
        • Forwarding Amazon-Linux logs to Apica Ascent using Fluent Bit
        • Fluent Bit installation on Ubuntu
        • Enabling IoT(MQTT) Input (PAAS)
        • IIS Logs on Windows
      • Fluentd
      • FortiNet Firewalls
      • GCP PubSub
      • GCP Cloud Logging
      • IBM QRadar
      • ilert
      • Incident Management
        • Webhooks
      • Jaeger
      • Kafka
      • Kinesis
      • Kubernetes
      • Logstash
      • MQTT
      • Network Packets
      • OpenTelemetry
      • Object store (S3 Compatible)
      • Oracle OCI Infrastructure Audit/Logs
      • Oracle Data Integrator (ODI)
      • OSSEC Variants (OSSEC/WAZUH/ATOMIC)
        • Apica Ascent-OSSEC Agent for Windows
      • Palo Alto Firewall
      • Prometheus
        • Spring Boot
        • Prometheus on Windows
        • Prometheus Remote Write
        • MongoDB Exporter
        • JMX Exporter
      • Rsyslogd
      • Syslog
      • Syslog-ng
      • Splunk Universal Forwarder
      • Splunk Heavy Forwarder
      • SNMP
      • Splunk Forwarding Proxy
      • Vault
        • Audit Vault Logs - AWS
        • Audit Vault Logs - OCI
        • Audit Vault Metrics
    • Apica API DOCS
  • DATA MANAGEMENT
    • Data Management Overview
    • Data Explorer Overview
      • Query Builder
      • Widget
      • Alerts
      • JSON Import
      • Creating Json Schema
        • Visualization
          • Line chart
          • Bar chart
          • Area chart
          • Scatter chart
          • Status chart
          • Counter chart
          • Stat chart
          • Size chart
          • Dense Status chart
          • Honeycomb chart
          • Gauge chart
          • Pie chart
          • Disk chart
          • Table chart
          • Date time chart
      • Time-Series AI/ML
        • Anomaly Detection
        • Averaging
        • Standard Deviation(STD)
      • Data Explorer Dashboard
        • Create a Dashboard
        • Editing Dashboard
          • Dashboard level filters
    • Timestamp handling
      • Timestamp bookmark
    • Large log/events/metrics/traces
  • OBSERVE
    • Monitoring Overview
      • Connecting Prometheus
      • Connecting Amazon Managed Service for Prometheus
      • Windows Redis Monitoring
      • Writing queries
        • Query Snippets
      • Query API
      • Use Apica API to ingest JSON data
    • Distributed Tracing
      • Traces
      • Spans
      • Native support for OTEL Traces
      • Windows .NET Application Tracing
      • Linux+Java Application Tracing
    • Log Management
      • Terminology
      • Explore Logs
      • Topology
      • Apica Ascent Search Cheat Sheet
      • Share Search Results
      • Severity Metrics
      • Log2Metrics
      • Native support for OTEL Logs
      • Reports
        • Accessing Reports results via API
      • Role-Based Access Control (RBAC)
      • Configuring RBAC
    • AI and LLM Observability
      • AI Agent Deployment
      • Ascent AI Agent Monitoring
      • Ascent Quick Start Guide
    • Synthetic Check Monitoring
      • Map View
      • List View
      • Alerting for Check Results
  • Flow
    • Overview
    • Pipeline Management
      • Configuring Pipelines
      • Visualize Pipelines
      • Pipeline Overview Dashboard
      • Forwarding Data
    • OpenTelemetry Ingest
      • OpenTelemetry Logs / Traces
      • OpenTelemetry Metrics
        • Transforming Metrics through Code Rules
    • Vault
      • Certificates
      • Variables
      • Lookups
    • Rules
      • FILTER
      • EXTRACT
      • SIEM and TAG
      • REWRITE
      • CODE
      • FORWARD
        • Rename Attributes
      • STREAM
    • Functions
      • ascent.encode
      • ascent.decode
      • ascent.persist
      • Ascent.variables
      • ascent.crypto
      • Ascent.mask
      • Ascent.net
      • Ascent.text
      • Ascent.time
      • Ascent.lookups
    • List of Forwarders
    • OpenTelemetry Forwarding
      • Metrics
      • Traces
      • Logs
    • Splunk Forwarding
      • Apica UF Proxy App Extension
        • Standalone Instance
        • List of Indexer Instances
        • Indexer Discovery
      • Splunk HTTP Event Collector (HEC) Forwarder
        • Metric Indexes
        • Non Metric Indexes
      • Splunk Syslog Forwarding
    • Real-Time Stream Forwarding
      • AWS Kinesis
      • Azure Eventhub
      • Google Pub/Sub
    • Security Monitor Forwarding
      • Arc Sight
      • RSA New Witness
    • Forwarding to Monitoring Tools
      • Datadog Forwarding
      • New Relic Forwarding
      • Dynatrace Forwarding
      • Elasticsearch Forwarding
      • Coralogix Forwarding
      • Azure Log Analytics Forwarding
    • Object Store Forwarding
      • S3 Compatible
      • Azure Blob Storage
    • Forwarding to Data Warehouse
      • GCP Bigquery
  • Customized Forwarders
    • JS Code Forwarding
  • LAKE
    • Powered by Instastoreâ„¢
  • FLEET MANAGEMENT
    • Overview
    • Agents
    • Configurations
    • Packages
    • Fleet Repository Management
    • Advanced Search
    • List of Agents
      • Datadog Agent
      • Fluent-bit Agent
      • Grafana Alloy
      • OpenTelemetry Collector
      • OpenTelemetry Kubernetes
      • Prometheus Agent
  • COMMAND LINE INTERFACE
    • apicactl Documentation
  • AUTONOMOUS INSIGHTS
    • Time Series AI-ML
      • Anomaly Detection
      • Averaging
      • Standard Deviation(STD)
      • Forecasting
      • AI-ML on PromQL Query Data Set
      • Statistical Data Description
    • Pattern-Signature (PS)
      • Log PS Explained
        • Unstructured Logs
        • Semi-structured JSON
        • Reduce Logs Based on PS
        • Log PS Use Cases
          • Log Outlier Isolation
          • Log Trending Analysis
          • Simple Log Compare
      • Config PS
        • Config JSON PS
    • ALIVE Log Visualization
      • ALIVE Pattern Signature Summary
      • ALIVE Log Compare
    • Log Explained using Generative AI
      • Configuring Generative AI Access
      • GenAI Example Using Log Explain
    • Alerts
    • Alerts (Simple/Anomaly)
    • Alerts On Logs
    • Rule Packs
    • AI-powered Search
  • PLATFORM DOCS
    • Synthetic Monitoring Overview
      • Getting Started with ASM
        • Achieving 3 Clicks to Issue Resolution via ASM
        • FAQ - Frequently Asked Questions
        • Creating A New Check
          • Creating a New Real Browser Check
      • Explore the Platform
        • API Details
        • Check Types
          • Android Check
          • Command Check
          • Compound Check
          • Browser Check
          • Desktop Application Check
          • AWS Lambda Check
          • DNS Resolver Check
          • DNS Security Check
          • Domain Availability Check
          • Domain Delegation Check
          • Domain Expiration Date Check
          • Hostname Integrity Check
          • iPad Check
          • iPhone Check
          • Ping Check
          • Port Check
          • Postman Check
          • Response Time Check
          • SSL Certificate Expiration Check
          • Scripted Check
        • Dashboards
        • Integrations
          • DynaTrace Integration
          • Google Analytics Integration
          • Akamai Integration
          • Centrify Integration
          • AppDynamics Integration
          • PagerDuty Integration
          • ServiceNow Integration
          • Splunk Integration
        • Metrics
          • Analyze Site
          • Result Values
          • Trends
          • Analyze Metrics
        • Monitoring
          • Integrating ASM Metrics into Grafana Using Apica Panels
            • Understanding the ASM Imported Dashboards
            • Using the Apica Panels Dashboards
          • Understanding ASM Check Host Locations
        • Navigation
          • Manage Menu
        • Reports
        • Use Cases
      • Configurations
        • Configuring Checks
          • Understanding Check Results
            • Understanding ZebraTester Check Results
            • Understanding Browser Check Results
            • Understanding Check Details
          • Editing Checks
            • Editing Browser Checks
            • Editing ZebraTester Checks
          • Using Regular Expressions Within the ASM Platform
          • Understanding the Edit Scenario Page
          • Comparing Selenium IDE Scripts to ASM Scenarios
          • Configuring Apica DNS Check Types
          • Implementing Tags Effectively Within ASM
          • Storing and Retrieving Information Using the ASM Dictionary
        • Configuring Users
          • Configuring SSO Within ASM
        • Configuring Alerts
          • Configuring Webhook Alerts
      • How-To Articles
        • ASM Monitoring Best Practices
        • API Monitoring Guide
        • IT Monitoring Guide
        • Monitor Mission-Critical Applications through the Eyes of Your Users
        • How To Mask Sensitive Data in ASM
        • How to Mask Sensitive Data When Using Postman Checks
        • How to Handle URL Errors in a Check
        • How To Set Up SSO Using Azure AD
        • How to Set Up SSO Using Centrify
        • ASM Scenarios How-To
          • How To Pace a Selenium Script
          • How to Utilize XPath Within a Selenium Script
          • How to Mask Sensitive Information Within an ASM Scenario
          • Handling Elements Which Do Not Appear Consistently
          • How to Handle HTML Windows in ASM Scenarios
        • Installing CES Private Agent (Docker)
    • ZebraTester Scripting
      • ZebraTester Overview
      • Install ZebraTester
        • Download ZebraTester
          • Core ZebraTester V7.5-A Documentation
          • Core ZebraTester V7.0-B Documentation
          • Core ZebraTester V7.0-A Documentation
          • Core ZebraTester V5.5-Z Documentation
          • Core ZebraTester V5.5-F Documentation
        • Download the ZebraTester Recorder Extension
        • Windows Installation
          • ZebraTester on Windows
          • Generate Private CA Root Certificate
          • Windows System Tuning
          • Install a new ZT version on Windows Server
          • Install/Uninstall ZT Windows Installer Silently
        • macOS Installation
          • macOS Preinstallation Instructions
          • Generate Private CA Root Cert (macOS X)
          • System Tuning (macOS)
          • Import a CA Root Certificate to an iOS device
          • Memory Configuration Guidelines for ZebraTester Agents
      • ZebraTester User Guide
        • Menu and Navigation Overview
        • 1. Get a Load Test Session
          • Recording Web Surfing Sessions with ZebraTester
            • Further Hints for Recording Web Surfing Sessions
            • Recording Extension
              • Record Web Session
              • Cookies and Cache
              • Proxy
              • Page Breaks
              • Recording Extension Introduction
              • Troubleshooting
            • Add URL to ZebraTester
            • Page Scanner
          • Next Steps after Recording a Web Surfing Session
        • 2. Scripting the Load Test Session
          • 1. Assertions - HTTP Response Verificaton
          • 2. Correlation - Dynamic Session Parameters
            • 2b. Configuring Variable Rules
            • 2a. Var Finder
          • 3. Parameterization: Input Fields, ADR and Input Files
            • ADR
          • 4. Execution Control - Inner Loops
          • 5. Execution Control - URL Loops
          • 6. Execution Control -User-Defined Transactions And Page Breaks
          • 7. Custom Scripting - Inline Scripts
          • 8. Custom Scripting - Load Test Plug-ins
            • ZebraTester Plug-in Handbooks
          • Modular Scripting Support
        • 3. Recording Session Replay
        • 4. Execute the Load Test
          • Executing a First Load Test
          • Executing Load Test Programs
            • Project Navigator
              • Configuration of the Project Navigator Main Directory
            • Real-Time Load Test Actions
            • Real-Time Error Analysis
            • Acquiring the Load Test Result
            • More Tips for Executing Load Tests
          • Distributed Load Tests
            • Exec Agents
            • Exec Agent Clusters
          • Multiple Client IP Addresses
            • Sending Email And Alerts
            • Using Multiple Client IP Addresses per Load-Releasing System
        • 5. Analyzing Results
          • Detail Results
          • Load Test Result Detail-Statistics and Diagrams
          • Enhanced HTTP Status Codes
          • Error Snapshots
          • Load Curve Diagrams
          • URL Exec Step
          • Comparison Diagrams
            • Analysis Load Test Response Time Comparison
            • Performance Overview
            • Session Failures
        • Programmatic Access to Measured Data
          • Extracting Error Snapshots
          • Extracting Performance Data
        • Web Tools
        • Advanced Topics
          • Execute a JMeter Test Plan in ZebraTester
          • Credentials Manager for ZebraTester
          • Wildcard Edition
          • Execution Plan in ZebraTester
          • Log rotation settings for ZebraTester Processes
          • Modify Session
          • Modular Scripting Support
          • Understanding Pacing
          • Integrating ZebraTester with GIT
            • GitHub Integration Manual V5.4.1
      • ZebraTester FAQ
      • ZebraTester How-to articles
        • How to Combine Multiple ZebraTester Scripts Into One
        • Inline Scripting
        • How to Configure a ZebraTester Script to Fetch Credentials from CyberArk
        • How to Configure a ZebraTester Scenario to Fetch Credentials from CyberArk
        • How to Convert a HAR file into a ZebraTester Script
        • How to Convert a LoadRunner Script to ZebraTester
        • How to Import the ZT Root Certificate to an iOS device
        • How to iterate over JSON objects in ZebraTester using Inline Scripts
        • How to round a number to a certain number of decimal points within a ZebraTester Inline Script
        • How to Use a Custom DNS Host File Within a ZebraTester Script
        • How to Move a ZebraTester Script to an Older Format
        • API Plugin Version
        • Setting up the Memu Player for ZebraTester Recording
        • Inline Script Version
      • Apica Data Repository (ADR) aka Apica Table Server
        • ADR related inline functions available in ZT
        • Apica Data Repository Release Notes
        • REST Endpoint Examples
        • Accessing the ADR with Inline Scripts
      • ZebraTester Plugin Repository
      • Apica YAML
        • Installing and Using the ApicaYAML CLI Tool
        • Understanding ApicaYAML Scripting and Syntax
    • Load Testing Overview
      • Getting Started with ALT
      • Creating / Running a Single Load Test
      • Running Multiple Tests Concurrently
      • Understanding Loadtest Results
    • Test Data Orchestrator (TDO)
      • Technical Guides
        • Hardware / Environment Requirements
        • IP Forwarding Instructions (Linux)
        • Self-Signed Certificate
        • Windows Server Install
        • Linux Server Install
        • User Maintenance
        • LDAP Setup
        • MongoDB Community Server Setup
        • TDX Installation Guide
      • User Documentation
        • End User Guide for TDO
          • Connecting to Orson
          • Coverage Sets and Business Rules
          • Data Assembly
          • Downloading Data
        • User Guide for TDX
          • Connecting to TDX
          • Setting up a Data Profile
          • Extracting Data
          • Analyzing Data Patterns
          • Performing Table Updates
        • TDO Project Builder User Guide
          • Project Design
          • Projects
            • Select Existing Project
            • Create a New Project
            • Export a Project
            • Import a Project
            • Clone a Project
            • Delete a Project
          • Working with Source Files
            • Ingest Data
            • Data Blocks
              • Create a Determining Attribute from a Data Block
              • Data Types and Field Formats
          • Determining Attributes
            • Manual Attribute Creation
              • Numeric Range Attribute
              • Manual Attribute Creation
              • Create a New Determining Attribute from an Existing Data Block
            • Setting Determining Attribute Priorities
            • Filtering Determining Attributes
            • Adding, Changing, or Deleting a Determining Attribute Value
          • Create Coverage Set
          • Business Rules
            • Create a New Business Rule
            • Edit a Business Rule
            • Using Priorities in Business Rules
            • Using Occurrences in Business Rules
            • Deleting a Business Rule
          • Create a Coverage Matrix
          • Create an Action
          • Create a Scenario
          • Create Data Views
            • Creating a Coverage Set Data View
            • Creating a Data View Joined to the Coverage Set View
            • Creating a Data View Linked to a Multiple Data Views
            • Locking Records in a Data View
            • Editing Data Source in a Data View
            • Other Edits in the Data View
          • Work Sets
            • Creating a Work Set
            • Editing a Work Set
            • Clone a Work Set
            • Deleting a Work Set
            • Unlocking a Work Set
            • Data Assembly from the Work Set Page
          • Data Assignment
            • Assign a Value from the Coverage Matrix
            • Assign a Value from a Data View
            • Assign a Value from a Prior Rule
            • Assign a Fixed Value
            • Assign a Value using a Format Function
            • Assign a Value using Mathematical Calculations
            • Assign a Value using String Concatenation
            • Assigning a Value using Conditions
          • Data Assembly
          • Other TDO Menu Items
        • API Guide
          • API Structure and Usage
          • Determining Attribute APIs
            • Create Determining Attribute (Range-based)
            • Create Determining Attribute (Value-based)
            • Update Determining Attributes
            • Get Determining Attribute Details
            • Delete a Determining Attribute
          • Coverage Set API’s
            • Create Coverage Set
            • Update Coverage Set
            • Get All Coverage Set Details
            • Get Single Coverage Set Details
            • Lock Coverage Set
            • Unlock Coverage Set
            • Delete Coverage Set
          • Business Rule API’s
            • Create Business Rule
            • Update Business Rule
            • Get Business Rule Details
            • Get All Business Rules
            • Delete Business Rule
          • Workset API's
            • Create Workset
            • Update Workset
            • Get All Worksets
            • Get Workset Details
            • Unlock Workset
            • Clone Workset
            • Delete Workset
          • Data Assembly API's
            • Assemble Data
            • Check Assembly Process
          • Data Movement API's
            • Ingest (Upload) Data Files
            • Download Data Files
              • HTML Download
              • CSV Download
              • Comma Delimited with Sequence Numbers Download
              • Pipe Delimited Download
              • Tab Delimited with Sequence Numbers Download
              • EDI X12 834 Download
              • SQL Lite db Download
              • Alight File Format Download
          • Reporting API's
            • Session Events
            • Rules Events
            • Coverage Events
            • Retrieve Data Block Contents
            • Data Assembly Summary
        • Workflow Guide
        • Format Function Guide
          • String Formats
          • Boolean Formats
          • Hexadecimal Formats
      • Release Notes
        • Build 1.0.2.0-20250213-1458
  • IRONdb
    • Getting Started
      • Installation
      • Configuration
      • Cluster Sizing
      • Command Line Options
      • ZFS Guide
    • Administration
      • Activity Tracking
      • Compacting Numeric Rollups
      • Migrating To A New Cluster
      • Monitoring
      • Operations
      • Rebuilding IRONdb Nodes
      • Resizing Clusters
    • API
      • API Specs
      • Data Deletion
      • Data Retrieval
      • Data Submission
      • Rebalance
      • State and Topology
    • Integrations
      • Graphite
      • Prometheus
      • OpenTSDB
    • Tools
      • Grafana Data Source
      • Graphite Plugin
      • IRONdb Relay
      • IRONdb Relay Release Notes
    • Metric Names and Tags
    • Release Notes
    • Archived Release Notes
  • Administration
    • E-Mail Configuration
    • Single Sign-On with SAML
    • Port Management
    • Audit Trail
      • Events Trail
      • Alerts Trail
Powered by GitBook
On this page
  • What is the OpenTelemetry (OTel) Demo?
  • Quick Start Process for Using the OTel Demo app with Ascent
  • How to Deploy the OTEL Demo App
  • Prerequisites:
  • Step 1: Get and Clone the OTEL demo app repository:
  • Step 2: Go to the demo folder:
  • Step 3: Start the demo app in Docker:
  • Step 4: (Optional) Enable API observability-driven testing:
  • Step 5: Test and access the OTEL demo application:
  • Optional: Changing the demo’s primary port number
  • Step 6: Update the OTEL config file:
  • Step 7: Get Your Ingest Token from Ascent:
  • Step 8: Get Data Flowing into Ascent Platform:
  • Step 9: Verify data flow in Ascent:
  • Step 9 - FLOW (Cost Savings Use Case)
  • Step 10 - Setup and Configure Pipeline
  • Step 11 - Design Queries
  • Step 12 - Create Dashboards
  • Step 13 - Setup Alerts and Workflow
  • FLOW - Cost Savings Use Case
  • Additional Resources

Was this helpful?

Edit on GitHub
Export as PDF
  1. GETTING STARTED
  2. Getting Started with Ascent

Using the OpenTelemetry Demo

PreviousRegister and Gain AccessNextGetting Started with Metrics

Last updated 8 hours ago

Was this helpful?

What is the OpenTelemetry (OTel) Demo?

The OTel Demo is a microservices-based application created by OpenTelemetry's community to demonstrate its capabilities in a realistic, distributed system environment. This demo application, known as the OpenTelemetry Astronomy Shop, simulates an e-commerce website composed of over 10 interconnected microservices (written in multiple languages), and communicates via HTTP and gRPC. Each service is fully instrumented with OTel, generating comprehensive traces, metrics, and logs.

The demo serves as an invaluable resource for understanding how to implement and use OpenTelemetry in real-world applications. Using the Ascent platform with the OTel Demo enables you to converge all of the IT data, manage the telemetry data, and monitor and troubleshoot the operational data in real-time. The following steps guide you through the process of using the OTel Demo application with Ascent.

Quick Start Process for Using the OTel Demo app with Ascent

For all users that want to get started with Ascent should follow these five (5) simple steps:

In this guide, we cover the key goals and related activities of each step to ensure a quick and easy setup of Ascent.

How to Deploy the OTEL Demo App

The goal is to ingest telemetry data (logs, metrics, traces) from relevant systems.

Key actions include:

  • Accessing and deploying the public OpenTelemetry (OTEL) Demo App

  • Configure data collection setup, frequency and granularity

  • Ensure data normalization

This guide aims to walk you through the steps required to deploy the OpenTelemetry Demo app and begin sending data to Ascent.

NOTE: We will deploy the OTEL demo app using Docker for this guide.

Prerequisites:

  • Docker

  • 6 GB of RAM for the application

Step 1: Get and Clone the OTEL demo app repository:

$ git clone https://github.com/open-telemetry/opentelemetry-demo.git

Step 2: Go to the demo folder:

$ cd opentelemetry-demo/

Step 3: Start the demo app in Docker:

$ docker compose up --force-recreate --remove-orphans --detach

Step 4: (Optional) Enable API observability-driven testing:

$ docker compose up --force-recreate --remove-orphans --detach

Step 5: Test and access the OTEL demo application:

Once the images are built and containers are started, you can now access the following Opentelemetry components on the demo app web store:

Optional: Changing the demo’s primary port number

By default, the demo application will start a proxy for all browser traffic bound to port 8080. To change the port number, set the ENVOY_PORT environment variable before starting the demo.

  • For example to use port 8081:

$ ENVOY_PORT=8081 docker compose up --force-recreate --remove-orphans --detach

Step 6: Update the OTEL config file:

  • /src/otel-collector/otelcol-config-extras.yml

Paste the following into the config file, overwriting it completely:

  1. Copy

    exporters:
      otlphttp/apicametrics:
        compression: gzip
        disable_keep_alives: true
        encoding: proto
        metrics_endpoint: "https://<YOUR-ASCENT-ENV>/v1/metrics"
        headers:
          Authorization: "Bearer <YOUR-INGEST-TOKEN>"
        tls:
          insecure: false
          insecure_skip_verify: true
      otlphttp/logs:
        logs_endpoint:  https://<YOUR-ASCENT-ENV>/v1/json_batch/otlplogs?namespace=OtelDemo&application=DemoLogs
        encoding: json
        compression: gzip
        headers:
          Authorization: "Bearer <YOUR-INGEST-TOKEN>"
        tls:
          insecure: false
          insecure_skip_verify: true
    service:
      pipelines:
        metrics:
          exporters: [otlphttp/apicametrics]
        logs:
          exporters: [otlphttp/logs]
  2. Replace <YOUR-ASCENT-ENV>with your Ascent domain, e.g. company.apica.io

  3. Replace <YOUR-INGEST-TOKEN>with your Ascent Ingest Token, e.g. eyXXXXXXXXXXX...

    1. See 'Step 7' to get your 'ingest-token'

Step 7: Get Your Ingest Token from Ascent:

Step 8: Get Data Flowing into Ascent Platform:

Restart the OpenTelemetry collector by running the following command:

$ docker compose restart

Step 9: Verify data flow in Ascent:

  1. Log into Ascent

  2. Navigate to Explore -> Logs & Insights:

  1. You should see namespace "OtelDemo" and Application "DemoLogs":

  2. This confirms that data is flowing from the Opentelemetry Demo Application. Feel free to click into application "DemoLogs" to view all the logs that are being sent from the Demo App.

Now that data is flowing, please follow the steps below to learn how to interact, enhance, and visualize this data in Ascent.

Step 9 - FLOW (Cost Savings Use Case)

Step 10 - Setup and Configure Pipeline

The goal is to transport and process the collected data.

Key actions include:

  • Select or configure a data pipeline

  • Define data routing rules

  • Apply transformations, filtering, or enrichment if needed

Links to related docs include:

Step 11 - Design Queries

The goal is to enable insights by querying telemetry data.

Key actions include:

  • Understand the query language used

  • Create baseline queries for system health

  • Optimize queries for performance and cost

  • Validate query results

Links to related docs include:

Step 12 - Create Dashboards

The goal is to visualize system performance and behavior in real time.

Key actions include:

  • Use visual components

  • Organize dashboards by domain

  • Incorporate filters

  • Enable drill-down for troubleshooting.

Links to related docs include:

Step 13 - Setup Alerts and Workflow

The goal is to detect anomalies and automate response actions.

Key actions include:

  • Define alerting rules

  • Set up alert destinations

  • Establish escalation policies and on-call schedules

  • Integrate with incident management workflows and postmortem tools

FLOW - Cost Savings Use Case

FLOW allows you to filter unecessary data out of your logs before hitting the data lake which leads to significant cost savings. This guide will walk you through how to drop labels from our Otel Demo App logs. You can apply the same functionality to any other data source.

  1. Navigate to the Logs & Insights page:

  1. This view lists all of the datasources pushing data to Ascent. To access the logs, click into "DemoLogs".

  1. To view one of the logs simply click one of them.

  1. We have one of our otel logs here. In this example, we will be dropping "destination.address" and "event.name" from the logs.

  1. To drop these fields, navigate to the Pipeline tab and then click the + button shown below:

  1. Create a new Pipeline:

  1. Enable Drop Labels by click the slider:

  1. On the right of the screen you'll want to preview the logs to know what labels to drop. Select the following and then hit Preview in the top right:

  1. Here are the two labels we want to drop:

  1. Select the key in the dropdown by typing them out or clicking.

  1. Click "Save" in the bottom right:

  1. Go back to the log view to verify the filter rule has been applied. Refresh the page and make sure it is a new log that you're verifying:

  1. As you can see, destination.address and event.name are no longer being ingested:

Dropping a few labels might not seem like a big deal at first, but if you exrapolate that over 10,000 or 100,000's logs, the cost savings start to add up QUICK.

14. To view savings and your configured pipelines navigate to "Pipelines":

  1. View all your pipeline data along with savings:

Links to related docs include:

Additional Resources

Here are helpful links to other "Getting Started" technical guides:

Web store:

Grafana:

Load Generator UI:

Jaeger UI:

Tracetest UI: , only when using make run-tracetesting

Flagd configurator UI:

Add a new Filter Rule. If you're interested in the other rules please use this documentation: for a detailed guide.

For more information on pipelines please see

Docker Compose v2.0.0+
http://localhost:8080/
http://localhost:8080/grafana/
http://localhost:8080/loadgen/
http://localhost:8080/jaeger/ui/
http://localhost:11633/
http://localhost:8080/feature
https://docs.apica.io/integrations/overview/generating-a-secure-ingest-token
Configure pipelines
Visualize pipelines
Forwarding data
Data explorer overview
Query builder
Widget
Dashboards overview
https://docs.apica.io/flow/rules
Alerts overview
Alerting on queries
Alerting on logs
Getting Started with Metrics
Getting Started with Logs
Get acquainted with the Apica Ascent UI
Configure your data sources
Configure Apica Ascent to send alerts to your email server
Add and configure alert destinations like email, Slack, and PagerDuty
Configure SSO using SAML
Configure RBAC
Deploy the OTEL Demo App
Setup and Configure Pipeline
Design Queries
Create Dashboards
Setup Alerts and Workflow
FLOW Guide Here
setup and configure pipelines
Ascent Quick Start Process
Logs & Insights
Logs & Insights
Otel Demo App Logs
Pipeline View
New Pipeline
Filter Rule
Drop Labels
Preview Logs
Labels
Select Label
Logs & Insights
Otel Log
Pipelines
Pipeline Dashboard