AWS CloudWatch
You can forward Cloud watch logs to Apica Ascent using 2 methods.
Apica Ascent CloudWatch exporter Lambda function
Run Logstash on VM (or docker)
Apica Ascent CloudWatch exporter Lambda function
You can export AWS CloudWatch logs to Apica Ascent using an AWS Lambada function. The AWS Lambda function acts as a trigger for a CloudWatch log stream.
This guide explains the process for setting up an AWS Lambda function and configuring an AWS CloudWatch trigger to forward CloudWatch logs to Apica Ascent.

Creating a Lambda function
Apica Ascent provides CloudFormation templates to create the Apica Ascent CloudWatch exporter Lambda function.
Depending on the type of logs you'd like to export, use the appropriate CloudFormation template from the following list.
Python version dependency
HTTP vs HTTPS handling
Exporting Lambda Function logs
Use the following CloudFormation template to export AWS Lambda function logs to Apica Ascent.
https://logiqcf.s3.amazonaws.com/cloudwatch-exporter/logiq-cloudwatch-lambda-logs-exporter.yaml
Exporting CloudTrail Logs
Use the following CloudFormation template to export CloudTrail logs to Apica Ascent.
https://logiqcf.s3.amazonaws.com/cloudwatch-exporter/logiq-cloudwatch-cloudtrail-exporter.yaml
Exporting AWS VPC Flowlogs
Use the following CloudFormation template to export Flowlogs logs to Apica Ascent.
https://logiqcf.s3.amazonaws.com/cloudwatch-exporter/logiq-cloudwatch-flowlogs-exporter.yaml
Exporting Cloudwatch logs from other services
Use the following CloudFormation template to export cloudwatch logs.
https://logiqcf.s3.amazonaws.com/cloudwatch-exporter/logiq-cloudwatch-exporter.yaml
This CloudFormation stack creates a Lambda function and its necessary permissions. You must configure the following attributes.
Parameter
Description
APPNAME
Application name - a readable name for Apica Ascent to partition logs.
CLUSTERID
Cluster ID - a readable name for Apica Ascent to partition logs.
NAMESPACE
Namespace - a readable name for Apica Ascent to partition logs.
LOGIQHOST
IP address or hostname of the Apica Ascent server. (Without http or https)
INGESTTOKEN
JWT token to securely ingest logs. Refer here to generate ingest token.
Configuring the CloudWatch trigger
Once the CloudFormation stack is created, navigate to the AWS Lambda function (logiq-cloudwatch-exporter
) and add a trigger.

On the Add trigger page, select CloudWatch, and then select a CloudWatch Logs Log Group.

Once this configuration is complete, any new logs coming to the configured CloudWatch Log group will be streamed to the Apica Ascent cluster.
Create the Logstash VM (or Docker)
Cloudwatch logs can also be pulled using agents such as logstash. If your team is familiar and has logstash in place, follow the instructions below to configure logstash to pull logs from CloudWatch.
Install Logstash on Ubuntu virtual machine as shown below.
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
sudo apt-get update
sudo apt-get install logstash
# Install Logstash logstash-input-cloudwatch
cd /usr/share/logstash
sudo -u root sudo -u logstash bin/logstash-plugin install logstash-input-cloudwatch
Configure Logstash
Logstash comes with no default configuration. Create a new file /etc/logstash/conf.d/logstash.conf
with these contents, modifying values as needed:
input {
cloudwatch_logs {
access_key_id => "<Acess-key>"
secret_access_key => "<secret-access-key>"
region => "<region>"
"log_group" => ["<Cloud-watch-log-group"]
"log_group_prefix" => true
codec => plain
start_position => end
interval => 30
}
}
filter {
ruby {
path => "/home/<custom-path>/flattenJSON.rb"
script_params => { "field" => "cloudwatch_logs" }
}
mutate {
gsub => ["cloudwatch_logs.log_group","\/","-"]
gsub => ["cloudwatch_logs.log_group","^-",""]
add_field => { "namespace" => "<custom-namespace>" }
add_field => { "cluster_id" => "<custom-cluster-id>" }
add_field => { "app_name" => "%{[cloudwatch_logs.log_group]}" }
add_field => { "proc_id" => "%{[cloudwatch_logs.log_stream]}" }
}
}
output {
http {
url => "http://<logiq-endpoint>/v1/json_batch"
headers => { "Authorization" => "Bearer <SECURE_INGEST_TOKEN>" }
http_method => "post"
format => "json_batch"
content_type => "json_batch"
pool_max => 2000
pool_max_per_route => 100
socket_timeout => 300
}
}
You can obtain an ingest token from the Apica Ascent UI as described here. You can customize the namespace
and cluster_id
in the Logstash configuration based on your needs.
Your AWS Cloud watch logs will now be forwarded to your Apica Ascent instance. See the Explore Section to view the logs.
Last updated
Was this helpful?