Apica Docs
  • Welcome to Apica Docs!
  • PRODUCT OVERVIEW
    • Ascent Overview
    • Ascent User Interface
  • TECHNOLOGIES
    • Ascent with Kubernetes
      • Kubernetes is a Game-Changer
      • Ascent: Built on Kubernetes
    • Ascent with OpenTelemetry
      • Why Implement OpenTelemetry?
      • Common Use Cases for OpenTelemetry
      • How to Get Started with OpenTelemetry
      • Best Practices for OpenTelemetry Implementations
  • RELEASE NOTES
    • Release Notes
      • Ascent 2.10.5
      • Ascent 2.10.4
      • Ascent 2.10.3
      • Ascent 2.10.2
      • Ascent 2.9.0
      • Ascent 2.8.1
      • Ascent 2.8.0
      • Ascent 2.7.0
      • Ascent 2.6.0
      • Ascent 2.5.0
      • Ascent 2.4.0
      • Ascent 2.3.0
      • Ascent 2.2.0
      • Ascent 2.1.0
        • Data Fabric
          • Releases-old
        • Synthetic Monitoring
        • Advanced Scripting Engine
        • IRONdb
      • Synthetic Monitoring
  • GETTING STARTED
    • Getting Started with Ascent
      • Register and Gain Access
      • Using the OpenTelemetry Demo
      • Getting Started with Metrics
      • Getting Started with Logs
        • OpenTelemetry
      • Using Fleet for Data Ingestion
    • Ascent Deployment Overview
    • Quickstart with Docker-Compose
    • On-Premise PaaS deployment
      • On-Premise PaaS Deployment Architecture
      • Deploying Apica Ascent PaaS on Kubernetes
      • Deploying Apica Ascent PaaS on MicroK8s
      • Deploying Apica Ascent PaaS on AWS
      • Deploying Apica Ascent EKS on AWS using CloudFormation
      • Deploying Ascent on AWS EKS with Aurora PostgreSQL and ElastiCache Redis using Cloud Formation
        • Deploying Apica Ascent on AWS EKS with Aurora PostgreSQL and ElastiCache Redis using CloudFormation
        • Apica Ascent on AWS EKS (Private Endpoint) with Aurora PostgreSQL and ElastiCache Redis on prod VPC
      • Deploying Apica Ascent EKS on AWS using custom AMI
      • Deploying Apica Ascent EKS with AWS ALB
      • Deploying Apica Ascent PaaS in Azure Kubernetes Service
        • Azure Blob Storage Lifecycle Management
      • Deploying Apica Ascent with OpenShift
    • Boomi RTO Quick Start Guide
      • RTO Dashboarding
      • Alerting on RTO Metrics
      • Alerting on RTO Logs
    • Dashboards & Visualizations
  • DATA SOURCES
    • Data Source Overview
    • API
      • JSON Data source
      • RSS
    • AWS
      • Amazon Athena
      • Amazon CloudWatch ( YAML )
      • Amazon Elasticsearch Service
      • Amazon Redshift
      • MySQL Server (Amazon RDS)
    • NoSQL Data Sources
      • MongoDB
    • OLAP
      • Data Bricks
      • Druid
      • Snowflake
    • SQL Data Sources
      • PostgreSQL
      • Microsoft SQL Server
      • MySQL Server
    • Time Series Databases
      • Prometheus Compatible
      • Elasticsearch
      • InfluxDB
    • Ascent Synthetics
      • Checks
    • Ascent Logs
      • Logs
  • INTEGRATIONS
    • Integrations Overview
      • Generating a secure ingest token
      • Data Ingest Ports
    • List of Integrations
      • Apache Beam
        • Export Metrics to Prometheus
          • Pull Mechanism via Push-Gateway
        • Export Events to Apica Ascent
      • Apica ASM
      • Apica Ascent Observability Data Collector Agent
      • AWS
        • AWS CloudWatch
        • AWS ECS
          • Forwarding AWS ECS logs to Apica Ascent using AWS FireLens
          • ECS prometheus metrics to Apica Ascent
        • AWS S3
      • Azure
        • Azure Databricks
        • Azure Eventhub
        • Azure Event Hubs
      • Docker Compose
      • Docker Swarm logging
      • Docker Syslog log driver
      • F5 Big-Ip System
      • Filebeat
      • Fluent Bit
        • Forwarding Amazon-Linux logs to Apica Ascent using Fluent Bit
        • Fluent Bit installation on Ubuntu
        • Enabling IoT(MQTT) Input (PAAS)
        • IIS Logs on Windows
      • Fluentd
      • FortiNet Firewalls
      • GCP PubSub
      • GCP Cloud Logging
      • IBM QRadar
      • ilert
      • Incident Management
        • Webhooks
      • Jaeger
      • Kafka
      • Kinesis
      • Kubernetes
      • Logstash
      • MQTT
      • Network Packets
      • OpenTelemetry
      • Object store (S3 Compatible)
      • Oracle OCI Infrastructure Audit/Logs
      • Oracle Data Integrator (ODI)
      • OSSEC Variants (OSSEC/WAZUH/ATOMIC)
        • Apica Ascent-OSSEC Agent for Windows
      • Palo Alto Firewall
      • Prometheus
        • Spring Boot
        • Prometheus on Windows
        • Prometheus Remote Write
        • MongoDB Exporter
        • JMX Exporter
      • Rsyslogd
      • Syslog
      • Syslog-ng
      • Splunk Universal Forwarder
      • Splunk Heavy Forwarder
      • SNMP
      • Splunk Forwarding Proxy
      • Vault
        • Audit Vault Logs - AWS
        • Audit Vault Logs - OCI
        • Audit Vault Metrics
    • Apica API DOCS
  • DATA MANAGEMENT
    • Data Management Overview
    • Data Explorer Overview
      • Query Builder
      • Widget
      • Alerts
      • JSON Import
      • Creating Json Schema
        • Visualization
          • Line chart
          • Bar chart
          • Area chart
          • Scatter chart
          • Status chart
          • Counter chart
          • Stat chart
          • Size chart
          • Dense Status chart
          • Honeycomb chart
          • Gauge chart
          • Pie chart
          • Disk chart
          • Table chart
          • Date time chart
      • Time-Series AI/ML
        • Anomaly Detection
        • Averaging
        • Standard Deviation(STD)
      • Data Explorer Dashboard
        • Create a Dashboard
        • Editing Dashboard
          • Dashboard level filters
    • Timestamp handling
      • Timestamp bookmark
    • Large log/events/metrics/traces
  • OBSERVE
    • Monitoring Overview
      • Connecting Prometheus
      • Connecting Amazon Managed Service for Prometheus
      • Windows Redis Monitoring
      • Writing queries
        • Query Snippets
      • Query API
      • Use Apica API to ingest JSON data
    • Distributed Tracing
      • Traces
      • Spans
      • Native support for OTEL Traces
      • Windows .NET Application Tracing
      • Linux+Java Application Tracing
    • Log Management
      • Terminology
      • Explore Logs
      • Topology
      • Apica Ascent Search Cheat Sheet
      • Share Search Results
      • Severity Metrics
      • Log2Metrics
      • Native support for OTEL Logs
      • Reports
        • Accessing Reports results via API
      • Role-Based Access Control (RBAC)
      • Configuring RBAC
    • AI and LLM Observability
      • AI Agent Deployment
      • Ascent AI Agent Monitoring
      • Ascent Quick Start Guide
    • Synthetic Check Monitoring
      • Map View
      • List View
      • Alerting for Check Results
  • Flow
    • Overview
    • Pipeline Management
      • Configuring Pipelines
      • Visualize Pipelines
      • Pipeline Overview Dashboard
      • Forwarding Data
    • OpenTelemetry Ingest
      • OpenTelemetry Logs / Traces
      • OpenTelemetry Metrics
        • Transforming Metrics through Code Rules
    • Vault
      • Certificates
      • Variables
      • Lookups
    • Rules
      • FILTER
      • EXTRACT
      • SIEM and TAG
      • REWRITE
      • CODE
      • FORWARD
        • Rename Attributes
      • STREAM
    • Functions
      • ascent.encode
      • ascent.decode
      • ascent.persist
      • Ascent.variables
      • ascent.crypto
      • Ascent.mask
      • Ascent.net
      • Ascent.text
      • Ascent.time
      • Ascent.lookups
    • List of Forwarders
    • OpenTelemetry Forwarding
      • Metrics
      • Traces
      • Logs
    • Splunk Forwarding
      • Apica UF Proxy App Extension
        • Standalone Instance
        • List of Indexer Instances
        • Indexer Discovery
      • Splunk HTTP Event Collector (HEC) Forwarder
        • Metric Indexes
        • Non Metric Indexes
      • Splunk Syslog Forwarding
    • Real-Time Stream Forwarding
      • AWS Kinesis
      • Azure Eventhub
      • Google Pub/Sub
    • Security Monitor Forwarding
      • Arc Sight
      • RSA New Witness
    • Forwarding to Monitoring Tools
      • Datadog Forwarding
      • New Relic Forwarding
      • Dynatrace Forwarding
      • Elasticsearch Forwarding
      • Coralogix Forwarding
      • Azure Log Analytics Forwarding
    • Object Store Forwarding
      • S3 Compatible
      • Azure Blob Storage
    • Forwarding to Data Warehouse
      • GCP Bigquery
  • Customized Forwarders
    • JS Code Forwarding
  • LAKE
    • Powered by Instastore™
  • FLEET MANAGEMENT
    • Overview
    • Agents
    • Configurations
    • Packages
    • Fleet Repository Management
    • Advanced Search
    • List of Agents
      • Datadog Agent
      • Fluent-bit Agent
      • Grafana Alloy
      • OpenTelemetry Collector
      • OpenTelemetry Kubernetes
      • Prometheus Agent
  • COMMAND LINE INTERFACE
    • apicactl Documentation
  • AUTONOMOUS INSIGHTS
    • Time Series AI-ML
      • Anomaly Detection
      • Averaging
      • Standard Deviation(STD)
      • Forecasting
      • AI-ML on PromQL Query Data Set
      • Statistical Data Description
    • Pattern-Signature (PS)
      • Log PS Explained
        • Unstructured Logs
        • Semi-structured JSON
        • Reduce Logs Based on PS
        • Log PS Use Cases
          • Log Outlier Isolation
          • Log Trending Analysis
          • Simple Log Compare
      • Config PS
        • Config JSON PS
    • ALIVE Log Visualization
      • ALIVE Pattern Signature Summary
      • ALIVE Log Compare
    • Log Explained using Generative AI
      • Configuring Generative AI Access
      • GenAI Example Using Log Explain
    • Alerts
    • Alerts (Simple/Anomaly)
    • Alerts On Logs
    • Rule Packs
    • AI-powered Search
  • PLATFORM DOCS
    • Synthetic Monitoring Overview
      • Getting Started with ASM
        • Achieving 3 Clicks to Issue Resolution via ASM
        • FAQ - Frequently Asked Questions
        • Creating A New Check
          • Creating a New Real Browser Check
      • Explore the Platform
        • API Details
        • Check Types
          • Android Check
          • Command Check
          • Compound Check
          • Browser Check
          • Desktop Application Check
          • AWS Lambda Check
          • DNS Resolver Check
          • DNS Security Check
          • Domain Availability Check
          • Domain Delegation Check
          • Domain Expiration Date Check
          • Hostname Integrity Check
          • iPad Check
          • iPhone Check
          • Ping Check
          • Port Check
          • Postman Check
          • Response Time Check
          • SSL Certificate Expiration Check
          • Scripted Check
        • Dashboards
        • Integrations
          • DynaTrace Integration
          • Google Analytics Integration
          • Akamai Integration
          • Centrify Integration
          • AppDynamics Integration
          • PagerDuty Integration
          • ServiceNow Integration
          • Splunk Integration
        • Metrics
          • Analyze Site
          • Result Values
          • Trends
          • Analyze Metrics
        • Monitoring
          • Integrating ASM Metrics into Grafana Using Apica Panels
            • Understanding the ASM Imported Dashboards
            • Using the Apica Panels Dashboards
          • Understanding ASM Check Host Locations
        • Navigation
          • Manage Menu
        • Reports
        • Use Cases
      • Configurations
        • Configuring Checks
          • Understanding Check Results
            • Understanding ZebraTester Check Results
            • Understanding Browser Check Results
            • Understanding Check Details
          • Editing Checks
            • Editing Browser Checks
            • Editing ZebraTester Checks
          • Using Regular Expressions Within the ASM Platform
          • Understanding the Edit Scenario Page
          • Comparing Selenium IDE Scripts to ASM Scenarios
          • Configuring Apica DNS Check Types
          • Implementing Tags Effectively Within ASM
          • Storing and Retrieving Information Using the ASM Dictionary
        • Configuring Users
          • Configuring SSO Within ASM
        • Configuring Alerts
          • Configuring Webhook Alerts
      • How-To Articles
        • ASM Monitoring Best Practices
        • API Monitoring Guide
        • IT Monitoring Guide
        • Monitor Mission-Critical Applications through the Eyes of Your Users
        • How To Mask Sensitive Data in ASM
        • How to Mask Sensitive Data When Using Postman Checks
        • How to Handle URL Errors in a Check
        • How To Set Up SSO Using Azure AD
        • How to Set Up SSO Using Centrify
        • ASM Scenarios How-To
          • How To Pace a Selenium Script
          • How to Utilize XPath Within a Selenium Script
          • How to Mask Sensitive Information Within an ASM Scenario
          • Handling Elements Which Do Not Appear Consistently
          • How to Handle HTML Windows in ASM Scenarios
        • Installing CES Private Agent (Docker)
        • CES Private Agent - Download Links
    • ZebraTester Scripting
      • ZebraTester Overview
      • Install ZebraTester
        • Download ZebraTester
          • Core ZebraTester V7.5-A Documentation
          • Core ZebraTester V7.0-B Documentation
          • Core ZebraTester V7.0-A Documentation
          • Core ZebraTester V5.5-Z Documentation
          • Core ZebraTester V5.5-F Documentation
        • Download the ZebraTester Recorder Extension
        • Windows Installation
          • ZebraTester on Windows
          • Generate Private CA Root Certificate
          • Windows System Tuning
          • Install a new ZT version on Windows Server
          • Install/Uninstall ZT Windows Installer Silently
        • macOS Installation
          • macOS Preinstallation Instructions
          • Generate Private CA Root Cert (macOS X)
          • System Tuning (macOS)
          • Import a CA Root Certificate to an iOS device
          • Memory Configuration Guidelines for ZebraTester Agents
      • ZebraTester User Guide
        • Menu and Navigation Overview
        • 1. Get a Load Test Session
          • Recording Web Surfing Sessions with ZebraTester
            • Further Hints for Recording Web Surfing Sessions
            • Recording Extension
              • Record Web Session
              • Cookies and Cache
              • Proxy
              • Page Breaks
              • Recording Extension Introduction
              • Troubleshooting
            • Add URL to ZebraTester
            • Page Scanner
          • Next Steps after Recording a Web Surfing Session
        • 2. Scripting the Load Test Session
          • 1. Assertions - HTTP Response Verificaton
          • 2. Correlation - Dynamic Session Parameters
            • 2b. Configuring Variable Rules
            • 2a. Var Finder
          • 3. Parameterization: Input Fields, ADR and Input Files
            • ADR
          • 4. Execution Control - Inner Loops
          • 5. Execution Control - URL Loops
          • 6. Execution Control -User-Defined Transactions And Page Breaks
          • 7. Custom Scripting - Inline Scripts
          • 8. Custom Scripting - Load Test Plug-ins
            • ZebraTester Plug-in Handbooks
          • Modular Scripting Support
        • 3. Recording Session Replay
        • 4. Execute the Load Test
          • Executing a First Load Test
          • Executing Load Test Programs
            • Project Navigator
              • Configuration of the Project Navigator Main Directory
            • Real-Time Load Test Actions
            • Real-Time Error Analysis
            • Acquiring the Load Test Result
            • More Tips for Executing Load Tests
          • Distributed Load Tests
            • Exec Agents
            • Exec Agent Clusters
          • Multiple Client IP Addresses
            • Sending Email And Alerts
            • Using Multiple Client IP Addresses per Load-Releasing System
        • 5. Analyzing Results
          • Detail Results
          • Load Test Result Detail-Statistics and Diagrams
          • Enhanced HTTP Status Codes
          • Error Snapshots
          • Load Curve Diagrams
          • URL Exec Step
          • Comparison Diagrams
            • Analysis Load Test Response Time Comparison
            • Performance Overview
            • Session Failures
        • Programmatic Access to Measured Data
          • Extracting Error Snapshots
          • Extracting Performance Data
        • Web Tools
        • Advanced Topics
          • Execute a JMeter Test Plan in ZebraTester
          • Credentials Manager for ZebraTester
          • Wildcard Edition
          • Execution Plan in ZebraTester
          • Log rotation settings for ZebraTester Processes
          • Modify Session
          • Modular Scripting Support
          • Understanding Pacing
          • Integrating ZebraTester with GIT
            • GitHub Integration Manual V5.4.1
      • ZebraTester FAQ
      • ZebraTester How-to articles
        • How to Combine Multiple ZebraTester Scripts Into One
        • Inline Scripting
        • How to Configure a ZebraTester Script to Fetch Credentials from CyberArk
        • How to Configure a ZebraTester Scenario to Fetch Credentials from CyberArk
        • How to Convert a HAR file into a ZebraTester Script
        • How to Convert a LoadRunner Script to ZebraTester
        • How to Import the ZT Root Certificate to an iOS device
        • How to iterate over JSON objects in ZebraTester using Inline Scripts
        • How to round a number to a certain number of decimal points within a ZebraTester Inline Script
        • How to Use a Custom DNS Host File Within a ZebraTester Script
        • How to Move a ZebraTester Script to an Older Format
        • API Plugin Version
        • Setting up the Memu Player for ZebraTester Recording
        • Inline Script Version
      • Apica Data Repository (ADR) aka Apica Table Server
        • ADR related inline functions available in ZT
        • Apica Data Repository Release Notes
        • REST Endpoint Examples
        • Accessing the ADR with Inline Scripts
      • ZebraTester Plugin Repository
      • Apica YAML
        • Installing and Using the ApicaYAML CLI Tool
        • Understanding ApicaYAML Scripting and Syntax
    • Load Testing Overview
      • Getting Started with ALT
      • Creating / Running a Single Load Test
      • Running Multiple Tests Concurrently
      • Understanding Loadtest Results
    • Test Data Orchestrator (TDO)
      • Technical Guides
        • Hardware / Environment Requirements
        • IP Forwarding Instructions (Linux)
        • Self-Signed Certificate
        • Windows Server Install
        • Linux Server Install
        • User Maintenance
        • LDAP Setup
        • MongoDB Community Server Setup
        • TDX Installation Guide
      • User Documentation
        • End User Guide for TDO
          • Connecting to Orson
          • Coverage Sets and Business Rules
          • Data Assembly
          • Downloading Data
        • User Guide for TDX
          • Connecting to TDX
          • Setting up a Data Profile
          • Extracting Data
          • Analyzing Data Patterns
          • Performing Table Updates
        • TDO Project Builder User Guide
          • Project Design
          • Projects
            • Select Existing Project
            • Create a New Project
            • Export a Project
            • Import a Project
            • Clone a Project
            • Delete a Project
          • Working with Source Files
            • Ingest Data
            • Data Blocks
              • Create a Determining Attribute from a Data Block
              • Data Types and Field Formats
          • Determining Attributes
            • Manual Attribute Creation
              • Numeric Range Attribute
              • Manual Attribute Creation
              • Create a New Determining Attribute from an Existing Data Block
            • Setting Determining Attribute Priorities
            • Filtering Determining Attributes
            • Adding, Changing, or Deleting a Determining Attribute Value
          • Create Coverage Set
          • Business Rules
            • Create a New Business Rule
            • Edit a Business Rule
            • Using Priorities in Business Rules
            • Using Occurrences in Business Rules
            • Deleting a Business Rule
          • Create a Coverage Matrix
          • Create an Action
          • Create a Scenario
          • Create Data Views
            • Creating a Coverage Set Data View
            • Creating a Data View Joined to the Coverage Set View
            • Creating a Data View Linked to a Multiple Data Views
            • Locking Records in a Data View
            • Editing Data Source in a Data View
            • Other Edits in the Data View
          • Work Sets
            • Creating a Work Set
            • Editing a Work Set
            • Clone a Work Set
            • Deleting a Work Set
            • Unlocking a Work Set
            • Data Assembly from the Work Set Page
          • Data Assignment
            • Assign a Value from the Coverage Matrix
            • Assign a Value from a Data View
            • Assign a Value from a Prior Rule
            • Assign a Fixed Value
            • Assign a Value using a Format Function
            • Assign a Value using Mathematical Calculations
            • Assign a Value using String Concatenation
            • Assigning a Value using Conditions
          • Data Assembly
          • Other TDO Menu Items
        • API Guide
          • API Structure and Usage
          • Determining Attribute APIs
            • Create Determining Attribute (Range-based)
            • Create Determining Attribute (Value-based)
            • Update Determining Attributes
            • Get Determining Attribute Details
            • Delete a Determining Attribute
          • Coverage Set API’s
            • Create Coverage Set
            • Update Coverage Set
            • Get All Coverage Set Details
            • Get Single Coverage Set Details
            • Lock Coverage Set
            • Unlock Coverage Set
            • Delete Coverage Set
          • Business Rule API’s
            • Create Business Rule
            • Update Business Rule
            • Reduce Business Rules using Priorities
            • Get Business Rule Details
            • Get All Business Rules
            • Delete Business Rule
            • Generate Coverage Matrix
          • Workset API's
            • Create Workset
            • Update Workset
            • Get All Worksets
            • Get Workset Details
            • Unlock Workset
            • Clone Workset
            • Delete Workset
          • Assignment Rule API’s
            • Create Assignment Rule
              • Assign a Fixed Value
              • Assign a Value from a Data View
              • Using Conditions in Assignment Rules
              • Using Multiple Operators in an Assignment Rule
              • Using the FORMAT Function in an Assignment Rule
            • Get Assignment Rules
            • Get Rule Details
            • Update Assignment Rule
            • Delete Assignment Rule
          • Data Assembly API's
            • Assemble Data
            • Check Assembly Process
          • Data Movement API's
            • Ingest (Upload) Data Files
            • Download Data Files
              • HTML Download
              • CSV Download
              • Comma Delimited with Sequence Numbers Download
              • Pipe Delimited Download
              • Tab Delimited with Sequence Numbers Download
              • EDI X12 834 Download
              • SQL Lite db Download
              • Alight File Format Download
          • Reporting API's
            • Session Events
            • Rules Events
            • Coverage Events
            • Retrieve Data Block Contents
            • Data Assembly Summary
        • Workflow Guide
        • Format Function Guide
          • String Formats
          • Boolean Formats
          • Hexadecimal Formats
      • Release Notes
        • Build 1.0.2.0-20250408-0906
        • Build 1.0.2.0-20250213-1458
  • IRONdb
    • Getting Started
      • Installation
      • Configuration
      • Cluster Sizing
      • Command Line Options
      • ZFS Guide
    • Administration
      • Activity Tracking
      • Compacting Numeric Rollups
      • Migrating To A New Cluster
      • Monitoring
      • Operations
      • Rebuilding IRONdb Nodes
      • Resizing Clusters
    • API
      • API Specs
      • Data Deletion
      • Data Retrieval
      • Data Submission
      • Rebalance
      • State and Topology
    • Integrations
      • Graphite
      • Prometheus
      • OpenTSDB
    • Tools
      • Grafana Data Source
      • Graphite Plugin
      • IRONdb Relay
      • IRONdb Relay Release Notes
    • Metric Names and Tags
    • Release Notes
    • Archived Release Notes
  • Administration
    • E-Mail Configuration
    • Single Sign-On with SAML
    • Port Management
    • Audit Trail
      • Events Trail
      • Alerts Trail
Powered by GitBook
On this page
  • 1. Intro to Lambda Computing
  • Use Case for an AWS Lambda check.
  • Lambda Check Overview
  • 2. Coding Lambda
  • 3. AWS Setup
  • 4. Adding a JSON Event
  • 5. ASM Check Creation
  • 6. Run Result

Was this helpful?

Edit on GitHub
Export as PDF
  1. PLATFORM DOCS
  2. Synthetic Monitoring Overview
  3. Explore the Platform
  4. Check Types

AWS Lambda Check

  • 1. Intro to Lambda Computing

    • Use Case for an AWS Lambda check.

      • Powerful: AWS Lambda has many runtimes available.

      • Other Reasons to Use the AWS Lambda check

      • AWS SDK for Python (Boto3)

  • Lambda Check Overview

  • 2. Coding Lambda

    • Import Libraries

      • Define the Main Function

      • Scripting the Check

      • The finished script

      • Running the check

  • 3. AWS Setup

    • Get into the AWS Lambda Service

    • Create New Lambda Function

    • Configure the Function

    • Permissions

    • Create Function

    • Import Your Script

    • Create Test Event

    • Deploy the AWS Changes, Test

    • Edit Runtime Handler

    • Test

  • 4. Adding a JSON Event

    • Add an Event Field

  • 5. ASM Check Creation

    • Step 1: Name, Description, and Tags

    • Step 2a: Command and location

    • Step 2b: Command and location

    • Step 3: Interval, Thresholds, & Monitor Groups

    • Create check

  • 6. Run Result

1. Intro to Lambda Computing

This page is a step-by-step guide to using the AWS Lambda check type.

Use Case for an AWS Lambda check.

Powerful: AWS Lambda has many runtimes available.

Apica's Scripted Checks

Amazon Web Services Lambda Runtimes

  • Java

  • Python 3+

  • JavaScript

  • Node.js

  • Python

  • Ruby

  • Java

  • Go

  • .NET Core

  • C#

  • PowerShell

With AWS Lambda, you could use any one of these services to create your new ASM check.

Other Reasons to Use the AWS Lambda check

Example: You have a very secure Virtual Private Cloud where no traffic is allowed inside, and perhaps you don't want to use a private Apicanet agent. You can spawn your Lambda execution inside your secure VPC and take full advantage of the AWS permissions model. You could use it to give your Lambda access to additional services such as databases or an AWS secret store.

AWS SDK for Python (Boto3)

In our next section, we'll step through writing a very simple Python Lambda. And we will then add it into our ASM account and then run it and see the results.

Lambda Check Overview

  1. Prepare by importing the Libraries your script will need.

  2. Create the AWS Lambda Function, which the ASM Check will call for execution.

    1. Define the Main Functions of the script.

    2. Create a New Lambda Function.

    3. Import Your Script into that Lambda Function.

  3. Create the Lambda Event that ASM will display.

  4. Create the AWS Lambda Check on any ASM-supported AWS Regions on the Apica Monitoring Platform.

  5. Collect, compare and analyze the ASM Check Results OR send the results to integrated systems that use ASM as a data source.

2. Coding Lambda

For this AWS Lambda example, we will use Python to create the code for our cloud function. This example will be a simple Lambda; we will not be importing any libraries other than the default Python libraries.

Our goal is to create a check that returns a random value.

It appears simplistic, but this is to provide an example of creating a Lambda.

Step

Screenshot

Import Libraries

Import random because we need a random number.

Import time because we're going to need to know the start and end times of our check execution for our check.

Define the Main Function

We must keep track of the name of the main function. It needs to receive two things:

  1. An event

  2. A context.

These objects are passed from AWS Lambda when we configure it later.

Scripting the Check

This script will generate the dictionary, a list of values we will measure for further analysis.

  • Generate a random value (rand_value) between 0 and 100.

  • Create json_value that will capture all the defined return values that we want: Return Code, time

The expandable JSON result format will allow us to present a message with the generated random value, a unit of time measurement (milliseconds), and a Lambda function (set to True).

Important: if you've created a scripted check before, all you need to do is put the same thing into a Lambda.

rand_value = random.randint(0, 100)

The finished script

Running the check

The returned dictionary script finished here with an exit code of 0 but nothing else. But when we upload this to Lambda, we'll show you how to test it and make sure that everything is okay.

And the next step is uploading our script into AWS.

3. AWS Setup

Here, we will create the function, which will be called by our check inside the Amazon AWS UI.

Step

Screenshot

Get into the AWS Lambda Service

Log into your Amazon account enter Lambda in the Search field. If you have visited Lambda earlier, there should be a shortcut.

Create New Lambda Function

Click Create Function.

Configure the Function

  • Name it "DemoFunction"

  • Select the Runtime, Python 3.9

Permissions

(Optional) Set your Lambda function to run with specific permissions for a specific existing (or new) role that has the rights that it needs to run correctly.

For example, if the function needs to access a relational database or other Amazon services, this is where you would set that.

Create Function

Click Create Function.

Here's our DemoFunction with an AWS-created default/placeholder function called lambda handler.

Import Your Script

Paste the Script Code into the DefaultFunction section

Note: if you press test now, you'll get an error because we are not ready.

Create Test Event

In the main definition, you have event and context. We have not defined a test event yet, so if you click Test, AWS will throw an error prompting you to define the event, opening an event template.

  • Enter your test event name

  • Click Create

Deploy the AWS Changes, Test

  • Deploy your changes

  • Our pasted code has changed the name of the Runtime Handler from lambda_handler to main so that a test will return an error. This error is not a problem because we can edit the Handler to match the code.

  • Note the test Status shows Failed.

Edit Runtime Handler

Find Runtime settings beneath the test results and note the Handler shows as named lambda_handler

Click Edit to change this to our name of main

Click Save

Test

Click Test to repeat the test, now with the handler name updated to main.

We can see that the test Status shows Succeeded

What we see is going to be our check response.

So the above is how you create the Lambda function in Amazon, and it's ready for use. But first, let's add a JSON Event to our check before we set up the check in ASM.

4. Adding a JSON Event

To demonstrate some of the power of Lambda, we'll be editing our code directly in AWS Lambda.

We will be adding just one additional field to our results because it's essential to show the power of our expandable JSON results.

Step

Screenshot

Add an Event Field

In AWS Lambda, we're going to add a field called event. And it will be the event that's passed in; i.e., the event is data from the check itself. So when we add it in this section, we can show that happening.

Click Deploy

Click Test to do a quick test, and we'll see that the event has our test event that we created earlier. We'll see how this works when we run the ASM function.

Note the test Status now shows Succeeded.

Next, we'll be creating our check in the ASM UI. And we'll be running this and retrieving a result from it.

5. ASM Check Creation

We're going to create our check in ASM.

Step

Screenshot

Navigate to ASM, New Check+. Under the Scripted Check banner, click the Run AWS Lambda icon.

The check wizard will open. So let's create a new check.

Step 1: Name, Description, and Tags

Provide a name for your check. In this example, we will call it New Lambda Check. Add any Description and Tags to help organize this check.

Click Next

Step 2a: Command and location

AWS Region: This is the AWS region that your Lambda resides in. It may not default to your location, which you'll find that in the upper right-hand corner of your Amazon screen, or your Amazon UI.

In this example, it says Frankfurt, but up in the browser URL, you will see eu-central-1.console.aws.ams:on.com/lambda/hometregion-=eu-central-14

So, select eu-central-1 for this example or whatever matches your AWS Region in the dropdown choices.

Function Name: Our example function name is simply DemoFunction, which we will type in, and find in our AWS UI, indicated to the right.

Step 2b: Command and location

AWS access key & AWS secret key. It would be best if you had this from AWS. You may need to ask your Operations team to get you an AWS Access Key and AWS Secret Key to execute this function. Enter these here.


Base64 Encoded JSON Payload. This JSON payload is the event that is passed here. For this field, create a JSON object; Base64 encode it and paste it in here.


For other fields, such as AWS Session Token, AWS Role ARN, and Role Session Name: if/when you provide these inputs, the script that runs your Lambda will assume a role before running the Lambda.

You may have a security scheme that uses that, but, in this case, we don't. So we're going to leave these empty.

Location. Select for this example Montreal and then press Next.

Step 3: Interval, Thresholds, & Monitor Groups

Set this check to run manually, and check the Monitor Group(s) to organize the check under. In this screenshot, It was put under the Driver Monitor Group. Click Next.

Create check

After Acknowledging the confirmation page, press Create.

For Testing:

  • Disable failover

  • Set to Maximum Attempts to One

  • Click Save

We've created our ASM check.

And in our next section, we will run the check and examine our results.

6. Run Result

We've created and configured our AWS Lambda check. We are ready to run it in ASM and view the results.

Note: the base 64 encoded JSON payload field that we provided is entirely optional. If you don't want to use it, you can leave it empty, and there won't be any additional information passed to your Lambda. But we'll keep it for this example.

Step

Screenshot

Check details page and manually run the check.

We see the result value/response time of 47 milliseconds.

If you click into this data point, in the Result section, you can see the Lambda: true event as well as the hello: "world" object that we encoded in Base64 and put inside our check configuration.

You can provide almost any kind of data in this check type. As long as you have an Amazon account, the possibilities with this check are endless.

As mentioned before, you can use many different types of runtimes. If you're a Ruby person and cannot have an Apica Private Agent inside your firewall, you can use Lambda.

Using AWS Identity and Access Management, you can set very fine-grained permissions.

PreviousDesktop Application CheckNextDNS Resolver Check

Was this helpful?

From

Lambda is a compute service that lets you run code without provisioning or managing servers. Lambda runs your code on a high-availability compute infrastructure and performs all of the administration of the compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, code monitoring, and logging. You can run code for virtually any application or back-end service with Lambda. All you need to do is supply your code in one of the .

AWS Lambda functionality adds flexibility to the . The Scripted Check Options have a pre-selected set of Runtime languages, listed below. Select the Run AWS Lambda icon for additional languages beyond these to develop checks from language resources you already may have.

Additionally, you can use to provide your Elastic Container, the runner for this check.

Get started quickly using AWS with , the AWS SDK for Python. Boto3 makes it easy to integrate your Python application, library, or script with AWS services, including Amazon S3, Amazon EC2, Amazon DynamoDB, and more.

Establish an .

Script the check. (Note: You could use a here.)

To add and use additional libraries in AWS Lambda, for example, add some additional Python packages, here's . The AWS SDK for Python, , is already included with your Python Lambda.

You can use a Base64 encoding website . In this example, we'll create a very simple JSON by entering 'hello world,' and then we will encode it; this is our encoded JSON, and then we will paste that in the field.

import random
import time
def main(event, context):
    json_value = {
        "returncode": 0,
        "start_time": time.time (),
        "end_time": time.time(),
        "message": f'Random value: {rand_value}.',
        "unit": "ms",
        "value": rand_value,
        “lambda": True
    }
def main(event, context):
    rand_value = random.randint(0, 100)
    json_value = {
        "returncode": 0,
        "start_time": time.time (),
        "end_time": time.time(),
        "message": f'Random value: {rand_value}.',
        "unit": "ms",
        "value": rand_value,
        “lambda": True
    }
    return json_value

 if __name__ == "__main__":
    main('', '')
What is AWS Lambda?
languages that Lambda supports
Apica Scripted Checks
Lambda container images
Boto3
AWS Lambda Account
Scripted Check
a link that may get you started
Boto3
http://base64encode.org