Apica Docs
  • Welcome to Apica Docs!
  • PRODUCT OVERVIEW
    • Ascent Overview
    • Ascent User Interface
  • TECHNOLOGIES
    • Ascent with Kubernetes
      • Kubernetes is a Game-Changer
      • Ascent: Built on Kubernetes
    • Ascent with OpenTelemetry
      • Why Implement OpenTelemetry?
      • Common Use Cases for OpenTelemetry
      • How to Get Started with OpenTelemetry
      • Best Practices for OpenTelemetry Implementations
  • RELEASE NOTES
    • Release Notes
      • Ascent 2.10.6
      • Ascent 2.10.5
      • Ascent 2.10.4
      • Ascent 2.10.3
      • Ascent 2.10.2
      • Ascent 2.9.0
      • Ascent 2.8.1
      • Ascent 2.8.0
      • Ascent 2.7.0
      • Ascent 2.6.0
      • Ascent 2.5.0
      • Ascent 2.4.0
      • Ascent 2.3.0
      • Ascent 2.2.0
      • Ascent 2.1.0
        • Data Fabric
          • Releases-old
        • Synthetic Monitoring
        • Advanced Scripting Engine
        • IRONdb
      • Synthetic Monitoring
  • GETTING STARTED
    • Getting Started with Ascent
      • Register and Gain Access
      • Using the OpenTelemetry Demo
      • Getting Started with Metrics
      • Getting Started with Logs
        • OpenTelemetry
      • Using Fleet for Data Ingestion
    • Ascent Deployment Overview
    • Quickstart with Docker-Compose
    • On-Premise PaaS deployment
      • On-Premise PaaS Deployment Architecture
      • Deploying Apica Ascent PaaS on Kubernetes
      • Deploying Apica Ascent PaaS on MicroK8s
      • Deploying Apica Ascent PaaS on AWS
      • Deploying Apica Ascent EKS on AWS using CloudFormation
      • Deploying Ascent on AWS EKS with Aurora PostgreSQL and ElastiCache Redis using Cloud Formation
        • Deploying Apica Ascent on AWS EKS with Aurora PostgreSQL and ElastiCache Redis using CloudFormation
        • Apica Ascent on AWS EKS (Private Endpoint) with Aurora PostgreSQL and ElastiCache Redis on prod VPC
      • Deploying Apica Ascent EKS on AWS using custom AMI
      • Deploying Apica Ascent EKS with AWS ALB
      • Deploying Apica Ascent PaaS in Azure Kubernetes Service
        • Azure Blob Storage Lifecycle Management
      • Deploying Apica Ascent with OpenShift
    • Boomi RTO Quick Start Guide
      • RTO Dashboarding
      • Alerting on RTO Metrics
      • Alerting on RTO Logs
    • Dashboards & Visualizations
  • DATA SOURCES
    • Data Source Overview
    • API
      • JSON Data source
      • RSS
    • AWS
      • Amazon Athena
      • Amazon CloudWatch ( YAML )
      • Amazon Elasticsearch Service
      • Amazon Redshift
      • MySQL Server (Amazon RDS)
    • NoSQL Data Sources
      • MongoDB
    • OLAP
      • Data Bricks
      • Druid
      • Snowflake
    • SQL Data Sources
      • PostgreSQL
      • Microsoft SQL Server
      • MySQL Server
    • Time Series Databases
      • Prometheus Compatible
      • Elasticsearch
      • InfluxDB
    • Ascent Synthetics
      • Checks
    • Ascent Logs
      • Logs
  • INTEGRATIONS
    • Integrations Overview
      • Generating a secure ingest token
      • Data Ingest Ports
    • List of Integrations
      • Apache Beam
        • Export Metrics to Prometheus
          • Pull Mechanism via Push-Gateway
        • Export Events to Apica Ascent
      • Apica ASM
      • Apica Ascent Observability Data Collector Agent
      • AWS
        • AWS CloudWatch
        • AWS ECS
          • Forwarding AWS ECS logs to Apica Ascent using AWS FireLens
          • ECS prometheus metrics to Apica Ascent
        • AWS S3
      • Azure
        • Azure Databricks
        • Azure Eventhub
        • Azure Event Hubs
      • Docker Compose
      • Docker Swarm logging
      • Docker Syslog log driver
      • F5 Big-Ip System
      • Filebeat
      • Fluent Bit
        • Forwarding Amazon-Linux logs to Apica Ascent using Fluent Bit
        • Fluent Bit installation on Ubuntu
        • Enabling IoT(MQTT) Input (PAAS)
        • IIS Logs on Windows
      • Fluentd
      • FortiNet Firewalls
      • GCP PubSub
      • GCP Cloud Logging
      • IBM QRadar
      • ilert
      • Incident Management
        • Webhooks
      • Jaeger
      • Kafka
      • Kinesis
      • Kubernetes
      • Logstash
      • MQTT
      • Network Packets
      • OpenTelemetry
      • Object store (S3 Compatible)
      • Oracle OCI Infrastructure Audit/Logs
      • Oracle Data Integrator (ODI)
      • OSSEC Variants (OSSEC/WAZUH/ATOMIC)
        • Apica Ascent-OSSEC Agent for Windows
      • Palo Alto Firewall
      • Prometheus
        • Spring Boot
        • Prometheus on Windows
        • Prometheus Remote Write
        • MongoDB Exporter
        • JMX Exporter
      • Rsyslogd
      • Syslog
      • Syslog-ng
      • Splunk Universal Forwarder
      • Splunk Heavy Forwarder
      • SNMP
      • Splunk Forwarding Proxy
      • Vault
        • Audit Vault Logs - AWS
        • Audit Vault Logs - OCI
        • Audit Vault Metrics
    • Apica API DOCS
  • DATA MANAGEMENT
    • Data Management Overview
    • Data Explorer Overview
      • Query Builder
      • Widget
      • Alerts
      • JSON Import
      • Creating Json Schema
        • Visualization
          • Line chart
          • Bar chart
          • Area chart
          • Scatter chart
          • Status chart
          • Counter chart
          • Stat chart
          • Size chart
          • Dense Status chart
          • Honeycomb chart
          • Gauge chart
          • Pie chart
          • Disk chart
          • Table chart
          • Date time chart
      • Time-Series AI/ML
        • Anomaly Detection
        • Averaging
        • Standard Deviation(STD)
      • Data Explorer Dashboard
        • Create a Dashboard
        • Editing Dashboard
          • Dashboard level filters
    • Timestamp handling
      • Timestamp bookmark
    • Large log/events/metrics/traces
  • OBSERVE
    • Monitoring Overview
      • Connecting Prometheus
      • Connecting Amazon Managed Service for Prometheus
      • Windows Redis Monitoring
      • Writing queries
        • Query Snippets
      • Query API
      • Use Apica API to ingest JSON data
    • Distributed Tracing
      • Traces
      • Spans
      • Native support for OTEL Traces
      • Windows .NET Application Tracing
      • Linux+Java Application Tracing
    • Log Management
      • Terminology
      • Explore Logs
      • Topology
      • Apica Ascent Search Cheat Sheet
      • Share Search Results
      • Severity Metrics
      • Log2Metrics
      • Native support for OTEL Logs
      • Reports
        • Accessing Reports results via API
      • Role-Based Access Control (RBAC)
      • Configuring RBAC
    • AI and LLM Observability
      • AI Agent Deployment
      • Ascent AI Agent Monitoring
      • Ascent Quick Start Guide
    • Synthetic Check Monitoring
      • Map View
      • List View
      • Alerting for Check Results
  • Flow
    • Overview
    • Replay
    • Pipeline Management
      • Configuring Pipelines
      • Visualize Pipelines
      • Pipeline Overview Dashboard
      • Forwarding Data
    • OpenTelemetry Ingest
      • OpenTelemetry Logs / Traces
      • OpenTelemetry Metrics
        • Transforming Metrics through Code Rules
    • Vault
      • Certificates
      • Variables
      • Lookups
    • Rules
      • FILTER
      • EXTRACT
      • SIEM and TAG
      • REWRITE
      • CODE
      • FORWARD
        • Rename Attributes
      • STREAM
    • Functions
      • ascent.encode
      • ascent.decode
      • ascent.persist
      • Ascent.variables
      • ascent.crypto
      • Ascent.mask
      • Ascent.net
      • Ascent.text
      • Ascent.time
      • Ascent.lookups
    • List of Forwarders
    • OpenTelemetry Forwarding
      • Metrics
      • Traces
      • Logs
    • Splunk Forwarding
      • Apica UF Proxy App Extension
        • Standalone Instance
        • List of Indexer Instances
        • Indexer Discovery
      • Splunk HTTP Event Collector (HEC) Forwarder
        • Metric Indexes
        • Non Metric Indexes
      • Splunk Syslog Forwarding
    • Real-Time Stream Forwarding
      • AWS Kinesis
      • Azure Eventhub
      • Google Pub/Sub
    • Security Monitor Forwarding
      • Arc Sight
      • RSA New Witness
    • Forwarding to Monitoring Tools
      • Datadog Forwarding
      • New Relic Forwarding
      • Dynatrace Forwarding
      • Elasticsearch Forwarding
      • Coralogix Forwarding
      • Azure Log Analytics Forwarding
    • Object Store Forwarding
      • S3 Compatible
      • Azure Blob Storage
    • Forwarding to Data Warehouse
      • GCP Bigquery
  • Customized Forwarders
    • JS Code Forwarding
  • LAKE
    • Powered by Instastore™
  • FLEET MANAGEMENT
    • Overview
    • Agents
    • Configurations
    • Packages
    • Fleet Repository Management
    • Advanced Search
    • List of Agents
      • Datadog Agent
      • Fluent-bit Agent
      • Grafana Alloy
      • OpenTelemetry Collector
      • OpenTelemetry Kubernetes
      • Prometheus Agent
  • COMMAND LINE INTERFACE
    • apicactl Documentation
  • AUTONOMOUS INSIGHTS
    • Time Series AI-ML
      • Anomaly Detection
      • Averaging
      • Standard Deviation(STD)
      • Forecasting
      • AI-ML on PromQL Query Data Set
      • Statistical Data Description
    • Pattern-Signature (PS)
      • Log PS Explained
        • Unstructured Logs
        • Semi-structured JSON
        • Reduce Logs Based on PS
        • Log PS Use Cases
          • Log Outlier Isolation
          • Log Trending Analysis
          • Simple Log Compare
      • Config PS
        • Config JSON PS
    • ALIVE Log Visualization
      • ALIVE Pattern Signature Summary
      • ALIVE Log Compare
    • Log Explained using Generative AI
      • Configuring Generative AI Access
      • GenAI Example Using Log Explain
    • Alerts
    • Alerts (Simple/Anomaly)
    • Alerts On Logs
    • Rule Packs
    • AI-powered Search
  • PLATFORM DOCS
    • Synthetic Monitoring Overview
      • Getting Started with ASM
        • Achieving 3 Clicks to Issue Resolution via ASM
        • FAQ - Frequently Asked Questions
        • Creating A New Check
          • Creating a New Real Browser Check
      • Explore the Platform
        • API Details
        • Check Types
          • Android Check
          • Command Check
          • Compound Check
          • Browser Check
          • Desktop Application Check
          • AWS Lambda Check
          • DNS Resolver Check
          • DNS Security Check
          • Domain Availability Check
          • Domain Delegation Check
          • Domain Expiration Date Check
          • Hostname Integrity Check
          • iPad Check
          • iPhone Check
          • Ping Check
          • Port Check
          • Postman Check
          • Response Time Check
          • SSL Certificate Expiration Check
          • Scripted Check
        • Dashboards
        • Integrations
          • DynaTrace Integration
          • Google Analytics Integration
          • Akamai Integration
          • Centrify Integration
          • AppDynamics Integration
          • PagerDuty Integration
          • ServiceNow Integration
          • Splunk Integration
        • Metrics
          • Analyze Site
          • Result Values
          • Trends
          • Analyze Metrics
        • Monitoring
          • Integrating ASM Metrics into Grafana Using Apica Panels
            • Understanding the ASM Imported Dashboards
            • Using the Apica Panels Dashboards
          • Understanding ASM Check Host Locations
        • Navigation
          • Manage Menu
        • Reports
        • Use Cases
      • Configurations
        • Configuring Checks
          • Understanding Check Results
            • Understanding ZebraTester Check Results
            • Understanding Browser Check Results
            • Understanding Check Details
          • Editing Checks
            • Editing Browser Checks
            • Editing ZebraTester Checks
          • Using Regular Expressions Within the ASM Platform
          • Understanding the Edit Scenario Page
          • Comparing Selenium IDE Scripts to ASM Scenarios
          • Configuring Apica DNS Check Types
          • Implementing Tags Effectively Within ASM
          • Storing and Retrieving Information Using the ASM Dictionary
        • Configuring Users
          • Configuring SSO Within ASM
        • Configuring Alerts
          • Configuring Webhook Alerts
      • How-To Articles
        • ASM Monitoring Best Practices
        • API Monitoring Guide
        • IT Monitoring Guide
        • Monitor Mission-Critical Applications through the Eyes of Your Users
        • How To Mask Sensitive Data in ASM
        • How to Mask Sensitive Data When Using Postman Checks
        • How to Handle URL Errors in a Check
        • How To Set Up SSO Using Azure AD
        • How to Set Up SSO Using Centrify
        • ASM Scenarios How-To
          • How To Pace a Selenium Script
          • How to Utilize XPath Within a Selenium Script
          • How to Mask Sensitive Information Within an ASM Scenario
          • Handling Elements Which Do Not Appear Consistently
          • How to Handle HTML Windows in ASM Scenarios
        • Installing CES Private Agent (Docker)
        • CES Private Agent - Download Links
    • ZebraTester Scripting
      • ZebraTester Overview
      • Install ZebraTester
        • Download ZebraTester
          • Core ZebraTester V7.5-A Documentation
          • Core ZebraTester V7.0-B Documentation
          • Core ZebraTester V7.0-A Documentation
          • Core ZebraTester V5.5-Z Documentation
          • Core ZebraTester V5.5-F Documentation
        • Download the ZebraTester Recorder Extension
        • Windows Installation
          • ZebraTester on Windows
          • Generate Private CA Root Certificate
          • Windows System Tuning
          • Install a new ZT version on Windows Server
          • Install/Uninstall ZT Windows Installer Silently
        • macOS Installation
          • macOS Preinstallation Instructions
          • Generate Private CA Root Cert (macOS X)
          • System Tuning (macOS)
          • Import a CA Root Certificate to an iOS device
          • Memory Configuration Guidelines for ZebraTester Agents
      • ZebraTester User Guide
        • Menu and Navigation Overview
        • 1. Get a Load Test Session
          • Recording Web Surfing Sessions with ZebraTester
            • Further Hints for Recording Web Surfing Sessions
            • Recording Extension
              • Record Web Session
              • Cookies and Cache
              • Proxy
              • Page Breaks
              • Recording Extension Introduction
              • Troubleshooting
            • Add URL to ZebraTester
            • Page Scanner
          • Next Steps after Recording a Web Surfing Session
        • 2. Scripting the Load Test Session
          • 1. Assertions - HTTP Response Verificaton
          • 2. Correlation - Dynamic Session Parameters
            • 2b. Configuring Variable Rules
            • 2a. Var Finder
          • 3. Parameterization: Input Fields, ADR and Input Files
            • ADR
          • 4. Execution Control - Inner Loops
          • 5. Execution Control - URL Loops
          • 6. Execution Control -User-Defined Transactions And Page Breaks
          • 7. Custom Scripting - Inline Scripts
          • 8. Custom Scripting - Load Test Plug-ins
            • ZebraTester Plug-in Handbooks
          • Modular Scripting Support
        • 3. Recording Session Replay
        • 4. Execute the Load Test
          • Executing a First Load Test
          • Executing Load Test Programs
            • Project Navigator
              • Configuration of the Project Navigator Main Directory
            • Real-Time Load Test Actions
            • Real-Time Error Analysis
            • Acquiring the Load Test Result
            • More Tips for Executing Load Tests
          • Distributed Load Tests
            • Exec Agents
            • Exec Agent Clusters
          • Multiple Client IP Addresses
            • Sending Email And Alerts
            • Using Multiple Client IP Addresses per Load-Releasing System
        • 5. Analyzing Results
          • Detail Results
          • Load Test Result Detail-Statistics and Diagrams
          • Enhanced HTTP Status Codes
          • Error Snapshots
          • Load Curve Diagrams
          • URL Exec Step
          • Comparison Diagrams
            • Analysis Load Test Response Time Comparison
            • Performance Overview
            • Session Failures
        • Programmatic Access to Measured Data
          • Extracting Error Snapshots
          • Extracting Performance Data
        • Web Tools
        • Advanced Topics
          • Execute a JMeter Test Plan in ZebraTester
          • Credentials Manager for ZebraTester
          • Wildcard Edition
          • Execution Plan in ZebraTester
          • Log rotation settings for ZebraTester Processes
          • Modify Session
          • Modular Scripting Support
          • Understanding Pacing
          • Integrating ZebraTester with GIT
            • GitHub Integration Manual V5.4.1
      • ZebraTester FAQ
      • ZebraTester How-to articles
        • How to Combine Multiple ZebraTester Scripts Into One
        • Inline Scripting
        • How to Configure a ZebraTester Script to Fetch Credentials from CyberArk
        • How to Configure a ZebraTester Scenario to Fetch Credentials from CyberArk
        • How to Convert a HAR file into a ZebraTester Script
        • How to Convert a LoadRunner Script to ZebraTester
        • How to Import the ZT Root Certificate to an iOS device
        • How to iterate over JSON objects in ZebraTester using Inline Scripts
        • How to round a number to a certain number of decimal points within a ZebraTester Inline Script
        • How to Use a Custom DNS Host File Within a ZebraTester Script
        • How to Move a ZebraTester Script to an Older Format
        • API Plugin Version
        • Setting up the Memu Player for ZebraTester Recording
        • Inline Script Version
      • Apica Data Repository (ADR) aka Apica Table Server
        • ADR related inline functions available in ZT
        • Apica Data Repository Release Notes
        • REST Endpoint Examples
        • Accessing the ADR with Inline Scripts
      • ZebraTester Plugin Repository
      • Apica YAML
        • Installing and Using the ApicaYAML CLI Tool
        • Understanding ApicaYAML Scripting and Syntax
    • Load Testing Overview
      • Getting Started with ALT
      • Creating / Running a Single Load Test
      • Running Multiple Tests Concurrently
      • Understanding Loadtest Results
    • Test Data Orchestrator (TDO)
      • Technical Guides
        • Hardware / Environment Requirements
        • IP Forwarding Instructions (Linux)
        • Self-Signed Certificate
        • Windows Server Install
        • Linux Server Install
        • User Maintenance
        • LDAP Setup
        • MongoDB Community Server Setup
        • TDX Installation Guide
      • User Documentation
        • End User Guide for TDO
          • Connecting to Orson
          • Coverage Sets and Business Rules
          • Data Assembly
          • Downloading Data
        • User Guide for TDX
          • Connecting to TDX
          • Setting up a Data Profile
          • Extracting Data
          • Analyzing Data Patterns
          • Performing Table Updates
        • TDO Project Builder User Guide
          • Project Design
          • Projects
            • Select Existing Project
            • Create a New Project
            • Export a Project
            • Import a Project
            • Clone a Project
            • Delete a Project
          • Working with Source Files
            • Ingest Data
            • Data Blocks
              • Create a Determining Attribute from a Data Block
              • Data Types and Field Formats
          • Determining Attributes
            • Manual Attribute Creation
              • Numeric Range Attribute
              • Manual Attribute Creation
              • Create a New Determining Attribute from an Existing Data Block
            • Setting Determining Attribute Priorities
            • Filtering Determining Attributes
            • Adding, Changing, or Deleting a Determining Attribute Value
          • Create Coverage Set
          • Business Rules
            • Create a New Business Rule
            • Edit a Business Rule
            • Using Priorities in Business Rules
            • Using Occurrences in Business Rules
            • Deleting a Business Rule
          • Create a Coverage Matrix
          • Create an Action
          • Create a Scenario
          • Create Data Views
            • Creating a Coverage Set Data View
            • Creating a Data View Joined to the Coverage Set View
            • Creating a Data View Linked to a Multiple Data Views
            • Locking Records in a Data View
            • Editing Data Source in a Data View
            • Other Edits in the Data View
          • Work Sets
            • Creating a Work Set
            • Editing a Work Set
            • Clone a Work Set
            • Deleting a Work Set
            • Unlocking a Work Set
            • Data Assembly from the Work Set Page
          • Data Assignment
            • Assign a Value from the Coverage Matrix
            • Assign a Value from a Data View
            • Assign a Value from a Prior Rule
            • Assign a Fixed Value
            • Assign a Value using a Format Function
            • Assign a Value using Mathematical Calculations
            • Assign a Value using String Concatenation
            • Assigning a Value using Conditions
          • Data Assembly
          • Other TDO Menu Items
        • API Guide
          • API Structure and Usage
          • Determining Attribute APIs
            • Create Determining Attribute (Range-based)
            • Create Determining Attribute (Value-based)
            • Update Determining Attributes
            • Get Determining Attribute Details
            • Delete a Determining Attribute
          • Coverage Set API’s
            • Create Coverage Set
            • Update Coverage Set
            • Get All Coverage Set Details
            • Get Single Coverage Set Details
            • Lock Coverage Set
            • Unlock Coverage Set
            • Delete Coverage Set
          • Business Rule API’s
            • Create Business Rule
            • Update Business Rule
            • Reduce Business Rules using Priorities
            • Get Business Rule Details
            • Get All Business Rules
            • Delete Business Rule
            • Generate Coverage Matrix
          • Workset API's
            • Create Workset
            • Update Workset
            • Get All Worksets
            • Get Workset Details
            • Unlock Workset
            • Clone Workset
            • Delete Workset
          • Assignment Rule API’s
            • Create Assignment Rule
              • Assign a Fixed Value
              • Assign a Value from a Data View
              • Using Conditions in Assignment Rules
              • Using Multiple Operators in an Assignment Rule
              • Using the FORMAT Function in an Assignment Rule
            • Get Assignment Rules
            • Get Rule Details
            • Update Assignment Rule
            • Delete Assignment Rule
          • Data Assembly API's
            • Assemble Data
            • Check Assembly Process
          • Data Movement API's
            • Ingest (Upload) Data Files
            • Download Data Files
              • HTML Download
              • CSV Download
              • Comma Delimited with Sequence Numbers Download
              • Pipe Delimited Download
              • Tab Delimited with Sequence Numbers Download
              • EDI X12 834 Download
              • SQL Lite db Download
              • Alight File Format Download
          • Reporting API's
            • Session Events
            • Rules Events
            • Coverage Events
            • Retrieve Data Block Contents
            • Data Assembly Summary
        • Workflow Guide
        • Format Function Guide
          • String Formats
          • Boolean Formats
          • Hexadecimal Formats
      • Release Notes
        • Build 1.0.2.0-20250408-0906
        • Build 1.0.2.0-20250213-1458
  • IRONdb
    • Getting Started
      • Installation
      • Configuration
      • Cluster Sizing
      • Command Line Options
      • ZFS Guide
    • Administration
      • Activity Tracking
      • Compacting Numeric Rollups
      • Migrating To A New Cluster
      • Monitoring
      • Operations
      • Rebuilding IRONdb Nodes
      • Resizing Clusters
    • API
      • API Specs
      • Data Deletion
      • Data Retrieval
      • Data Submission
      • Rebalance
      • State and Topology
    • Integrations
      • Graphite
      • Prometheus
      • OpenTSDB
    • Tools
      • Grafana Data Source
      • Graphite Plugin
      • IRONdb Relay
      • IRONdb Relay Release Notes
    • Metric Names and Tags
    • Release Notes
    • Archived Release Notes
  • Administration
    • E-Mail Configuration
    • Single Sign-On with SAML
    • Port Management
    • Audit Trail
      • Events Trail
      • Alerts Trail
Powered by GitBook
On this page
  • Setting Up a Postman Check
  • The Postman Check Runner
  • 1. Introduction
  • 2. Why Use a Postman Check?
  • 3. The ASM Postman Check
  • Manage Repository Files
  • 4. Samples
  • 5. Result Reports
  • 6. “cryptify” Package Details

Was this helpful?

Edit on GitHub
Export as PDF
  1. PLATFORM DOCS
  2. Synthetic Monitoring Overview
  3. Explore the Platform
  4. Check Types

Postman Check

PreviousPort CheckNextResponse Time Check

Was this helpful?

The following documentation provide details for the setup and configuration of the Apica Postman Check.

Setting Up a Postman Check

Step

Screenshot

Step 1: Name, Description, and Tags

Select a check type from the API Tools Check section. In this example, we will use Postman as the check type. Give the check a name and description and assign Tags to it if desired.

Step 2a: Select/Add/Delete Repository Profile

This step directs you to select/add/delete a Repository Profile. Select an existing Repository Profile. If there is no existing profile, press the green “+” icon to add a new Repository Profile (of any supported type). Click the Edit Icon to Edit/Modify/Delete an existing Repository Profile.

Step 2b: Specify Repository Paths

File path: Add the path/filename to the collection file relative to the root repository address. For example, if the URL address of your GitHub repository is https://github.com/user/newCheckTypes (see screenshot in step 2a) and the URL address of your file is https://github.com/user/newCheckTypes/tree/main/subfolder/myCollection.json, enter /subfolder/myCollection.json as the File path.

Environment Variables: if your script utilizes environment variables, add them to the “Environment Variables” field as a comma-separated list in the form of name=value. Having the option to specify Environment Variables from the ASM GUI allows you to easily change certain variables which are used in the script. For example, by defining an environment variable called “ID” (see screenshot), you can clone the Check after it has been created and easily create multiple checks with different “ID” variables.

Decrypt Key (optional): a Decrypt Key can be specified if you wish to encrypt sensitive data within your script. Technically, sensitive data should always be encrypted within a hosting repository, but it is up to you whether you wish to encrypt the data or not. Encryption of sensitive data is done via the Cryptify library. For directions on how to implement encryption within your Scripted check, refer to How to Mask Sensitive Data When Using Postman Checks.

Data Directory (optional): If your script relies on data which is stored in another file, that file can be placed in a “data directory” and referenced from the “data directory” field. For example, if your script utilizes a set of 500 key/value pair combinations, the data would best be placed in a separate .json file and put within the /data subdirectory within your base repository. Then, when /data is specified as the Data Directory, that subfolder will be recognized when the check runs.

Step 3 Interval, Thresholds & Monitor Groups

This is the same as other ASM checks. Set the frequencies needed, desired thresholds, and the Monitoring Groups the check should be organized under.

The Postman Check Runner

Note: npm Documentation Source

The latest version of the ASM Postman Check Runner documentation is found here and is authoritative:

1. Introduction

The asm-pm-runner runs Postman collections and exports results to Apica’s Check Results Service (CRS) format.

The command-line tool is based on the standard Postman (Newman is a command-line collection runner for Postman.)

The primary enhancement to Newman is supporting the data decryption files and certificates. Encryption/decryption is based on the npm module.

2. Why Use a Postman Check?

ASM users who have Postman collections can now run them, without conversion, as long-term monitoring checks from a global perspective. This allows you to expand testing from a single location to measuring and comparing multiple performance results from any Apica agent over time.

3. The ASM Postman Check

Apica ASM can run Postman checks based on asm-pm-runner. This feature runs a Postman collection in Apica ASM and reports the results to Apica’s Check Reporting Service.

Running the Postman Runner Solution

At a Terminal window, type the following to install npm:

npm install -g npm

Installing the ASM Postman Check Runner Solution

At a Terminal window, type the following to install the ASM Postman Runner

npm install -g @apica-io/asm-pm-runner

ASM PM Runner Syntax

asm-pm-runner [options] <collection>

To get the options available for your version of the ASM-PM-Runner, at the command line/Terminal, enter:

asm-pm-runner -h

Which will return the following help with options Usage: asm-pm-runner [options] <collection>

Options: -e, --environment <path> Environment JSON file -dk, --decryptKey <decryptKey> Decrypt key -ev, --envVars [envVars...] Environment variables. Format name=value format -v, --verbose Print collection information on stdout (default: false) -r, --resultDir <dir> The result directory -dd, --dataDir <dir> The data directory for certificates and test data -l, --logLevel <logLevel> Log level in log4js (default: "info") --sslClientCert <path> Specify the path to a client certificate (PEM) --sslClientKey <path> Specify the path to a client certificate private key --sslClientPassphrase <passphrase> Specify the client certificate passphrase (for protected key) -V, --version output the version number -h, --help display help for command

Manage Repository Files

Please set up a Repository Profile for using URL-XI, Postman, or Scripted Checks. These scripts and collections are not uploaded to the Apica Platform but are managed through a repository.

URL-XI, Postman, and Scripted Checks use repositories to store their scripts/collections, so the first step is to create a new Repository Profile in the ASM Portal. Please go to the Manage Repository Files page for steps in setting this up.

The package.json file contains scripts for running the samples with correct parameter settings.

$ asm-pm-runner samples/HTTP-Bin-Requests.postman_collection.json -v -ev username=foo password=bar

This sample uses encrypted certificates.

$ asm-pm-runner -l debug -dk Encrypt.4.ssl --sslClientCert file.cert.pem --sslClientKey file.key.pem --dataDir samples/ec_badssl_certs -r results/ -v samples/BadSSL.postman_collection.json


The JSON report is stored in the results directory. The format is complaint with Check Result Service in Apica ASM.

The JSON report is divided into one or more steps.

  • A step contains requests from the postman collection.

  • The default is only one step, with all requests generated.

  • Several steps will be generated if the collection contains folders.

    • One step for each folder will be generated.

    • The step name corresponds to the folder name.

  • Only one level of steps is supported.

    • Folders in folders will be named folder/subfolder.

      • Example root folder/subfolder 1/subfolder 2

asm-pm-runner can return a custom value as the return value when running a collection. This functionality is often used in ASM when you want to return something else than the total response time as the main result of the check run. In the collection, use two special collection parameters and assign them values in a JavaScript test.

Collection Variable

Description

_Apica_ReturnValue

A numeric value used as the check value (return value)

_Apica_ReturnUnit

The unit of the customized return value

pm.test("Status code is 200", function () { pm.response.to.have.status(200); }); let eventCount =pm.response.json().length; pm.collectionVariables.set("_Apica_ReturnValue",eventCount) pm.collectionVariables.set("_Apica_ReturnUnit","events")

You can also return variable values in the result report. If you want to suppress a variable from being included in the result report use a name starting with the "_" character. Collection and Environment variables not changed during the run of the collection will be typed as input variables.

let json =pm.response.json(); pm.collectionVariables.set("eventName",json.name) pm.collectionVariables.set("category",json?.category?.description|| "") pm.collectionVariables.set("mediaLink",json?.mediaItem?.url|| "")

Key Links:


6. “cryptify” Package Details

  • certificates

  • data files

shared variables

You must supply a decryption key with the -dk option. The file must be encrypted with the key using the command-line version of cryptify.6.1.1 Install cryptify

Install cryptify

$ npm install -g cryptify

Encrypt with cryptify

$ cryptify

Usage: cryptify [options] [command]

Options:

-v, --version Display the current version

-l, --list List available ciphers

-h, --help Display help for the command

Commands

encrypt [options] <file...> Encrypt files(s)

decrypt [options] <file...> Decrypt files(s)

help <command> Display help for the command

cryptify Syntax Example

$ cryptify encrypt file.txt -p 'Secret123!'

$ cryptify decrypt file.txt -p 'Secret123!'

Password Requirements:

1. Must contain at least 8 characters

2. Must contain at least 1 special character

3. Must contain at least 1 numeric character

4. Must contain a combination of uppercase and lowercase

$ cryptify encrypt common_vars.json --password URL-XI.4.data

$ asm-pm-runner samples/HTTP-Bin-Requests.postman_collection.json -r results -l debug -v -e samples/ec_env/Http_bin_environment.postman_environment.json -dk Encrypt.4.ssl

PDF Version here:

The solution depends on , which allows you to run your Postman Collections from the command line using the Apica Synthetic Monitoring Postman Runner (ASM PM Runner)

Solution Link:

(as of Version 1.3.5)

4.

You will find some runnable examples in the installation directory under samples. See the in the about where you can find the global npm directory.

:

5.

To encrypt and decrypt files in a project, you can use the NPM package , simple file-based encryption (FBE) utility for Node.js. The following file types support encryption/decryption:

@apica-io/asm-pm-runner - npm (npmjs.com)
Newman utility
cryptify
Installing npm
Link to the ASM Postman Check Runner Solution
Command-line Options
Samples
folder structures used by npm
npm documentation
Example 1 - Running some http requests on http-bin
Example 2 - Testing certificate on https://badssl.com
The command line
Follow this npm link to see the full console output from the command above
Result Reports
Steps and requests in the report
Returning a custom value as the result
Returning variables in the result
Follow this npm link for how Variables are reported in the JSON result report
Follow this npm link for the generated JSON report
cryptify
npm example of encrypting a file with cryptify
npm example with encrypted postman_environment