# AWS S3

## <mark style="background-color:blue;">Apica Ascent S3 Import App Extension (PULL)</mark>

Apica Ascent can ingest data directly from any S3 compatible object storage. Head over to the App extensions to create an object importer app extension.

You can find App extensions under the Explore menu

Once inside the App extension menu, select AWS S3/Compatible Object store

<figure><img src="https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2FLkNhFLA8516L7MmYnA30%2FScreen%20Shot%202023-01-02%20at%201.58.18%20PM.png?alt=media&#x26;token=350504d9-53c8-4776-8393-858da4082c42" alt=""><figcaption><p>AWS S3/Compatible object store</p></figcaption></figure>

Create the extension and provide the settings for accessing your object store bucket. The settings menu provides options that allow customization that is specific to vendor object store implementations.

<figure><img src="https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2FyNwMQbxXtcUGwuc4n8kx%2FScreen%20Shot%202023-01-02%20at%201.58.53%20PM.png?alt=media&#x26;token=01a3f353-5843-4be8-bb36-118c69ecd8b3" alt=""><figcaption><p>Configuring access to the bucket</p></figcaption></figure>

And that is all you need. You data from the Object store bucket will show up as a flow in the Apica Ascent platform

<figure><img src="https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2Ffu6P54yagkSsiYDyGqp7%2FScreen%20Shot%202023-01-02%20at%202.05.18%20PM.png?alt=media&#x26;token=c23f89f8-78a9-447f-92a0-aad27c42cc3f" alt=""><figcaption><p>Viewing the object store data import in "Explore" as a Flow</p></figcaption></figure>

## <mark style="background-color:blue;">Apica Ascent S3 exporter Lambda function (PUSH)</mark>

![](https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2F14tQu6HmMCBvaxIfCupw%2Fflash-high-level-s3.png?alt=media\&token=725a94a5-482d-41e2-8ad0-0e6448ed04c7)

### Creating a Lambda function

Apica Ascent provides a CloudFormation template to create the Apica Ascent S3 exporter Lambda function.

```
https://logiqcf.s3.amazonaws.com/s3-exporter/cf.yaml
```

{% hint style="info" %}
You can also download the CloudFormation template from our [client-integrations](https://github.com/ApicaSystem/client-integrations/tree/master/cloudwatch-exporter) GitHub repository.
{% endhint %}

This CloudFormation stack creates a Lambda function and its necessary permissions. You must configure the following attributes.

| Parameter     | Description                                                                                                                                                    |
| ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `APPNAME`     | Application name - a readable name for Apica Ascent to partition logs.                                                                                         |
| `CLUSTERID`   | Cluster ID - a readable name for Apica Ascent to partition logs.                                                                                               |
| `NAMESPACE`   | Namespace - a readable name for Apica Ascent to partition logs.                                                                                                |
| `LOGIQHOST`   | IP address or hostname of the Apica Ascent server.                                                                                                             |
| `INGESTTOKEN` | JWT token to securely ingest logs. Refer [here](https://docs.apica.io/overview/generating-a-secure-ingest-token#generating-using-ui) to generate ingest token. |

## Configuring S3 trigger

Once the CloudFormation stack is created, navigate to the AWS Lambda function (`logiq-s3-logs-exporter`) and add a S3 trigger.

![](https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2FUXDHXr3jeTldJKPqGI81%2FScreenshot%202021-11-09%20at%2021-47-48%20logiq-s3-logs-exporter%20-%20Lambda.png?alt=media\&token=eda66d01-ccd6-4b8b-8d14-474f52750b2f)

On the **Add trigger** page, select **S3**. Next, select the **Bucket** you'd like to forward logs from and add a **Prefix**.

![](https://2948796384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LmzGprckLqwd5v6bs6m%2Fuploads%2FCX51x97dbkSM3ryUMm2E%2FScreenshot%202021-11-09%20at%2021-46-40%20Lambda.png?alt=media\&token=e5bb7ef6-bd09-4c9c-b218-f47783d25509)

Once this configuration is complete, any new log files in the S3 bucket will be streamed to the Apica Ascent cluster.
