Skip to content

Commit ed9a213

Browse files
authored
title casing + position of warning box fixed [netlify-build]
1 parent 181a592 commit ed9a213

File tree

1 file changed

+16
-16
lines changed
  • src/connections/sources/catalog/cloud-apps/amazon-s3

1 file changed

+16
-16
lines changed

src/connections/sources/catalog/cloud-apps/amazon-s3/index.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -17,10 +17,6 @@ The goal of this walkthrough is to make this process easier by providing an auto
1717
- an AWS IAM execution role that grants the permissions your Lambda function needs through the permissions policy associated with this role
1818
- an AWS S3 source bucket with a notification configuration that invokes the Lambda function
1919

20-
> warning "CSV support recommendation"
21-
>
22-
> Implementing a production-grade solution with this tutorial can be complex. Segment recommends that you submit feature requests for Segment reverse ETL for CSV support.
23-
2420
## Prerequisites
2521

2622
This tutorial assumes that you have some basic understanding of S3, Lambda and the `aws cli` tool. If you haven't already, follow the instructions in [Getting Started with AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/getting-started.html){:target="_blank"} to create your first Lambda function. If you're unfamiliar with `aws cli`, follow the instructions in [Setting up the AWS Command Line Interface](https://docs.aws.amazon.com/polly/latest/dg/setup-aws-cli.html){:target="_blank"} before you proceed.
@@ -31,13 +27,13 @@ On Linux and macOS, use your preferred shell and package manager. On macOS, you
3127

3228
[Install NPM](https://www.npmjs.com/get-npm){:target="_blank"} to manage the function's dependencies.
3329

34-
## Getting Started
30+
## Getting started
3531

3632
### 1. Create an S3 source in Segment
3733

3834
Remember the write key for this source, you'll need it in a later step.
3935

40-
### 2. Create the Execution Role
36+
### 2. Create the execution role
4137

4238
Create the [execution role](https://docs.aws.amazon.com/lambda/latest/dg/lambda-intro-execution-role.html){:target="_blank"} that gives your function permission to access AWS resources.
4339

@@ -57,7 +53,7 @@ Create the [execution role](https://docs.aws.amazon.com/lambda/latest/dg/lambda-
5753

5854
The **AWSLambdaExecute** policy has the permissions that the function needs to manage objects in Amazon S3, and write logs to CloudWatch Logs.
5955

60-
### 3. Create Local Files, an S3 Bucket and Upload a Sample Object
56+
### 3. Create local files, an S3 bucket and upload a sample object
6157

6258
Follow these steps to create your local files, S3 bucket and upload an object.
6359

@@ -77,7 +73,7 @@ Follow these steps to create your local files, S3 bucket and upload an object.
7773
3. Create your bucket. **Record your bucket name** - you'll need it later!
7874
4. In the source bucket, upload `track_1.csv`.
7975
80-
### 4. Create the Function
76+
### 4. Create the function
8177
8278
Next, create the Lambda function, install dependencies, and zip everything up so it can be deployed to AWS.
8379
@@ -264,11 +260,11 @@ The command above sets a 90-second timeout value as the function configuration.
264260
S3-Lambda-Segment$ aws lambda update-function-configuration --function-name <!Your Lambda Name!> --timeout 180
265261
```
266262

267-
### 5. Test the Lambda Function
263+
### 5. Test the lambda function
268264

269265
In this step, you invoke the Lambda function manually using sample Amazon S3 event data.
270266

271-
**To test the Lambda function**
267+
**To test the lambda function**
272268

273269
1. Create an empty file named `output.txt` in the `S3-Lambda-Segment` folder - the aws cli complains if it's not there.
274270
```bash
@@ -285,7 +281,7 @@ In this step, you invoke the Lambda function manually using sample Amazon S3 eve
285281
286282
**Note**: Calls to Segment's Object API don't show up the Segment debugger.
287283
288-
### Configure Amazon S3 to Publish Events
284+
### Configure Amazon S3 to publish events
289285
290286
In this step, you add the remaining configuration so that Amazon S3 can publish object-created events to AWS Lambda and invoke your Lambda function.
291287
You'll do the following:
@@ -352,11 +348,15 @@ Last, test your system to make sure it's working as expected:
352348
### Timestamps
353349
This script automatically transforms all CSV timestamp columns named `createdAt` and `timestamp` to timestamp objects, regardless of nesting, preparation for Segment ingestion. If your timestamps have a different name, search the example `index.js` code for the `colParser` function, and add your column names there for automatic transformation. If you make this modification, re-zip the package (using `zip -r function.zip .`) and upload the new zip to Lambda.
354350

355-
## CSV Formats
351+
## CSV formats
356352

357353
Define your CSV file structure based on the method you want to execute.
358354

359-
#### Identify Structure
355+
> warning "CSV support recommendation"
356+
>
357+
> Implementing a production-grade solution with this tutorial can be complex. Segment recommends that you submit feature requests for Segment reverse ETL for CSV support.
358+
359+
#### Identify structure
360360

361361
An `identify_XXXXX` .csv file uses the following field names:
362362

@@ -371,7 +371,7 @@ An `identify_XXXXX` .csv file uses the following field names:
371371
In the above structure, the `userId` is required, but all other items are optional. Start all traits with `traits.` and then the trait name, for example `traits.account_type`. Similarly, start context fields with `context.` followed by the canonical structure. The same structure applies to `integrations.` too.
372372

373373

374-
#### Page/Screen Structure
374+
#### Page/Screen structure
375375

376376
For example a `screen_XXXXX` or `page_YYYY` file has the following field names:
377377

@@ -384,7 +384,7 @@ For example a `screen_XXXXX` or `page_YYYY` file has the following field names:
384384
7. `timestamp` (Unix time) - Optional
385385
8. `integrations.<integration>` - Optional
386386

387-
#### Track Structure
387+
#### Track structure
388388

389389
For example a `track_XXXXX` file has the following field names:
390390

@@ -413,7 +413,7 @@ For any of these methods, you might need to pass nested JSON to the tracking or
413413

414414
The example `index.js` sample code above does not support ingestion of arrays. If you need this functionality you can modify the sample code as needed.
415415

416-
#### Object Structure
416+
#### Object structure
417417

418418
There are cases when Segment's tracking API is not suitable for datasets that you might want to move to a warehouse. This could be e-commerce product data, media content metadata, campaign performance, and so on.
419419

0 commit comments

Comments
 (0)