Skip to content

Commit 89ecdcb

Browse files
committed
more of these
1 parent b2b6abc commit 89ecdcb

File tree

29 files changed

+69
-71
lines changed

29 files changed

+69
-71
lines changed

src/connections/storage/catalog/data-lakes/index.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Segment supports two type of data-lakes:
1111
- [AWS Data Lakes](/docs/connections/storage/catalog/data-lakes/#set-up-segment-data-lakes)
1212
- [Segment Data Lakes (Azure)](/docs/connections/storage/catalog/data-lakes/#set-up-segment-data-lakes-azure)
1313

14-
> note "Lake Formation"
14+
> success ""
1515
> You can also set up your Segment Data Lakes using [Lake Formation](/docs/connections/storage/data-lakes/lake-formation/), a fully managed service built on top of the AWS Glue Data Catalog.
1616
1717
## Set up Segment Data Lakes (AWS)
@@ -167,7 +167,7 @@ Before you can configure your Azure resources, you must complete the following p
167167

168168
### Step 4 - Set up Databricks
169169

170-
> note "Databricks pricing tier"
170+
> info "Databricks pricing tier"
171171
> If you create a Databricks instance only for Segment Data Lakes (Azure) usage, only the standard pricing tier is required. However, if you use your Databricks instance for other applications, you may require premium pricing.
172172
173173
1. From the [home page of your Azure portal](https://portal.azure.com/#home){:target="_blank”}, select **Create a resource**.
@@ -346,7 +346,7 @@ After you set up the necessary resources in Azure, the next step is to set up th
346346

347347
Instead of manually configuring your Data Lake, you can create it using the script in the [`terraform-segment-data-lakes`](https://github.com/segmentio/terraform-segment-data-lakes){:target="_blank”} GitHub repository.
348348

349-
> note " "
349+
> warning ""
350350
> This script requires Terraform versions 0.12+.
351351
352352
Before you can run the Terraform script, create a Databricks workspace in the Azure UI using the instructions in [Step 4 - Set up Databricks](#step-4---set-up-databricks). Note the **Workspace URL**, as you will need it to run the script.

src/connections/storage/data-lakes/data-lakes-manual-setup.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ Segment requires access to an EMR cluster to perform necessary data processing.
7979
14. Expand the EC2 security groups section and select the appropriate security groups for the Master and Core & Task types.
8080
15. Select **Create cluster**.
8181

82-
> note ""
82+
> info ""
8383
> If you update the EMR cluster of existing Data Lakes instance, take note of the EMR cluster ID on the confirmation page.
8484
8585
## Step 3 - Create an Access Management role and policy
@@ -119,7 +119,7 @@ Attach the following trust relationship document to the role to create a `segmen
119119
}
120120
```
121121

122-
> note ""
122+
> info ""
123123
> Replace the `ExternalID` list with the Segment `WorkspaceID` that contains the sources to sync to the Data Lake.
124124
125125
### IAM policy
@@ -210,7 +210,7 @@ Add a policy to the role created above to give Segment access to the relevant Gl
210210
}
211211
```
212212

213-
> note ""
213+
> warning ""
214214
> The policy above grants full access to Athena, but the individual Glue and S3 policies determine which table is queried. Segment queries for debugging purposes, and notifies you before running any queries.
215215
216216
## Debugging

src/connections/storage/data-lakes/lake-formation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ To verify that you've configured Lake Formation, open the [AWS Lake Formation se
4646

4747
### Configure Lake Formation using IAM policies
4848

49-
> note "Granting Super permission to IAM roles"
49+
> info "Granting Super permission to IAM roles"
5050
> If you manually configured your database, assign the `EMR_EC2_DefaultRole` Super permissions in step 8. If you configured your database using Terraform, assign the `segment_emr_instance_profile` Super permissions in step 8.
5151
5252
#### Existing databases

src/connections/storage/warehouses/health.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@ You can use this feature to answer questions such as:
1111
- *Anomaly detection* - How much data is being synced on a daily basis? Have there been anomalous spikes or dips that may indicate sudden changes in event volume, sync failures, or something else?
1212
- *Data composition* - Which sources are contributing the most (or least) amount of data in my warehouse? Which collections make up the majority of data within a source?
1313

14-
> note ""
15-
> **Note**: Warehouse Health is available for all Warehouse customers.
14+
> success ""
15+
> Warehouse Health is available for all Warehouse customers.
1616
1717

1818
The Warehouse Health dashboards are available at both the [warehouse level](#warehouse-dashboards), and at the [warehouse-source connection level](#warehouse-source-dashboards), explained below.

src/connections/storage/warehouses/redshift-useful-sql.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ You can use SQL queries for the following tasks:
1919
- [Historical Traits](#historical-traits-1)
2020
- [Converting the Groups Table into an Organizations Table](#converting-the-groups-table-into-an-organizations-table)
2121

22-
> note " "
22+
> success " "
2323
> If you're looking for SQL queries for warehouses other than Redshift, check out some of Segment's [Analyzing with SQL guides](/docs/connections/storage/warehouses#analyzing-with-sql).
2424
2525
## Tracking events

src/connections/storage/warehouses/schema.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,9 @@ title: Warehouse Schemas
55
A **schema** describes the way that the data in a warehouse is organized. Segment stores data in relational schemas, which organize data into the following template:
66
`<source>.<collection>.<property>`, for example `segment_engineering.tracks.user_id`, where source refers to the source or project name (segment_engineering), collection refers to the event (tracks), and the property refers to the data being collected (user_id). All schemas convert collection and property names from `CamelCase` to `snake_case` using the [go-snakecase](https://github.com/segmentio/go-snakecase) package.
77

8-
> note "Warehouse column creation"
9-
> **Note:** Segment creates tables for each of your custom events in your warehouse, with columns for each event's custom properties. Segment does not allow unbounded `event` or `property` spaces in your data. Instead of recording events like "Ordered Product 15", use a single property of "Product Number" or similar.
8+
> info "Warehouse column creation"
9+
> Segment creates tables for each of your custom events in your warehouse, with columns for each event's custom properties. Segment does not allow unbounded `event` or `property` spaces in your data. Instead of recording events like "Ordered Product 15", use a single property of "Product Number" or similar.
10+
>
1011
> Segment creates and populates a column only when it receives a non-null value from the source.
1112
1213
### How warehouse tables handle nested objects and arrays

src/connections/storage/warehouses/warehouse-syncs.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ Your plan determines how frequently data is synced to your warehouse.
2323

2424
*If you're a Business plan member and would like to adjust your sync frequency, you can do so using the Selective Sync feature. To enable Selective Sync, please go to **Warehouse** > **Settings** > **Sync Schedule**.
2525

26-
> note "Why can't I sync more than 24 times per day?"
27-
> We do not set syncs to happen more than once per hour (24 times per day). The warehouse product is not designed for real-time data, so more frequent syncs would not necessarily be helpful.
26+
> info "Why can't I sync more than 24 times per day?"
27+
> Segment does not set syncs to happen more than once per hour (24 times per day). The warehouse product is not designed for real-time data, so more frequent syncs would not necessarily be helpful.
2828
2929
## Sync History
3030
You can use the Sync History page to see the status and history of data updates in your warehouse. The Sync History page is available for every source connected to each warehouse. This page helps you answer questions like, “Has the data from a specific source been updated recently?” “Did a sync completely fail, or only partially fail?” and “Why wasn't this sync successful?”
@@ -61,8 +61,8 @@ Warehouse Selective Sync allows you to manage the data that you send to your war
6161
6262
With Selective Sync, you can customize which collections and properties from a source are sent to each warehouse. This helps you control the data that is sent to each warehouse, allowing you to sync different sets of data from the same source to different warehouses.
6363

64-
> note ""
65-
> **NOTE:** This feature only affects [warehouses](/docs/connections/storage/warehouses/), and doesn't prevent data from going to any other [destinations](/docs/connections/destinations/).
64+
> info ""
65+
> This feature only affects [warehouses](/docs/connections/storage/warehouses/), and doesn't prevent data from going to any other [destinations](/docs/connections/destinations/).
6666
6767
When you disable a source, collection or property, Segment no longer syncs data from that source. Segment won't delete any historical data from your warehouse. When you re-enable a source, Segment syncs all events since the last sync. This doesn't apply when a collection or property is re-enabled. Only new data generated after re-enabling a collection or property will sync to your warehouse.
6868

src/engage/content/email/template.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ To configure an email template, click **Create Template**.
2929

3030
1. Select **Email**, and click **Configure**.
3131

32-
> note ""
32+
> info ""
3333
> You must first connect a [SendGrid subuser account](https://docs.sendgrid.com/ui/account-and-settings/subusers#create-a-subuser){:target="blank"} to your Segment space to build email templates in Engage. Visit the [onboarding steps](/docs/engage/onboarding/) for more information.
3434
3535
2. Configure the email template.

src/engage/journeys/step-types.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -111,7 +111,7 @@ The **Send an email**, **Send an SMS**, and **Send a WhatsApp** steps are only a
111111

112112
Use Twilio Engage to send email as a step in a Journey.
113113

114-
> note ""
114+
> info ""
115115
> To send email in Engage, you must connect a [SendGrid subuser account](https://docs.sendgrid.com/ui/account-and-settings/subusers#create-a-subuser){:target="blank"} to your Segment space. Visit the [onboarding steps](/docs/engage/onboarding/) for more information.
116116
117117
1. From the **Add step** window, **Send an email**.
@@ -132,7 +132,7 @@ Use Twilio Engage to send email as a step in a Journey.
132132

133133
Use Twilio Engage to send an SMS message as a step in a Journey.
134134

135-
> note ""
135+
> info ""
136136
> To send SMS in Engage, you must connect a Twilio messaging service to your Segment workspace. Visit the [onboarding steps](/docs/engage/onboarding/) for more information.
137137
138138
1. From the **Add step** window, click **Send an SMS**.

src/getting-started/02-simple-install.md

Lines changed: 9 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -70,12 +70,10 @@ Click a tab below to see the tutorial content for the specific library you chose
7070

7171
### Step 1: Copy the Snippet
7272
<br>
73-
Navigate **Connections > Sources > JavaScript** in the Segment app and copy the snippet from the JavaScript Source overview page and paste it into the `<head>` tag of your site.
73+
Navigate to **Connections > Sources > JavaScript** in the Segment app, copy the snippet from the JavaScript Source overview page, and paste it into the `<head>` tag of your site.
7474
<br><br>
7575
That snippet loads Analytics.js onto the page _asynchronously_, so it won't affect your page load speed. Once the snippet runs on your site, you can turn on destinations from the destinations page in your workspace and data starts loading on your site automatically.
7676
<br><br>
77-
> note ""
78-
> **Note:** If you only want the most basic Google Analytics setup you can stop reading right now. You're done! Just toggle on Google Analytics from the Segment App.
7977

8078
> info ""
8179
> The Segment snippet version history available on [GitHub](https://github.com/segmentio/snippet/blob/master/History.md){:target="_blank"}. Segment recommends that you use the latest snippet version whenever possible.
@@ -85,8 +83,8 @@ That snippet loads Analytics.js onto the page _asynchronously_, so it won't affe
8583
<br>
8684
The `identify` method is how you tell Segment who the current user is. It includes a unique User ID and any optional traits you know about them. You can read more about it in the [identify method reference](/docs/connections/sources/catalog/libraries/website/javascript#identify).
8785
<br><br>
88-
> note ""
89-
> **Note:** You don't need to call `identify` for anonymous visitors to your site. Segment automatically assigns them an `anonymousId`, so just calling `page` and `track` works just fine without `identify`.
86+
> info "You don't need to call `identify` for anonymous visitors to your site"
87+
> Segment automatically assigns them an `anonymousId`, so just calling `page` and `track` works just fine without `identify`.
9088
<br><br>
9189
Here's an example of what a basic call to `identify` might look like:
9290

@@ -114,8 +112,8 @@ analytics.identify(' {{user.id}} ', {
114112
<br>
115113
With that call in your page footer, you successfully identify every user that visits your site.
116114
<br><br>
117-
> note ""
118-
> **Note:** If you only want to use a basic CRM set up, you can stop here. Just enable Salesforce, Intercom, or any other CRM system from your Segment workspace, and Segment starts sending all of your user data to it.
115+
> info ""
116+
> You've completed a basic CRM set up. Return to the Segment app to enable Salesforce, Intercom, or your CRM system of choice and Segment starts sending all of your user data to it.
119117
120118
<br>
121119
### Step 3: Track Actions
@@ -209,8 +207,8 @@ Here's an example of what a basic call to `identify` might look like:
209207
<br>
210208
This call identifies Michael by his unique User ID (`f4ca124298`, which is the one you know him by in your database) and labels him with `name` and `email` traits.
211209
<br><br>
212-
> note ""
213-
> **Note:** When you put that code in your iOS app, you need to replace those hard-coded trait values with the variables that represent the details of the currently logged-in user.
210+
> info ""
211+
> When you put the above code in your iOS app, you would replace those hard-coded trait values with variables that represent the details of the user that's currently signed in.
214212
<br><br>
215213
### Step 3: Track Actions
216214
<br>
@@ -288,8 +286,8 @@ Segment::init("YOUR_WRITE_KEY");
288286
You only need to call `init` once when your php file is requested. All of your files then have access to the same `Analytics` client.
289287

290288

291-
> note ""
292-
> **Note:** The default PHP consumer is the [libcurl consumer](/docs/connections/sources/catalog/libraries/server/php/#lib-curl-consumer). If this is not working well for you, or if you have a high-volume project, you might try one of Segment's other consumers like the [fork-curl consumer](/docs/connections/sources/catalog/libraries/server/php/#fork-curl-consumer).
289+
> info ""
290+
> Segment's default PHP consumer is the [libcurl consumer](/docs/connections/sources/catalog/libraries/server/php/#lib-curl-consumer). If this is not working well for you or if you have a high-volume project, you might try one of Segment's other consumers like the [fork-curl consumer](/docs/connections/sources/catalog/libraries/server/php/#fork-curl-consumer).
293291
294292
<br>
295293
### Step 2: Identify Users

0 commit comments

Comments
 (0)