@@ -16,10 +16,6 @@ process to pull raw events and objects and load them into your BigQuery cluster.
16
16
Using BigQuery through Segment means you'll get a fully managed data pipeline
17
17
loaded into one of the most powerful and cost-effective data warehouses today.
18
18
19
- If you notice any gaps,
20
- out-dated information or want to leave some feedback to help us improve
21
- our documentation, [ let us know] ( https://segment.com/help/contact ) !
22
-
23
19
## Getting Started
24
20
25
21
First, you'll want to enable BigQuery for your Google Cloud project. Then, you
@@ -32,7 +28,7 @@ warehouse in Segment.
32
28
2 . Configure [ Cloud Platform] ( https://console.cloud.google.com/ ) :
33
29
- If you don't have a project already, [ create one] ( https://support.google.com/cloud/answer/6251787?hl=en&ref_topic=6158848 ) .
34
30
- If you have an existing project, you will need to [ enable the BigQuery API] ( https://cloud.google.com/bigquery/quickstart-web-ui ) .
35
- Once you've done so, you should see BigQuery in the [ "Resources" section] ( https://cl.ly/0W2i2I2B2R0M ) of Cloud Platform.
31
+ Once you've done so, you should see BigQuery in the "Resources" section of Cloud Platform.
36
32
- ** Note:** make sure [ billing is enabled] ( https://support.google.com/cloud/answer/6293499#enable-billing ) on your project,
37
33
otherwise Segment will not be able to write into the cluster.
38
34
3 . Copy your project ID, as you will need it later.
@@ -56,11 +52,11 @@ The downloaded file will be used to create your warehouse in the next section.
56
52
1 . In Segment, go to ** Workspace** > ** Add destination** > Search for "BigQuery"
57
53
2 . Select ** BigQuery**
58
54
3 . Enter your project ID in the ** Project** field
59
- 4 . Copy the contents of the credentials (the JSON key) into the ** Credentials** field
60
- 5 . ( Optional) Enter a [ region code] ( https://cloud.google.com/compute/docs/regions-zones/ ) in the ** Location** field (the default will be "US")
55
+ 4 . Copy the contents of the credentials (the JSON key) into the ** Credentials** field < br />
56
+ ** Optional: ** Enter a [ region code] ( https://cloud.google.com/compute/docs/regions-zones/ ) in the ** Location** field (the default will be "US")
61
57
6 . Click ** Connect**
62
- 7 . if Segment is able to successfully connect with the ** Project ID** and ** Credentials** ,
63
- the warehouse will be created and your first sync should begin shortly
58
+ 7 . if Segment is able to successfully connect with the provided ** Project ID** and ** Credentials** ,
59
+ a warehouse will be created and your first sync should begin shortly
64
60
65
61
### Schema
66
62
@@ -108,17 +104,17 @@ from <project-id>.<source-name>.<collection-name>_view
108
104
109
105
For early customers using BigQuery with Segment, rather than providing Segment
110
106
with credentials, access was granted to a shared Service Account
111
- (
` [email protected] ` ). While convenient early
112
- adopters, this presents potential security risks that we would prefer to address
107
+ (
` [email protected] ` ). While convenient
for early
108
+ adopters, this presents potential security risks that Segment would prefer to address
113
109
proactively.
114
110
115
- Starting in ** March 2019** , we're going to start requiring BigQuery customers to
116
- create their own Service Accounts and provide us with those credentials instead.
111
+ As of ** March 2019** , Segment requires BigQuery customers to
112
+ create their own Service Accounts and provide the app with those credentials instead.
117
113
In addition, any attempts to update warehouse connection settings will also
118
114
require these credentials. This effectively deprecates the shared Service
119
- Account, and in the future it will be deactivated completely .
115
+ Account.
120
116
121
- In order to stay ahead of this, make sure to migrate your warehouse by following
117
+ In order to stay ahead of this change , make sure to migrate your warehouse by following
122
118
the instructions in the "Create a Service Account for Segment" section above.
123
119
Then, head to your warehouse's connection settings and update with the
124
120
** Credentials** you created along the way.
@@ -159,7 +155,7 @@ querying sub-sets of tables.
159
155
Absolutely! You will just need to modify one of the references to 60 in the view
160
156
definition to the number of days of your choosing.
161
157
162
- We chose 60 days as it suits the needs for most of our customers. However,
158
+ We chose 60 days as it suits the needs of most of our customers. However,
163
159
you're welcome to update the definition of the view as long as the name stays
164
160
the same.
165
161
@@ -196,8 +192,8 @@ costs.
196
192
You can connect to BigQuery using a BI tool like Mode or Looker, or query
197
193
directly from the BigQuery console.
198
194
199
- BigQuery now supports standard SQL, which you can enable using their query UI.
200
- This does not work with views, or with a query that utilizes table range
195
+ BigQuery now supports standard SQL, which you can enable using their query UI.
196
+ This does not work with views, or with a query that uses table range
201
197
functions.
202
198
203
199
### Does Segment support streaming inserts?
0 commit comments