You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. Go to Google Cloud Console -> Dataflow -> [Create job from template](https://console.cloud.google.com/dataflow/createjob)
8
+
2. Choose the name and region for the Dataflow job
9
+
3. Select **Custom Template**
10
+
4. As Template path, use `better-stack-gcs-dataflow/pubsub-to-betterstack.json`
11
+
5. Set parameters based on your Google Cloud Pub/Sub Subscription and [Better Stack Telemetry source](https://telemetry.betterstack.com/team/260195/sources)
12
+
6. Click **Run job**
21
13
22
-
3. Choose Google Cloud Platform region to use
23
-
```bash
24
-
# See currently selected region
25
-
gcloud config get-value compute/region
26
-
# You can switch to a different region using
27
-
gcloud app regions list
28
-
gcloud config set compute/region PROJECT_ID
29
-
```
30
-
31
-
4. Create a Cloud Storage bucket for the template (if you don't have one):
5. Set parameters based on your Google Cloud Pub/Sub Subscription and Better Stack Telemetry source
16
+
1. Set parameters based on your Google Cloud Pub/Sub Subscription and [Better Stack Telemetry source](https://telemetry.betterstack.com/team/260195/sources)
@@ -60,10 +34,10 @@ gcloud dataflow flex-template run "pubsub-to-betterstack-$(date +%Y%m%d-%H%M%S)"
60
34
61
35
The template supports the following optional parameters:
62
36
63
-
-`batch_size`: Number of messages to batch before sending to Better Stack. Default: 100
64
-
-`window_size`: Window size in seconds for batching messages. Default: 10
65
-
-`max_retries`: Maximum number of retry attempts for failed requests. Default: 3
66
-
-`initial_retry_delay`: Initial delay between retries in seconds. Default: 1
37
+
-`batch_size` - Number of messages to batch before sending to Better Stack. Default: 100
38
+
-`window_size` - Window size in seconds for batching messages. Default: 10
39
+
-`max_retries` - Maximum number of retry attempts for failed requests. Default: 3
40
+
-`initial_retry_delay` - Initial delay between retries in seconds. Default: 1
67
41
68
42
You can include these parameters in your Dataflow job by adding them to the run command, e.g. `gcloud dataflow flex-template run ... --parameters window_size=30`.
0 commit comments