Skip to content

Commit de5465a

Browse files
authored
[Logs] BigQuery info for logpush (#1285)
* adding draft bigquery content * kody edits
1 parent b007a7e commit de5465a

File tree

1 file changed

+14
-0
lines changed
  • products/logs/src/content/get-started/enable-destinations/bigquery

1 file changed

+14
-0
lines changed
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
order: 61
3+
pcx-content-type: concept
4+
---
5+
6+
# Enable BigQuery
7+
8+
Configure Logpush to send batches of Cloudflare logs to BigQuery.
9+
10+
BigQuery supports loading up to 1,500 jobs per table per day (including failures) with up to 10 million files in each load. That means you can load into BigQuery once per minute and include up to 10 million files in a load. See BigQuery's quotas for load jobs for more information.
11+
12+
Logpush delivers batches of logs as soon as possible, which means you could receive more than one batch of files per minute. Ensure your BigQuery job is configured to ingest files on a given time interval, like every minute, as opposed to when files are received. Ingesting files into BigQuery as each Logpush file is received could exhaust your BigQuery quota quickly.
13+
14+
For an example of how to set up a schedule job load with BigQuery, see the [Cloudflare + Google Cloud | Integrations repository](https://github.com/cloudflare/cloudflare-gcp/tree/master/logpush-to-bigquery).

0 commit comments

Comments
 (0)