-
Notifications
You must be signed in to change notification settings - Fork 197
Connector Configuration
A list of Kafka topics to read from
- Type: List of comma-delimited strings
- Required (no default)
The BigQuery project to write to
- Type: String
- Required (no default)
Names for the datasets Kafka topics will write to (form of <topic regex>=<dataset>
)
- Type: List of comma-delimited strings
- Required (no default)
The file containing a JSON key with BigQuery service account credentials
- Type: String
- Default: ""
The base URL of the Schema Registry instance to use
- Type: String
- Default: ""
- Required if and only if autoCreateTables or autoUpdateSchemas is enabled
Whether to automatically sanitize topic names before using them as table names; if not enabled topics names will be used directly as table names
- Type: boolean
- Default: false
Whether to include an extra block containing the Kafka source topic, offset, and partition information in the resulting BigQuery rows
- Type: boolean
- Default: false
The size of the cache to use when converting schemas from Avro to Kafka Connect
- Type: int
- Default: 100
Automatically create BigQuery tables if they don't already exist
- Type: boolean
- Default: false
Whether or not to automatically update BigQuery schemas
- Type: boolean
- Default: false
The maximum number of records to buffer per table before temporarily halting the flow of new records, or -1 for unlimited buffering
- Type: long
- Default: 100000
The number of retry attempts that will be made per BigQuery request that fails with a backend error
- Type: int
- Default: 0
The amount of time, in milliseconds to wait between BigQuery backend error retries
- Type: long
- Default: 1,000
A list of mappings from topic regexes to table names. Note the regex must include capture groups that are referenced in the format string using placeholders (i.e. $1) (form of =).
- Type: List of comma-delimited strings
- Default null
The batch writer class to be used. At the moment there are only two options:
- com.wepay.kafka.connect.bigquery.write.batch.DynamicBatchWriter
- com.wepay.kafka.connect.bigquery.write.batch.SingleBatchWriter
See these classes for documentation.
- Type: String
- Default: com.wepay.kafka.connect.bigquery.write.batch.DynamicBatchWriter
The size of the BigQuery write thread pool. This establishes the maximum number of concurrent writes to BigQuery.
- Type: Integer
- Default: 10
The maximum size (or -1 for no maximum size) of the worker queue for bigQuery write requests before all topics are paused. This is a soft limit; the size of the queue can go over this before topics are paused. All topics will be resumed once a flush is requested or the size of the queue drops under half of the maximum size.
- Type: Integer
- Default: -1