| page_title | subcategory | description |
|---|---|---|
airbyte_source_airtable Resource - terraform-provider-airbyte |
SourceAirtable Resource |
SourceAirtable Resource
resource "airbyte_source_airtable" "my_source_airtable" {
configuration = {
add_base_id_to_stream_name = false
additional_properties = "{ \"see\": \"documentation\" }"
credentials = {
o_auth20 = {
access_token = "...my_access_token..."
client_id = "...my_client_id..."
client_secret = "...my_client_secret..."
refresh_token = "...my_refresh_token..."
token_expiry_date = "2022-01-14T11:50:58.504Z"
}
}
num_workers = 37
}
definition_id = "fdd0b7d7-bc62-4e45-9809-2513b5f90d61"
name = "...my_name..."
secret_id = "...my_secret_id..."
workspace_id = "110737e7-1846-4cca-8ebc-d0f82e4b8ffb"
}configuration(Attributes) The values required to configure the source. The schema for this must match the schema return by source_definition_specifications/get for the source. (see below for nested schema)name(String) Name of the source e.g. dev-mysql-instance.workspace_id(String)
definition_id(String) The UUID of the connector definition. One of configuration.sourceType or definitionId must be provided. Default: "14c6e7ea-97ed-4f5e-a7b5-25e9a80b8212"; Requires replacement if changed.secret_id(String) Optional secretID obtained through the public API OAuth redirect flow. Requires replacement if changed.
created_at(Number)resource_allocation(Attributes) actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level. (see below for nested schema)source_id(String)source_type(String)
Optional:
add_base_id_to_stream_name(Boolean) When enabled, includes the base ID in stream names to ensure uniqueness. Use this if you have cloned Airtable bases with duplicate table names. Note that enabling this will change stream names and require a full refresh. Default: falseadditional_properties(String) Parsed as JSON.credentials(Attributes) (see below for nested schema)num_workers(Number) Number of concurrent threads for syncing. Higher values can speed up syncs but may hit rate limits. Airtable limits to 5 requests per second per base. Default: 5
Optional:
o_auth20(Attributes) (see below for nested schema)personal_access_token(Attributes) (see below for nested schema)
Required:
client_id(String, Sensitive) The client ID of the Airtable developer application.client_secret(String, Sensitive) The client secret of the Airtable developer application.refresh_token(String, Sensitive) The key to refresh the expired access token.
Optional:
access_token(String, Sensitive) Access Token for making authenticated requests.token_expiry_date(String) The date-time when the access token should be refreshed.
Required:
api_key(String, Sensitive) The Personal Access Token for the Airtable account. See the Support Guide for more information on how to obtain this token.
Read-Only:
default(Attributes) optional resource requirements to run workers (blank for unbounded allocations) (see below for nested schema)job_specific(Attributes List) (see below for nested schema)
Read-Only:
cpu_limit(String)cpu_request(String)ephemeral_storage_limit(String)ephemeral_storage_request(String)memory_limit(String)memory_request(String)
Read-Only:
job_type(String) enum that describes the different types of jobs that the platform runs.resource_requirements(Attributes) optional resource requirements to run workers (blank for unbounded allocations) (see below for nested schema)
Read-Only:
cpu_limit(String)cpu_request(String)ephemeral_storage_limit(String)ephemeral_storage_request(String)memory_limit(String)memory_request(String)
Import is supported using the following syntax:
In Terraform v1.5.0 and later, the import block can be used with the id attribute, for example:
import {
to = airbyte_source_airtable.my_airbyte_source_airtable
id = "..."
}The terraform import command can be used, for example:
terraform import airbyte_source_airtable.my_airbyte_source_airtable "..."