-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Closed
Labels
area/connectorsConnector related issuesConnector related issuesautoteamcommunityneeds-triageteam/extensibilityteam/usetype/bugSomething isn't workingSomething isn't working
Description
Connector Name
source-postgres
Connector Version
3.7.0
What step the error happened?
During the sync
Relevant information
Environment:
- Airbyte OSS v2.0.1 (self-hosted, Kubernetes, Helm chart v2.0.19)
- PostgreSQL 15.12 (AWS RDS)
- Destination: Snowflake (destination-snowflake v4.0.31)
Table schema:
id(String, primary key)json_data(JSONB) — avg 20KB, max 24KB+ per rowcreated_at(Timestamp with Timezone)updated_at(Timestamp with Timezone)- Total rows: ~5.4M
Sync config:
- Mode: Incremental | Append + Deduped
- Replication method: Xmin (not CDC)
- Initial sync uses ctid
Behaviour:
- Source starts reading via ctid query
- Fetch size adapts from 10 → 3,740 → ~100,000 rows
- No further log output for ~68 minutes (17:18:39 → 18:26:40)
- Sync completes with 0 rows loaded to Snowflake
- No errors logged
Verified:
- No active queries on PostgreSQL source during the "silent" period
- No active queries on Snowflake destination
- Connection checks pass for both source and destination
- Other tables in the same connection sync successfully when not blocked by the problem table
- Fivetran is able to sync this table without issue
Resource allocation:
- Source: 1-2 CPU, 1-2GB memory
- Destination: 1-2 CPU, 2GB memory
- Orchestrator: 2 CPU, 2GB memory
Relevant log output
# Source starts reading successfully
17:18:36 INFO Executing query for table simulation_data: SELECT ctid::text, "json_data","id","created_at","updated_at" FROM "public"."simulation_data" WHERE ctid > ?::tid with binding (0,0)
17:18:36 INFO Set initial fetch size: 10 rows
17:18:36 INFO Set new fetch size: 3740 rows
17:18:37 INFO Max memory limit: 12197036032, JDBC buffer size: 7318221619
17:18:39 INFO Set new fetch size: 103809 rows
17:18:39 INFO Set new fetch size: 102781 rows
17:18:39 INFO Set new fetch size: 98502 rows
# ~68 minutes of silence - no logs
# Sync "completes" with zero rows loaded
18:26:40 ----- START POST REPLICATION OPERATIONS -----
18:26:40 No post-replication operation(s) to perform.
18:26:40 ----- END POST REPLICATION OPERATIONS -----Full logs:
>> ATTEMPT 1/1
2025-12-15 17:15:11 info APPLY Stage: BUILD — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info APPLY Stage: CLAIM — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info Claimed: true for workload 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check via API in dataplane 2754cf6e-4514-47d1-81d0-c7e176dca14c (f29d321a-ed7a-4951-bdf3-b3f1983bad18)
2025-12-15 17:15:16 info APPLY Stage: LOAD_SHED — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info APPLY Stage: CHECK_STATUS — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info No pod found running for workload 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check
2025-12-15 17:15:16 info APPLY Stage: MUTEX — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info No mutex key specified for workload: 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check. Continuing...
2025-12-15 17:15:16 info APPLY Stage: ARCHITECTURE — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info APPLY Stage: LAUNCH — (workloadId=3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check)
2025-12-15 17:15:16 info [initContainer] image: 426105708615.dkr.ecr.eu-west-1.amazonaws.com/dockerhub/airbyte/workload-init-container:2.0.1 resources: ResourceRequirements(claims=[], limits={}, requests={}, additionalProperties={})
2025-12-15 17:15:37 info Attempting to update workload: 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check to LAUNCHED.
2025-12-15 17:15:37 info Pipeline completed for workload: 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check.
2025-12-15 17:15:47 info
2025-12-15 17:15:47 info Transitioning workload to running state
2025-12-15 17:15:47 info Connector exited, processing output
2025-12-15 17:15:50 info ----- START CHECK -----
2025-12-15 17:15:50 info
2025-12-15 17:15:50 info Output file jobOutput.json found
2025-12-15 17:15:50 info Connector exited with exit code 0
2025-12-15 17:15:50 info Reading messages from protocol version 0.2.0
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresSource(main):750 starting source: class io.airbyte.integrations.source.postgres.PostgresSource
2025-12-15 17:15:50 info INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):144 integration args: {check=null, config=/config/connectionConfiguration.json}
2025-12-15 17:15:50 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):130 Running integration: io.airbyte.cdk.integrations.base.ssh.SshWrappedSource
2025-12-15 17:15:50 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):131 Command: CHECK
2025-12-15 17:15:50 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):132 Integration config: IntegrationConfig{command=CHECK, configPath='/config/connectionConfiguration.json', catalogPath='null', statePath='null'}
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword group - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword pattern_descriptor - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword display_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword groups - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:15:50 info INFO main i.a.c.i.b.s.SshTunnel$Companion(getInstance):424 Starting connection with method: NO_TUNNEL
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresSource(toSslJdbcParamInternal):976 REQUIRED toSslJdbcParam require
2025-12-15 17:15:50 info INFO main c.z.h.HikariDataSource(<init>):79 HikariPool-1 - Starting...
2025-12-15 17:15:50 info INFO main c.z.h.HikariDataSource(<init>):81 HikariPool-1 - Start completed.
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresUtils(isXmin):190 using Xmin: true
2025-12-15 17:15:50 info INFO main i.a.c.i.s.j.AbstractJdbcSource(getCheckOperations$lambda$6):343 Attempting to get metadata from the database to see if we can connect.
2025-12-15 17:15:50 info INFO main i.a.c.i.s.j.AbstractJdbcSource(checkUserHasPrivileges):307 Checking if the user can perform select to any table in schema: public
2025-12-15 17:15:50 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(initialize):24 Set initial fetch size: 10 rows
2025-12-15 17:15:50 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(accept):33 Set new fetch size: 1197005 rows
2025-12-15 17:15:50 info INFO main c.z.h.HikariDataSource(close):349 HikariPool-1 - Shutdown initiated...
2025-12-15 17:15:50 info INFO main c.z.h.HikariDataSource(close):351 HikariPool-1 - Shutdown completed.
2025-12-15 17:15:50 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):224 Completed integration: io.airbyte.cdk.integrations.base.ssh.SshWrappedSource
2025-12-15 17:15:50 info INFO main i.a.i.s.p.PostgresSource(main):752 completed source: class io.airbyte.integrations.source.postgres.PostgresSource
2025-12-15 17:15:50 info Checking for optional control message...
2025-12-15 17:15:50 info Writing output of 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check to the doc store
2025-12-15 17:15:51 info Workload successfully transitioned to running state
2025-12-15 17:15:51 info Marking workload 3f3b28ad-fba8-4971-92c1-8dc99c30f188_19_0_check as successful
2025-12-15 17:15:51 info
2025-12-15 17:15:51 info Deliberately exiting process with code 0.
2025-12-15 17:15:51 info ----- END CHECK -----
2025-12-15 17:15:51 info
2025-12-15 17:15:53 info APPLY Stage: BUILD — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info APPLY Stage: CLAIM — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info Claimed: true for workload d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check via API in dataplane 2754cf6e-4514-47d1-81d0-c7e176dca14c (f29d321a-ed7a-4951-bdf3-b3f1983bad18)
2025-12-15 17:15:53 info APPLY Stage: LOAD_SHED — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info APPLY Stage: CHECK_STATUS — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info No pod found running for workload d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check
2025-12-15 17:15:53 info APPLY Stage: MUTEX — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info No mutex key specified for workload: d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check. Continuing...
2025-12-15 17:15:53 info APPLY Stage: ARCHITECTURE — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info APPLY Stage: LAUNCH — (workloadId=d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check)
2025-12-15 17:15:53 info [initContainer] image: 426105708615.dkr.ecr.eu-west-1.amazonaws.com/dockerhub/airbyte/workload-init-container:2.0.1 resources: ResourceRequirements(claims=[], limits={}, requests={}, additionalProperties={})
2025-12-15 17:16:00 info Attempting to update workload: d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check to LAUNCHED.
2025-12-15 17:16:00 info Pipeline completed for workload: d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check.
2025-12-15 17:16:07 info
2025-12-15 17:16:07 info Transitioning workload to running state
2025-12-15 17:16:07 info Connector exited, processing output
2025-12-15 17:16:11 info Output file jobOutput.json found
2025-12-15 17:16:11 info ----- START CHECK -----
2025-12-15 17:16:11 info
2025-12-15 17:16:11 info Connector exited with exit code 0
2025-12-15 17:16:11 info Reading messages from protocol version 0.2.0
2025-12-15 17:16:11 info INFO main i.m.c.e.DefaultEnvironment(<init>):168 Established active environments: [k8s, cloud, cli, destination, connector]
2025-12-15 17:16:11 info INFO main i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 Setting log level 'ERROR' for logger: 'net.snowflake'
2025-12-15 17:16:11 info INFO main i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 Setting log level 'ERROR' for logger: 'com.zaxxer.hikari.pool'
2025-12-15 17:16:11 info INFO main i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 Setting log level 'ERROR' for logger: 'com.zaxxer.hikari'
2025-12-15 17:16:11 info INFO main i.a.c.AirbyteConnectorRunnable(run):35 Executing class io.airbyte.cdk.load.check.CheckOperationV2 operation.
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SELECT COUNT(*) > 0 AS SCHEMA_EXISTS
FROM "AIRBYTE_DB".INFORMATION_SCHEMA.SCHEMATA
WHERE SCHEMA_NAME = ?
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 CREATE OR REPLACE TABLE "AIRBYTE_DB"."AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A" (
"_AIRBYTE_RAW_ID" VARCHAR NOT NULL,
"_AIRBYTE_EXTRACTED_AT" TIMESTAMP_TZ NOT NULL,
"_AIRBYTE_META" VARIANT NOT NULL,
"_AIRBYTE_GENERATION_ID" NUMBER(38,0),
"test_key" VARCHAR NOT NULL
)
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 CREATE STAGE IF NOT EXISTS "AIRBYTE_DB"."AIRBYTE_SCHEMA"."airbyte_stage__AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SHOW COLUMNS IN TABLE "AIRBYTE_DB"."AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"
2025-12-15 17:16:11 info INFO main i.a.i.d.s.w.l.SnowflakeInsertBuffer(flush):86 Beginning insert into "AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"...
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 PUT 'file:///tmp/snowflake8114961985933247358.csv.gz' '@"AIRBYTE_DB"."AIRBYTE_SCHEMA"."airbyte_stage__AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"'
AUTO_COMPRESS = FALSE
SOURCE_COMPRESSION = GZIP
OVERWRITE = TRUE
2025-12-15 17:16:11 info INFO main i.a.i.d.s.w.l.SnowflakeInsertBuffer(flush):91 Copying staging data into "AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"...
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 COPY INTO "AIRBYTE_DB"."AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"
FROM '@"AIRBYTE_DB"."AIRBYTE_SCHEMA"."airbyte_stage__AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"'
FILE_FORMAT = (
TYPE = 'CSV'
COMPRESSION = GZIP
FIELD_DELIMITER = ','
RECORD_DELIMITER = '
'
FIELD_OPTIONALLY_ENCLOSED_BY = '"'
TRIM_SPACE = TRUE
ERROR_ON_COLUMN_COUNT_MISMATCH = FALSE
REPLACE_INVALID_CHARACTERS = TRUE
ESCAPE = NONE
ESCAPE_UNENCLOSED_FIELD = NONE
)
ON_ERROR = 'ABORT_STATEMENT'
PURGE = TRUE
files = ('snowflake8114961985933247358.csv.gz')
2025-12-15 17:16:11 info INFO main i.a.i.d.s.w.l.SnowflakeInsertBuffer(flush):96 Finished insert of 1 row(s) into "AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A".
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SELECT COUNT(*) AS TOTAL FROM "AIRBYTE_DB"."AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"
2025-12-15 17:16:11 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 DROP TABLE IF EXISTS "AIRBYTE_DB"."AIRBYTE_SCHEMA"."_AIRBYTE_CONNECTION_TEST_4ECB0BD679224DE59D03F69D0D58A11A"
2025-12-15 17:16:11 info INFO main i.a.c.AirbyteConnectorRunnable(run):47 Flushing output consumer prior to shutdown.
2025-12-15 17:16:11 info INFO main i.a.c.AirbyteConnectorRunnable(run):49 Completed integration: airbyte/destination-snowflake.
2025-12-15 17:16:11 info Checking for optional control message...
2025-12-15 17:16:11 info Writing output of d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check to the doc store
2025-12-15 17:16:11 info Workload successfully transitioned to running state
2025-12-15 17:16:11 info Marking workload d103f16e-b4b8-4f1f-9ade-d9a0026333f0_19_0_check as successful
2025-12-15 17:16:11 info
2025-12-15 17:16:11 info Deliberately exiting process with code 0.
2025-12-15 17:16:11 info ----- END CHECK -----
2025-12-15 17:16:11 info
2025-12-15 17:16:16 info APPLY Stage: BUILD — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info APPLY Stage: CLAIM — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info Claimed: true for workload 38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync via API in dataplane 2754cf6e-4514-47d1-81d0-c7e176dca14c (f29d321a-ed7a-4951-bdf3-b3f1983bad18)
2025-12-15 17:16:16 info APPLY Stage: LOAD_SHED — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info APPLY Stage: CHECK_STATUS — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info No pod found running for workload 38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync
2025-12-15 17:16:16 info APPLY Stage: MUTEX — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info Mutex key: 38be01ef-65e3-4f5e-b459-5bd45a5717f7 specified for workload: 38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync. Attempting to delete existing pods...
2025-12-15 17:16:16 info Existing pods for mutex key: 38be01ef-65e3-4f5e-b459-5bd45a5717f7 deleted.
2025-12-15 17:16:16 info APPLY Stage: ARCHITECTURE — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info APPLY Stage: LAUNCH — (workloadId=38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync)
2025-12-15 17:16:16 info [initContainer] image: 426105708615.dkr.ecr.eu-west-1.amazonaws.com/dockerhub/airbyte/workload-init-container:2.0.1 resources: ResourceRequirements(claims=[], limits={memory=2Gi, cpu=2}, requests={memory=2Gi, cpu=2}, additionalProperties={})
2025-12-15 17:16:16 info Launching replication pod: replication-job-19-attempt-0 (selectors = {}) with containers:
2025-12-15 17:16:16 info [source] image: 426105708615.dkr.ecr.eu-west-1.amazonaws.com/dockerhub/airbyte/source-postgres:3.7.0 resources: ResourceRequirements(claims=[], limits={memory=2Gi, cpu=2}, requests={memory=1Gi, cpu=1}, additionalProperties={})
2025-12-15 17:16:16 info [destination] image: 426105708615.dkr.ecr.eu-west-1.amazonaws.com/dockerhub/airbyte/destination-snowflake:4.0.31 resources: ResourceRequirements(claims=[], limits={memory=2Gi, cpu=2}, requests={memory=2Gi, cpu=1}, additionalProperties={})
2025-12-15 17:16:16 info [orchestrator] image: 426105708615.dkr.ecr.eu-west-1.amazonaws.com/dockerhub/airbyte/container-orchestrator:2.0.1 resources: ResourceRequirements(claims=[], limits={memory=2Gi, cpu=2}, requests={memory=2Gi, cpu=2}, additionalProperties={})
2025-12-15 17:18:03 info Attempting to update workload: 38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync to LAUNCHED.
2025-12-15 17:18:03 info Pipeline completed for workload: 38be01ef-65e3-4f5e-b459-5bd45a5717f7_19_0_sync.
2025-12-15 17:18:32 info Running replication worker...
2025-12-15 17:18:36 info Starting replication worker. job id: 19 attempt: 0
2025-12-15 17:18:36 info
2025-12-15 17:18:36 info ----- START REPLICATION -----
2025-12-15 17:18:36 info
2025-12-15 17:18:36 info Running destination...
2025-12-15 17:18:36 error SLF4J(W): Class path contains multiple SLF4J providers.
2025-12-15 17:18:36 error SLF4J(W): Found provider [org.apache.logging.slf4j.SLF4JServiceProvider@262b2c86]
2025-12-15 17:18:36 error SLF4J(W): Found provider [org.slf4j.reload4j.Reload4jServiceProvider@371a67ec]
2025-12-15 17:18:36 info Writing messages to protocol version 0.2.0
2025-12-15 17:18:36 error SLF4J(W): See https://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2025-12-15 17:18:36 error SLF4J(I): Actual provider is of type [org.apache.logging.slf4j.SLF4JServiceProvider@262b2c86]
2025-12-15 17:18:36 info Reading messages from protocol version 0.2.0
2025-12-15 17:18:36 info Reading messages from protocol version 0.2.0
2025-12-15 17:18:36 info SourceReader started.
2025-12-15 17:18:36 info MessageProcessor started.
2025-12-15 17:18:36 info DestinationWriter started.
2025-12-15 17:18:36 info Starting workload heartbeat (interval=10s; timeout=600s)
2025-12-15 17:18:36 info DestinationReader started.
2025-12-15 17:18:36 info Transitioning workload to running state
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(main):750 starting source: class io.airbyte.integrations.source.postgres.PostgresSource
2025-12-15 17:18:36 info INFO main i.m.c.e.DefaultEnvironment(<init>):168 Established active environments: [k8s, cloud, cli, destination, connector]
2025-12-15 17:18:36 info INFO main i.a.c.i.b.IntegrationCliParser$Companion(parseOptions):144 integration args: {read=null, catalog=/source/catalog.json, config=/source/connectorConfig.json}
2025-12-15 17:18:36 info INFO main i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 Setting log level 'ERROR' for logger: 'net.snowflake'
2025-12-15 17:18:36 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):130 Running integration: io.airbyte.cdk.integrations.base.ssh.SshWrappedSource
2025-12-15 17:18:36 info INFO main i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 Setting log level 'ERROR' for logger: 'com.zaxxer.hikari.pool'
2025-12-15 17:18:36 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):131 Command: READ
2025-12-15 17:18:36 info INFO main i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):113 Setting log level 'ERROR' for logger: 'com.zaxxer.hikari'
2025-12-15 17:18:36 info INFO main i.a.c.i.b.IntegrationRunner(runInternal):132 Integration config: IntegrationConfig{command=READ, configPath='/source/connectorConfig.json', catalogPath='/source/catalog.json', statePath='null'}
2025-12-15 17:18:36 warn WARN main c.k.j.j.JsonSchemaGenerator$MyJsonFormatVisitorWrapper(expectAnyFormat):725 Not able to generate jsonSchema-info for type: [simple type, class com.fasterxml.jackson.databind.JsonNode] - probably using custom serializer which does not override acceptJsonFormatVisitor
2025-12-15 17:18:36 info INFO main i.a.c.l.c.DataChannelBeanFactory(dataChannelMedium):60 Using data channel medium STDIO
2025-12-15 17:18:36 info INFO main i.a.c.l.c.DataChannelBeanFactory(namespaceMapper):308 Going to use the given source value: SOURCE for namespace
2025-12-15 17:18:36 info INFO main i.a.c.l.c.DestinationCatalog(<init>):49 Destination catalog initialized: [DestinationStream(unmappedNamespace=pg_simulation, unmappedName=simulation_data, importType=Dedupe(primaryKey=[[id]], cursor=[updated_at]), schema=ObjectType(properties={id=FieldType(type=StringType, nullable=true), json_data=FieldType(type=StringType, nullable=true), created_at=FieldType(type=TimestampTypeWithTimezone, nullable=true), updated_at=FieldType(type=TimestampTypeWithTimezone, nullable=true)}, additionalProperties=true, required=[]), generationId=3, minimumGenerationId=3, syncId=19, isFileBased=false, includeFiles=false, destinationObjectName=null, matchingKey=null, namespaceMapper=io.airbyte.cdk.load.command.NamespaceMapper@22f12830)]
2025-12-15 17:18:36 info INFO main i.a.c.AirbyteConnectorRunnable(run):35 Executing class io.airbyte.integrations.destination.snowflake.cdk.WriteOperationV2 operation.
2025-12-15 17:18:36 info INFO main i.a.i.d.s.c.WriteOperationV2(execute):23 Running new pipe...
2025-12-15 17:18:36 info INFO main i.a.c.l.d.DestinationLifecycle$initializeDestination$1(invokeSuspend):51 Initializing the destination
2025-12-15 17:18:36 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SELECT COUNT(*) > 0 AS SCHEMA_EXISTS
FROM "AIRBYTE_DB".INFORMATION_SCHEMA.SCHEMATA
WHERE SCHEMA_NAME = ?
2025-12-15 17:18:36 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SELECT COUNT(*) > 0 AS SCHEMA_EXISTS
FROM "AIRBYTE_DB".INFORMATION_SCHEMA.SCHEMATA
WHERE SCHEMA_NAME = ?
2025-12-15 17:18:36 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SELECT COUNT(*) AS TOTAL FROM "AIRBYTE_DB"."PG_SIMULATION"."SIMULATION_DATA"
2025-12-15 17:18:36 info INFO main i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SELECT COUNT(*) AS TOTAL FROM "AIRBYTE_DB"."airbyte_internal"."PG_SIMULATIONSIMULATION_DATA9ec27b42ea0af17884797d01a31e0c97"
2025-12-15 17:18:36 info INFO main i.a.c.l.d.DestinationLifecycle$initializeDestination$1(invokeSuspend):53 Destination initialized
2025-12-15 17:18:36 info INFO DefaultDispatcher-worker-1 i.a.c.l.d.DestinationLifecycle$initializeIndividualStreams$1$result$1$1(invokeSuspend):64 Starting stream loader for stream pg_simulation:simulation_data
2025-12-15 17:18:36 info INFO DefaultDispatcher-worker-1 i.a.c.l.o.d.d.DirectLoadTableDedupTruncateStreamLoader(start):288 DedupTruncateStreamLoader starting for stream pg_simulation.simulation_data
2025-12-15 17:18:36 info INFO DefaultDispatcher-worker-1 i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 CREATE STAGE IF NOT EXISTS "AIRBYTE_DB"."airbyte_internal"."airbyte_stage_PG_SIMULATIONSIMULATION_DATA9ec27b42ea0af17884797d01a31e0c97"
2025-12-15 17:18:36 info INFO DefaultDispatcher-worker-1 i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 DESCRIBE TABLE "AIRBYTE_DB"."airbyte_internal"."PG_SIMULATIONSIMULATION_DATA9ec27b42ea0af17884797d01a31e0c97"
2025-12-15 17:18:36 info INFO DefaultDispatcher-worker-1 i.a.c.l.d.DestinationLifecycle$initializeIndividualStreams$1$result$1$1(invokeSuspend):69 Stream loader for stream pg_simulation:simulation_data started
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 info INFO main i.a.c.l.d.p.PipelineRunner(run):40 Destination Pipeline Starting...
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword group - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 info INFO main i.a.c.l.d.p.PipelineRunner(run):42 Starting state reconciler...
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 info INFO main i.a.c.l.d.p.PipelineRunner(run):45 Starting 1 pipelines...
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword pattern_descriptor - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword multiline - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword display_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 warn WARN main c.n.s.UnknownKeywordFactory(lambda$getKeyword$0):37 Unknown keyword groups - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword or if it should generate annotations AnnotationKeyword
2025-12-15 17:18:36 info INFO main i.a.c.i.b.s.SshTunnel$Companion(getInstance):424 Starting connection with method: NO_TUNNEL
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:18:36 info INFO main i.a.c.i.s.r.s.StateManagerFactory(createStateManager):57 Stream state manager selected to manage state object with type STREAM.
2025-12-15 17:18:36 info INFO main i.a.c.i.s.r.s.CursorManager(createCursorInfoForStream$io_airbyte_airbyte_cdk_java_airbyte_cdk_airbyte_cdk_db_sources):221 No cursor field set in catalog but not present in state. Stream: public_simulation_data, New Cursor Field: updated_at. Resetting cursor value
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(toSslJdbcParamInternal):976 REQUIRED toSslJdbcParam require
2025-12-15 17:18:36 info INFO main c.z.h.HikariDataSource(<init>):79 HikariPool-1 - Starting...
2025-12-15 17:18:36 info INFO main c.z.h.HikariDataSource(<init>):81 HikariPool-1 - Start completed.
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:18:36 info INFO main i.a.c.i.s.j.AbstractJdbcSource(logPreSyncDebugData):804 Data source product recognized as PostgreSQL:15.12
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(logPreSyncDebugData):306 Discovering indexes for schema "public", table "simulation_data"
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(logPreSyncDebugData):308 Index name: simulation_data_pkey, Column: id, Unique: true
2025-12-15 17:18:36 info INFO main i.a.c.i.s.j.AbstractJdbcSource(discoverTable):563 Discover table: public.simulation_data
2025-12-15 17:18:36 info INFO main i.a.c.i.s.r.AbstractDbSource(discoverWithoutSystemTables):301 Discovered table: public.simulation_data: TableInfo(nameSpace=public, name=simulation_data, fields=[CommonField{name='json_data', type=OTHER, properties=null}, CommonField{name='id', type=OTHER, properties=null}, CommonField{name='created_at', type=TIMESTAMP_WITH_TIMEZONE, properties=null}, CommonField{name='updated_at', type=TIMESTAMP_WITH_TIMEZONE, properties=null}], primaryKeys=[], cursorFields=[created_at, updated_at])
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresUtils(isXmin):190 using Xmin: true
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(recategoriseStreamsForXmin):809 Xmin Status : {Number of wraparounds: 0, Xmin Transaction Value: 13057926, Xmin Raw Value: 13057926
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresQueryUtils(fileNodeForIndividualStream):235 Relation filenode is for stream "public"."simulation_data" is 1563271
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresQueryUtils(fileNodeForIndividualStream):235 Relation filenode is for stream "public"."simulation_data" is 1563271
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresUtils(isCdc):70 using CDC: false
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresUtils(isXmin):190 using Xmin: true
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresQueryUtils(fileNodeForIndividualStream):235 Relation filenode is for stream "public"."simulation_data" is 1563271
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresQueryUtils(getTableBlockSizeForStream):299 Stream "public"."simulation_data" relation size is 457351168. block size 8192
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(getIncrementalIterators):593 Streams to be synced via ctid : 1
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(getIncrementalIterators):594 Streams: public.simulation_data
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresSource(getIncrementalIterators):605 No Streams will be synced via xmin.
2025-12-15 17:18:36 info INFO main i.a.i.s.p.c.PostgresCtidHandler(queryTableCtid):162 Queueing query for table: simulation_data
2025-12-15 17:18:36 info INFO main i.a.i.s.p.c.InitialSyncCtidIterator(isCdcSync):366 Not running a cdc sync
2025-12-15 17:18:36 info INFO main i.a.i.s.p.c.InitialSyncCtidIterator(ctidQueryPlan):243 Will read 131072 pages to get 1GB
2025-12-15 17:18:36 info INFO main i.a.i.s.p.PostgresQueryUtils(fileNodeForIndividualStream):235 Relation filenode is for stream "public"."simulation_data" is 1563271
2025-12-15 17:18:36 info INFO main i.a.i.s.p.c.InitialSyncCtidIterator(createCtidQueryStatement):300 Preparing query for table: simulation_data
2025-12-15 17:18:36 info INFO main i.a.i.s.p.c.InitialSyncCtidIterator(createCtidQueryStatement):315 Executing query for table simulation_data: SELECT ctid::text, "json_data","id","created_at","updated_at" FROM "public"."simulation_data" WHERE ctid > ?::tid with binding (0,0)
2025-12-15 17:18:36 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(initialize):24 Set initial fetch size: 10 rows
2025-12-15 17:18:36 info Stream status TRACE received of status: STARTED for stream public:simulation_data
2025-12-15 17:18:36 info Sending update for public:simulation_data - null -> RUNNING
2025-12-15 17:18:36 info Stream Status Update Received: public:simulation_data - RUNNING
2025-12-15 17:18:36 info Creating status: public:simulation_data - RUNNING
2025-12-15 17:18:36 info Workload successfully transitioned to running state
2025-12-15 17:18:36 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(accept):33 Set new fetch size: 3740 rows
2025-12-15 17:18:37 info INFO pool-3-thread-1 i.a.i.d.s.s.SnowflakeDirectLoadSqlGeneratorKt(andLog):33 SHOW COLUMNS IN TABLE "AIRBYTE_DB"."airbyte_internal"."PG_SIMULATIONSIMULATION_DATA9ec27b42ea0af17884797d01a31e0c97"
2025-12-15 17:18:37 info INFO main i.a.c.d.j.s.TwoStageSizeEstimator$Companion(getTargetBufferByteSize):80 Max memory limit: 12197036032, JDBC buffer size: 7318221619
2025-12-15 17:18:39 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(accept):33 Set new fetch size: 103809 rows
2025-12-15 17:18:39 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(accept):33 Set new fetch size: 102781 rows
2025-12-15 17:18:39 info INFO main i.a.c.d.j.s.AdaptiveStreamingQueryConfig(accept):33 Set new fetch size: 98502 rows
2025-12-15 18:26:40 info
----- START POST REPLICATION OPERATIONS -----
2025-12-15 18:26:40 info No post-replication operation(s) to perform.
2025-12-15 18:26:40 info
----- END POST REPLICATION OPERATIONS -----Contribute
- Yes, I want to contribute in
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
area/connectorsConnector related issuesConnector related issuesautoteamcommunityneeds-triageteam/extensibilityteam/usetype/bugSomething isn't workingSomething isn't working