Hello,
NOTE : All below try is in "indirect" write mode.
I am trying to upgrade my spark BQ writer from "0.23.0" to "0.41.1". As soon as I tried this, I end up with getting error :
Invalid table ID "test_b_deals$202602051235232241136685". Table IDs must be alphanumeric (plus underscores) and must be at most 1024 characters long. Also, Table decorators cannot be used.
To resolve this I removed .option("datePartition", partition) from my writer.
val partition = partitionType match {
case "HOUR" => DateTimeUtil.DATE_FORMATTER_yyyyMMddHH.format(date)
case "DAY" => DateTimeUtil.DATE_FORMATTER_yyyyMMdd.format(date)
case "MONTH" => DateTimeUtil.DATE_FORMATTER_yyyyMM.format(date)
}
writer
.format("bigquery")
.mode(SaveMode.Overwrite)
.option("writeMethod", "indirect")
.option("table", s"$dataset.$table") // dataset: b_aggregated, table: test_b_deals (passed from config)
// .option("datePartition", partition)
.option("allowFieldAddition", allowFieldAddition) // true
.option("allowFieldRelaxation", allowFieldRelaxation) // true
.option("partitionField", partitionField) // hour_timestamp
.option("partitionType", partitionType) // HOUR
Also already have set DYNAMIC .set("spark.sql.sources.partitionOverwriteMode", "DYNAMIC")
As soon as i tried above one I start getting another error. :
[2026-02-09, 14:45:10 UTC] {pod_manager.py:226} INFO - Caused by: com.google.cloud.bigquery.BigQueryException: Schema update options should only be specified with WRITE_APPEND disposition, or with WRITE_TRUNCATE disposition on a table partition.
Even after mentioning DYNAMIC with proper config, I am stuck in a loop.
My requirement is to Use Indirect WRITE with latest version which is supported by spark 3.5 and hadoop 3.4.2 version.
I tried with 0.34.0, 0.41.1 version but its same and stuck in the loop.
Could someone please help me, Thanks!
Table details :
