Skip to content

Commit 461db7d

Browse files
andmariosclaude
andcommitted
fix: Add decimal logical type handling in ToAvroDataConverter
Added case for 'decimal' logical type in convertFieldValue method to properly convert java.math.BigDecimal values to ByteBuffer format expected by Avro. This fixes the failing AvroFormatWriterTest 'should write decimal data from the header' test. (cherry-picked from c6de95d) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 5df23bf commit 461db7d

File tree

1 file changed

+7
-1
lines changed
  • kafka-connect-cloud-common/src/main/scala/io/lenses/streamreactor/connect/cloud/common/sink/conversion

1 file changed

+7
-1
lines changed

kafka-connect-cloud-common/src/main/scala/io/lenses/streamreactor/connect/cloud/common/sink/conversion/ToAvroDataConverter.scala

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
/*
2-
* Copyright 2017-2025 Lenses.io Ltd
2+
* Copyright 2017-2026 Lenses.io Ltd
33
*
44
* Licensed under the Apache License, Version 2.0 (the "License");
55
* you may not use this file except in compliance with the License.
@@ -145,6 +145,12 @@ object ToAvroDataConverter {
145145
case d: Date => d.getTime * 1000L
146146
case other => other
147147
}
148+
case Some("decimal") =>
149+
value match {
150+
case bd: java.math.BigDecimal =>
151+
ByteBuffer.wrap(bd.unscaledValue().toByteArray)
152+
case other => other
153+
}
148154
case _ =>
149155
// No logical type or unhandled logical type - convert based on physical schema type
150156
convertBySchemaType(value, targetSchema)

0 commit comments

Comments
 (0)