Replies: 4 comments 2 replies
-
@chrigehr, that version for Trino doesn't look correct, can you verify? |
Beta Was this translation helpful? Give feedback.
-
In the above description I forgot to describe that I am doing the database creation, the table creation and the insert into the table via Spark. The read access via Trino and the Iceberg connector then fails as described. I did another test today where the database creation, table creation and insert was done using trino itself (so essentially everything then took place via trino). To my surprise this worked as expected with no problems or errors at all. Additionally I made the test with Trino 410 instead of 409, but the behaviour was the same. |
Beta Was this translation helpful? Give feedback.
-
Further details: I used Spark 3.3.2 and Iceberg 1.1.0. Trino Stacktrace: trino-coordinator | 2023-03-10T15:48:10.331Z INFO Query-20230310_154810_00008_a7egi-336 org.apache.iceberg.CatalogUtil Loading custom FileIO implementation: org.apache.iceberg.aws.s3.S3FileIO |
Beta Was this translation helpful? Give feedback.
-
After some tests I could now narrow down the error to Trino and the Iceberg Connector in interaction with the RestCatalog. Spark has nothing to do with it. Furthermore, the same works with the JDBCCatalog without any problems. I'm testing with Trino 410, Iceberg Connector+RESTCatalog, Minio. The following steps lead always to the described error:
if I'm not mistaken this looks like a bug in the RestCatalog. Creating a table in the Iceberg catalog seems to reload something (e.g. a Java Class) or set some configuration on the catalog temporarily. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Trino Community.
I'm having some problem with a Trino (409) + Iceberg Connector + RestCatalog + Files on S3/MinIO setup (local docker/docker-compose setup on my workstation). It is possible to create databases and tables with this setup, but an SELECT Operation on a table within this setup fails because the connector cannot load the class "org.apache.iceberg.aws.s3.S3FileIO". As fallback the connector tries to load a hadoop filesystem provider for "s3a", but this also fails. The interesting stuff seems to happen in the class "org.apache.iceberg.io.ResolvingFileIO".
Am i doing something wrong here? Are iceberg-aws or hadoop-aws libraries missing for some reason?
Searched issues and pull requests for this, but so far have only seen one possible connection to #16455
Update: #16213 seems also to be very similar, but in the ticket there are no concrete solutions if I understand it correctly.
I am grateful for every possible hint
Best regards,
Chris
Beta Was this translation helpful? Give feedback.
All reactions