Skip to content

java.lang.IllegalAccessError: ... tried to access method 'org.apache.hadoop.thirdparty.protobuf.LazyStringArrayList #367

@allipatev

Description

@allipatev

Customer noticed that work with HDFS according to Hadoop Distributed Filesystem (HDFS) doesn't work.

It is reproducible with the latest version of DB (2025.2.0) and latest Cloud Storage Extension (2.9.2)

CREATE or replace JAVA SET SCRIPT CLOUD_STORAGE_EXTENSION."IMPORT_PATH" (...) EMITS (...) AS
%scriptclass com.exasol.cloudetl.scriptclasses.FilesImportQueryGenerator;
  %jar /buckets/bfsdefault/jars/exasol-cloud-storage-extension-2.9.2.jar;
/
;

CREATE or replace JAVA SCALAR SCRIPT CLOUD_STORAGE_EXTENSION."IMPORT_METADATA" (...) EMITS ("FILENAME" VARCHAR(2000) UTF8, "PARTITION_INDEX" VARCHAR(100) UTF8, "START_INDEX" DECIMAL(36,0), "END_INDEX" DECIMAL(36,0)) AS
%scriptclass com.exasol.cloudetl.scriptclasses.FilesMetadataReader;
  %jar /buckets/bfsdefault/jars/exasol-cloud-storage-extension-2.9.2.jar;
/
;

CREATE or replace JAVA SET SCRIPT CLOUD_STORAGE_EXTENSION."IMPORT_FILES" (...) EMITS (...) AS
%scriptclass com.exasol.cloudetl.scriptclasses.FilesDataImporter;
  %jar /buckets/bfsdefault/jars/exasol-cloud-storage-extension-2.9.2.jar;
/
;

IMPORT INTO test.hdfs_test
FROM SCRIPT CLOUD_STORAGE_EXTENSION.IMPORT_PATH WITH
  BUCKET_PATH     = 'hdfs://exasol.com:443/data/*.parquet'
  DATA_FORMAT     = 'PARQUET'
;
SQL Error [22002]: VM error: F-UDF-CL-LIB-1127: F-UDF-CL-SL-JAVA-1002: F-UDF-CL-SL-JAVA-1013: 
com.exasol.ExaUDFException: F-UDF-CL-SL-JAVA-1080: Exception during run 
java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.protocol.proto.ErasureCodingProtos$GetECTopologyResultForPoliciesRequestProto tried to access method 'org.apache.hadoop.thirdparty.protobuf.LazyStringArrayList org.apache.hadoop.thirdparty.protobuf.LazyStringArrayList.emptyList()' (org.apache.hadoop.hdfs.protocol.proto.ErasureCodingProtos$GetECTopologyResultForPoliciesRequestProto and org.apache.hadoop.thirdparty.protobuf.LazyStringArrayList are in unnamed module of loader 'app')
org.apache.hadoop.hdfs.protocol.proto.ErasureCodingProtos$GetECTopologyResultForPoliciesRequestProto.<init>(ErasureCodingProtos.java:10445)
org.apache.hadoop.hdfs.protocol.proto.ErasureCodingProtos$GetECTopologyResultForPoliciesRequestProto.<clinit>(ErasureCodingProtos.java:10948)
java.base/java.lang.Class.forName0(Native Method)
java.base/java.lang.Class.forName(Class.java:375)
jdk.proxy2/jdk.proxy2.$Proxy11.<clinit>(Unknown Source)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
java.base/java.lang.reflect.Proxy.newProxyInstance(Proxy.java:1053)
java.base/java.lang.reflect.Proxy.newProxyInstance(Proxy.java:1039)
org.apache.hadoop.ipc.ProtobufRpcEngine2.getProxy(ProtobufRpcEngine2.java:124)
org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:712)
org.apache.hadoop.hdfs.NameNodeProxiesClient.createProxyWithAlignmentContext(NameNodeProxiesClient.java:371)
org.apache.hadoop.hdfs.NameNodeProxiesClient.createNonHAProxyWithClientProtocol(NameNodeProxiesClient.java:343)
org.apache.hadoop.hdfs.NameNodeProxiesClient.createProxyWithClientProtocol(NameNodeProxiesClient.java:135)
org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:370)
org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:309)
org.apache.hadoop.hdfs.DistributedFileSystem.initDFSClient(DistributedFileSystem.java:203)
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:188)
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3615)
org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:172)
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3716)
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3667)
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:557)
com.exasol.cloudetl.bucket.Bucket.fileSystem$lzycompute(Bucket.scala:70)
com.exasol.cloudetl.bucket.Bucket.fileSystem(Bucket.scala:69)
com.exasol.cloudetl.bucket.Bucket.getPaths(Bucket.scala:79)
com.exasol.cloudetl.emitter.FilesMetadataEmitter.<init>(FilesMetadataEmitter.scala:27)
com.exasol.cloudetl.scriptclasses.FilesMetadataReader$.run(FilesMetadataReader.scala:33)
com.exasol.cloudetl.scriptclasses.FilesMetadataReader.run(FilesMetadataReader.scala)
com.exasol.ExaWrapper.run(ExaWrapper.java:215)
 (Session: 1855640615434125312)

I presume a problem with one of Cloud Storage Extension dependencies, like here: scalapb/ScalaPB#1657.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions