Skip to content

请问一下用spark跨集群的访问开启了kerberos的hbase #33

@ighack

Description

@ighack

请问一下。如果想把集群B的hbase的数据读取到集群A中该怎么操作啊

我想到的是

sparkSession.sparkContext.addFile("hdfs://nameservice1/krb5B.conf")
sparkSession.sparkContext.addFile("hdfs://nameservice/clusterB.keytab")

val krb5Path = SparkFiles.get("krb5B.conf")
val principal = config.getJSONObject("auth").getString("principal")
val keytab = SparkFiles.get("clusterB.keytab")

System.setProperty("java.security.krb5.conf", krb5Path);
Configuration conf = new Configuration();
conf.set("hadoop.security.authentication", "Kerberos");
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromKeytab(principal, keytabPath)

在读取集群B的之前先loginUserFromKeytab一下。这里使用集群B的配制
在读取成一个dataframe之后。在用集群A的配制loginUserFromKeytab一下。

不知道这样是否可行

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions