我已将密钥和访问密钥作为文件存储在hdfs中,用于访问AWS,

hadoop credential create fs.s3a.access.key -provider jceks://hdfs/user/dev/keys.jceks -value ****************

hadoop credential create fs.s3a.secret.key -provider jceks://hdfs/user/dev/keys.jceks -value **********

我想使用jceks文件从Java代码连接到SQS队列和S3。

最佳答案

我可以使用以下代码解决此问题:

Java代码:

Configuration hadoopConfiguration = SparkSession.sparkContext().hadoopConfiguration();
log.info("CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH : "+hadoopConfiguration.get(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH ));
String accessKey = new String(hadoopConfiguration.getPassword("fs.s3a.access.key"));
String secretKey = new String(hadoopConfiguration.getPassword("fs.s3a.secret.key"));

Scala代码:
val hadoopConfiguration = sparkSession.sparkContext.hadoopConfiguration
hadoopConfiguration.set(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH, keyFileHdfsPath);
val access_Key = hadoopConfiguration.getPassword("fs.s3a.access.key").mkString
val secret_Key = hadoopConfiguration.getPassword("fs.s3a.secret.key").mkString

关于java - 如何使用存储在hdfs中的jceks文件连接到AWS SPSS队列和S3,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/58516601/

10-10 04:36