本文介绍了连接到Hive时Kinit与Spark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
有人可以让我知道如何在spark程序中执行kinit吗?我可以连接到配置单元?
更新:我的Spark与Hadoop位于不同的群集中
解决方案
import java.lang.ProcessBuilder
import java.io.PrintWriter
//重置您的Kerberos登录
val p1 = Runtime.getRuntime。 exec(kdestroy)
p1.waitFor
//执行kinit,
val p = Runtime.getRuntime.exec(kinit)
val stdin = p.getOutputStream
val pw = new PrintWriter(stdin)
// val pwd = get_password()// get_password()是一个函数,用于从文件或任何地方获取密码
pw.println(pwd )/ /你可以把你的密码放在这里,但普通的文本密码通常都会丢失ned upon
pw.close
p.waitFor
I am trying to connect to Hive(hadoop cluster has kerberos authentication) from Spark which is Standalone.
Can someone let me know how to do kinit in spark program i could connect to hive?
UPDATE:My Spark is on different cluster from Hadoop
解决方案
Assuming you have a spark-shell open and you don't want to exit, and then re-kinit you could do something like this:
import java.lang.ProcessBuilder
import java.io.PrintWriter
//resets your kerberos login
val p1 = Runtime.getRuntime.exec("kdestroy")
p1.waitFor
//executes kinit,
val p = Runtime.getRuntime.exec("kinit")
val stdin = p.getOutputStream
val pw =new PrintWriter(stdin)
//val pwd = get_password() //get_password() is a function to get your password from a file, or wherever
pw.println(pwd) // you could put your password here , but plain text passwords are generally frowned upon
pw.close
p.waitFor
这篇关于连接到Hive时Kinit与Spark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!
08-07 06:31