一、kerberos安装windows客户端
1、官方下载地址
http://web.mit.edu/kerberos/dist/
2、环境变量配置
下载msi安装包,无需重启计算机,调整环境变量在jdk的前面,尽量靠前,因为jdk也带了kinit、klist等命令
C:\Program Files\MIT\Kerberos\bin
3、krb5.ini配置
文件路径:`C:\ProgramData\MIT\Kerberos5\krb5.ini`,客户忽略日志的配置
[libdefaults]
dns_lookup_realm = false
dns_lookup_kdc = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
rdns = false
pkinit_anchors = FILE:/etc/pki/tls/certs/ca-bundle.crt
default_realm = EASTCOM.CN
# default_ccache_name = KEYRING:persistent:%{uid}
[realms]
EASTCOM.CN = {
kdc = 192.168.83.128
admin_server = 192.168.83.128
}
4、kerberos认证
PS D:\Environment\jdk-11\jdk-11\11\bin> kdestroy
PS D:\Environment\jdk-11\jdk-11\11\bin> kinit.exe -k -t C:\ProgramData\MIT\Kerberos5\hive.keytab hive/bigdata32@HADOOP.COM
PS D:\Environment\jdk-11\jdk-11\11\bin> kinit hdfs
Password for hdfs@HADOOP.COM:
PS D:\Environment\jdk-11\jdk-11\11\bin> klist
Ticket cache: FILE:C:\temp\krb5.cache
Default principal: hdfs@HADOOP.COM
Valid starting Expires Service principal
04/28/24 09:37:20 04/29/24 09:37:20 krbtgt/HADOOP.COM@HADOOP.COM
PS D:\Environment\jdk-11\jdk-11\11\bin> kdestroy
5、浏览器配置firefox(用于yarn、hive等web访问)
about:config 修改配置
network.negotiate-auth.trusted-uris : bigdata29,bigdata30,bigdata31,bigdata32
network.auth.use-sspi false
二、hive的dbeaver配置
1、驱动包文件测试这几个都满足kerberos的验证登录
HiveJDBC4.jar、inceptor-sdk-transwarp-6.0.0-SNAPSHOT.jar、hive-jdbc-uber-2.6.3.0-235.jar
2、url模板
jdbc:hive2://{host}[:{port}][/{database}];AuthMech=1;KrbRealm=HADOOP.COM;KrbHostFQDN={host};KrbServiceName=hive;KrbAuthType=2;principal=hive/bigdata32@HADOOP.COM
精简模板
jdbc:hive2://{host}[:{port}][/{database}];principal={user}/_HOST@HADOOP.COM
三、连接出现的问题与解决方式
1、dbeaver指定jdk的命令,在dbeaver.ini中增加以下命令
-vm
D:\Environment\db_tools\DBeaverEE\jdk-17\bin
2、首先打开dbeaver的指定kerberos配置文件,开启调试命令,查看问题
-Djava.net.preferIPv4Stack=true
-Djavax.security.auth.useSubjectCredsOnly=false
-Djava.security.krb5.conf=C:\ProgramData\MIT\Kerberos5\krb5.ini
-Dsun.security.krb5.debug=true
3、遇到的问题
3.1、UDP配置问题
>>> KDCCommunication: kdc=bigdata29 UDP:88, timeout=30000,Attempt =1, #bytes=643
SocketTimeOutException with attempt: 1
>>> KDCCommunication: kdc=bigdata29 UDP:88, timeout=30000,Attempt =2, #bytes=643
SocketTimeOutException with attempt: 2
>>> KDCCommunication: kdc=bigdata29 UDP:88, timeout=30000,Attempt =3, #bytes=643
SocketTimeOutException with attempt: 3
>>> KrbKdcReq send: error trying bigdata29
java.net.SocketTimeoutException: Receive timed out
解决办法:krb5.ini配置文件中增加以下配置
udp_preference_limit = 1
3.2、时钟不同步问题
dbeaver如下提示
Could not open client transport with JDBC Uri: jdbc:hive2://bigdata32:10000/spark;AuthMech=1;KrbRealm=HADOOP.COM;KrbHostFQDN=bigdata32;KrbServiceName=hive;KrbAuthType=2;principal=hive/bigdata32@HADOOP.COM: GSS initiate failed
GSS initiate failed
GSS initiate failed
dbeaver日志文件显示如下信息
KrbException: Clock skew too great (37) - PROCESS_TGS
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
解决办法:确定开发设备与服务器的设备时钟是否同步,同步与服务器的时钟
3.3、hive查询spark创建的hive表
报错如下
SQL 错误: java.io.IOException: java.lang.ArrayIndexOutOfBoundsException: 9
折腾了好几天,把一些问题汇总分享出来,避免其他人走弯路,尤其注意服务器与本地的时钟不一致,会导致dbeaver连接不上,但是kerberos认证的工具认证没有问题