编程语言
首页 > 编程语言> > 【转】Java通过Kerberos认证连接hiveServer2【JDBC及Thrift方式】

【转】Java通过Kerberos认证连接hiveServer2【JDBC及Thrift方式】

作者:互联网

 使用Java连接Kerberos认证的Hadoop集群服务,最重要是首先要通过Java进行Kerberos认证,剩下的连接操作,就没有什么太多差别。

          废话不说,上代码:

         KerberosLogin 程序开始要先使用该方法进行kerberos认证。类似操作系统里kinit,这步就相当于在JVM里做了kinit.

 1 public class KerberosLogin {
 2 
 3 private Log logger = LogFactory.getLog(KerberosLogin.class);
 4 
 5 public void login() {
 6 String hiveUserName = "xxxxx@BCHKDC";//kerberos认证的用户principal名称
 7 String hiveKeytab = "F:/NL_MYECLIPSE2014_WORK/usrkrb5/conf/xxxxx.keytab";//用户的keytab认证文件
 8 String krbconf = "F:/NL_MYECLIPSE2014_WORK/usrkrb5/conf/krb5.conf";//kerberos5的配置文件
 9 
10 System.setProperty("java.security.krb5.conf", krbconf);
11 Configuration conf = new Configuration();
12 conf.set("hadoop.security.authentication", "Kerberos");
13 UserGroupInformation.setConfiguration(conf);
14 try {
15 UserGroupInformation.loginUserFromKeytab(hiveUserName, hiveKeytab);
16 } catch (IOException e) {
17 logger.error("Kerberos login fail.", e);
18 }
19 }
20 
21 }

  Thrift接口初始化示例,首先是工具类定义:

 1 import java.io.IOException;
 2 import java.util.HashMap;
 3 import java.util.Map;
 4 
 5 import javax.security.sasl.Sasl;
 6 
 7 import org.apache.commons.logging.Log;
 8 import org.apache.commons.logging.LogFactory;
 9 import org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport;
10 import org.apache.hadoop.security.UserGroupInformation;
11 import org.apache.hive.service.cli.thrift.TCLIService;
12 import org.apache.hive.service.cli.thrift.TOpenSessionReq;
13 import org.apache.hive.service.cli.thrift.TOpenSessionResp;
14 import org.apache.thrift.TException;
15 import org.apache.thrift.transport.TSaslClientTransport;
16 import org.apache.thrift.transport.TSocket;
17 import org.apache.thrift.transport.TTransport;
18 
19 public class QueryTool
20 {
21 
22 private static Log logger = LogFactory.getLog(QueryTool.class);
23 
24 public static TTransport getSocketInstance(String host, int port) throws IOException
25 {
26 TTransport transport = new TSocket(host, port);
27 Map<String, String> saslProperties = new HashMap<String, String>();
28 saslProperties.put("javax.security.sasl.qop", "auth");// kerberos认证关键参数
29 saslProperties.put("javax.security.sasl.server.authentication", "true");//Kerberos 认证关键参数
30 
31 logger.info("Security is enabled: " + UserGroupInformation.isSecurityEnabled());
32 
33 UserGroupInformation currentUser = UserGroupInformation.getCurrentUser();
34 logger.info("Current user: {}" + currentUser);
35 
36 TSaslClientTransport saslTransport = new TSaslClientTransport("GSSAPI", // 设置SASL使用GSSAPI
37 null, // authorizationid - null
38 "hive", // kerberos primary for server - "hive" in hive/myservername@MY.REALM
39 "myservername",// kerberos instance for server - "myservername" in hive/myservername@MY.REALM
40 saslProperties, // Properties set, above
41 null, // callback handler - null
42 transport); // underlying transport
43 
44 TUGIAssumingTransport ugiTransport = new TUGIAssumingTransport(saslTransport, currentUser);
45 
46 return ugiTransport;
47 }
48 
49 /**
50 * 如果使用此方法中设置的user进行访问,则需要 HiveServer2 启用模拟 hive.server2.enable.impersonation, hive.server2.enable.doAs =
51 * true即HiveServer2 Default Group打钩 获取TOpenSessionResp
52 * 
53 * @return
54 * @throws TException
55 */
56 
57 public static TOpenSessionResp openSession(TCLIService.Client client, String user, String pwd) throws TException
58 {
59 TOpenSessionReq openSessionReq = new TOpenSessionReq();
60 openSessionReq.setUsername(user);
61 openSessionReq.setPassword(pwd);
62 openSessionReq.setUsernameIsSet(true);
63 
64 return client.OpenSession(openSessionReq);
65 }
66 
67 public static TOpenSessionResp openSession(TCLIService.Client client) throws TException
68 {
69 TOpenSessionReq openSessionReq = new TOpenSessionReq();
70 return client.OpenSession(openSessionReq);
71 }
72 
73 }

 


        其它使用与平时使用Thrift接口没有什么差别,最关键就是这些定义。

   JDBC方式,同样需要在程序一开始进行Kerberos认证,之后代码就和无kerberos认证时一样。没有什么区别。

 1 public Connection getConnetionHive() {
 2 
 3 try {
 4 Class.forName(“org.apache.hive.jdbc.HiveDriver”);
 5 conn = DriverManager.getConnection(“jdbc:hive2://192.168.215.26:10000/mydatabase;principal=hive/myservername@BCHKDC”);
 6 System.out.println("mydatabase数据库连接成功");
 7 } catch (Exception e) {
 8 try {
 9 System.out.println("尝试备用链接");
10 conn = DriverManager.getConnection(“jdbc:hive2://192.168.215.27:10000/mydatabase;principal=hive/myservername@BCHKDC”);
11 System.out.println("mydatabase备用连接成功");
12 } catch (Exception e1) {
13 System.out.println("主备连接失败。。。。。。");
14 e1.printStackTrace();
15 }
16 }
17 return conn;
18 }


原文链接:https://blog.csdn.net/supperman_009/article/details/88366269

标签:Java,Kerberos,hive,认证,hiveServer2,org,apache,import,thrift
来源: https://www.cnblogs.com/zhangrui153169/p/13129620.html