简体   繁体   English

通过Java应用程序连接到Spark SQL JDBC失败

[英]Connection to Spark SQL JDBC through Java Application fails

I have written a Java Application to perform SQL Queries on a hive-based database, using Spark SQL. 我已经编写了一个Java应用程序,使用Spark SQL在基于配置单元的数据库上执行SQL查询。 However, connecting to the Spark SQL through JDBC from my Java Application gives me a connection refused error. 但是,从我的Java应用程序通过JDBC连接到Spark SQL会给我一个拒绝连接错误。 The piece of code that I have written is below: 我写的代码如下:

public static void main(String args[]) {
    Class.forName("org.apache.hive.jdbc.HiveDriver");
    conn = DriverManager.getConnection("jdbc:hive2://<ip>:10015/default","","");
    System.out.println("Connected!");
}

I am using HDP 2.5 on a Centos 7 OS inside a VMWare workstation on Windows 7 . 我在Windows 7VMWare工作站内的Centos 7 OS上使用HDP 2.5 The code is being run from Eclipse Neon, JDK 8 . 该代码是从Eclipse Neon, JDK 8

I am able to connect to port 10000 (for Hive) on the same IP from the code. 我可以从代码连接到同一IP上的port 10000 (对于Hive)。 Also, I have tried to make JDBC connections from Beeline to both 10000 and 10015 and both of them work properly. 另外,我试图建立从Beeline到1000010015 JDBC连接,并且它们都正常工作。 Beeline syntax is: 直线语法为:

!connect jdbc:hive2://127.0.0.1:10015

The stack trace from the Java Code is: Java代码中的堆栈跟踪为:

1300 [main] INFO  org.apache.hive.jdbc.HiveConnection  - Could not open client transport with JDBC Uri: jdbc:hive2://192.168.225.128:10015/default
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.225.128:10015/default: java.net.ConnectException: Connection refused: connect
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:215)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:163)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)

Have you stored the hive metadata in the mysql? 您是否已将配置单元元数据存储在mysql中? Maybe the mysql server binds to localhost. 也许MySQL服务器绑定到本地主机。 Try to view /etc/mysql/mysql.conf.d/mysqld.cnf , and find this row: 尝试查看/etc/mysql/mysql.conf.d/mysqld.cnf ,并找到以下行:

bind-address = 127.0.0.1

remove it, and restart the mysql server. 删除它,然后重启mysql服务器。

I have managed to work it out, and the solution is fairly simple. 我已经设法解决了,解决方案非常简单。 You need to start Spark Thrift Servers to make this work. 您需要启动Spark Thrift服务器才能完成此工作。

Starting Spark Thrift Server using Ambari Web UI: 使用Ambari Web UI启动Spark Thrift Server:

Just go to Ambari -> Spark. 只需转到Ambari-> Spark。 On the top-right corner, you can see options. 在右上角,您可以看到选项。 Click on it and start Spark Thrift Servers from there. 单击它,然后从那里启动Spark Thrift服务器。

In case you do not see Spark Thrift Servers in the Web UI, go to SPARK_HOME/sbin and type: 如果您在Web UI中看不到Spark Thrift服务器,请转到SPARK_HOME / sbin并键入:

./start-thriftserver.sh --hiveconf hive.server2.thrift.port=10015

Wait for a minute or two after the server has started and you can make the JDBC connections successfully. 服务器启动后,请等待一两分钟,您可以成功建立JDBC连接。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM