# 透過JDBC連線至Apache Hadoop Hive 參考:https://intl.cloud.tencent.com/document/product/1026/31147 > pom.xml ```xml= <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com</groupId> <artifactId>hive_example</artifactId> <version>0.0.1-SNAPSHOT</version> <packaging>jar</packaging> <name>hive_example</name> <url>http://maven.apache.org</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <dependencies> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-jdbc</artifactId> <version>1.1.0</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.6.0</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.8</source> <target>1.8</target> <encoding>utf-8</encoding> </configuration> </plugin> <plugin> <artifactId>maven-assembly-plugin</artifactId> <configuration> <descriptorRefs> <descriptorRef>jar-with-dependencies</descriptorRef> </descriptorRefs> </configuration> <executions> <execution> <id>make-assembly</id> <phase>package</phase> <goals> <goal>single</goal> </goals> </execution> </executions> </plugin> </plugins> </build> </project> ``` ```shell= hive --version hadoop version ``` hive-jdbc與hadoop-common 分別要對應到自己相符版本 不然會跑error: ```shell= Required field 'client_protocol' is unset! ``` --- > App.java ```java= package com.hive_example; import java.sql.SQLException; import java.sql.Connection; import java.sql.ResultSet; import java.sql.Statement; import java.sql.DriverManager; public class App { private static String driverName = "org.apache.hive.jdbc.HiveDriver"; public static void main(String[] args) throws SQLException { try { Class.forName(driverName); } catch (ClassNotFoundException e) { e.printStackTrace(); System.exit(1); } Connection con = DriverManager .getConnection("jdbc:hive2://[your uri]","",""); Statement stmt = con.createStatement(); String sql = "[your sql code]"; stmt.execute(sql); System.out.println("Running: " + sql); ResultSet res = stmt.executeQuery(sql); res = stmt.executeQuery(sql); while (res.next()) { System.out.println(res.getString(1)); } } } ``` 原本進到hive shell中用set hive.metastore.uris 去查看uri時 得到的是 ``` hive.metastore.uris=thrift://cluster_____:9083 ``` 所以就拿這uri去拼jdbc的uri連結 但一直得到: ```java= java.sql.SQLException: Could not open client transport with JDBC Uri ``` 後來才得知 是port給錯了 用那指令拿到的是metastore default port 但metastore不支援和hive server相同的thrift api 得到的解答是: > [You need to specify the host + port of hiveserver(1 or 2). The default port for it is 10000.](https://stackoverflow.com/questions/17027676/invalid-method-name-execute-error-with-hive-client-in-java) > 因此 改成port 10000 就沒這個問題了 中間搞錯的小問題 在JAVA寫HIVE查詢指令時加了; > When you CANNOT use ';' at the end of the SQL statement when you're using Java: > ```java= > LOAD DATA LOCAL INPATH '/tmp/sql.txt' INTO TABLE files > ``` > or get error: ParseException about EOF in Hive > eclipse的話Run As > Maven Build > Goals:package  console:  用這包 檔名含有-with-dependencies的這包jar  Final Result: 
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up