架設HBase with Hive

看看就好,個人測試完後覺得沒必要把Sql (Hive) 跟 No Sql (Hbase)這兩種不同資料格式的東西硬湊在一起

配置Hive

複製hadoop3 Server的HBase jar到hadoop1 Server的Hive Lib

$ scp ~/hbase/lib/hbase-protocol-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-server-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-client-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-common-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-common-2.1.0-tests.jar hadoop1:~/hive/lib/

配置hive-site.xml

$ vi ~/hive/conf/hive-site.xml 

配置內容

<configuration>
  <!--jdbc-->
  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://localhost:3306/hive</value>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.cj.jdbc.Driver</value>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>shark</value>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>shark</value>
  </property>
  <property>
    <name>datanucleus.schema.autoCreateAll</name>
    <value>true</value>
  </property>
  <!--spark engine -->
  <property>
    <name>hive.execution.engine</name>
    <value>spark</value>
  </property>
  <property>
    <name>hive.enable.spark.execution.engine</name>
    <value>true</value>
  </property>
  <!--spark context -->
  <property>
    <name>spark.master</name>
    <value>yarn-cluster</value>
  </property>
  <property>
    <name>spark.serializer</name>
    <value>org.apache.spark.serializer.KryoSerializer</value>
  </property>
  <!--hbase-->
  <property>
    <name>hbase.zookeeper.quorum</name>
    <value>hadoop5:2181,hadoop6:2181,hadoop7:2181</value>
  </property>
  <property>
    <name>hive.aux.jars.path</name>
    <value>file://~/hive/lib/hive-hbase-handler-3.1.0.jarc,file://~/hive/lib/hbase-protocol-2.1.0.jar,file://~/hive/lib/hbase-server-2.1.0.jar,file://~/hive/lib/hbase-client-2.1.0.jar,file://~/hive/lib/hbase-common-2.1.0.jar,file://~/hive/lib/hbase-common-2.1.0-tests.jar,file://~/hive/lib/zookeeper-3.4.6.jar,file://~/hive/lib/guava-19.0.jar</value>
  </property>
</configuration>

測試

用hadoop1 Hive建立HBase Table

用hadoop3 HBase檢查Table是否有順利建立成功

Last updated