架設HBase with Hive
看看就好,個人測試完後覺得沒必要把Sql (Hive) 跟 No Sql (Hbase)這兩種不同資料格式的東西硬湊在一起
配置Hive
複製hadoop3 Server的HBase jar到hadoop1 Server的Hive Lib
$ scp ~/hbase/lib/hbase-protocol-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-server-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-client-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-common-2.1.0.jar hadoop1:~/hive/lib/
$ scp ~/hbase/lib/hbase-common-2.1.0-tests.jar hadoop1:~/hive/lib/
配置hive-site.xml
$ vi ~/hive/conf/hive-site.xml
配置內容
<configuration>
<!--jdbc-->
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.cj.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>shark</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>shark</value>
</property>
<property>
<name>datanucleus.schema.autoCreateAll</name>
<value>true</value>
</property>
<!--spark engine -->
<property>
<name>hive.execution.engine</name>
<value>spark</value>
</property>
<property>
<name>hive.enable.spark.execution.engine</name>
<value>true</value>
</property>
<!--spark context -->
<property>
<name>spark.master</name>
<value>yarn-cluster</value>
</property>
<property>
<name>spark.serializer</name>
<value>org.apache.spark.serializer.KryoSerializer</value>
</property>
<!--hbase-->
<property>
<name>hbase.zookeeper.quorum</name>
<value>hadoop5:2181,hadoop6:2181,hadoop7:2181</value>
</property>
<property>
<name>hive.aux.jars.path</name>
<value>file://~/hive/lib/hive-hbase-handler-3.1.0.jarc,file://~/hive/lib/hbase-protocol-2.1.0.jar,file://~/hive/lib/hbase-server-2.1.0.jar,file://~/hive/lib/hbase-client-2.1.0.jar,file://~/hive/lib/hbase-common-2.1.0.jar,file://~/hive/lib/hbase-common-2.1.0-tests.jar,file://~/hive/lib/zookeeper-3.4.6.jar,file://~/hive/lib/guava-19.0.jar</value>
</property>
</configuration>
測試
用hadoop1 Hive建立HBase Table
$ ~/hive/bin/hive -hiveconf hbase.master=hadoop3:16000
which: no hbase in (/home/hadoop/spark/bin:/home/hadoop/scala/bin:/home/hadoop/hive/bin:/usr/local/src/apache-maven/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/hadoop/hadoop/sbin:/home/hadoop/hadoop/bin:/home/hadoop/.local/bin:/home/hadoop/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = e31e3db2-dc40-482f-8c34-9bf1f4329e18
Logging initialized using configuration in jar:file:/home/hadoop/hive/lib/hive-common-3.1.0.jar!/hive-log4j2.properties Async: true
Hive Session ID = e7646241-e029-4f17-b7e3-cdfc138f558a
hive> create table hive_hbase(key int,value string) stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' with serdeproperties('hbase.columns.mapping'=':key,f:value') tblproperties('hbase.table.name'='hive_hbase');
OK
Time taken: 5.419 seconds
用hadoop3 HBase檢查Table是否有順利建立成功
$ ~/hbase/bin/hbase shell
OpenJDK 64-Bit Server VM warning: If the number of processors is expected to increase from one, then you should configure the number of parallel GC threads appropriately using -XX:ParallelGCThreads=N
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hbase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
Version 2.1.0, re1673bb0bbfea21d6e5dba73e013b09b8b49b89b, Tue Jul 10 17:26:48 CST 2018
Took 0.0029 seconds
hbase(main):001:0> list;
TABLE
Member
hive_hbase
2 row(s)
Took 0.8008 seconds
Last updated