web-dev-qa-db-ja.com

Hbaseシェルを実行するとエラーが発生する

私のローカル環境:OS X 10.9.2、Hbase 0.98.0、Java1.6

conf/hbase-site.xml

 <property>
     <name>hbase.rootdir</name>
     <!--<value>hdfs://127.0.0.1:9000/hbase</value> need to run dfs -->
     <value>file:///Users/Apple/Documents/tools/hbase-rootdir/hbase</value>
 </property>

 <property>
        <name>hbase.zookeeper.property.dataDir</name>
        <value>/Users/Apple/Documents/tools/hbase-zookeeper/zookeeper</value>
 </property> 

conf/hbase-env.sh

export Java_HOME=$(/usr/libexec/Java_home -d 64 -v 1.6)
export HBASE_OPTS="-XX:+UseConcMarkSweepGC"

そして私が走ったとき

> list

hbase Shellで、次のエラーが発生しました。

2014-03-29 10:25:53.412 Java[2434:1003] Unable to load realm info from SCDynamicStore
2014-03-29 10:25:53,416 WARN  [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-Java classes where applicable
2014-03-29 10:26:14,470 ERROR [main] zookeeper.RecoverableZooKeeper: ZooKeeper exists failed after 4 attempts
2014-03-29 10:26:14,471 WARN  [main] zookeeper.ZKUtil: hconnection-0x5e15e68d, quorum=localhost:2181, baseZNode=/hbase Unable to set watcher on znode (/hbase/hbaseid)
org.Apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
    at org.Apache.zookeeper.KeeperException.create(KeeperException.Java:99)
    at org.Apache.zookeeper.KeeperException.create(KeeperException.Java:51)
    at org.Apache.zookeeper.ZooKeeper.exists(ZooKeeper.Java:1041)
    at org.Apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.Java:199)
    at org.Apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.Java:479)
    at org.Apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.Java:65)
    at org.Apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.Java:83)
    at org.Apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.Java:857)
    at org.Apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.Java:662)
    at Sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at Sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.Java:39)
    at Sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.Java:27)
    at Java.lang.reflect.Constructor.newInstance(Constructor.Java:513)
    at org.Apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.Java:414)
    at org.Apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.Java:393)
    at org.Apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.Java:274)
    at org.Apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.Java:183)
    at Sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at Sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.Java:39)
    at Sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.Java:27)
    at Java.lang.reflect.Constructor.newInstance(Constructor.Java:513)
    at org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.Java:275)
    at org.jruby.Java.invokers.ConstructorInvoker.call(ConstructorInvoker.Java:91)
    at org.jruby.Java.invokers.ConstructorInvoker.call(ConstructorInvoker.Java:178)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.Java:322)
    at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.Java:178)
    at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.Java:182)
    at org.jruby.Java.proxies.ConcreteJavaProxy$2.call(ConcreteJavaProxy.Java:48)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.Java:322)
    at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.Java:178)
    at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.Java:182)
    at org.jruby.RubyClass.newInstance(RubyClass.Java:829)
        ...
at Users.Apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.block_2$Ruby$start(/Users/Apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb:185)
    at Users$Apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$block_2$Ruby$start.call(Users$Apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$block_2$Ruby$start:65535)
    at org.jruby.runtime.CompiledBlock.yield(CompiledBlock.Java:112)
    at org.jruby.runtime.CompiledBlock.yield(CompiledBlock.Java:95)
    at org.jruby.runtime.Block.yield(Block.Java:130)
    at org.jruby.RubyContinuation.enter(RubyContinuation.Java:106)
    at org.jruby.RubyKernel.rbCatch(RubyKernel.Java:1212)
    at org.jruby.RubyKernel$s$1$0$rbCatch.call(RubyKernel$s$1$0$rbCatch.gen:65535)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.Java:322)
    at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.Java:178)
    at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.Java:187)
    at Users.Apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.method__5$Ruby$start(/Users/Apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb:184)
    at Users$Apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$method__5$Ruby$start.call(Users$Apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$method__5$Ruby$start:65535)
    at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.Java:203)
    at org.jruby.internal.runtime.methods.CompiledMethod.call(CompiledMethod.Java:255)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.Java:292)
    at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.Java:135)
    at Users.Apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.__file__(/Users/Apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb:190)
    at Users.Apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.load(/Users/Apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb)
    at org.jruby.Ruby.runScript(Ruby.Java:697)
    at org.jruby.Ruby.runScript(Ruby.Java:690)
    at org.jruby.Ruby.runNormally(Ruby.Java:597)
    at org.jruby.Ruby.runFromMain(Ruby.Java:446)
    at org.jruby.Main.doRunFromMain(Main.Java:369)
    at org.jruby.Main.internalRun(Main.Java:258)
    at org.jruby.Main.run(Main.Java:224)
    at org.jruby.Main.run(Main.Java:208)
    at org.jruby.Main.main(Main.Java:188)
2014-03-29 10:28:21,137 ERROR [main] client.HConnectionManager$HConnectionImplementation: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase

ERROR: KeeperErrorCode = ConnectionLoss for /hbase

そして、私の/ etc/hostsは正しく見えます:

127.0.0.1   localhost
255.255.255.255 broadcasthost
::1             localhost 
fe80::1%lo0 localhost
127.0.0.1 activate.Adobe.com
127.0.0.1 practivate.Adobe.com
127.0.0.1 ereg.Adobe.com
127.0.0.1 activate.wip3.Adobe.com
127.0.0.1 wip3.Adobe.com
127.0.0.1 3dns-3.Adobe.com
127.0.0.1 3dns-2.Adobe.com
127.0.0.1 Adobe-dns.Adobe.com
127.0.0.1 Adobe-dns-2.Adobe.com
127.0.0.1 Adobe-dns-3.Adobe.com
127.0.0.1 ereg.wip3.Adobe.com
127.0.0.1 activate-sea.Adobe.com
127.0.0.1 wwis-dubc1-vip60.Adobe.com
127.0.0.1 activate-sjc0.Adobe.com
127.0.0.1 Adobe.activate.com
127.0.0.1 209.34.83.73:443
127.0.0.1 209.34.83.73:43
127.0.0.1 209.34.83.73
127.0.0.1 209.34.83.67:443
127.0.0.1 209.34.83.67:43
127.0.0.1 209.34.83.67
127.0.0.1 ood.opsource.net
127.0.0.1 CRL.VERISIGN.NET
127.0.0.1 199.7.52.190:80
127.0.0.1 199.7.52.190
127.0.0.1 adobeereg.com
127.0.0.1 OCSP.SPO1.VERISIGN.COM
127.0.0.1 199.7.54.72:80
127.0.0.1 199.7.54.72
7
Rickie Lau

私も同じ問題に直面し、長い間苦労しました。 ここでの説明 に続いて、実行する前に./bin/hbase Shellコマンドを使用する必要があります./bin/start-hbase.sh 最初。その後、私の問題は解決されました。

12
Kehe CAI

hbase-site.xmlが言うように、hdfsでもhbaseを実行しようとしましたが、現在はローカルファイルシステムで実行しようとしています。
解決策:最初にhadoop.x.x.x/bin/start-dfs.shを実行し、次にhbase.x.x.x/bin/start-hbase.shを実行します。これで、ローカルファイルシステムで期待どおりに実行されます。

3
Chandra kant

私もこの問題に直面していました。

スタンドアロンで試行する場合は、hbaseライブラリのみを使用し、ライブラリからhadoopを削除して、hbase.hadoopライブラリを使用してください。

0
Mark

/ etc/hostsファイルにホスト名を追加しなかったときにこの問題に直面しました。

たとえば、私のホスト名はnode1

add 127.0.0.1 node1 in /etc/hosts
0
user1531214