HBase容量负载的流失

HBase容量负载的流失

本文介绍了HBase容量负载的流失的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我可以使用Java程序生成HFile,但每当我尝试将它们导入到HBase表时,我都会看到附加的错误。当我使用completebulkload的时候,我得到了同样的错误。



如果有人能够帮助我,这将是一个很大的帮助。我坚持了这一些日子,现在开始变得非常沮丧。



亲切的问候,
Pieterjan



例外

  12/12/14 17:46:23 WARN mapreduce.LoadIncrementalHFiles:跳过非目录hdfs:// localhost:9000 / hadoopdir / user / data / output / hfiles / test / _SUCCESS 
12/12/14 17:46:23信息hfile.CacheConfig:分配LruBlockCache最大大小为241.7m
12/12/14 17:46:23信息util.ChecksumType:Checksum使用org.apache.hadoop.util.PureJavaCrc32
12/12/14 17:46:23信息util.ChecksumType:org.apache.hadoop.util.PureJavaCrc32C不可用。
12/12/14 17:46:23错误mapreduce.LoadIncrementalHFiles:分裂
期间的意外执行异常java.util.concurrent.ExecutionException:java.lang.IllegalStateException:hbase.metrics的值。 showTableName conf选项没有在SchemaMetrics中指定
在java.util.concurrent.FutureTask $ Sync.innerGet(FutureTask.java:252)
在java.util.concurrent.FutureTask.get(FutureTask.java :111)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:333)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java :232)
at POC.HBaseTest.TestHBaseRun.run(TestHBaseRun.java:67)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at Main.Main.main(Main.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
在sun.reflect.DelegatingMetho dAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar。 java:156)
导致:java.lang.IllegalStateException:没有在SchemaMetrics
中org.apache.hadoop.hbase.regionserver.metrics中指定hbase.metrics.showTableName conf选项的值.SchemaMetrics.getEffectiveTableName(SchemaMetrics.java:607)
位于org.apache.hadoop.hbase.regionserver.metrics.SchemaMetrics.getInstance(SchemaMetrics.java:333)
位于org.apache.hadoop.hbase .regionserver.metrics.SchemaConfigured.getSchemaMetrics(SchemaConfigured.java:185)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.close(HFileReaderV2.java:441)
at org.apache .hadoop.hbase.io.hfile.HFileReaderV2.close(HFileReaderV2.java:419)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:410)
at org .apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles $ 2.cal l(LoadIncrementalHFiles.java:323)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles $ 2.call(LoadIncrementalHFiles.java:321)
在java.util.concurrent.FutureTask $ Sync.innerRun (FutureTask.java:334)
在java.util.concurrent.FutureTask.run(FutureTask.java:166)
在java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
在java.util.concurrent.ThreadPoolExecutor $ Worker.run(ThreadPoolExecutor.java:603)$ b $在java.lang.Thread.run(Thread.java:722)
错误:java.lang .IllegalStateException:未在SchemaMetrics中指定hbase.metrics.showTableName conf选项的值


解决方案

最后找到了一个解决方案。



我确保在 $ HBASE_HOME / conf -folder位于我的Java应用程序的类路径中,并且在我的代码中添加了 SchemaMetrics.configureGlobally(conf),它将标志设置为使用度量名称中的表名称。

编辑:我发现后面这个东西是我必须用HBase 0.94.3做的。

>

I'm able to generate HFiles using a Java program but whenever I try to import them into my HBase table I get the attached error. I get the same error when, instead of using my Java program, I use completebulkload.

It would be a great help if someone could help me out here. I'm stuck on this for some days now and it's starting to get really frustrating.

Kind regards,Pieterjan

Exception:

12/12/14 17:46:23 WARN mapreduce.LoadIncrementalHFiles: Skipping non-directory hdfs://localhost:9000/hadoopdir/user/data/output/hfiles/test/_SUCCESS
12/12/14 17:46:23 INFO hfile.CacheConfig: Allocating LruBlockCache with maximum size 241.7m
12/12/14 17:46:23 INFO util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
12/12/14 17:46:23 INFO util.ChecksumType: org.apache.hadoop.util.PureJavaCrc32C not available.
12/12/14 17:46:23 ERROR mapreduce.LoadIncrementalHFiles: Unexpected execution exception during splitting
java.util.concurrent.ExecutionException: java.lang.IllegalStateException: The value of the hbase.metrics.showTableName conf option has not been specified in SchemaMetrics
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:252)
at java.util.concurrent.FutureTask.get(FutureTask.java:111)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:333)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:232)
at POC.HBaseTest.TestHBaseRun.run(TestHBaseRun.java:67)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at Main.Main.main(Main.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.IllegalStateException: The value of the hbase.metrics.showTableName conf option has not been specified in SchemaMetrics
at org.apache.hadoop.hbase.regionserver.metrics.SchemaMetrics.getEffectiveTableName(SchemaMetrics.java:607)
at org.apache.hadoop.hbase.regionserver.metrics.SchemaMetrics.getInstance(SchemaMetrics.java:333)
at org.apache.hadoop.hbase.regionserver.metrics.SchemaConfigured.getSchemaMetrics(SchemaConfigured.java:185)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.close(HFileReaderV2.java:441)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.close(HFileReaderV2.java:419)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:410)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:323)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:321)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
Error: java.lang.IllegalStateException: The value of the hbase.metrics.showTableName conf option has not been specified in SchemaMetrics
解决方案

Finally found a solution.

I made sure the under $HBASE_HOME/conf-folder was on the classpath of my Java application and in my code I added SchemaMetrics.configureGlobally(conf) which sets the flag to use table names in metric names.

I hope this can help someone later on.

EDIT: I found out this latter thing was something I had to do using HBase 0.94.3.

这篇关于HBase容量负载的流失的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-31 04:29