我已经安装了Hadoop 2.6.0版,HBase 0.99.0版,Hive 1.2版,Kylin 1.5.0版。

在运行Kylin时,我已经在独立模式下设置了以上所有内容,并且在早期阶段检查了Hadoop,HBase和Hive。每件事都已经安装了,但是当我启动Kylin时,它给出了一个HBase common lib找不到的错误。
以下是Apache Kylin的日志。

KYLIN_HOME is set to bin/../
16/03/24 18:02:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
kylin.security.profile is set to testing
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Downloads/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Downloads/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties
cut: illegal option -- -
usage: cut -b list [-n] [file ...]
       cut -c list [file ...]
       cut -f list [-s] [-d delim] [file ...]
HIVE_CONF is set to: /Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/conf/, use it to locate hive configurations.
HCAT_HOME is set to: /Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/hcatalog, use it to find hcatalog path:
usage: dirname path
find: -printf: unknown primary or operator
hive dependency: /Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/conf/::/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/hcatalog/share/hcatalog/hive-hcatalog-core-1.2.1.jar
cut: illegal option -- -
usage: cut -b list [-n] [file ...]
       cut -c list [file ...]
       cut -f list [-s] [-d delim] [file ...]
hbase-common lib not found

请有人帮我。

最佳答案

您遇到的问题是mac-osx上的cut命令不支持“--output-delimiter”选项。安装kylin-1.5.1时遇到了相同的错误。
容易解决的是在 shell 中使用gnu二进制文件而不是osx二进制文件。

使用brew安装coreutils(我将所有常用的shell utils更改为其gnu版本)
为此,请使用以下命令。

brew install coreutils findutils gnu-tar gnu-sed gawk gnutls gnu-indent gnu-getopt --default-names

现在,要使 shell 使用这些而不是Mac二进制文件,请在 shell 配置文件中将这些实用程序的路径添加到 shell 中。
vi ~/.profile

将以下行添加到此文件
PATH="/usr/local/opt/coreutils/libexec/gnubin:$PATH"

之后,打开一个新的终端窗口并执行
echo $PATH

结果应该具有我们在前面的上一步中设置的路径(前置)
现在启动麒麟,应该工作顺利。

一些参考链接对我有帮助:
Mac forum link
Installation guide from apple.se

关于hadoop - Apache Kylin无法找到HBase公共(public)库,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/36200523/

10-15 22:20