本文介绍了Spark-HBase-GCP模板(1/3)-如何在本地打包Hortonworks连接器?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在GCP上下文中测试Spark-HBase连接器,并尝试遵循[1],它要求使用适用于Spark 2.4的Maven(我尝试过Maven 3.6.3)在本地打包连接器[2],并导致以下问题.

I'm trying to test the Spark-HBase connector in the GCP context and tried to follow [1], which asks to locally package the connector [2] using Maven (I tried Maven 3.6.3) for Spark 2.4, and leads to following issue.

错误"branch-2.4":

Error "branch-2.4":

[错误]无法在项目shc-core上执行目标net.alchim31.maven:scala-maven-plugin:3.2.2:compile(scala-compile-first):执行scala-compile-first目标net.alchim31.maven:scala-maven-plugin:3.2.2:编译失败.[帮助1]

参考

[1] https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/tree/master/scala/bigtable-shc

[2] https://github.com/hortonworks-spark/shc/tree/branch-2.4

推荐答案

如注释中所建议(感谢 @Ismail !),使用Java 8可以构建连接器:

As suggested in the comments (thanks @Ismail !), using Java 8 works to build the connector:

sdk使用java 8.0.275-zulu

mvn清洁软件包-DskipTests

然后可以将罐子导入Dependencies.scala 中-shc"rel =" nofollow noreferrer> GCP模板,如下所示.

One can then import the jar in Dependencies.scala of the GCP template as follows.

...
val shcCore = "com.hortonworks" % "shc-core" % "1.1.3-2.4-s_2.11" from "file:///<path_to_jar>/shc-core-1.1.3-2.4-s_2.11.jar"
...
// shcCore % (shcVersionPrefix + scalaBinaryVersion) excludeAll(
shcCore excludeAll(
...

这篇关于Spark-HBase-GCP模板(1/3)-如何在本地打包Hortonworks连接器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-29 15:32
查看更多