本文介绍了libraryDependencies Spark在build.sbt中出错(IntelliJ)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试通过Spark学习Scala.我正在按照教程进行操作,但是在尝试导入Spark的库依赖项时遇到错误 :

I am trying to learning Scala with Spark. I am following a tutorial but I am having an error, when I try to import the library dependencies of Spark:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.3"

我遇到以下错误:

我有3个Unkwons文物.

And I have 3 Unkwons artifacts.

这里可能是什么问题?

我的代码是如此简单,它只是一个Hello World.

My code is so simple, it is just a Hello World.

推荐答案

可能您需要添加到您的 build.sbt :

Probably you need to add to your build.sbt:

resolvers += "spark-core" at "https://mvnrepository.com/artifact/org.apache.spark/spark-core"

请注意,该库仅受Scala 2.11和Scala 2.12支持.

Please note that this library is supported only for Scala 2.11 and Scala 2.12.

这篇关于libraryDependencies Spark在build.sbt中出错(IntelliJ)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-30 00:13