中某些任务的依赖

中某些任务的依赖

本文介绍了如何覆盖对 sbt 中某些任务的依赖的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在某些任务中覆盖对项目的依赖.我有一个使用 spark 的 sbt 多项目.

I want to override dependency on project in certain Task.I have a sbt multi-project which using spark.

lazy val core = // Some Project

val sparkLibs = Seq(
  "org.apache.spark" %% "spark-core" % "1.6.1"
)

val sparkLibsProvided = Seq(
  "org.apache.spark" %% "spark-core" % "1.6.1" % "provided"
)

lazy val main = Project(
  id = "main",
  base = file("main-project"),
  settings = sharedSettings
).settings(
  name := "main",
  libraryDependencies ++= sparkLibs,
  dependencyOverrides ++= Set(
    "com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
  )
).dependsOn(core)

当我尝试制作 fat jar 以在我的纱线集群上提交时,我使用 https://github.com/sbt/sbt-assembly 任务.但在这种情况下,我想使用 sparkLibsProvided 而不是 sparkLibs 类似的东西:

When I try to make fat jar to submit on my yarn cluster, I use https://github.com/sbt/sbt-assembly task. But in this case, I want to use sparkLibsProvided instead of sparkLibs something like:

lazy val sparkProvided = (project in assembly).settings(
  dependencyOverrides ++= sparkLibsProvided.toSet
)

如何正确覆盖此依赖项?

How can I properly override this dependency?

推荐答案

您可以创建一个新项目,该项目是一个专用项目,用于使用提供的标志创建您的 Spark uber jar:

You can create a new project which is a dedicated project for creating your spark uber jar with the provided flag:

lazy val sparkUberJar = (project in file("spark-project"))
  .settings(sharedSettings: _*)
  .settings(
    libraryDependencies ++= sparkLibsProvided,
    dependencyOverrides ++= Set(
      "com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
    )
  )

当你在 sbt 中组装时,先去上述项目:

And when you assemble in sbt, go to the said project first:

sbt project sparkUberJar
sbt assembly

这篇关于如何覆盖对 sbt 中某些任务的依赖的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-29 20:36