我正在使用Databricks Connect从IntelliJ IDEA(Scala)在本地Azure Databricks群集中运行代码。

一切正常。我可以在IDE中本地连接,调试和检查。

我创建了一个Databricks作业来运行我的自定义应用程序JAR,但由于以下异常而失败:

19/08/17 19:20:26 ERROR Uncaught throwable from user code: java.lang.NoClassDefFoundError: com/databricks/service/DBUtils$
at Main$.<init>(Main.scala:30)
at Main$.<clinit>(Main.scala)

我的Main.scala类的第30行是
val dbutils: DBUtils.type = com.databricks.service.DBUtils

就像this documentation page上的描述一样

该页面显示了一种访问在本地和群集中均可使用的DBUtil的方法。但是该示例仅显示了Python,而我正在使用Scala。

以既可以使用databricks-connect在本地工作又可以在运行JAR的Databricks Job中工作的方式访问它的正确方法是什么?

更新

似乎有两种使用DBUtils的方法。

1)DbUtils类描述了here
引用文档,此库允许您构建和编译项目,但不能运行它。这不允许您在集群上运行本地代码。

2)Databricks Connect描述了here。这使您可以在Databricks群集中运行本地Spark代码。

问题在于这两种方法具有不同的设置和程序包名称。似乎没有一种在本地使用Databricks Connect的方法(在群集中不可用),但是随后通过sbt/maven添加了使用DbUtils类的jar应用程序,以便群集可以访问它。

最佳答案

我不知道the docs you mentioned为什么不起作用。也许您正在使用其他依赖项?

These docs有一个示例应用程序,您可以download。这是一个经过最少测试的项目,因此它不会创建作业或尝试在集群上运行它们-但这只是一个开始。另外,请注意,它使用了较旧的0.0.1dbutils-api版本。

因此,要解决当前问题,请尝试从其他位置导入com.databricks.service.DBUtils,而不要使用dbutils:

import com.databricks.dbutils_v1.DBUtilsHolder.dbutils

或者,如果您愿意:
import com.databricks.dbutils_v1.{DBUtilsV1, DBUtilsHolder}

type DBUtils = DBUtilsV1
val dbutils: DBUtils = DBUtilsHolder.dbutils

另外,请确保您在SBT中具有以下依赖项(如果0.0.3不起作用,则可以尝试使用版本-最新的是0.0.4):
libraryDependencies += "com.databricks" % "dbutils-api_2.11" % "0.0.3"

This question and answer为我指明了正确的方向。答案包含指向使用dbutils:waimak的可用Github存储库的链接。我希望这个仓库可以帮助您解决有关Databricks配置和依赖项的更多问题。

祝你好运!

更新

我知道了,所以我们有两个相似但不完全相同的API,也没有在本地版本和后端版本之间切换的好方法(尽管Databricks Connect promise 它应该可以正常工作)。请让我提出一个解决方法。

Scala可以方便地编写适配器,这是很好的。这是一个应该作为桥梁的代码片段-这里定义了DBUtils对象,它为API的两个版本提供了足够的API抽象:Databricks在com.databricks.service.DBUtils上连接一个,以及后端com.databricks.dbutils_v1.DBUtilsHolder.dbutils API。我们可以通过加载并随后通过反射使用com.databricks.service.DBUtils来实现此目的-我们没有对其进行硬编码的导入。

package com.example.my.proxy.adapter

import org.apache.hadoop.fs.FileSystem
import org.apache.spark.sql.catalyst.DefinedByConstructorParams

import scala.util.Try

import scala.language.implicitConversions
import scala.language.reflectiveCalls


trait DBUtilsApi {
    type FSUtils
    type FileInfo

    type SecretUtils
    type SecretMetadata
    type SecretScope

    val fs: FSUtils
    val secrets: SecretUtils
}

trait DBUtils extends DBUtilsApi {
    trait FSUtils {
        def dbfs: org.apache.hadoop.fs.FileSystem
        def ls(dir: String): Seq[FileInfo]
        def rm(dir: String, recurse: Boolean = false): Boolean
        def mkdirs(dir: String): Boolean
        def cp(from: String, to: String, recurse: Boolean = false): Boolean
        def mv(from: String, to: String, recurse: Boolean = false): Boolean
        def head(file: String, maxBytes: Int = 65536): String
        def put(file: String, contents: String, overwrite: Boolean = false): Boolean
    }

    case class FileInfo(path: String, name: String, size: Long)

    trait SecretUtils {
        def get(scope: String, key: String): String
        def getBytes(scope: String, key: String): Array[Byte]
        def list(scope: String): Seq[SecretMetadata]
        def listScopes(): Seq[SecretScope]
    }

    case class SecretMetadata(key: String) extends DefinedByConstructorParams
    case class SecretScope(name: String) extends DefinedByConstructorParams
}

object DBUtils extends DBUtils {

    import Adapters._

    override lazy val (fs, secrets): (FSUtils, SecretUtils) = Try[(FSUtils, SecretUtils)](
        (ReflectiveDBUtils.fs, ReflectiveDBUtils.secrets)    // try to use the Databricks Connect API
    ).getOrElse(
        (BackendDBUtils.fs, BackendDBUtils.secrets)    // if it's not available, use com.databricks.dbutils_v1.DBUtilsHolder
    )

    private object Adapters {
        // The apparent code copying here is for performance -- the ones for `ReflectiveDBUtils` use reflection, while
        // the `BackendDBUtils` call the functions directly.

        implicit class FSUtilsFromBackend(underlying: BackendDBUtils.FSUtils) extends FSUtils {
            override def dbfs: FileSystem = underlying.dbfs
            override def ls(dir: String): Seq[FileInfo] = underlying.ls(dir).map(fi => FileInfo(fi.path, fi.name, fi.size))
            override def rm(dir: String, recurse: Boolean = false): Boolean = underlying.rm(dir, recurse)
            override def mkdirs(dir: String): Boolean = underlying.mkdirs(dir)
            override def cp(from: String, to: String, recurse: Boolean = false): Boolean = underlying.cp(from, to, recurse)
            override def mv(from: String, to: String, recurse: Boolean = false): Boolean = underlying.mv(from, to, recurse)
            override def head(file: String, maxBytes: Int = 65536): String = underlying.head(file, maxBytes)
            override def put(file: String, contents: String, overwrite: Boolean = false): Boolean = underlying.put(file, contents, overwrite)
        }

        implicit class FSUtilsFromReflective(underlying: ReflectiveDBUtils.FSUtils) extends FSUtils {
            override def dbfs: FileSystem = underlying.dbfs
            override def ls(dir: String): Seq[FileInfo] = underlying.ls(dir).map(fi => FileInfo(fi.path, fi.name, fi.size))
            override def rm(dir: String, recurse: Boolean = false): Boolean = underlying.rm(dir, recurse)
            override def mkdirs(dir: String): Boolean = underlying.mkdirs(dir)
            override def cp(from: String, to: String, recurse: Boolean = false): Boolean = underlying.cp(from, to, recurse)
            override def mv(from: String, to: String, recurse: Boolean = false): Boolean = underlying.mv(from, to, recurse)
            override def head(file: String, maxBytes: Int = 65536): String = underlying.head(file, maxBytes)
            override def put(file: String, contents: String, overwrite: Boolean = false): Boolean = underlying.put(file, contents, overwrite)
        }

        implicit class SecretUtilsFromBackend(underlying: BackendDBUtils.SecretUtils) extends SecretUtils {
            override def get(scope: String, key: String): String = underlying.get(scope, key)
            override def getBytes(scope: String, key: String): Array[Byte] = underlying.getBytes(scope, key)
            override def list(scope: String): Seq[SecretMetadata] = underlying.list(scope).map(sm => SecretMetadata(sm.key))
            override def listScopes(): Seq[SecretScope] = underlying.listScopes().map(ss => SecretScope(ss.name))
        }

        implicit class SecretUtilsFromReflective(underlying: ReflectiveDBUtils.SecretUtils) extends SecretUtils {
            override def get(scope: String, key: String): String = underlying.get(scope, key)
            override def getBytes(scope: String, key: String): Array[Byte] = underlying.getBytes(scope, key)
            override def list(scope: String): Seq[SecretMetadata] = underlying.list(scope).map(sm => SecretMetadata(sm.key))
            override def listScopes(): Seq[SecretScope] = underlying.listScopes().map(ss => SecretScope(ss.name))
        }
    }
}

object BackendDBUtils extends DBUtilsApi {
    import com.databricks.dbutils_v1

    private lazy val dbutils: DBUtils = dbutils_v1.DBUtilsHolder.dbutils
    override lazy val fs: FSUtils = dbutils.fs
    override lazy val secrets: SecretUtils = dbutils.secrets

    type DBUtils = dbutils_v1.DBUtilsV1
    type FSUtils = dbutils_v1.DbfsUtils
    type FileInfo = com.databricks.backend.daemon.dbutils.FileInfo

    type SecretUtils = dbutils_v1.SecretUtils
    type SecretMetadata = dbutils_v1.SecretMetadata
    type SecretScope = dbutils_v1.SecretScope
}

object ReflectiveDBUtils extends DBUtilsApi {
    // This throws a ClassNotFoundException when the Databricks Connection API isn't available -- it's much better than
    // the NoClassDefFoundError, which we would get if we had a hard-coded import of com.databricks.service.DBUtils .
    // As we're just using reflection, we're able to recover if it's not found.
    private lazy val dbutils: DBUtils =
        Class.forName("com.databricks.service.DBUtils$").getField("MODULE$").get().asInstanceOf[DBUtils]

    override lazy val fs: FSUtils = dbutils.fs
    override lazy val secrets: SecretUtils = dbutils.secrets

    type DBUtils = AnyRef {
        val fs: FSUtils
        val secrets: SecretUtils
    }

    type FSUtils = AnyRef {
        def dbfs: org.apache.hadoop.fs.FileSystem
        def ls(dir: String): Seq[FileInfo]
        def rm(dir: String, recurse: Boolean): Boolean
        def mkdirs(dir: String): Boolean
        def cp(from: String, to: String, recurse: Boolean): Boolean
        def mv(from: String, to: String, recurse: Boolean): Boolean
        def head(file: String, maxBytes: Int): String
        def put(file: String, contents: String, overwrite: Boolean): Boolean
    }

    type FileInfo = AnyRef {
        val path: String
        val name: String
        val size: Long
    }

    type SecretUtils = AnyRef {
        def get(scope: String, key: String): String
        def getBytes(scope: String, key: String): Array[Byte]
        def list(scope: String): Seq[SecretMetadata]
        def listScopes(): Seq[SecretScope]
    }

    type SecretMetadata = DefinedByConstructorParams { val key: String }

    type SecretScope = DefinedByConstructorParams { val name: String }
}


如果将val dbutils: DBUtils.type = com.databricks.service.DBUtils中提到的Main替换为val dbutils: DBUtils.type = com.example.my.proxy.adapter.DBUtils,则无论本地还是远程,所有内容都应作为即插即用的替代品。

如果您有一些新的NoClassDefFoundError,请尝试将特定依赖项添加到JAR作业中,或者尝试重新排列它们,更改版本或将提供的依赖项标记为已提供。

这个适配器不是很漂亮,它使用反射,但是我希望它可以很好地作为一种解决方法。祝你好运 :)

09-05 20:07