问题描述
我正在尝试在 Zeppelin 0.8.0 中使用一些带有 Spark 解释器的 Scala 代码:
I'm trying to use some Scala code in Zeppelin 0.8.0 with Spark interpreter:
%spark
import scala.beans.BeanProperty
class Node(@BeanProperty val parent: Option[Node]) {
}
但是进口好像没有考虑
import scala.beans.BeanProperty
<console>:14: error: not found: type BeanProperty
@BeanProperty val parent: Option[Node]) {
^
我发现以下代码有效:
class Node(@scala.beans.BeanProperty val parent: Option[Node]) {
}
这也很好用:
def loadCsv(CSVPATH: String): DataFrame = {
import org.apache.spark.sql.types._
//[...] some code
val schema = StructType(
firstRow.map(s => StructField(s, StringType))
)
//[…] some code again
}
所以我想如果它在大括号之间导入或在使用时直接用 path.to.package.Class
指定,一切都可以正常工作.
So I guess everything works fine if it is imported between braces or directly specified with a path.to.package.Class
when used.
问题:如何在类/函数定义之外导入?
QUESTION: How do I import outside of a class/function definition?
推荐答案
通过 path.to.package.Class
导入在 Zeppelin 中运行良好.您可以尝试导入和使用 java.sql.Date
;
Importing by path.to.package.Class
works well in Zeppelin. You can try it with importing and using java.sql.Date
;
import java.sql.Date
val date = Date.valueOf("2019-01-01")
问题在于 Zeppelin 上下文.如果您尝试在 Zeppelin 中使用以下代码片段,您会发现它运行良好;
The problem is about Zeppelin context. If you try to use following code snippets in Zeppelin, you will see that it works fine;
object TestImport {
import scala.beans.BeanProperty
class Node(@BeanProperty val parent: Option[Node]){}
}
val testObj = new TestImport.Node(None)
testObj.getParent
//prints Option[Node] = None
希望能帮到你!
这篇关于带有 Spark 解释器的 Zeppelin 忽略在类/函数定义之外声明的导入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!