问题描述
我正在尝试编写spark scala应用程序代码的测试用例.因此,我打算为此目的使用 SharedSparkSession
.
I am trying to write test case of spark scala application code. So I am planning to use SharedSparkSession
for this purpose.
我看过其他框架,例如 com.holdenkarau ,但是我正在寻找其他任何替代方法,尤其是使用 SharedSparkSeesion
.
I've seen other framework such as com.holdenkarau but I am looking for any other alternative especially using SharedSparkSeesion
.
因此,我尝试通过网络上的此 SharedSparkSession
查找示例示例,但我无法这样做.
So I tried finding sample examples using this SharedSparkSession
from the web, but I am unable to do so.
如果有任何示例,请发布.
If you have any example, please post.
推荐答案
Spark自己的测试框架可以在Scala中使用,其中存在SparkSession.对于下面的Maven,必须包含一些依赖项,可以将其转换为Sbt.ScalaTest示例: https://apache.googlesource.com/spark/+/master/sql/core/src/test/scala/org/apache/spark/sql/ColumnExpressionSuite.scala
Spark own test framework can be used in Scala, SparkSession present there. Some dependencies have to be included, for Maven below, can be converted to Sbt.ScalaTest example: https://apache.googlesource.com/spark/+/master/sql/core/src/test/scala/org/apache/spark/sql/ColumnExpressionSuite.scala
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.suffix}</artifactId>
<version>${spark.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.suffix}</artifactId>
<version>${spark.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_${scala.suffix}</artifactId>
<version>${spark.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
这篇关于spark单元测试框架示例com.holdenkarau以外的其他示例的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!