测试中的 parallelExecution:=否 在此行中,测试运行.I have multiple ScalaTest classes which use BeforeAndAfterAll to construct a SparkContext and stop it afterwards like so:class MyTest extends FlatSpec with Matchers with BeforeAndAfterAll { private var sc: SparkContext = null override protected def beforeAll(): Unit = { sc = ... // Create SparkContext } override protected def afterAll(): Unit = { sc.stop() } // my tests follow}These tests run fine when started from IntelliJ IDEA, but when running sbt test, I get WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243)., and after that, a bunch of other exceptions which are, I suppose, related to this issue.How to correctly use Spark? Do I have to create one global SparkContext for the whole test suite, and if yes, how do I do this? 解决方案 Seems like I lost the sight of the wood for the trees, I forgot the following line in my build.sbt:parallelExecution in test := falseWith this line, the test runs. 这篇关于如何在ScalaTest测试中正确使用Spark?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持! 上岸,阿里云!
09-01 19:33