问题描述
我正在Linux 64(Fedora 25)上的Intellij(CE 2017.1)Scala控制台中以脚本形式运行Spark代码.我在一开始就设置了SparkContext:
I am running Spark code as script in Intellij (CE 2017.1) Scala Console on Linux 64 (Fedora 25). I set SparkContext at the start:
import org.apache.spark.{SparkConf, SparkContext}
val conf = new SparkConf().
setAppName("RandomForest").
setMaster("local[*]").
set("spark.local.dir", "/spark-tmp").
set("spark.driver.memory", "4g").
set("spark.executor.memory", "4g")
val sc = new SparkContext(conf)
但是正在运行的SparkContext始终以同一行开头:
But the running SparkContext always starts with the same line:
17/03/27 20:12:21 INFO SparkContext:运行Spark版本2.1.0
17/03/27 20:12:21 INFO SparkContext: Running Spark version 2.1.0
17/03/27 20:12:21信息MemoryStore:MemoryStore开始时的容量为871.6 MB
17/03/27 20:12:21 INFO MemoryStore: MemoryStore started with capacity 871.6 MB
17/03/27 20:12:21信息BlockManagerMasterEndpoint:使用871.8 MB RAM,BlockManagerId(驱动程序,192.168.1.65、38119,无)注册块管理器192.168.1.65:38119
17/03/27 20:12:21 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.65:38119 with 871.8 MB RAM, BlockManagerId(driver, 192.168.1.65, 38119, None)
Spark Web UI中的Executors选项卡显示相同的金额.在启动前从终端导出_JAVA_OPTIONS =-Xms2g -Xmx4g"对此也无效.
And the Executors tab in the Spark web UI shows the same amount.Exporting _JAVA_OPTIONS="-Xms2g -Xmx4g" from the terminal before start also had no effect here.
推荐答案
增加Web UI的Spark MemoryStore并最终增加Storage Memory Executors选项卡的唯一方法是直接在Intellij Scala控制台设置中的VM选项中添加-Xms2g -Xmx4g.开始.
The only way to increase Spark MemoryStore and eventually Storage memory Executors tab of web UI was to add -Xms2g -Xmx4g in VM options directly in Intellij Scala Console settings before start.
现在信息行将打印:
17/03/27 20:12:21信息MemoryStore:MemoryStore开始时的容量为2004.6 MB
17/03/27 20:12:21 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
17/03/27 20:12:21信息BlockManagerMasterEndpoint:使用2004.6 MB RAM,BlockManagerId(驱动程序,192.168.1.65、41997,无)注册块管理器192.168.1.65:41997
17/03/27 20:12:21 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.65:41997 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.1.65, 41997, None)
,Spark Web UI Executors选项卡的存储内存"显示为2.1 GB.
and the Spark web UI Executors tab Storage Memory shows 2.1 GB.
这篇关于在IntelliJ Scala控制台中运行时如何设置Spark MemoryStore大小?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!