问题描述
1。我的spark(独立)集群:spmaster,spslave1,spslave2
1.My spark(standalone) cluster: spmaster,spslave1,spslave2
2。对于我简单的spark应用程序,它从mysql中选择了一些记录。
2.For my simple spark app which selects some records from mysql.
public static void main(String[] args) {
SparkConf conf = new SparkConf()
.setMaster("spark://spmaster:7077")
.setAppName("SparkApp")
.set("spark.driver.extraClassPath","/usr/lib/spark-1.6.1-bin-hadoop2.6/lib/mysql-connector-java-5.1.24.jar") //the driver jar was uploaded to all nodes
.set("spark.executor.extraClassPath","/usr/lib/spark-1.6.1-bin-hadoop2.6/lib/mysql-connector-java-5.1.24.jar");
JavaSparkContext sc = new JavaSparkContext(conf);
SQLContext sqlContext = new SQLContext(sc);
String url = "jdbc:mysql://192.168.31.43:3306/mytest";
Map<String, String> options = new HashMap<String, String>();
options.put("url", url);
options.put("dbtable", "mytable");
options.put("user", "root");
options.put("password", "password");
DataFrame jdbcDF = sqlContext.read().format("jdbc").options(options)
.load();
jdbcDF.registerTempTable("c_picrecord");
DataFrame sql = sqlContext.sql("select * from mytable limit 10");
sql.show(); // **show the result on eclipse console**
sc.close();
}
3。我的问题:当我右键单击
-> 以 Java应用程序运行
,它可以成功运行,我可以在webUI < spark上找到工作: // spmaster:7077>
。我不了解其工作原理,使用 spark-submit.sh
有什么区别。
3.My question : when i right click
->run as 'Java Application'
, it works successfully, and i can find the job on webUI<spark://spmaster:7077>
.I don't undersatand how it works , and what is the different between with using spark-submit.sh
.
推荐答案
当我们使用 spark-submit.sh
提交申请时,然后默认情况下,火花提交已经创建了Spark上下文(aka 驱动程序)。
When we use spark-submit.sh
for submitting application, then spark-submit already created Spark Context (aka Driver) by default.
但是当我们使用Java API( JavaSparkContext
)连接主服务器,则Java应用程序将成为 Driver 。通过使用该驱动程序,所有应用程序/作业都将提交给母版。
But when we use Java API (JavaSparkContext
) to connect master, then Java application will become Driver. And by using this Driver all application/job will submitted to master.
这篇关于为什么我可以在不提交火花的情况下直接在Eclipse中运行spark应用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!