第一步:启动IntelliJ IDEA,选择Create New Project,然后选择Scala,点击下一步,输入项目名称wujiadong.spark继续下一步

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

第二步:导入spark-assembly-1.5.1-hadoop2.6.0.jar包

File——Project Structure——Libraries——点+号——点java——选择下载好的spark-assembly-1.5.1-hadoop2.6.0.jar包——点ok

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

第三步:创建WordCount类编写代码

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

第四步:导出jar包

依次选择“File”–> “Project Structure” –> “Artifact”,选择“+”–> “Jar” –> “From Modules with dependencies”,选择main函数,并在弹出框中选择输出jar位置,并选择“OK”。

最后依次选择“Build”–> “Build Artifact”编译生成jar包。具体如下图所示。

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

spark学习10(win下利用Intellij IDEA搭建spark开发环境)-LMLPHP

第五步:spark-submit提交运行

hadoop@master:~/wujiadong$ spark-submit --class wujiadong.spark.WordCount  --executor-memory 500m --total-executor-cores 2 /home/hadoop/wujiadong/wujiadong.spark.jar hdfs://master:9000/wordcount.txt
17/02/02 20:27:34 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/02/02 20:27:37 INFO Slf4jLogger: Slf4jLogger started
17/02/02 20:27:37 INFO Remoting: Starting remoting
17/02/02 20:27:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:52310]
17/02/02 20:27:41 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
17/02/02 20:27:44 INFO FileInputFormat: Total input paths to process : 1
17/02/02 20:27:51 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
17/02/02 20:27:51 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
17/02/02 20:27:51 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
17/02/02 20:27:51 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
17/02/02 20:27:51 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
(spark,1)
(wujiadong,1)
(hadoop,1)
(python,1)
(hello,4)
17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.

参考资料1

参考资料2

05-06 04:32