问题描述
我试图自动启动的星火的从的 HDInsight 的集群上工作的微软Azure 的。据我所知,有几种方法来自动的的Hadoop 的作业提交(所提供的天青的本身),但到目前为止,我还没有能够找到一种方法来远程运行星火的工作withouth的设置RDP与主实例。
I am trying to automatically launch a Spark job on an HDInsight cluster from Microsoft Azure. I am aware that several methods exist to automate Hadoop job submission (provided by Azure itself), but so far I have not been able to found a way to remotely run a Spark job withouth setting a RDP with the master instance.
有什么办法来实现这一目标?
Is there any way to achieve this?
推荐答案
火花jobserver为提交和管理Apache的星火工作,罐,和工作环境一个RESTful接口。
Spark-jobserver provides a RESTful interface for submitting and managing Apache Spark jobs, jars, and job contexts.
我的解决办法是同时使用计划和星火jobserver定期推出火花工作。
My solution is using both Scheduler and Spark-jobserver to launch the Spark-job periodically.
这篇关于在HDInsight群集上执行远程火花工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!