以编程方式从另一个应用程序中杀死Spark应用程序程序

以编程方式从另一个应用程序中杀死Spark应用程序程序

本文介绍了提交&以编程方式从另一个应用程序中杀死Spark应用程序程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道是否可以提交监控杀死从其他服务中触发应用程序.

I am wondering if it is possible to submit, monitor & kill spark applications from another service.

我的要求如下:

我写了一项服务

  1. 解析用户命令
  2. 转换为易于理解的参数,以提供给已准备好的Spark-SQL应用程序
  3. 使用 ProcessBuilder
  4. 中的 spark-submit 将应用程序以及参数提交给Spark Cluster
  5. 并计划在集群模式中运行生成的应用程序的驱动程序.
  1. parse user commands
  2. translate them into understandable arguments to an already prepared Spark-SQL application
  3. submit the application along with arguments to Spark Cluster using spark-submit from ProcessBuilder
  4. And plans to run generated applications' driver in cluster mode.

其他要求:

  • 查询有关应用程序状态的信息,例如,百分比保持不变
  • 杀死查询
  • Query about the applications status, for example, the percentage remains
  • Kill queries accrodingly

我在 spark独立文档中找到的内容建议关闭应用程序:

What I find in spark standalone documentation suggest kill application using:

./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>

并且应该find the driver ID through the standalone Master web UI at http://<master url>:8080.

那我该怎么办?

相关的SO问题:
Spark应用程序已完成回调
从Java中的另一个应用程序部署Apache Spark应用程序,最佳做法

Related SO questions:
Spark application finished callback
Deploy Apache Spark application from another application in Java, best practice

推荐答案

您可以使用shell脚本执行此操作.

You could use shell script to do this.

部署脚本:

#!/bin/bash

spark-submit --class "xx.xx.xx" \
        --deploy-mode cluster \
        --supervise \
        --executor-memory 6G hdfs:///spark-stat.jar > output 2>&1

cat output

,您将获得如下输出:

16/06/23 08:37:21 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark://node-1:6066.
16/06/23 08:37:22 INFO rest.RestSubmissionClient: Submission successfully created as driver-20160623083722-0026. Polling submission state...
16/06/23 08:37:22 INFO rest.RestSubmissionClient: Submitting a request for the status of submission driver-20160623083722-0026 in spark://node-1:6066.
16/06/23 08:37:22 INFO rest.RestSubmissionClient: State of driver driver-20160623083722-0026 is now RUNNING.
16/06/23 08:37:22 INFO rest.RestSubmissionClient: Driver is running on worker worker-20160621162532-192.168.1.200-7078 at 192.168.1.200:7078.
16/06/23 08:37:22 INFO rest.RestSubmissionClient: Server responded with CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160623083722-0026",
  "serverSparkVersion" : "1.6.0",
  "submissionId" : "driver-20160623083722-0026",
  "success" : true
}

并以此为基础,创建您的kill驱动程序脚本

And based on this, create your kill driver script

#!/bin/bash

driverid=`cat output | grep submissionId | grep -Po 'driver-\d+-\d+'`

spark-submit --master spark://node-1:6066 --kill $driverid

确保使用chmod +x

这篇关于提交&amp;以编程方式从另一个应用程序中杀死Spark应用程序程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-30 00:07