本文介绍了如何将Spark Application作为守护程序运行的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

我有一个有关运行spark应用程序的基本问题.

I have a basic question about running spark application.

我有一个Java客户端,它将向我发送HDFS中驻留的查询数据请求.

I have a Java client which will send me request for query data which is residing in HDFS.

我收到的请求是基于HTTP的REST API,我需要解释该请求并形成Spark SQL查询,并将响应返回给客户端.

The request I get is REST API over HTTP and I need to interpret the request and form Spark SQL queries and return the response back to client.

我无法理解如何使我的spark应用程序成为守护程序,该守护程序正在等待请求并可以使用预实例化的SQL上下文执行查询?

I am unable to understand how can I make my spark application as daemon which is waiting for request and can execute the queries using the pre instantiated SQL context ?

推荐答案

您可以让一个线程在无限循环中运行以使用Spark进行计算.

You can have a thread that run in an infinite loop to do the calculation with Spark.

while (true) {
  request = incomingQueue.poll()
  // Process the request with Spark
  val result = ...
  outgoingQueue.put(result)
}

然后在处理REST请求的线程中,将请求放入incomingQueue中,并等待OutingQueue的结果.

Then in the thread that handle the REST request, you put the request in the incomingQueue and wait for the result from the outgoingQueue.

 // Create the request from the REST call
 val request = ...
 incompingQueue.put(request)
 val result = outgoingQueue.poll()
 return result

这篇关于如何将Spark Application作为守护程序运行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 21:45