问题描述
我正在构建一个 Spark Structured Streaming 应用程序,我正在其中执行批处理流连接.并且批处理数据的来源会定期更新.
I am building a Spark Structured Streaming application where I am doing a batch-stream join. And the source for the batch data gets updated periodically.
因此,我计划定期对该批数据进行持久化/非持久化.
So, I am planning to do a persist/unpersist of that batch data periodically.
以下是我用来持久化和取消持久化批处理数据的示例代码.
Below is a sample code which I am using to persist and unpersist the batch data.
流程:
- 读取批处理数据
- 保留批处理数据
- 每隔一小时,取消持久化数据并读取批处理数据并再次持久化.
但是,我没有看到批处理数据每小时刷新一次.
But, I am not seeing the batch data getting refreshed for every hour.
代码:
var batchDF = handler.readBatchDF(sparkSession)
batchDF.persist(StorageLevel.MEMORY_AND_DISK)
var refreshedTime: Instant = Instant.now()
if (Duration.between(refreshedTime, Instant.now()).getSeconds > refreshTime) {
refreshedTime = Instant.now()
batchDF.unpersist(false)
batchDF = handler.readBatchDF(sparkSession)
.persist(StorageLevel.MEMORY_AND_DISK)
}
有没有更好的方法可以在 Spark 结构化流媒体作业中实现这种场景?
Is there any better way to achieve this scenario in spark structured streaming jobs ?
推荐答案
您可以利用 Structured Streaming 提供的流式调度功能来做到这一点.
You could do this by making use of the streaming scheduling capabilities that Structured Streaming provides.
您可以通过创建人工速率"来触发静态数据帧的刷新(非持久化 -> 加载 -> 持久化).定期刷新静态数据帧的流.这个想法是:
You can trigger the refreshing (unpersist -> load -> persist) of a static Dataframe by creating an artificial "Rate" stream that refreshes the static Dataframe periodically. The idea is to:
- 最初加载静态数据帧并保持为
var
- 定义一个刷新静态数据框的方法
- 使用费率"以所需时间间隔(例如 1 小时)触发的流
- 读取实际流数据并使用静态数据帧执行连接操作
- 在该速率流中有一个
foreachBatch
接收器,它调用在步骤 2 中创建的刷新方法.
- Load the static Dataframe initially and keep as
var
- Define a method that refreshes the static Dataframe
- Use a "Rate" Stream that gets triggered at the required interval (e.g. 1 hour)
- Read actual streaming data and perform join operation with static Dataframe
- Within that Rate Stream have a
foreachBatch
sink that calls refresher method created in step 2.
以下代码在 Spark 3.0.1、Scala 2.12.10 和 Delta 0.7.0 上运行良好.
The following code runs fine with Spark 3.0.1, Scala 2.12.10 and Delta 0.7.0.
// 1. Load the staticDataframe initially and keep as `var`
var staticDf = spark.read.format("delta").load(deltaPath)
staticDf.persist()
// 2. Define a method that refreshes the static Dataframe
def foreachBatchMethod[T](batchDf: Dataset[T], batchId: Long) = {
staticDf.unpersist()
staticDf = spark.read.format("delta").load(deltaPath)
staticDf.persist()
println(s"${Calendar.getInstance().getTime}: Refreshing static Dataframe from DeltaLake")
}
// 3. Use a "Rate" Stream that gets triggered at the required interval (e.g. 1 hour)
val staticRefreshStream = spark.readStream
.format("rate")
.option("rowsPerSecond", 1)
.option("numPartitions", 1)
.load()
.selectExpr("CAST(value as LONG) as trigger")
.as[Long]
// 4. Read actual streaming data and perform join operation with static Dataframe
// As an example I used Kafka as a streaming source
val streamingDf = spark.readStream
.format("kafka")
.option("kafka.bootstrap.servers", "localhost:9092")
.option("subscribe", "test")
.option("startingOffsets", "earliest")
.option("failOnDataLoss", "false")
.load()
.selectExpr("CAST(value AS STRING) as id", "offset as streamingField")
val joinDf = streamingDf.join(staticDf, "id")
val query = joinDf.writeStream
.format("console")
.option("truncate", false)
.option("checkpointLocation", "/path/to/sparkCheckpoint")
.start()
// 5. Within that Rate Stream have a `foreachBatch` sink that calls refresher method
staticRefreshStream.writeStream
.outputMode("append")
.foreachBatch(foreachBatchMethod[Long] _)
.queryName("RefreshStream")
.trigger(Trigger.ProcessingTime("5 seconds")) // or e.g. 1 hour
.start()
有一个完整的例子,增量表被创建并更新为新值,如下所示:
To have a full example, the delta table got created and updated with new values as below:
val deltaPath = "file:///tmp/delta/table"
import spark.implicits._
val df = Seq(
(1L, "static1"),
(2L, "static2")
).toDF("id", "deltaField")
df.write
.mode(SaveMode.Overwrite)
.format("delta")
.save(deltaPath)
这篇关于Stream-Static Join:如何定期刷新(非持久化/持久化)静态数据帧的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!