本文介绍了带有 2 个源的 Spring Data Flow 为一个处理器/接收器提供数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在寻找有关为特定用例设置 Spring Data Flow 流的一些建议.

I'm looking for some advice on setting up a Spring Data Flow stream for a specific use case.

我的用例:

我有 2 个 RDBMS,我需要比较每个查询的结果.查询应该大致同时运行.根据比较结果,我应该能够通过我创建的自定义电子邮件接收器应用发送电子邮件.

I have 2 RDBMS and I need to compare the results of queries run against each. The queries should be run roughly simultaneously. Based on the result of the comparison, I should be able to send an email through a custom email sink app which I have created.

我设想流图看起来像这样(抱歉画错了):

I envision the stream diagram to look something like this (sorry for the paint):

问题是,据我所知,SDF 不允许一个流由 2 个源组成.在我看来,这样的事情应该是可能的,而不会将框架的限制推得太远.我正在寻找可以在 SDF 框架内工作时为这种情况提供良好方法的答案.

The problem is that SDF does not, to my knowledge, allow a stream to be composed with 2 sources. It seems to me that something like this ought to be possible without pushing the limits of the framework too far. I'm looking for answers that provide a good approach to this scenario while working within the SDF framework.

我使用 Kafka 作为消息代理,数据流服务器使用 mysql 来持久化流信息.

I am using Kafka as a message broker and the data flow server is using mysql to persist stream information.

我考虑过创建一个自定义 Source 应用程序,它轮询两个数据源并在输出通道上发送消息.这将消除我对 2 个源的要求,但看起来它需要对 jdbc 源应用程序进行大量自定义.

I have considered creating a custom Source app which polls two datasources and sends the messages on the output channel. This would eliminate my requirement of 2 sources, but it looks like it would require a significant amount of customization of the jdbc source application.

提前致谢.

推荐答案

我还没有真正尝试过这个,但是您应该能够使用命名目的地来实现这一点.看看这里:http://docs.spring.io/spring-cloud-dataflow/docs/current-SNAPSHOT/reference/htmlsingle/#spring-cloud-dataflow-stream-advanced

I have not really tried this, but you should be able to use named destinations to achieve that. Take a look here: http://docs.spring.io/spring-cloud-dataflow/docs/current-SNAPSHOT/reference/htmlsingle/#spring-cloud-dataflow-stream-advanced

stream create --name jdbc1 --definition "jdbc > :dbSource"

stream create --name jdbc2 --definition "jdbc > :dbSource"

stream create --name processor --definition ":dbSource > aggregator | sink"

这篇关于带有 2 个源的 Spring Data Flow 为一个处理器/接收器提供数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-07 01:09