问题描述
我有一个托管的 Web 服务,它允许批量提取记录.此 Web 服务以起始记录号 (ROWID) 和页面大小(最大 800)作为参数.可能有 50-60k 条记录要从此服务中提取并调用另一个 Web 服务以将所有这些数据以块的形式再次发布,而不会在其间保留数据.
I have a web service hosted which allows to pull records in batch. This web service takes starting record number (ROWID) and page size(800 max) as parameters. There could be 50-60k records to pull from this service and call another web service to post all these data again in chunk without persisting data in between.
如何使用 Spring Batch 通过调用 Web 服务逐页(分块)提取记录,以及如何将相同的记录发布到另一个 Web 服务.
How could I use Spring Batch to pull the records page by page (chunking) by calling web service and how do I post same records to another web service.
我能够使用 Spring-Integration 批处理来做到这一点,但是对于大量数据,当我们使用 Spring-Batch 处理大量数据时,我不确定 Spring-Integration 是否是理想的做法.
I was able to do this using Spring-Integration batch but for large set of data, i am not sure whether Spring-Integration is ideal way of doing when we have Spring-Batch for processing large set of data.
推荐答案
Spring Batch 没有 Web 服务 ItemReader
.话虽如此,如果您创建一个扩展 AbstractPagingItemReader
的自定义 ItemReader
,应该为您处理分页逻辑本身(您实现 doReadPage()
code> 方法处理获取数据页面,超类处理跟踪您所在的页面等).
Spring Batch doesn't have a web service ItemReader
per say. That being said, if you create a custom ItemReader
that extends AbstractPagingItemReader
the paging logic itself should be taken care of for you (you implement the doReadPage()
method that handles getting a page of data, the super class handles keeping track of what page you're on, etc).
对于ItemWriter
方面的事情,如果您有要调用的客户端,则可以使用ItemWriterAdapter
.这将调用 java 对象上的一个方法,将列表中的每个项目传递给 ItemWriter#write(List items)
方法.否则,您需要自己编写.
For the ItemWriter
side of things, if you have a client you want called, you can use the ItemWriterAdapter
. This will call a method on a java object, passing it each item within the list passed to the ItemWriter#write(List items)
method. Otherwise, you'll need to write your own.
无论哪种情况,您需要的自定义代码都应该最少.
In either case, the custom code you'll need should be minimal.
这篇关于Spring Batch - Web 服务到 Web 服务分块的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!