本文介绍了如何链接Azure Data Factory管道的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个带有多个管道的数据工厂,每个管道都有大约20个复制活动,可以在2个存储帐户之间复制azure表.

I have a data factory with multiple pipelines and each pipeline has around 20 copy activities to copy azure tables between 2 storage accounts.

每个管道都会处理每个azure表的快照,因此我想按顺序运行管道,以避免用旧数据覆盖最新数据的风险.

Each pipeline handles a snapshot of each azure table hence i want to run pipelines sequentially to avoid the risk of overwriting latest data with old data.

我知道,将第一条管道输出作为第二条管道的输入,就可以实现这一点.但是由于我有许多活动正在筹备中,所以我不确定哪个活动会最后完成.

I know that giving first pipeline output as input to the 2nd pipeline we can achieve this. But as i have many activities in a pipeline, i am not sure which activity will complete last.

我是否知道管道已完成,或者一个管道已完成状态触发下一个管道?

Is there anyway i can know that pipeline is completed or anyway one pipeline completed status triggers the next pipeline ?

在活动中,inputs是一个数组.那么可以提供多个输入吗?如果是,则所有输入将异步运行或一个接一个地运行?

In Activity, inputs is an array. So is it possible to give multiple inputs ? If yes all inputs will run asynchronously or one after the other ?

在多个输入的上下文中,我已经了解了有关调度依赖性的内容.那么外部输入可以充当调度依赖性还是仅内部数据集?

In the context of multiple inputs i have read about Scheduling dependency. So can an external input act as scheduling dependency or only internal dataset ?

推荐答案

这是一个较旧的版本,但是我仍然在datafactory 2上遇到这个问题,所以万一有人来这里寻找datafactory 2上的解决方案.等待完成"勾选框设置隐藏在执行管道"活动的设置"选项卡的高级"部分下.只需检查一下即可获得所需的结果.

This is an old one but i was still having this issue with datafactory 2 so in case anyone has come here looking for this solution on datafactory 2.The "Wait on completion" tick box setting is hidden under the 'Advanced' part of the Settings tab for the Execute Pipeline activity. Just check it to get the desired result.

请注意,设置选项卡上的高级"位与高级"免费编码选项卡不同.查看屏幕截图:

Note the 'Advanced' bit on the setting tab is not the same as the 'Advanced' free coding tab. See screen shot:

这篇关于如何链接Azure Data Factory管道的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-06 00:31