问题描述
我有许多管道/链接服务/数据集json文件,我需要将它们上传到我的数据工厂,而不是创建新版本并复制文本.最简单的方法是什么?
I have a number of pipeline/linkedservice/dataset json files and I need to upload them to my Data Factory, opposed to creating new versions and copying the text over. Whats the simplest way to do this?
推荐答案
如果使用的是版本1,则可以使用Visual Studio来执行此操作,如下所示"> https://azure.microsoft.com/zh-cn/blog/azure-用于制作管道的数据工厂可视化工作室扩展/
If you are using version 1, you can use Visual Studio to do so as shown here https://azure.microsoft.com/en-us/blog/azure-data-factory-visual-studio-extension-for-authoring-pipelines/
如果使用版本2,则可以使用powershell进行.首先从此处下载并安装Powershell的Azure SDK: https://azure.microsoft.com /en-us/downloads/然后从Powershell中登录并选择订阅:
If you are using version 2, you can do this using powershell. First download and install the azure sdk for powershell from here: https://azure.microsoft.com/en-us/downloads/Then from powershell, login and select subscription:
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionName "your subs name here"
然后使用以下命令可以上传json文件:
Then with the following command you can upload the json files:
Set-AzureRmDataFactoryV2Pipeline -DataFactoryName "your df name" -ResourceGroupName "your RG name" -Name "pipelineName" -DefinitionFile "path to json file"
替换为您的数据工厂和资源组名称.
Replace with your Data factory and resource group name.
使用相同的参数通过以下命令上传链接的服务和数据集:
The same arguments are used to upload linked services and datasets with the commands:
Set-AzureRmDataFactoryV2LinkedService
Set-AzureRmDataFactoryV2Dataset
希望这对您有帮助!
这篇关于将ADF json文件上传到我的数据工厂的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!