问题描述
我需要从 Azure文件 ="https://azure.microsoft.com/en-us/services/databricks/" rel ="nofollow noreferrer"> Azure Databricks .根据文档 Azure Blob 受支持,但我需要这个使用Azure文件的代码:
I need to access Azure Files from Azure Databricks. According to the documentation Azure Blobs are supported but I am need this code to work with Azure files:
dbutils.fs.mount(
source = "wasbs://<your-container-name>@<your-storage-account-name>.file.core.windows.net",
mount_point = "/mnt/<mount-name>",
extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})
或者还有另一种方式来挂载/访问 Azure文件到/从 Azure Databricks 群集?谢谢
or is there another way to mount/access Azure Files to/from a Azure Databricks cluster? Thanks
推荐答案
在Azure上,通常可以通过SMB协议将Azure文件的文件共享安装到Linux.而且我尝试遵循官方教程 Use Azure Files with Linux
通过在Python中创建一个笔记本来执行以下操作,但操作失败.
On Azure, generally you can mount a file share of Azure Files to Linux via SMB protocol. And I tried to follow the offical tutorial Use Azure Files with Linux
to do it via create a notebook in Python to do the commands as below, but failed.
Azure Databricks似乎不允许这样做,即使我在Databricks社区中搜索了mount NFS
,SMB
,Samba
等内容,也没有任何讨论.
It seems that Azure Databricks does not allow to do that, even I searched about mount NFS
, SMB
, Samba
, etc. in Databricks community that there is not any discussion.
因此访问Azure文件中文件的唯一方法是安装azure-storage
程序包,然后直接在Azure Databricks上使用Python的Azure Files SDK.
So the only way to access files in Azure Files is to install the azure-storage
package and directly to use Azure Files SDK for Python on Azure Databricks.
这篇关于数据块和Azure文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!