oreFileSystemManagementClient上载到

oreFileSystemManagementClient上载到

本文介绍了使用DataLakeStoreFileSystemManagementClient上载到Azure DataLake的30Mb限制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用

_adlsFileSystemClient.FileSystem.Create(_adlsAccountName, destFilePath, stream, overwrite)

将文件上传到Datalake.超过30Mb的文件会出现错误.对于较小的文件,它可以正常工作.

to upload files to a datalake. The error comes up with files over 30Mb. It works fine with smaller files.

错误是:

还有其他人遇到过吗?还是观察到类似的行为?我通过将文件分割成30Mb的片段并上传来解决此问题.

Has anybody else encountered this? Or observed similar behaviour? I am getting around this by splitting my files into 30Mb pieces and uploading them.

但是,从长远来看,这是不切实际的,因为原始文件为380Mb,并且可能更大.从长远来看,我不想在我的数据湖中有10到15个解剖的文件.我想上传为单个文件.

However this is impractical in the long term because the original file is 380Mb, and potentially quite a bit larger. I do not want to have 10-15 dissected files in my datalake in the long term. I would like to upload as a single file.

我能够通过门户界面将完全相同的文件上传到Datalake.

I am able to upload the exact same file to the datalake through the portal interface.

推荐答案

它回答了此处.

当前,大小限制为30000000字节.您可以先创建一个初始文件,然后附加两个流大小均小于限制的文件,以解决此问题.

Currently there is a size limit of 30000000 bytes. You can work around by creating an initial file and then append, both with stream size less than the limit.

这篇关于使用DataLakeStoreFileSystemManagementClient上载到Azure DataLake的30Mb限制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-05 11:22