本文介绍了如何使用Python SDK将大文件(~100MB)上传到Azure BLOB存储?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用的是最新的Azure Storage SDK(azure-storage-blob-12.7.1)。它适用于较小的文件,但对于较大的文件>;30MB则会抛出异常。

from azure.storage.blob import BlobServiceClient, PublicAccess, BlobProperties,ContainerClient

    def upload(file):
        settings = read_settings()
        connection_string = settings['connection_string']
        container_client = ContainerClient.from_connection_string(connection_string,'backup')
        blob_client = container_client.get_blob_client(file)
        with open(file,"rb") as data:
            blob_client.upload_blob(data)
            print(f'{file} uploaded to blob storage')

    upload('crashes.csv')

推荐答案

当我尝试上载~180MB的.txt文件时,您的代码似乎一切正常。但是,如果上传小文件对您有效,我认为将大文件分成小部分上传可能是一种变通办法。尝试以下代码:

from azure.storage.blob import BlobClient

storage_connection_string=''
container_name = ''
dest_file_name = ''

local_file_path = ''

blob_client = BlobClient.from_connection_string(storage_connection_string,container_name,dest_file_name)

#upload 4 MB for each request
chunk_size=4*1024*1024

if(blob_client.exists):
    blob_client.delete_blob()
    blob_client.create_append_blob()

with open(local_file_path, "rb") as stream:

    while True:
            read_data = stream.read(chunk_size)

            if not read_data:
                print('uploaded')
                break
            blob_client.append_block(read_data)

结果:

这篇关于如何使用Python SDK将大文件(~100MB)上传到Azure BLOB存储?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-18 22:46