问题描述
我一直在寻找上传带有更多数据的大文件的方法,但是似乎没有任何解决方案.要上传文件,我一直在使用此代码,并且在处理小文件时效果很好:
I've been looking around for ways to upload large file with additional data, but there doesn't seem to be any solution. To upload file, I've been using this code and it's been working fine with small file:
with open("my_file.csv", "rb") as f:
files = {"documents": ("my_file.csv", f, "application/octet-stream")}
data = {"composite": "NONE"}
headers = {"Prefer": "respond-async"}
resp = session.post("my/url", headers=headers, data=data, files=files)
问题是代码在发送之前将整个文件加载了,而上传大文件时我会遇到MemoryError.我环顾四周,流数据的方法是设置
The problem is that the code loads the whole file up before sending, and I would run into MemoryError when uploading large files. I've looked around, and the way to stream data is to set
resp = session.post("my/url", headers=headers, data=f)
但是我需要向数据添加{"composite":"NONE"}.如果没有,服务器将无法识别该文件.
but I need to add {"composite": "NONE"} to the data. If not, the server wouldn't recognize the file.
推荐答案
您可以使用 requests-toolbelt 来做到这一点:
You can use the requests-toolbelt to do this:
import requests
from requests_toolbelt.multipart import encoder
session = requests.Session()
with open('my_file.csv', 'rb') as f:
form = encoder.MultipartEncoder({
"documents": ("my_file.csv", f, "application/octet-stream"),
"composite": "NONE",
})
headers = {"Prefer": "respond-async", "Content-Type": form.content_type}
resp = session.post(url, headers=headers, data=form)
session.close()
这将导致请求为您流式传输multipart/form-data
上传.
This will cause requests to stream the multipart/form-data
upload for you.
这篇关于python请求上传带有其他数据的大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!