我正在尝试使用httr下载1.1 GB的文件,但遇到以下错误:

x <- GET( extract.path )
Error in curlPerform(curl = handle$handle, .opts = curl_opts$values) :
  cannot allocate more space: 1728053248 bytes

我的C盘有400GB的可用空间。

RCurl包中,我在使用maxfilesize时看到maxfilesize.largegetCurlOptionsConstants()选项,但是我不知道是否/如何通过httrconfig将它们传递给set_config,或者是否需要为此切换到RCurl。即使我确实需要切换,也会增加最大文件大小的工作吗?

这是我的sessionInfo。
> sessionInfo()
R version 3.0.0 (2013-04-03)
Platform: i386-w64-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United States.1252    LC_MONETARY=English_United States.1252 LC_NUMERIC=C                           LC_TIME=English_United States.1252

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base

other attached packages:
[1] XML_3.96-1.1 httr_0.2

loaded via a namespace (and not attached):
[1] digest_0.6.0   RCurl_1.95-4.1 stringr_0.6.2  tools_3.0.0

..and(不建议这样做,因为它会花一些时间),如果您想重现我的错误,可以转到https://usa.ipums.org/usa-action/samples,注册一个新帐户,选择2011年5年Acs摘录,添加大约百个变量,然后等待提取准备就绪。然后编辑前三行并运行下面的代码。 (再次,不推荐)
your.email <- "[email protected]"
your.password <- "password"
extract.path <- "https://usa.ipums.org/usa-action/downloads/extract_files/some_file.csv.gz"

require(httr)

values <-
    list(
        "login[email]" = your.email ,
        "login[password]" = your.password ,
        "login[is_for_login]" = 1
    )

POST( "https://usa.ipums.org/usa-action/users/validate_login" , body = values )
GET( "https://usa.ipums.org/usa-action/extract_requests/download" , query = values )

# this line breaks
x <- GET( extract.path )

最佳答案

仅供引用-这已添加到write_disk()httr控件中:
https://github.com/hadley/httr/blob/master/man/write_disk.Rd

10-05 21:06