本文介绍了在R中超过内存限制(即使使用24GB RAM)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述 我正在尝试合并两个数据帧:一个具有908450个变量的观察结果,另一个具有908450个变量的观察值。 dataframe2< -merge(dataframe1,dataframe2,by =id) 已清除所有其他数据帧从工作内存,并重置我的内存限制(一个全新的桌面与24 GB的RAM)使用代码: memory.limit(24576) 但是,我仍然收到错误不能分配大小为173.Mb的矢量。 有关如何解决这个问题的任何想法?解决方案要跟进我的意见,请使用 data.table 。我组合了一个匹配您的数据的快速例子: library(data.table) dt1 dt2 #set keys setkey(dt1,id) setkey(dt2,id) #check dims > dim(dt1) [1] 908450 33 > dim(dt2) [1] 908450 2 #merge在一起并检查系统时间:> system.time(dt3 用户系统已用 0.43 0.03 0.47 所以花了不到1/2秒才能合并在一起。我在屏幕截图前后拍了一下我的记忆。在合并之前,我使用了3.4 gig的ram。当我合并在一起时,它跳到了3.7,并且平息了。我想你会很难找到更多内存或时间效率的东西。 之前: 之后: I am trying to merge two dataframes: one has 908450 observations of 33 variables, and the other has 908450 observations of 2 variables.dataframe2 <-merge(dataframe1, dataframe2, by="id")I've cleared all other dataframes from working memory, and reset my memory limit (for a brand new desktop with 24 GB of RAM) using the code:memory.limit(24576)But, I'm still getting the error Cannot allocate vector of size 173.Mb.Any thoughts on how to get around this problem? 解决方案 To follow up on my comments, use data.table. I put together a quick example matching your data to illustrate:library(data.table)dt1 <- data.table(id = 1:908450, matrix(rnorm(908450*32), ncol = 32))dt2 <- data.table(id = 1:908450, rnorm(908450))#set keyssetkey(dt1, id)setkey(dt2, id)#check dims> dim(dt1)[1] 908450 33> dim(dt2)[1] 908450 2#merge together and check system time:> system.time(dt3 <- dt1[dt2]) user system elapsed 0.43 0.03 0.47 So it took less than 1/2 second to merge together. I took a before and after screenshot watching my memory. Before the merge, I was using 3.4 gigs of ram. When I merged together, it jumped to 3.7 and leveled off. I think you'll be hard pressed to find something more memory or time efficient than that.Before:After: 这篇关于在R中超过内存限制(即使使用24GB RAM)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!
09-26 21:57