本文介绍了Node.js堆内存不足的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

今天我运行文件系统索引的脚本来刷新RAID文件索引,4小时后崩溃并出现以下错误:

Today I ran my script for filesystem indexing to refresh RAID files index and after 4h it crashed with following error:

[md5:]  241613/241627 97.5%
[md5:]  241614/241627 97.5%
[md5:]  241625/241627 98.1%
Creating missing list... (79570 files missing)
Creating new files list... (241627 new files)

<--- Last few GCs --->

11629672 ms: Mark-sweep 1174.6 (1426.5) -> 1172.4 (1418.3) MB, 659.9 / 0 ms [allocation failure] [GC in old space requested].
11630371 ms: Mark-sweep 1172.4 (1418.3) -> 1172.4 (1411.3) MB, 698.9 / 0 ms [allocation failure] [GC in old space requested].
11631105 ms: Mark-sweep 1172.4 (1411.3) -> 1172.4 (1389.3) MB, 733.5 / 0 ms [last resort gc].
11631778 ms: Mark-sweep 1172.4 (1389.3) -> 1172.4 (1368.3) MB, 673.6 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x3d1d329c9e59 <JS Object>
1: SparseJoinWithSeparatorJS(aka SparseJoinWithSeparatorJS) [native array.js:~84] [pc=0x3629ef689ad0] (this=0x3d1d32904189 <undefined>,w=0x2b690ce91071 <JS Array[241627]>,L=241627,M=0x3d1d329b4a11 <JS Function ConvertToString (SharedFunctionInfo 0x3d1d3294ef79)>,N=0x7c953bf4d49 <String[4]\: ,\n  >)
2: Join(aka Join) [native array.js:143] [pc=0x3629ef616696] (this=0x3d1d32904189 <undefin...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [/usr/bin/node]
 2: 0xe2c5fc [/usr/bin/node]
 3: v8::Utils::ReportApiFailure(char const*, char const*) [/usr/bin/node]
 4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/usr/bin/node]
 5: v8::internal::Factory::NewRawTwoByteString(int, v8::internal::PretenureFlag) [/usr/bin/node]
 6: v8::internal::Runtime_SparseJoinWithSeparator(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/bin/node]
 7: 0x3629ef50961b

服务器配备16GB RAM和24GB SSD交换。我非常怀疑我的脚本超过36GB的内存。至少它不应该

Server is equipped with 16gb RAM and 24gb SSD swap. I highly doubt my script exceeded 36gb of memory. At least it shouldn't

脚本创建存储为具有文件元数据的对象数组的文件索引(修改日期,权限等,没有大数据)

Script creates index of files stored as Array of Objects with files metadata (modification dates, permissions, etc, no big data)

以下是完整的脚本代码:

Here's full script code:http://pastebin.com/mjaD76c3

过去我用这个脚本已经遇到过奇怪的节点问题。将索引拆分成多个文件,因为当处理像String这样的大文件时,节点会出现故障。有没有办法用庞大的数据集改进nodejs内存管理?

I've already experiend weird node issues in the past with this script what forced me eg. split index into multiple files as node was glitching when working on such big files as String. Is there any way to improve nodejs memory management with huge datasets?

推荐答案

如果我没记错的话,有一个严格的标准限制如果不手动增加,则V8中的内存使用量约为1.7 GB。

If I remember correctly, there is a strict standard limit for the memory usage in V8 of around 1.7 GB, if you do not increase it manually.

在我们的一个产品中,我们在部署脚本中使用了此解决方案:

In one of our products we followed this solution in our deploy script:

 node --max-old-space-size=4096 yourFile.js

还会是一个新的太空指挥,但正如我在这里读到的那样:新空间仅收集新创建的短期数据,旧空间包含所有引用的数据结构,在您的情况下应该是最佳选择。

There would also be a new space command but as I read here: a-tour-of-v8-garbage-collection the new space only collects the newly created short-term data and the old space contains all referenced data structures which should be in your case the best option.

这篇关于Node.js堆内存不足的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-15 02:55