本文介绍了使用Node.js编写大文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用可写流<:

var fs     = require('fs');
var stream = fs.createWriteStream('someFile.txt', { flags : 'w' });

var lines;
while (lines = getLines()) {
    for (var i = 0; i < lines.length; i++) {
        stream.write( lines[i] );
    }
}

我想知道如果不使用 drain 事件?如果不是(我认为是这种情况),那么将任意大数据写入文件的模式是什么?

这就是我最终做到的方式.背后的想法是创建实现 ReadStream 接口的可读流,然后使用pipe()方法将数据通过管道传输到可写流

var fs = require('fs');
var writeStream = fs.createWriteStream('someFile.txt', { flags : 'w' });
var readStream = new MyReadStream();

readStream.pipe(writeStream);
writeStream.on('close', function () {
    console.log('All done!');
});

MyReadStream类的示例可以取自猫鼬 QueryStream . /p>

I'm writing a large file with node.js using a writable stream:

var fs     = require('fs');
var stream = fs.createWriteStream('someFile.txt', { flags : 'w' });

var lines;
while (lines = getLines()) {
    for (var i = 0; i < lines.length; i++) {
        stream.write( lines[i] );
    }
}

I'm wondering if this scheme is safe without using drain event? If it is not (which I think is the case), what is the pattern for writing an arbitrary large data to a file?

解决方案

That's how I finally did it. The idea behind is to create readable stream implementing ReadStream interface and then use pipe() method to pipe data to writable stream.

var fs = require('fs');
var writeStream = fs.createWriteStream('someFile.txt', { flags : 'w' });
var readStream = new MyReadStream();

readStream.pipe(writeStream);
writeStream.on('close', function () {
    console.log('All done!');
});

The example of MyReadStream class can be taken from mongoose QueryStream.

这篇关于使用Node.js编写大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-11 18:49