问题描述
我需要允许用户以csv格式导出其数据.我已经在nodejs中编写了应用程序.用户的导出数据可能非常庞大.所以我想知道如何在nodejs中处理这种情况.我应该使用node.js的process.nexttick还是子进程api?还有一些好模块可用于nodejs将数据从mysql转换为csv.
I need to allow users to export their data in csv format. I have written app in nodejs. The export data for users can be huge. So i was wondering How to handle such situation in nodejs. Should i user process.nexttick or child process api of nodejs? Also are there any good module available for nodejs to convert data from mysql to csv.
推荐答案
从mysql-db逐行读取,并将行逐行追加到文件中
read line by line from your mysql-db, and append line by line to your file
我对mysqlmodule不太了解,所以我假设这里每一行只是一个数组,因此是'row.join(';')'.如果不是这种情况(也许它是一个对象),则应修复该问题.
i dont know that much about the mysqlmodule, so i'm assuming here each line is just an array, therefore the 'row.join(';')'. if thats not the case (maybe its an object), you should fix that.
var fs = require('fs');
var connection = require('mysql').createConnection({yourdbsettingshere});
function processRow (row) {
fs.appendFile('your-file.csv', row.join(';'), function (err) {
connection.resume();
});
}
var query = connection.query('SELECT * FROM WHATEVER');
query
.on('error', function(err) {
// do something when an error happens
})
.on('fields', function(fields) {
processRow(fields);
})
.on('result', function(row) {
// Pausing the connnection is useful if your processing involves I/O
connection.pause();
processRow(row, function (err) {
connection.resume();
});
})
.on('end', function() {
// now you can mail your user
});
如果您有很多请求,则可以使用计算群集模块进行分发您的工作量
if you have a lot of requests, you could use the compute-cluster module for distributing your workload
这篇关于用户的Nodejs CSV数据导出系统的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!