问题描述
我有一个从文件中删除行的功能。我正在处理大文件(超过100Mb)。我拥有256MB的PHP内存,但用100MB的CSV文件处理不带线条的功能。
该功能必须做的是:
原来我的CSV是这样的:
它只会删除第一行,仅此而已。问题是这个函数的性能与大文件,它吹起了内存。
该函数是:
爆炸的问题在于将整个文件放在内存中的文件函数。为了克服这个问题,你需要逐行读取文件,将除了要删除的行全部写入临时文件,最后重命名临时文件。
public function deleteLine($ line_no,$ csvFileName){
//在当前工作目录中获取临时文件名..您可以使用
//任何其他目录说/ tmp
$ tmpFileName = tempnam(。,csv);
$ strip_return = FALSE;
//打开输入文件进行阅读。
$ readFD = fopen($ csvFileName,'r');
//写入临时文件。
$ writeFD = fopen($ tmpFileName,'w');
//检查fopen错误。
if($ line_no == - 1){
$ skip = $ size-1;
} else {
$ skip = $ line_no-1;
}
$ line = 0;
//从输入文件逐行读取行。
//写入除要删除的行外的所有行。
while(($ buffer = fgets($ readFD))!== false){
if($ line!= $ skip)
fputs($ writeFD,$ buffer);
else
$ strip_return = TRUE;
$ line ++;
}
//将临时文件重命名为输入文件。
重命名($ tmpFileName,$ csvFileName);
返回$ strip_return;
}
I have a function that strips out lines from files. I'm handling with large files(more than 100Mb). I have the PHP Memory with 256MB but the function that handles with the strip out of lines blows up with a 100MB CSV File.
What the function must do is this:
Originally I have the CSV like:
When I pass the CSV file to this function I got:
It only strips out the first line, nothing more. The problem is the performance of this function with large files, it blows up the memory.
The function is:
public function deleteLine($line_no, $csvFileName) { // this function strips a specific line from a file // if a line is stripped, functions returns True else false // // e.g. // deleteLine(-1, xyz.csv); // strip last line // deleteLine(1, xyz.csv); // strip first line // Assigna o nome do ficheiro $filename = $csvFileName; $strip_return=FALSE; $data=file($filename); $pipe=fopen($filename,'w'); $size=count($data); if($line_no==-1) $skip=$size-1; else $skip=$line_no-1; for($line=0;$line<$size;$line++) if($line!=$skip) fputs($pipe,$data[$line]); else $strip_return=TRUE; return $strip_return; }It is possible to refactor this function to not blow up with the 256MB PHP Memory?
Give me some clues.
Best Regards,
解决方案The problem for your blowout is the file function that brings the entire file in memory. To overcome this you need to read the file line by line, write all but the line to be deleted to a temporary file and finally rename the temporary file.
public function deleteLine($line_no, $csvFileName) { // get a temp file name in current working directory..you can use // any other directory say /tmp $tmpFileName = tempnam(".", "csv"); $strip_return=FALSE; // open input file for reading. $readFD=fopen($csvFileName,'r'); // temp file for writing. $writeFD=fopen($tmpFileName,'w'); // check for fopen errors. if($line_no==-1) { $skip=$size-1; } else { $skip=$line_no-1; } $line = 0; // read lines from input file one by one. // write all lines except the line to be deleted. while (($buffer = fgets($readFD)) !== false) { if($line!=$skip) fputs($writeFD,$buffer); else $strip_return=TRUE; $line++; } // rename temp file to input file. rename($tmpFileName,$csvFileName); return $strip_return; }
这篇关于PHP性能不佳。随着大文件内存爆炸!我该如何重构?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!