本文介绍了PHP:scandir()太慢的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

我必须创建一个将所有子文件夹列出到一个文件夹中的函数.我有一个无文件过滤器,但是该函数使用scandir()列出.这使应用程序非常慢.是否有scandir()的替代方法,甚至不是本机php函数?预先感谢!

I have to make a function that lists all subfolders into a folder. I have a no-file filter, but the function uses scandir() for listing. That makes the application very slow. Is there an alternative of scandir(), even a not native php function?Thanks in advance!

推荐答案

您可以使用可能更快的readdir,如下所示:

You can use readdir which may be faster, something like this:

function readDirectory($Directory,$Recursive = true)
{
    if(is_dir($Directory) === false)
    {
        return false;
    }

    try
    {
        $Resource = opendir($Directory);
        $Found = array();

        while(false !== ($Item = readdir($Resource)))
        {
            if($Item == "." || $Item == "..")
            {
                continue;
            }

            if($Recursive === true && is_dir($Item))
            {
                $Found[] = readDirectory($Directory . $Item);
            }else
            {
                $Found[] = $Directory . $Item;
            }
        }
    }catch(Exception $e)
    {
        return false;
    }

    return $Found;
}

可能需要一些时间调整,但这实际上是scandir的工作,它应该更快,如果没有,请写一个更新,因为我想看看我是否可以做出更快的解决方案.

May require some tweeking but this is essentially what scandir does, and it should be faster, if not please write an update as i would like to see if i can make a faster solution.

另一个问题是,如果您读取的目录很大,则在内部存储器中填充了一个数组,这可能就是您的存储器要去的地方.

Another issue is if your reading a very large directory your filling an array up within the internal memory and that may be where your memory is going.

您可以尝试创建一个读取偏移量的函数,以便一次可以返回50个文件!

You could try and create a function that reads in offsets so that you can return 50 files at a time!

一次读取大块文件就像使用一样简单:

reading chunks of files at a time would be just as simple to use, would be like so:

$offset = 0;
while(false !== ($Batch = ReadFilesByOffset("/tmp",$offset)))
{
    //Use $batch here which contains 50 or less files!

    //Increment the offset:
    $offset += 50;
}

这篇关于PHP:scandir()太慢的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-07 17:29