有没有一种方法可以一次上传多个文件,而不必为每个文件重新连接?

我将S3用作php应用程序的存储,该应用程序需要存储大量(一次100个)大部分较小(约10k)的图像文件。目前,我正在遍历它们,并使用以下代码分别上传它们:

$s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ)

这需要很长时间。 1.5分钟的文件大约需要一分钟。如其他答案中所建议的那样,关闭SSL可以减少到大约40s,但这仍然很慢。

这是我当前的代码,使用针对PHP的Amazon S3 REST实现
$s3 = new S3($awsAccessKey, $awsSecretKey, false);


function send_to_s3($s3, $bucketName, $uploadFile)

{

    $start = microtime(true);



// Check if our upload file exists
if (!file_exists($uploadFile) || !is_file($uploadFile))
    exit("\nERROR: No such file: $uploadFile\n\n");

// Check for CURL
if (!extension_loaded('curl') && !@dl(PHP_SHLIB_SUFFIX == 'so' ? 'curl.so' : 'php_curl.dll'))
    exit("\nERROR: CURL extension not loaded\n\n");



if ($s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ))
    {

    $end = microtime(true);

    $took = $end - $start;

    echo "S3::putObjectFile(): File copied to {$bucketName}/".($uploadFile).PHP_EOL . ' - ' . filesize($uploadFile) . ' in ' . $took . ' seconds<br />';

    return $took;
    }
else
    {
    print 'error';
    }

}

感谢您的帮助。

最佳答案

use Aws\S3\S3Client;
use Aws\CommandPool;
use Guzzle\Service\Exception\CommandTransferException;

$commands = array();
foreach ( $objects as $key => $file ) {
    $fileContent = $file['body'];
        $objParams = array (
            'ACL' => 'bucket-owner-full-control',
            'Bucket' => 'bucket_name',
            'Key' => 's3_path',
            'Body' => $fileContent
        );
        $commands[] = $clientS3->getCommand('PutObject', $objParams);
    }
    try {
        $results = CommandPool::batch($clientS3, $commands);
    } catch (CommandTransferException $e) {
        $succeeded = $e->getSuccessfulCommands();
        echo "Failed Commands:\n";
        foreach ($e->getFailedCommands() as $failedCommand) {
            echo $e->getExceptionForFailedCommand($failedCommand)->getMessage() . "\n";
        }
    }

07-24 09:52
查看更多