本文介绍了在PHP中的大请求期间防止超时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在向brightcove服务器发出大量请求,以便在我的视频中批量更改元数据。它似乎只通过1000次迭代然后停止 - 任何人都可以帮助调整此代码以防止超时发生?它需要进行大约7000/8000次迭代。
I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.
<?php
include 'echove.php';
$e = new Echove(
'xxxxx',
'xxxxx'
);
// Read Video IDs
# Define our parameters
$params = array(
'fields' => 'id,referenceId'
);
# Make our API call
$videos = $e->findAll('video', $params);
//print_r($videos);
foreach ($videos as $video) {
//print_r($video);
$ref_id = $video->referenceId;
$vid_id = $video->id;
switch ($ref_id) {
case "":
$metaData = array(
'id' => $vid_id,
'referenceId' => $vid_id
);
# Update a video with the new meta data
$e->update('video', $metaData);
echo "$vid_id updated sucessfully!<br />";
break;
default:
echo "$ref_id was not updated. <br />";
break;
}
}
?>
谢谢!
推荐答案
尝试功能。调用 set_time_limit(0)
将删除执行脚本的任何时间限制。
Try the set_time_limit() function. Calling set_time_limit(0)
will remove any time limits for execution of the script.
这篇关于在PHP中的大请求期间防止超时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!