问题描述
我的工作中,我需要获取的Bigcommerce产品检查产品的URL生成的sitemap.xml文件中的任务
I am working on a task in which i need to fetch bigcommerce products check for product url's to generate sitemap.xml file
其实有18万的产品在现场,所以我需要创建多个Sitemap XML文件和单一指数(sitemap.xml的)文件。
Actually there are 180000 products on site so i will need to create multiple sitemap xml files and a single index (sitemap.xml) file.
我完成了所有的脚本来做到这一点,我很分组50000网址在创建的每个站点地图的XML文件。
I completed all script to do that, i am grouping 50000 url's in every sitemap xml files created.
当我运行它,一旦它被正确执行和4站点地图文件。
一切都工作得很好。
When i run it, once it was executed correctly and created 4 sitemap files.all was worked fine.
但现在我无法执行其堂妹了一段时间后,运行它给我的网络错误(如连接的Bigcommerce失去的东西)
But now i am unable to execute it coz after running for sometime it gives me network error (Something like bigcommerce connection lost)
问题是,有一个极限,而调用API的Bigcommerce,怎么把我们需要并没有在同一时间只有250种产品被取出发送一个网页。
The issue is that there is a limit while calling bigcommerce api, coz we need to send it a page no and at a time only 250 products are fetched.
所以我要求的Bigcommerce支持人员为获取产品在一个单一的API调用扩展限制。他建议我用循环并告诉我,有没有它的任何解决方案,我们可以一次只获取250种产品。
so i asked to bigcommerce support person for extending limit for fetching products in a single api call. he suggested me to use loop and informed me that there is no any solution for it, we can fetch only 250 products at a time.
其难取18万的产品在一个脚本中调用API的循环。
但在我的情况下,它必须为我做了一个脚本事务所(我需要设置该脚本作为cron作业)。
its difficult to fetch 180000 products in single script calling api in loop.but in my case its compulsory for me to do that in a sigle script (i need to set that script as a cron job).
请问有什么解决就完成这项任务没有任何网络错误。
这里任何的Bigcommerce专家?
Is there any solution do accomplish this task without any network error.Any bigcommerce expert here?
任何帮助将大大AP preciated!
Any help would be greatly appreciated!!
推荐答案
我不得不试图把所有的产品在店里我工作了同样的问题,
因为它的立场,他们确实有每个请求产品最大数量,
I had the same problem trying to pull all the products in the store i was working on,As it stands, they do have a max number of products per request,
您需要做的是,而不是使用过滤器和循环......我相信没有其他的方式来做到这一点。
What you need to do instead is to use a filter, and loop... i believe there is no other way to do this.
$count = Bigcommerce::getProductsCount()/250;
for($x=1;$x<$count;$x++){
$filter = array("page" => $x, "limit" => 250);
$products = Bigcommerce::getProducts($filter);
// All your code goes here
}
我希望这回答了你的问题。
虽然这个答复有点晚了,它可能会帮助别人
I hope this answers your question.Though this reply is a bit late, it might help someone
这篇关于与页数限制的Bigcommerce API问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!