r同时访问一个站点

r同时访问一个站点

本文介绍了多个wget -r同时访问一个站点?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

任何带有选项的命令/wget?

any command / wget with options?

对于多线程,是否同时递归下载站点?

For multithreaded download a site recursively and simultaneously?

推荐答案

我找到了一个不错的解决方案.

I found a decent solution.

http://上阅读原始内容www.linuxquestions.org/questions/linux-networking-3/wget-multi-threaded-downloading-457375/

wget -r -np -N [url] &
wget -r -np -N [url] &
wget -r -np -N [url] &
wget -r -np -N [url] &

您认为合适的副本已进行了多次,以进行尽可能多的处理 下载.这不像一个适当的多线程应用程序那么优雅, 但只需少量的开销就可以完成工作. 此处的键是"-N"开关.这意味着仅传输文件 如果它比磁盘上的新.这将(主要)防止 从下载相同文件到不同过程的每个过程 已经下载,但跳过文件并下载其他文件 流程尚未下载.它使用时间戳作为一种手段 这样,因此开销很小.

copied as many times as you deem fitting to have as much processes downloading. This isn't as elegant as a properly multithreaded app, but it will get the job done with only a slight amount of over head. the key here being the "-N" switch. This means transfer the file only if it is newer than what's on the disk. This will (mostly) prevent each process from downloading the same file a different process already downloaded, but skip the file and download what some other process hasn't downloaded. It uses the time stamp as a means of doing this, hence the slight overhead.

对我来说效果很好,并节省了很多时间.没有太多 进行处理,因为这可能会使网站的连接饱和并勾选 主人.最多保持4个左右.但是,这个数字是 仅受两端的CPU和网络带宽的限制.

It works great for me and saves a lot of time. Don't have too many processes as this may saturate the web site's connection and tick off the owner. Keep it around a max of 4 or so. However, the number is only limited by CPU and network bandwidth on both ends.

这篇关于多个wget -r同时访问一个站点?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 23:52