问题描述
我正在使用Mac的终端.
I'm using mac's terminal.
我要从远程URL复制图像: http://media.pragprog.com/titles/rails4/code/depot_b/public/images/到本地目录.
I want to copy images from remote url: http://media.pragprog.com/titles/rails4/code/depot_b/public/images/ to a local directory.
执行此操作的命令是什么?
What's the command to do that?
Tnx,
推荐答案
或者,您可能只需要网站中的所有图像. wget可以使用递归选项来做到这一点,例如:
alternatively you may want just all the images, from a website. wget can do this with a recursive option such as:
$ wget -r -A = jpeg,jpg,bmp,png,gif,tiff,xpm,ico http://www.website.com/
$ wget -r -A=jpeg,jpg,bmp,png,gif,tiff,xpm,ico http://www.website.com/
这应该仅从站点索引开始递归下载逗号分隔的扩展名.这就像网络蜘蛛一样工作,因此,如果未在网站上的任何地方引用它,则会丢失图像.
This should only download the comma delimited extensions recursively starting at the site index. This works like a web-spider so if its not referenced anywhere on the site it will miss the image.
这篇关于Bash命令从远程URL复制图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!