问题描述
我可以ssh到我们的盒子中,然后执行hadoop fs -ls /theFolder
并浏览文件,等等.但这也就是我所知道的:)我的目标是将其中一个文件-它们是Avro-复制到我的文件中本地主文件夹.
I can ssh to our box and do a hadoop fs -ls /theFolder
and browse in for the files, etc.. but that's all I know too :) My goal is to copy one of those files - they are Avro - on to my local home folder.
该怎么办?我也找到了get
命令,但不确定如何起诉.
How can do this? I found also a get
command but not sure how to sue that either.
推荐答案
首先,使用hadoop fs -get /theFolder
将该文件复制到您要放在盒子中的当前目录中.
First, use hadoop fs -get /theFolder
to copy it into the current directory you are ssh'ed into on your box.
然后,您可以使用scp
或我的偏好设置rsync
在盒子和本地系统之间复制文件,就像这样.这是在使用-get
之后仍在同一目录中的我如何使用rsync
的方法:
Then you can use either scp
or my preference of rsync
to copy the files between your box and your local system like so. Here's how I'd use rsync
after having used the -get
, still in the same directory:
rsync -av ./theFolder username@yourlocalmachine:/home/username
这会将theFolder
从包装盒上的本地fs复制到计算机fs上的主文件夹中.在这两种情况下,请确保将username
替换为您的实际用户名,并将yourlocalmachine
替换为计算机的主机名或IP地址.
This will copy theFolder
from the local fs on your box into your home folder on your machine's fs. Be sure to replace username
with your actual username in both cases, and yourlocalmachine
with your machine's hostname or ip address.
这篇关于从Hadoop复制到本地计算机的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!