从S3到HDFS运行s3distcp时:
sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/s3distcp.jar --src s3n://workAAAA-KKKK-logs/production-logs/Log-XXXX-click/Log-XXXXX-click-2013-03-27_06-21-19_i-7XXb2x39_00037.gz --dest hdfs:///test/
我得到以下异常。
我的路径语法(s3n://; hdfs:///)有问题吗?
有人遇到过这个问题吗?
13/04/04 12:10:52 INFO s3distcp.S3DistCp: Using output path 'hdfs:/tmp/96a8e57b-4c68-406c-b4ca-bf212de12d93/output'
13/04/04 12:10:53 INFO s3distcp.FileInfoListing: Opening new file: hdfs:/tmp/96a8e57b-4c68-406c-b4ca-bf212de12d93/files/1
Exception in thread "main" java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
at org.apache.hadoop.fs.Path.<init>(Path.java:99)
at org.apache.hadoop.fs.Path.<init>(Path.java:58)
at com.amazon.external.elasticmapreduce.s3distcp.FileInfoListing.getOutputFilePath(FileInfoListing.java:155)
at com.amazon.external.elasticmapreduce.s3distcp.FileInfoListing.add(FileInfoListing.java:111)
at com.amazon.external.elasticmapreduce.s3distcp.FileInfoListing.add(FileInfoListing.java:78)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.createInputFileListS3(S3DistCp.java:122)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.createInputFileList(S3DistCp.java:60)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:529)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:216)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.amazon.external.elasticmapreduce.s3distcp.Main.main(Main.java:12)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
最佳答案
如果需要的话,有一种方法可以请求特定文件。您可以使用 --copyFromManifest 选项,该选项允许您向s3distcp提供一个 list 文件,该 list 文件包含所有文件路径(即使在不同的文件夹中)。
关于hadoop - s3distcp : can not create path from empty string,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/15810968/