假设我有一个名为myDirectory的HDFS目录,其中包含子目录的可变数目,如下所示:

/tmp
     |___mainDirectory
            |___ subDirectory1
            |___ subDirectory2
            .
            .
            |___ subDirectoryN

如何将主目录中每个子目录的路径捕获为bash变量?

例如,在上述情况下,我将得到N个bash变量,由此每个变量将类似于:
var_1=/tmp/mainDirectory/subDirectory1
var_2=/tmp/mainDirectory/subDirectory2

..etc

到目前为止,我已经做完hadoop fs -ls /tmp/mainDirectory后直到知道包含目录路径的列为止
$hadoop fs -ls /tmp/mainDirectory | awk '{print $8}'
/tmp/mainDirectory/subDirectory1
/tmp/mainDirectory/subDirectory2
.
.
/tmp/mainDirectory/subDirectoryN

但是,我无法将各个目录路径捕获到单独的bash变量中。

任何帮助将不胜感激。谢谢!

最佳答案

如果您想获得结果,请:

$hadoop fs -ls /tmp/mainDirectory | awk '{print $8}'
/tmp/mainDirectory/subDirectory1
/tmp/mainDirectory/subDirectory2
.
.
/tmp/mainDirectory/subDirectoryN

您可以使用命令将它们分配给bash变量,
$ declare $(hadoop fs -ls /tmp/mainDirectory | awk '{print "var_"NR"="$8}')

10-04 10:31