问题描述
该行下面的问题已解决,但我面临另一个问题.
我正在这样做:
DistributedCache.createSymlink(job.getConfiguration());
DistributedCache.addCacheFile(new URI
("hdfs:/user/hadoop/harsh/libnative1.so"),conf.getConfiguration());
以及在映射器中:
System.loadLibrary("libnative1.so");
(我也尝试过System.loadLibrary("libnative1");System.loadLibrary("native1");
(i also triedSystem.loadLibrary("libnative1");System.loadLibrary("native1");
但是我收到此错误:
java.lang.UnsatisfiedLinkError: no libnative1.so in java.library.path
我一无所知,应该将java.library.path设置为..我尝试将其设置为/home,并将每个.so从分布式缓存复制到/home/,但仍然不起作用:(
I am totally clueless what should I set java.library.path to ..I tried setting it to /home and copied every .so from distributed cache to /home/ but still it didn't work :(
有什么建议/解决方案吗?
Any suggestions / solutions please?
IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII
我想设置运行映射器的机器的系统环境变量(特别是LD_LIBRARY_PATH).
I want to set the system environment variable (specifically, LD_LIBRARY_PATH) of the machine where the mapper is running.
我尝试过:
Runtime run = Runtime.getRuntime();
Process pr=run.exec("export LD_LIBRARY_PATH=/usr/local/:$LD_LIBRARY_PATH");
但是它抛出IOException.
But it throws IOException.
我也知道
JobConf.MAPRED_MAP_TASK_ENV
但是我正在使用hadoop版本0.20.2,其中包含Job&配置而不是JobConf.
But I am using hadoop version 0.20.2 which has Job & Configuration instead of JobConf.
我找不到任何这样的变量,这也不是Hadoop特定的环境变量,而是系统环境变量.
I am unable to find any such variable, and this is also not a Hadoop specific environment variable but a system environment variable.
有解决方案/建议吗?在此先感谢.
Any solution/suggestion?Thanks in advance..
推荐答案
为什么不在群集的所有节点上导出此变量?
Why dont you export this variable on all nodes of the cluster ?
无论如何,请使用在提交作业时配置如下的类
Configuration conf = new Configuration();
conf.set("mapred.map.child.env",<string value>);
Job job = new Job(conf);
值的格式为k1 = v1,k2 = v2
The format of the value is k1=v1,k2=v2
这篇关于如何从Mapper Hadoop设置系统环境变量?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!