本文介绍了当我使用配置单元变换功能时遇到错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

java.lang.RuntimeException:在关闭操作符时发生Hive运行时错误
位于org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:226)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
at org .apache.hadoop.mapred.MapTask.run(MapTask.java:372)org.apache.hadoop.mapred.Child上的
在java.security处$ 4.run(Child.java:255)
。 AccessController.doPrivileged(本地方法)
位于javax.security.auth.Subject.doAs(Subject.java:396)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136 )
在org.apache.hadoop.mapred.Child.main(Child.java:249)
引起:org.apache.hadoop.hive.ql.metadata.HiveException:[错误20003]:尝试关闭运行您的自定义脚本的运算符时发生错误。
at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:486)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator。 java:567)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:567)
at org.apache.hadoop.hive.ql.exec.Operator。关闭(Operator.java:567)
在org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:193)
... 8 more
code
HQL脚本如下所示:
$ b $ pre $ SELECT
TRANSFORM(userid,movieid,等级)
USING'python /home/daxingyu930/test_data_mapper2.py'
用户id,movieid,等级
;

python脚本非常简单,使用 \ t

我已经使用下面的shell脚本测试了Linux中的python脚本:

  cat test_data / u_data.txt | python test_data_mapper2.py 

请给我一些关于这个问题的想法,它让我疯狂,让我无法睡觉。
非常感谢。

解决方案

在使用您的自定义脚本之前,您应该将脚本添加到分布式缓存中。



例如。

 添加文件/home/daxingyu930/test_data_mapper2.py; 

SELECT
TRANSFORM(userid,movieid,rating)
USING'python test_data_mapper2.py'
用户id,movieid,等级
;


java.lang.RuntimeException: Hive Runtime Error while closing operators at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:226) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136) at org.apache.hadoop.mapred.Child.main(Child.java:249)Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error 20003]: An error occurred when trying to close the Operator running your custom script. at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:486) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:567) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:567) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:567) at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:193) ... 8 moreThe HQL script as follow:

SELECT
  TRANSFORM (userid, movieid, rating)
  USING 'python /home/daxingyu930/test_data_mapper2.py'
  AS userid, movieid, rating
;

the python script is very simple, using \t to split lines.

I have tested the python script in Linux with follow shell script:

cat test_data/u_data.txt | python test_data_mapper2.py

Pleas give me some idea about the question, it drive me crazy and make me cant sleep.Thanks very much.

解决方案

before using your custom script, you should add your scripts into distributed cache.

eg.

add file  /home/daxingyu930/test_data_mapper2.py;

SELECT
    TRANSFORM (userid, movieid, rating)
    USING 'python test_data_mapper2.py'
    AS userid, movieid, rating
;

这篇关于当我使用配置单元变换功能时遇到错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-05 19:24