hive> select count(*) from ipaddress where country='China';
 WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X releases.
 Query ID = pruthviraj_20160922163728_79a0f8d6-5ea6-4cb5-8dd2-d3bb63f8baaf
 Total jobs = 1
 Launching Job 1 out of 1
 Number of reduce tasks determined at compile time: 1
 In order to change the average load for a reducer (in bytes):
   set hive.exec.reducers.bytes.per.reducer=<number>
 In order to limit the maximum number of reducers:
   set hive.exec.reducers.max=<number>
 In order to set a constant number of reducers:
   set mapreduce.job.reduces=<number>
 Starting Job = job_1474512819880_0032, Tracking URL = http://Pruthvis-MacBook-Pro.local:8088/proxy/application_1474512819880_0032/
 Kill Command = /Users/pruthviraj/lab/software/hadoop-2.7.0/bin/hadoop job  -kill job_1474512819880_0032
 Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
 2016-09-22 16:37:45,094 Stage-1 map = 0%,  reduce = 0%
 2016-09-22 16:37:52,532 Stage-1 map = 100%,  reduce = 0%
 2016-09-22 16:37:59,901 Stage-1 map = 100%,  reduce = 100%
 Ended Job = job_1474512819880_0032
 MapReduce Jobs Launched:
 Stage-Stage-1: Map: 1  Reduce: 1   HDFS Read: 10393 HDFS Write: 102 SUCCESS
 Total MapReduce CPU Time Spent: 0 msec
 OK
 Exception in thread "main"
 Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "main"
 Pruthvis-MacBook-Pro:apache-hive-2.1.0-bin pruthviraj$

我在Mac OS 10上运行此程序,我已经厌倦了premmax大小,但仍无法正常工作。任何帮助将不胜感激。

最佳答案

转到环境文件并将-Xmx2048m增加到-Xmx4096m

-Xmx4096m -XX:PermSize=128m -XX:MaxPermSize=128m

关于java - HIVE计数*内存不足,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/39637530/

10-11 02:26