我已经多次将表从Hive导出到SQL Server。我从来没有面对这个问题。

我将字段定界符用作“,”,并且还在SQL Server中创建了一个表。

hadoop@ubuntu:~/sqoop-1.3.0-cdh3u1/bin$ ./sqoop-export --connect 'jdbc:sqlserver://192.168.1.1;username=abcd;password=12345;database=HadoopTest' --table tmptempmeasurereport --export-dir /user/hive/warehouse/tmptempmeasurereport

12/03/29 16:20:21 INFO SqlServer.MSSQLServerManagerFactory: Using Microsoft's SQL Server - Hadoop Connector
12/03/29 16:20:21 INFO manager.SqlManager: Using default fetchSize of 1000
12/03/29 16:20:21 INFO tool.CodeGenTool: Beginning code generation
12/03/29 16:20:21 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [tmptempmeasurereport]
12/03/29 16:20:21 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [tmptempmeasurereport]
12/03/29 16:20:21 INFO orm.CompilationManager: HADOOP_HOME is /home/hadoop/hadoop-0.20.2-cdh3u2
12/03/29 16:20:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/1c5aae88cd7daca66aa665d4bab5b470/tmptempmeasurereport.jar
12/03/29 16:20:22 INFO mapreduce.ExportJobBase: Beginning export of tmptempmeasurereport
12/03/29 16:20:22 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [tmptempmeasurereport]
12/03/29 16:20:22 WARN mapreduce.ExportJobBase: IOException checking SequenceFile header: java.io.EOFException
12/03/29 16:20:23 INFO input.FileInputFormat: Total input paths to process : 2
12/03/29 16:20:23 INFO input.FileInputFormat: Total input paths to process : 2
12/03/29 16:20:23 INFO mapred.JobClient: Running job: job_201203291108_0645
12/03/29 16:20:24 INFO mapred.JobClient:  map 0% reduce 0%
12/03/29 16:20:29 INFO mapred.JobClient: Task Id : attempt_201203291108_0645_m_000000_0, Status : FAILED
java.util.NoSuchElementException
    at java.util.AbstractList$Itr.next(AbstractList.java:350)
    at tmptempmeasurereport.__loadFromFields(tmptempmeasurereport.java:383)
    at tmptempmeasurereport.parse(tmptempmeasurereport.java:332)
    at com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:79)
    at com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)

12/03/29 16:20:34 INFO mapred.JobClient: Task Id : attempt_201203291108_0645_m_000000_1, Status : FAILED
java.util.NoSuchElementException
    at java.util.AbstractList$Itr.next(AbstractList.java:350)
    at tmptempmeasurereport.__loadFromFields(tmptempmeasurereport.java:383)
    at tmptempmeasurereport.parse(tmptempmeasurereport.java:332)
    at com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:79)
    at com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)

12/03/29 16:20:38 INFO mapred.JobClient: Task Id : attempt_201203291108_0645_m_000000_2, Status : FAILED
java.util.NoSuchElementException
    at java.util.AbstractList$Itr.next(AbstractList.java:350)
    at tmptempmeasurereport.__loadFromFields(tmptempmeasurereport.java:383)
    at tmptempmeasurereport.parse(tmptempmeasurereport.java:332)
    at com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:79)
    at com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
    at org.apache.hadoop.mapred.Child.main(Child.java:264)

12/03/29 16:20:43 INFO mapred.JobClient: Job complete: job_201203291108_0645
12/03/29 16:20:43 INFO mapred.JobClient: Counters: 7
12/03/29 16:20:43 INFO mapred.JobClient:   Job Counters
12/03/29 16:20:43 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=18742
12/03/29 16:20:43 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/03/29 16:20:43 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/03/29 16:20:43 INFO mapred.JobClient:     Launched map tasks=4
12/03/29 16:20:43 INFO mapred.JobClient:     Data-local map tasks=4
12/03/29 16:20:43 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/03/29 16:20:43 INFO mapred.JobClient:     Failed map tasks=1
12/03/29 16:20:43 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 21.0326 seconds (0 bytes/sec)
12/03/29 16:20:43 INFO mapreduce.ExportJobBase: Exported 0 records.
12/03/29 16:20:43 ERROR tool.ExportTool: Error during export: Export job failed!

[我的看法是as- hadoop-0.20.2-cdh3,
sqoop-1.3.0-cdh3u1,
hive -0.7.1]

我做错什么了吗。?请帮助我。

非常感谢。

最佳答案

我建议您在sqoop命令中添加--fields-terminated-by和--lines-terminated-by选项。

10-08 12:37