我将CHD5.4.1设置为在Spark SQL上运行一些测试Spark。 Spark工作正常,但Spark SQL有一些问题。

我开始pyspark如下:
/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/lib/spark/bin/pyspark --master yarn-client
我想用Spark SQL在Hive中选择一个表:results = sqlCtx.sql("SELECT * FROM my_table").collect()
打印错误日志:http://pastebin.com/u98psBG8

> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/    /__ / .__/\_,_/_/ /_/\_\   version 1.3.0
>       /_/
>
> Using Python version 2.7.6 (default, Mar 22 2014 22:59:56)
> SparkContext available as sc, HiveContext available as sqlCtx.
> >>> results = sqlCtx.sql("SELECT * FROM vt_call_histories").collect() 15/05/20 06:57:07 INFO HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 15/05/20 06:57:07 INFO ObjectStore: ObjectStore, initialize called
> 15/05/20 06:57:08 WARN General: Plugin (Bundle) "org.datanucleus" is
> already registered. Ensure you dont have multiple JAR versions of the
> same plugin in the classpath. The URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-core-3.2.10.jar"
> is already registered, and you are trying to register an identical
> plugin located at URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-core-3.2.2.jar."
> 15/05/20 06:57:08 WARN General: Plugin (Bundle)
> "org.datanucleus.api.jdo" is already registered. Ensure you dont have
> multiple JAR versions of the same plugin in the classpath. The URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-api-jdo-3.2.6.jar"
> is already registered, and you are trying to register an identical
> plugin located at URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-api-jdo-3.2.1.jar."
> 15/05/20 06:57:08 WARN General: Plugin (Bundle)
> "org.datanucleus.store.rdbms" is already registered. Ensure you dont
> have multiple JAR versions of the same plugin in the classpath. The
> URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-rdbms-3.2.1.jar"
> is already registered, and you are trying to register an identical
> plugin located at URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-rdbms-3.2.9.jar."
> 15/05/20 06:57:08 INFO Persistence: Property datanucleus.cache.level2
> unknown - will be ignored 15/05/20 06:57:08 INFO Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 15/05/20 06:57:08 WARN HiveMetaStore: Retrying creating default
> database after error: Error creating transactional connection factory
> javax.jdo.JDOFatalInternalException: Error creating transactional
> connection factory   at
> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)   at
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)   at
> java.security.AccessController.doPrivileged(Native Method)   at
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)   at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:193)
> at
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2845)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2864)   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)
> at
> org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:241)
> at
> org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:240)
> at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:86)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)   at
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)   at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
> at py4j.Gateway.invoke(Gateway.java:259)   at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
> at py4j.commands.CallCommand.execute(CallCommand.java:79)   at
> py4j.GatewayConnection.run(GatewayConnection.java:207)   at
> java.lang.Thread.run(Thread.java:745) NestedThrowablesStackTrace:
> java.lang.reflect.InvocationTargetException   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
> at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
> at
> org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)
> at
> org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
> at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> at
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
> at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)   at
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)   at
> java.security.AccessController.doPrivileged(Native Method)   at
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)   at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:193)
> at
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2845)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2864)   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)
> at
> org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:241)
> at
> org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:240)
> at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:86)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)   at
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)   at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
> at py4j.Gateway.invoke(Gateway.java:259)   at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
> at py4j.commands.CallCommand.execute(CallCommand.java:79)   at
> py4j.GatewayConnection.run(GatewayConnection.java:207)   at
> java.lang.Thread.run(Thread.java:745) Caused by:
> java.lang.ExceptionInInitializerError   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at java.lang.Class.newInstance(Class.java:374)   at
> org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47)
> at
> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
> at
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
> at
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)
> at
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)
> ... 73 more Caused by: java.lang.SecurityException: sealing violation:
> package org.apache.derby.impl.services.locks is sealed   at
> java.net.URLClassLoader.getAndVerifyPackage(URLClassLoader.java:388)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:417)   at
> java.net.URLClassLoader.access$100(URLClassLoader.java:71)   at
> java.net.URLClassLoader$1.run(URLClassLoader.java:361)   at
> java.net.URLClassLoader$1.run(URLClassLoader.java:355)   at
> java.security.AccessController.doPrivileged(Native Method)   at
> java.net.URLClassLoader.findClass(URLClassLoader.java:354)   at
> java.lang.ClassLoader.loadClass(ClassLoader.java:425)   at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)   at
> java.lang.ClassLoader.loadClass(ClassLoader.java:358)   at
> java.lang.ClassLoader.defineClass1(Native Method)   at
> java.lang.ClassLoader.defineClass(ClassLoader.java:800)   at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)   at
> java.net.URLClassLoader.access$100(URLClassLoader.java:71)   at
> java.net.URLClassLoader$1.run(URLClassLoader.java:361)   at
> java.net.URLClassLoader$1.run(URLClassLoader.java:355)   at
> java.security.AccessController.doPrivileged(Native Method)   at
> java.net.URLClassLoader.findClass(URLClassLoader.java:354)   at
> java.lang.ClassLoader.loadClass(ClassLoader.java:425)   at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)   at
> java.lang.ClassLoader.loadClass(ClassLoader.java:358)   at
> java.lang.Class.forName0(Native Method)   at
> java.lang.Class.forName(Class.java:190)   at
> org.apache.derby.impl.services.monitor.BaseMonitor.getImplementations(Unknown
> Source)   at
> org.apache.derby.impl.services.monitor.BaseMonitor.getDefaultImplementations(Unknown
> Source)   at
> org.apache.derby.impl.services.monitor.BaseMonitor.runWithState(Unknown
> Source)   at
> org.apache.derby.impl.services.monitor.FileMonitor.<init>(Unknown
> Source)   at
> org.apache.derby.iapi.services.monitor.Monitor.startMonitor(Unknown
> Source)   at org.apache.derby.iapi.jdbc.JDBCBoot.boot(Unknown Source)
> at org.apache.derby.jdbc.EmbeddedDriver.boot(Unknown Source)   at
> org.apache.derby.jdbc.EmbeddedDriver.<clinit>(Unknown Source)   ... 83
> more 15/05/20 06:57:08 INFO HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 15/05/20 06:57:08 INFO ObjectStore: ObjectStore, initialize called
> 15/05/20 06:57:08 WARN General: Plugin (Bundle) "org.datanucleus" is
> already registered. Ensure you dont have multiple JAR versions of the
> same plugin in the classpath. The URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-core-3.2.10.jar"
> is already registered, and you are trying to register an identical
> plugin located at URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-core-3.2.2.jar."
> 15/05/20 06:57:08 WARN General: Plugin (Bundle)
> "org.datanucleus.api.jdo" is already registered. Ensure you dont have
> multiple JAR versions of the same plugin in the classpath. The URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-api-jdo-3.2.6.jar"
> is already registered, and you are trying to register an identical
> plugin located at URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-api-jdo-3.2.1.jar."
> 15/05/20 06:57:08 WARN General: Plugin (Bundle)
> "org.datanucleus.store.rdbms" is already registered. Ensure you dont
> have multiple JAR versions of the same plugin in the classpath. The
> URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-rdbms-3.2.1.jar"
> is already registered, and you are trying to register an identical
> plugin located at URL
> "file:/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/jars/datanucleus-rdbms-3.2.9.jar."
> 15/05/20 06:57:08 INFO Persistence: Property datanucleus.cache.level2
> unknown - will be ignored 15/05/20 06:57:08 INFO Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> Traceback (most recent call last):   File "<stdin>", line 1, in
> <module>   File
> "/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/lib/spark/python/pyspark/sql/context.py",
> line 528, in sql
>     return DataFrame(self._ssql_ctx.sql(sqlQuery), self)   File "/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/lib/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
> line 538, in __call__   File
> "/opt/cloudera/parcels/CDH-5.4.1-1.cdh5.4.1.p0.6/lib/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py",
> line 300, in get_return_value py4j.protocol.Py4JJavaError: An error
> occurred while calling o31.sql. : java.lang.RuntimeException:
> java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)
> at
> org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:241)
> at
> org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:240)
> at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:86)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)   at
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)   at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
> at py4j.Gateway.invoke(Gateway.java:259)   at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
> at py4j.commands.CallCommand.execute(CallCommand.java:79)   at
> py4j.GatewayConnection.run(GatewayConnection.java:207)   at
> java.lang.Thread.run(Thread.java:745) Caused by:
> java.lang.RuntimeException: Unable to instantiate
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient   at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1488)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2845)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2864)   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)
> ... 16 more Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)
> ... 21 more Caused by: javax.jdo.JDOFatalInternalException: Error
> creating transactional connection factory NestedThrowables:
> java.lang.reflect.InvocationTargetException   at
> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)   at
> javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)   at
> java.security.AccessController.doPrivileged(Native Method)   at
> javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)   at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> at
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:610)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:193)
> at
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
> ... 26 more Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
> at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
> at
> org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)
> at
> org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
> at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> at
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
> at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
> ... 55 more Caused by: java.lang.NoClassDefFoundError: Could not
> initialize class org.apache.derby.jdbc.EmbeddedDriver   at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

最佳答案

我找到了解决我问题的答案。

安装CDH 5.3并通过Cloudera Manager启用Spark之后,请按照以下步骤启用Hive访问:

  • 确保hive通过HiveServer2在Hive CLI和JDBC中正常工作(默认情况下应正常工作)。
  • 将hive-site.xml复制到您的SPARK_HOME/conf文件夹中。
  • 将Hive库添加到Spark类路径->编辑SPARK_HOME/bin/compute-classpath.sh文件并添加以下内容:
    CLASSPATH =“$ CLASSPATH:/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/hive/lib/*”(特定于CDH的示例,请使用您的配置单元库位置)。
  • 重新启动Spark集群,以使所有内容生效。

  • Read full document here

    关于hive - 在CHD5.4.1 NoClassDefFoundError上运行spark SQL,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/30343932/

    10-13 07:45
    查看更多