如何在 Datastax Enterprise 上启动 Spark Thrift Server(因 java.lang.NoSuchMethodError 失败:...LogDivertAppende

Posted

技术标签:

【中文标题】如何在 Datastax Enterprise 上启动 Spark Thrift Server(因 java.lang.NoSuchMethodError 失败:...LogDivertAppender.setWriter)?【英文标题】:How to start Spark Thrift Server on Datastax Enterprise (fails with java.lang.NoSuchMethodError: ...LogDivertAppender.setWriter)? 【发布时间】:2017-10-05 20:22:49 【问题描述】:

当我尝试运行 datastax spark-sql-thriftserver 时,出现以下错误:

dse spark-sql-thriftserver start \
  --conf spark.cores.max=10 \
  --conf spark.executor.memory=2g \
  --hiveconf hive.server2.thrift.port=10001

Spark 命令:/opt/jdk1.8.0_112/jre//bin/java -cp /etc/dse/spark/:/usr/share/dse/spark/jars/*:/etc/dse/hadoop2-client/ -Djava.library.path=/usr/share/dse/hadoop2-client/lib/native:/usr/share/dse/cassandra/lib/sigar-bin: -Dcassandra.logdir=/var/log/cassandra -XX:MaxHeapFreeRatio=50 -XX:MinHeapFreeRatio=20 -Dguice_include_stack_traces=OFF -Ddse.system_memory_in_mb=32174 -Dcassandra.config.loader=com.datastax.bdp.config.DseConfigurationLoader -Dlogback.configurationFile=/etc/dse/spark/logback-spark.xml -Dcassandra.logdir=/var/log/cassandra -Ddse.client.configuration.impl=com.datastax.bdp.transport.client.HadoopBasedClientConfiguration -Dderby.stream.error.method=com.datastax.bdp.derby.LogbackBridge.getLogger -Xmx1024M org.apache.spark.deploy.SparkSubmit --conf spark.executor.memory=2g --conf spark.cores.max=10 --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 火花内部 --hiveconf hive.server2.thrift.port=10001 ========================================= WARN 2017-05-07 22:21: 55 org.apache.spark.SparkContext:使用现有的 SparkContext,一些 配置可能不会生效。错误 2017-05-07 22:22:04 org.apache.spark.deploy.DseSparkSubmitBootstrapper:启动失败 或提交 Spark 应用程序 java.lang.NoSuchMethodError: org.apache.hive.service.cli.operation.LogDivertAppender.setWriter(Ljava/io/Writer;)V 在 org.apache.hive.service.cli.operation.LogDivertAppender.(LogDivertAppender.java:166) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.hive.service.cli.operation.OperationManager.initOperationLogCapture(OperationManager.java:85) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.hive.service.cli.operation.OperationManager.init(OperationManager.java:63) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 scala.collection.Iterator$class.foreach(Ite​​rator.scala:893) ~[scala-library-2.11.8.jar:na] 在 scala.collection.AbstractIterator.foreach(Ite​​rator.scala:1336) ~[scala-library-2.11.8.jar:na] 在 scala.collection.IterableLike$class.foreach(Ite​​rableLike.scala:72) ~[scala-library-2.11.8.jar:na] 在 scala.collection.AbstractIterable.foreach(Ite​​rable.scala:54) ~[scala-library-2.11.8.jar:na] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$class.initCompositeService(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.initCompositeService(SparkSQLSessionManager.scala:36) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.init(SparkSQLSessionManager.scala:58) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 scala.collection.Iterator$class.foreach(Ite​​rator.scala:893) ~[scala-library-2.11.8.jar:na] 在 scala.collection.AbstractIterator.foreach(Ite​​rator.scala:1336) ~[scala-library-2.11.8.jar:na] 在 scala.collection.IterableLike$class.foreach(Ite​​rableLike.scala:72) ~[scala-library-2.11.8.jar:na] 在 scala.collection.AbstractIterable.foreach(Ite​​rable.scala:54) ~[scala-library-2.11.8.jar:na] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$class.initCompositeService(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIService.initCompositeService(SparkSQLCLIService.scala:39) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIService.init(SparkSQLCLIService.scala:62) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$$anonfun$initCompositeService$1.apply(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 scala.collection.Iterator$class.foreach(Ite​​rator.scala:893) ~[scala-library-2.11.8.jar:na] 在 scala.collection.AbstractIterator.foreach(Ite​​rator.scala:1336) ~[scala-library-2.11.8.jar:na] 在 scala.collection.IterableLike$class.foreach(Ite​​rableLike.scala:72) ~[scala-library-2.11.8.jar:na] 在 scala.collection.AbstractIterable.foreach(Ite​​rable.scala:54) ~[scala-library-2.11.8.jar:na] 在 org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService$class.initCompositeService(SparkSQLCLIService.scala:79) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.initCompositeService(HiveThriftServer2.scala:27​​2) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.init(HiveThriftServer2.scala:292) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:94) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_112] 在 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_112] 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_112] 在 java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_112] 在 org.apache.spark.deploy.DseSparkSubmit$.org$apache$spark$deploy$DseSparkSubmit$$runMain(DseSparkSubmit.scala:730) ~[dse-spark-5.1.0.jar:5.1.0] 在 org.apache.spark.deploy.DseSparkSubmit$.doRunMain$1(DseSparkSubmit.scala:175) ~[dse-spark-5.1.0.jar:5.1.0] 在 org.apache.spark.deploy.DseSparkSubmit$.submit(DseSparkSubmit.scala:200) ~[dse-spark-5.1.0.jar:5.1.0] 在 org.apache.spark.deploy.DseSparkSubmit$.main(DseSparkSubmit.scala:109) ~[dse-spark-5.1.0.jar:5.1.0] 在 org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:74) ~[dse-spark-5.1.0.jar:5.1.0] 在 org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.0.jar:5.1.0] 错误 2017-05-07 22:22:15 org.apache.spark.util.Utils:线程 Thread-0 中未捕获的异常 java.lang.NullPointerException: null at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$$anonfun$main$1.apply$mcV$sp(HiveThriftServer2.scala:85) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:215) ~[spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:187) ~[spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:187) ~[spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:187) ~[spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1953) ~[spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:187) [spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:187) [spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:187) [spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 scala.util.Try$.apply(Try.scala:192) [scala-library-2.11.8.jar:na] 在 org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:187) [spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:177) [spark-core_2.11-2.0.2.6.jar:2.0.2.6] 在 org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) [hadoop-common-2.7.1.3.jar:na]

日志的相关部分: ERROR 2017-05-07 22:22:04 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application java.lang.NoSuchMethodError: org.apache.hive.service.cli.operation.LogDivertAppender.setWriter(Ljava/io/Writer;)V at org.apache.hive.service.cli.operation.LogDivertAppender.(LogDivertAppender.java:166) ~[spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6]

我有 opscenter 6.1 和 dse 5.1

====

更新 1

/usr/share/dse/hadoop2-client/lib/native 中的文件

libhadoop.a libhadoop.so libhadoop.so.1.0.0 libhadooppipes.a libhadooputils.a libhdfs.a libhdfs.so libhdfs.so.0.0.0

/usr/share/dse/spark/jars 中的文件

/usr/share/dse/spark/lib 中的文件

JavaEWAH-0.3.2.jar commons-dbcp-1.4.jar httpclient-4.5.2.jar jersey-server-2.22.2.jar netty-3.9.8.Final.jar reflectasm-1.10.1.jar spark-sketch_2.11-2.0.2.6.jar RoaringBitmap-0.5.11.jar commons-io-2.5.jar httpcore-4.4.4.jar jline-2.14.2.jar noggit-0.6.jar scala-compiler-2.11.8.jar spark-sql_2.11-2.0.2.6.jar antlr4-runtime-4.5.3.jar commons-lang3-3.4.jar httpmime-4.4.1.jar joda-convert-1.2.jar objenesis-2.1.jar scala-xml_2.11-1.0.4.jar spark-streaming_2.11-2.0.2.6.jar aopalliance-repackaged-2.4.0-b34.jar commons-math3-3.4.1.jar ivy-2.4.0.jar joda-time-2.9.3.jar opencsv-2.3.jar scalap-2.11.8.jar spark-tags_2.11-2.0.2.6.jar arpack_combined_all-0.1.jar compress-lzf-1.0.3.jar jackson-annotations-2.5.3.jar jodd-core-3.5.2.jar oro-2.0.8.jar scalatest_2.11-2.2.6.jar spark-unsafe_2.11-2.0.2.6.jar avro-1.7.7.jar core-1.1.2.jar jackson-core-asl-1.9.13.jar jpam-1.1.jar osgi-resource-locator-1.0.1.jar snappy-0.2.jar spire-macros_2.11-0.7.4.jar avro-ipc-1.7.7.jar datanucleus-api-jdo-3.2.6.jar jackson-mapper-asl-1.9.13.jar json4s-ast_2.11-3.2.11.jar paranamer-2.8.jar solr-solrj-6.0.1.0.1596.jar spire_2.11-0.7.4.jar avro-mapred-1.7.7-hadoop2.jar datanucleus-core-3.2.10.jar jackson-module-scala_2.11-2.5.3.jar json4s-core_2.11-3.2.11.jar parquet-column-1.7.0.jar spark-cassandra-connector-unshaded_2.11-2.0.1.jar stax-api-1.0.1.jar bonecp-0.8.0.RELEASE.jar 数据核-rdbms-3.2.9.jar javax.annotation-api-1.2.jar json4s-jackson_2.11-3.2.11.jar parquet-common-1.7.0.jar spark-catalyst_2.11-2.0.2.6.jar stax2-api-3.1.4.jar 微风-宏_2.11-0.11.2.jar derby-10.10.2.0.jar javax.inject-2.4.0-b34.jar jsr166e-1.1.0.jar parquet-encoding-1.7.0.jar spark-core_2.11-2.0.2.6.jar stream-2.7.0.jar 微风_2.11-0.11.2.jar eigenbase-properties-1.1.5.jar javax.ws.rs-api-2.0.1.jar jta-1.1.jar parquet-format-2.3.0-incubating.jar spark-graphx_2.11-2.0.2.6.jar super-csv-2.2.0.jar calcite-avatica-1.2.0-incubating.jar hive-beeline-1.2.1.spark2.jar javolution-5.5.1.jar jtransforms-2.4.0.jar parquet-generator-1.7.0.jar spark-hive-thriftserver_2.11-2.0.2.6.jar univocity-parsers-2.1.1.jar calcite-core-1.2.0-incubating.jar hive-cli-1.2.1.spark2.jar jdo-api-3.0.1.jar jul-to-slf4j-1.7.13.jar parquet-hadoop-1.7.0.jar spark-hive_2.11-2.0.2.6.jar 未使用-1.0.0.jar calcite-linq4j-1.2.0-incubating.jar hive-exec-1.2.1.spark2.jar jersey-client-2.22.2.jar kryo-3.0.3.jar parquet-hadoop-bundle-1.6.0.jar spark-launcher_2.11-2.0.2.6.jar velocity-1.7.jar cassandra-driver-mapping-3.1.4.jar hive-jdbc-1.2.1.spark2.jar jersey-common-2.22.2.jar libfb303-0.9.3.jar parquet-jackson-1.7.0.jar spark-mllib-local_2.11-2.0.2.6.jar woodstox-core-asl-4.4.1.jar chill-java-0.8.0.jar hive-metastore-1.2.1.spark2.jar jersey-container-servlet-2.22.2.jar mail-1.4.7.jar pmml-model-1.2.15.jar spark-mllib_2.11-2.0.2.6.jar xbean-asm5-shaded-4.4.jar chill_2.11-0.8.0.jar hk2-api-2.4.0-b34.jar jersey-container-servlet-core-2.22.2.jar mesos-0.21.1-shaded-protobuf.jar pmml-schema-1.2.15.jar spark-network-common_2.11-2.0.2.6.jar commons-beanutils-1.9.3.jar hk2-locator-2.4.0-b34.jar jersey-guava-2.22.2.jar 指标-json-3.1.2.jar py4j-0.10.1.jar spark-network-shuffle_2.11-2.0.2.6.jar commons-codec-1.10.jar hk2-utils-2.4.0-b34.jar jersey-media-jaxb-2.22.2.jar minlog-1.3.0.jar pyrolite-4.13.jar spark-repl_2.11-2.0.2.6.jar


更新 2

我也安装了 DSE 5.0,但它也不包含任何 apache-log4j jar,并且 spark-sql-thrift-server 工作正常。

apache-log4j-extras-1.2.17.jar 放入/usr/share/dse/spark/lib 后,我收到错误消息(请参阅https://pastebin.com/KjgsEhnw 了解整个日志):

Spark Command: /opt/jdk1.8.0_112/bin/java -cp /etc/dse/spark/:/usr/share/dse/spark/jars/*:/etc/dse/hadoop2-client/ -Djava.library.path=/usr/share/dse/hadoop2-client/lib/native:/usr/share/dse/cassandra/lib/sigar-bin: -Dcassandra.logdir=/var/log/cassandra -XX:MaxHeapFreeRatio=50 -XX:MinHeapFreeRatio=20 -Dguice_include_stack_traces=OFF -Ddse.system_memory_in_mb=32174 -Dcassandra.config.loader=com.datastax.bdp.config.DseConfigurationLoader -Dlogback.configurationFile=/etc/dse/spark/logback-spark.xml -Dcassandra.logdir=/var/log/cassandra -Ddse.client.configuration.impl=com.datastax.bdp.transport.client.HadoopBasedClientConfiguration -Dderby.stream.error.method=com.datastax.bdp.derby.LogbackBridge.getLogger -Xmx1024M org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal
========================================
WARN  2017-05-19 14:11:31 org.apache.spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
WARN  2017-05-19 14:11:36 org.apache.hadoop.hive.metastore.HiveMetaStore: Retrying creating default database after error: Unexpected exception caught.
javax.jdo.JDOFatalInternalException: Unexpected exception caught.
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193) ~[jdo-api-3.0.1.jar:3.0.1]
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) ~[jdo-api-3.0.1.jar:3.0.1]
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) ~[jdo-api-3.0.1.jar:3.0.1]
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) ~[hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) ~[hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) ~[hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) ~[hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) ~[hadoop-common-2.7.1.3.jar:na]
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) ~[hadoop-common-2.7.1.3.jar:na]
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) ~[hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) ~[hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_112]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_112]
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_112]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [na:1.8.0_112]
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) [hive-metastore-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) [hive-exec-1.2.1.spark2.jar:1.2.1.spark2]
    at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189) [spark-hive_2.11-2.0.2.6.jar:2.0.2.6]
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:247) [spark-hive_2.11-2.0.2.6.jar:2.0.2.6]
    at org.apache.spark.sql.hive.HiveUtils$.newClientForExecution(HiveUtils.scala:250) [spark-hive_2.11-2.0.2.6.jar:2.0.2.6]
    at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:88) [spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6]
    at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala) [spark-hive-thriftserver_2.11-2.0.2.6.jar:2.0.2.6]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_112]
    at org.apache.spark.deploy.DseSparkSubmit$.org$apache$spark$deploy$DseSparkSubmit$$runMain(DseSparkSubmit.scala:730) [dse-spark-5.1.0.jar:5.1.0]
    at org.apache.spark.deploy.DseSparkSubmit$.doRunMain$1(DseSparkSubmit.scala:175) [dse-spark-5.1.0.jar:5.1.0]
    at org.apache.spark.deploy.DseSparkSubmit$.submit(DseSparkSubmit.scala:200) [dse-spark-5.1.0.jar:5.1.0]
    at org.apache.spark.deploy.DseSparkSubmit$.main(DseSparkSubmit.scala:109) [dse-spark-5.1.0.jar:5.1.0]
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:74) [dse-spark-5.1.0.jar:5.1.0]
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.0.jar:5.1.0]
Caused by: java.lang.reflect.InvocationTargetException: null
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_112]
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) ~[jdo-api-3.0.1.jar:3.0.1]
    at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_112]
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) ~[jdo-api-3.0.1.jar:3.0.1]
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) ~[jdo-api-3.0.1.jar:3.0.1]
    ... 48 common frames omitted
Caused by: java.lang.NoClassDefFoundError: org/apache/log4j/or/RendererMap
    at org.apache.log4j.Hierarchy.<init>(Hierarchy.java:97) ~[apache-log4j-extras-1.2.17.jar:na]
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:82) ~[apache-log4j-extras-1.2.17.jar:na]
    at org.apache.log4j.Logger.getLogger(Logger.java:104) ~[apache-log4j-extras-1.2.17.jar:na]
    at org.datanucleus.util.Log4JLogger.<init>(Log4JLogger.java:49) ~[datanucleus-core-3.2.10.jar:na]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.8.0_112]
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [na:1.8.0_112]
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.8.0_112]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [na:1.8.0_112]
    at org.datanucleus.util.NucleusLogger.getLoggerInstance(NucleusLogger.java:237) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.util.NucleusLogger.<clinit>(NucleusLogger.java:205) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.plugin.PluginRegistryFactory.newPluginRegistry(PluginRegistryFactory.java:74) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:61) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225) ~[datanucleus-core-3.2.10.jar:na]
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416) ~[datanucleus-api-jdo-3.2.6.jar:na]
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301) ~[datanucleus-api-jdo-3.2.6.jar:na]
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) ~[datanucleus-api-jdo-3.2.6.jar:na]
    ... 56 common frames omitted
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.or.RendererMap
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[na:1.8.0_112]
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_112]
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) ~[na:1.8.0_112]
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_112]
    ... 75 common frames omitted

【问题讨论】:

额外的 jar 没有帮助,但我也缺少 log4j jar,我从新下载的 spark 中复制了它,并将其放在我的 DSE 5.1 中,现在它似乎工作正常。感谢您为我指明正确的方向,如果您可以将评论框定为答案,我想奖励您赏金。 【参考方案1】:

查看org.apache.hive.service.cli.operation.LogDivertAppender.(LogDivertAppender.java:166) 显示以下行:

setWriter(writer);

这导致apache-log4j-extras-1.2.17.jar,我可以在我的 Spark 安装中看到 jars/apache-log4j-extras-1.2.17.jar

我在 jar 列表中看不到jars/apache-log4j-extras-1.2.17.jar,似乎以某种方式与受影响的缺失方法相关。

我的建议是从其他 Spark 下载中获取 apache-log4j-extras-1.2.17.jar 并将其放在 /usr/share/dse/spark/lib 甚至 /usr/share/dse/spark/jars 下,因为它目前是空的。

我发现它可以在 Download Apache Extras Companion™ for Apache log4j™ 找到。

这可能需要也可能不需要相应的 log4j jar。

我建议向 DSE 支持报告问题。

【讨论】:

以上是关于如何在 Datastax Enterprise 上启动 Spark Thrift Server(因 java.lang.NoSuchMethodError 失败:...LogDivertAppende的主要内容,如果未能解决你的问题,请参考以下文章

Datastax Enterprise Ubuntu 安装系统限制

在 DataStax Enterprise 中更改减速器/映射器的数量

麦格理银行借助DataStax Enterprise (DSE) 驱动数字化转型

红帽 6.4 上的 Datastax Enterprise 5.0.0

Docker 上的 DataStax Enterprise:由于 /hadoop/conf 目录不可写而无法启动

无法从 Python 应用程序连接到 DataStax Enterprise 集群