SQLContext.sql 上的 Spark NoSuchMethodError(Cloudera 5.8.0 上的 Spark 1.6.0)
Posted
技术标签:
【中文标题】SQLContext.sql 上的 Spark NoSuchMethodError(Cloudera 5.8.0 上的 Spark 1.6.0)【英文标题】:Spark NoSuchMethodError on SQLContext.sql (Spark 1.6.0 on Cloudera 5.8.0) 【发布时间】:2017-04-25 16:33:50 【问题描述】:我正在尝试从 java 程序中使用 Spark SQL,其中 pom.xml 中的依赖项指向 Spark 版本 1.6.0。下面是程序
package spark_test;
import java.util.List;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.SQLContext;
import org.apache.spark.sql.hive.HiveContext;
public class MyTest
private static SparkConf sparkConf;
public static void main(String[] args)
String warehouseLocation = args[0];
sparkConf = new SparkConf().setAppName("Hive Test").setMaster("local[*]")
.set("spark.sql.warehouse.dir", warehouseLocation);
JavaSparkContext ctx = new JavaSparkContext(sparkConf);
SQLContext sc = new HiveContext(ctx.sc());
System.out.println(" Current Tables: ");
DataFrame results = sc.sql("show tables");
results.show();
但是,我在线程“main”java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame; 中遇到异常我正在创建一个平面 jar 并从命令行运行 jar
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/PortalHandlerTest.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/SparkTest.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [file:/home/cloudera/workspace/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/workspace/JARs/slf4j-log4j12-1.7.22.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/04/25 08:44:07 INFO SparkContext: Running Spark version 2.1.0
17/04/25 08:44:07 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/04/25 08:44:07 WARN SparkContext: Support for Scala 2.10 is deprecated as of Spark 2.1.0
17/04/25 08:44:08 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/25 08:44:08 INFO SecurityManager: Changing view acls to: cloudera
17/04/25 08:44:08 INFO SecurityManager: Changing modify acls to: cloudera
17/04/25 08:44:08 INFO SecurityManager: Changing view acls groups to:
17/04/25 08:44:08 INFO SecurityManager: Changing modify acls groups to:
17/04/25 08:44:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(cloudera); groups with view permissions: Set(); users with modify permissions: Set(cloudera); groups with modify permissions: Set()
17/04/25 08:44:09 INFO Utils: Successfully started service 'sparkDriver' on port 43850.
17/04/25 08:44:09 INFO SparkEnv: Registering MapOutputTracker
17/04/25 08:44:09 INFO SparkEnv: Registering BlockManagerMaster
17/04/25 08:44:09 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/04/25 08:44:09 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/04/25 08:44:09 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-4199c353-4e21-4863-8b78-cfa280ce2de3
17/04/25 08:44:09 INFO MemoryStore: MemoryStore started with capacity 375.7 MB
17/04/25 08:44:09 INFO SparkEnv: Registering OutputCommitCoordinator
17/04/25 08:44:09 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
17/04/25 08:44:09 INFO Utils: Successfully started service 'SparkUI' on port 4041.
17/04/25 08:44:09 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.2.15:4041
17/04/25 08:44:10 INFO Executor: Starting executor ID driver on host localhost
17/04/25 08:44:10 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41716.
17/04/25 08:44:10 INFO NettyBlockTransferService: Server created on 10.0.2.15:41716
17/04/25 08:44:10 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/04/25 08:44:10 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.2.15, 41716, None)
17/04/25 08:44:10 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.2.15:41716 with 375.7 MB RAM, BlockManagerId(driver, 10.0.2.15, 41716, None)
17/04/25 08:44:10 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.2.15, 41716, None)
17/04/25 08:44:10 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.2.15, 41716, None)
Current Tables:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;
at spark_test.MyTest.main(MyTest.java:31)
17/04/25 08:44:10 INFO SparkContext: Invoking stop() from shutdown hook
17/04/25 08:44:10 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4041
17/04/25 08:44:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/04/25 08:44:10 INFO MemoryStore: MemoryStore cleared
17/04/25 08:44:10 INFO BlockManager: BlockManager stopped
17/04/25 08:44:10 INFO BlockManagerMaster: BlockManagerMaster stopped
17/04/25 08:44:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/04/25 08:44:10 INFO SparkContext: Successfully stopped SparkContext
17/04/25 08:44:10 INFO ShutdownHookManager: Shutdown hook called
17/04/25 08:44:10 INFO ShutdownHookManager: Deleting directory /tmp/spark-93fca3d1-ff79-4d2b-b07f-a340c1a60416
可能是因为我的 pom 有 spark 版本 1.6.0,但 cloudera VM 运行的是 2.1.0。 spark-shell 正在运行 spark 版本 1.6.0 并且工作正常。如何在我的 java 程序中强制版本为 1.6.0?
任何帮助将不胜感激。
【问题讨论】:
【参考方案1】:DataFrame() 在 Spark 2 中被 Dataset() 取代。如果您正在运行带有 Spark 2.1 服务器端的 Spark 1.6 客户端,则需要导入 org.apache.spark.sql.Dataset 并使用它。更多信息here。从开发人员体验的角度来看,大多数 API 都是相似的。老实说,如果不是服务器版本,至少在客户端中使用 Spark 2.0 依赖项会更好。
【讨论】:
我在 Spark 2.1.0 下编译了相同的程序以使用 Datasets,但是无法通过程序读取任何 Hive 表,即使我在 /usr 中设置了warehouse.dir 并复制了 hive-site.xml /lib/spark/conf/ 。请看帖子here 这是一个不同的问题。我会在你的另一篇文章中解决这个问题。 如果您的原始问题已通过上述步骤解决,请确认。 数据集在 Spark 1.6.0 中不可用。所以一旦用 Spark 2.1.0 编译,Using Dataset 确实解决了这个问题。【参考方案2】:您的日志显示您正在运行针对 1.6.0 Spark 集群的 Spark 2.1 库。我的猜测是您的客户端和服务器库不是二进制兼容的。我建议您在应用程序中使用与服务器中存在的版本相同的版本,以确保兼容性。
【讨论】:
这就是我的猜测。但是,我不确定如何强制我的 java 程序运行 Spark 1.6.0 而不是 Spark 2.1.0。另外,我在 Spark 2.1.0 下编译了相同的程序,但是即使我在 /usr/lib/spark/conf/ 中设置了 warehouse.dir 并复制了 hive-site.xml ,也无法通过该程序读取任何 Hive 表。请看帖子here以上是关于SQLContext.sql 上的 Spark NoSuchMethodError(Cloudera 5.8.0 上的 Spark 1.6.0)的主要内容,如果未能解决你的问题,请参考以下文章
带有浮点数 Spark 1.6 的 DataFrame 上的 SQL 百分位数 - 任何可能的解决方法? [复制]