为啥即使 jar 存在,火花应用程序也会因 java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
Posted
技术标签:
【中文标题】为啥即使 jar 存在,火花应用程序也会因 java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig 而失败?【英文标题】:Why does spark application fail with java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig even though the jar exists?为什么即使 jar 存在,火花应用程序也会因 java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig 而失败? 【发布时间】:2021-04-16 14:14:51 【问题描述】:我正在开发一个装有 Spark 2.3.x 的 Hadoop 集群。对于我的用例,我需要从 Internet 下载的 Spark 2.4x 并将其移动到我的服务器并提取到一个新目录中:~/john/spark247ext/spark-2.4.7-bin-hadoop2.7
这就是我的 Spark2.4.7 目录的样子:
username@:[~/john/spark247ext/spark-2.4.7-bin-hadoop2.7] 173 $ ls
bin conf data examples jars kubernetes LICENSE licenses NOTICE python R README.md RELEASE sbin yarn
这些是我的 bin 目录的内容。
username@:[~/john/spark247ext/spark-2.4.7-bin-hadoop2.7/bin] 175 $ ls
beeline find-spark-home.cmd pyspark2.cmd spark-class sparkR2.cmd spark-shell.cmd spark-submit
beeline.cmd load-spark-env.cmd pyspark.cmd spark-class2.cmd sparkR.cmd spark-sql spark-submit2.cmd
docker-image-tool.sh load-spark-env.sh run-example spark-class.cmd spark-shell spark-sql2.cmd spark-submit.cmd
find-spark-home pyspark run-example.cmd sparkR spark-shell2.cmd spark-sql.cmd
我正在使用以下 spark spark 提交命令提交我的 spark 代码:
./spark-submit --master yarn --deploy-mode cluster --driver-class-path /home/john/jars/mssql-jdbc-9.2.0.jre8.jar --jars /home/john/jars/spark-bigquery-with-dependencies_2.11-0.19.1.jar,/home/john/jars/mssql-jdbc-9.2.0.jre8.jar --driver-memory 1g --executor-memory 4g --executor-cores 4 --num-executors 4 --class com.loader /home/john/jars/HiveLoader-1.0-SNAPSHOT-jar-with-dependencies.jar somearg1 somearg2 somearg3
作业失败并出现异常java.lang.ClassNotFoundException:com.sun.jersey.api.client.config.ClientConfig
所以我将该 jar 添加到我的 spark-submit 命令中,如下所示。
./spark-submit --master yarn --deploy-mode cluster --driver-class-path /home/john/jars/mssql-jdbc-9.2.0.jre8.jar --jars /home/john/jars/spark-bigquery-with-dependencies_2.11-0.19.1.jar,/home/john/jars/mssql-jdbc-9.2.0.jre8.jar,/home/john/jars/jersey-client-1.19.4.jar --driver-memory 1g --executor-memory 4g --executor-cores 4 --num-executors 4 --class com.loader /home/john/jars/HiveLoader-1.0-SNAPSHOT-jar-with-dependencies.jar somearg1 somearg2 somearg3
我还检查了目录:/john/spark247ext/spark-2.4.7-bin-hadoop2.7/jars
,发现 jar:jersey-client-x.xx.x.jar
存在那里。
username@:[~/john/spark247ext/spark-2.4.7-bin-hadoop2.7/jars] 179 $ ls -ltr | grep jersey
-rwxrwxrwx 1 john john 951701 Sep 8 2020 jersey-server-2.22.2.jar
-rwxrwxrwx 1 john john 72733 Sep 8 2020 jersey-media-jaxb-2.22.2.jar
-rwxrwxrwx 1 john john 971310 Sep 8 2020 jersey-guava-2.22.2.jar
-rwxrwxrwx 1 john john 66270 Sep 8 2020 jersey-container-servlet-core-2.22.2.jar
-rwxrwxrwx 1 john john 18098 Sep 8 2020 jersey-container-servlet-2.22.2.jar
-rwxrwxrwx 1 john john 698375 Sep 8 2020 jersey-common-2.22.2.jar
-rwxrwxrwx 1 john john 167421 Sep 8 2020 jersey-client-2.22.2.jar
我还在 pom.xml 文件中添加了依赖项:
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-client</artifactId>
<version>1.19.4</version>
</dependency>
即使在我的 spark-submit 命令中提供了 jar 文件并从我的 maven 项目中创建了一个包含所有依赖项的胖 jar 文件,我仍然看到异常:
Exception in thread "main" java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:161)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1135)
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1530)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
我下载的 spark 是针对我自己的用例的,所以我没有更改项目中现有 spark 版本的任何设置,即Spark 2.3
谁能告诉我我该怎么做才能解决这个问题,以便代码正常运行?
【问题讨论】:
您是否将 SPARK_HOME 环境变量设置为$HOME/john/spark247ext/spark-2.4.7-bin-hadoop2.7
?
你可以试试球衣版本 2.22 吗?应该是兼容性问题
你用的是肥罐还是原装的?
@Emer 是的,我也试过了。同样的例外。
@MohdAvais 我尝试通过在 spark-submit 命令中单独传递 jar 以及 --jars /home//john/spark247ext/spark-2.4.7-bin-hadoop2 等其他 jar 来尝试。 7/jars/jersey-client-2.22.2.jar 结果相同。
【参考方案1】:
你能在你的 spark-submit 中使用这个属性吗
--conf "spark.driver.userClassPathFirst=true"
我认为你遇到了一个 jar 冲突,即从环境中获取同一个 jar 的不同版本
【讨论】:
以上是关于为啥即使 jar 存在,火花应用程序也会因 java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig 的主要内容,如果未能解决你的问题,请参考以下文章
即使允许使用 NA,R 函数 prcomp 也会因 NA 的值而失败
为啥即使文件存在于服务器上,jQuery ajax 也会返回 404 Not found 错误?
为啥即使应用程序状态是 UIApplicationStateBackground,backgroundTimeRemaining 也会给出 DBL_MAX?