类路径中缺少符号 'type scala.package.Serializable'
Posted
技术标签:
【中文标题】类路径中缺少符号 \'type scala.package.Serializable\'【英文标题】:Symbol 'type scala.package.Serializable' is missing from the classpath类路径中缺少符号 'type scala.package.Serializable' 【发布时间】:2021-12-09 01:13:00 【问题描述】:我的类路径缺少可序列化和可克隆的类。 我不知道如何解决这个问题。
我有一个看起来像这样的 sbt 应用程序
name := "realtime-spark-streaming"
version := "0.1"
resolvers += "confluent" at "https://packages.confluent.io/maven/"
resolvers += "Public Maven Repository" at "https://repository.com/content/repositories/pangaea_releases"
val sparkVersion = "3.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.0"
libraryDependencies += "com.walmart.grcaml" % "us-aml-commons" % "latest.release"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion
//libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "3.2.0" % "2.1.3"
//libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.7.12"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka
libraryDependencies += "org.apache.kafka" %% "kafka" % "6.1.0-ccs"
resolvers += Resolver.mavenLocal
scalaVersion := "2.13.6"
当我进行 sbt 构建时,我得到了..
Symbol 'type scala.package.Serializable' is missing from the classpath.
This symbol is required by 'class org.apache.spark.sql.SparkSession'.
Make sure that type Serializable is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'SparkSession.class' was compiled against an incompatible version of scala.package.
import org.apache.spark.sql.DataFrame, SparkSession
Symbol 'type scala.package.Serializable' is missing from the classpath.
This symbol is required by 'class org.apache.spark.sql.Dataset'.
Make sure that type Serializable is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'Dataset.class' was compiled against an incompatible version of scala.package.
def extractData(spark: SparkSession, configDetails: ReadProperties, pcSql: String, query: String): DataFrame =
我的依赖树只显示 jar,但这似乎是类/包冲突或丢失..
【问题讨论】:
你是怎么得到这个错误的?sbt clean compile
?
只需从 intellij 构建项目
您是否尝试过使用 sbt 以确保这不是 IntelliJ 不知道某些依赖项更改的问题。
【参考方案1】:
您使用的是不兼容的 Scala 版本 (2.13.6)。来自 Spark 文档:
Spark runs on Java 8/11, Scala 2.12, Python 3.6+ and R 3.5+. Python 3.6
support is deprecated as of Spark 3.2.0. Java 8 prior to version 8u201 support
is deprecated as of Spark 3.2.0. For the Scala API, Spark 3.2.0 uses Scala 2.12.
You will need to use a compatible Scala version (2.12.x).
如果您使用 2.12.x 系列的 Scala 版本,应该没问题。
【讨论】:
Spark 3.2.0 可用于 Scala 2.13。不确定您在哪里找到此文档。 否则 SBT 甚至无法解决依赖关系。 @Gaël J:spark.apache.org/docs/latest。今天访问了。 似乎他们还没有更新文档或有错字,但 Spark 3.2.0 确实支持 Scala 2.13(甚至通过使用 2.13 lib 支持 3.0)。以上是关于类路径中缺少符号 'type scala.package.Serializable'的主要内容,如果未能解决你的问题,请参考以下文章
CGLib缺少jar出现 java.lang.ClassNotFoundException: org.objectweb.asm.Type