冲突的交叉版本后缀:Spark作业
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了冲突的交叉版本后缀:Spark作业相关的知识,希望对你有一定的参考价值。
下面是我的build.sbt文件。我搜索过这样的其他问题,但没有人帮我找到答案。我已经尝试了多种方法来显式使用2.11 scala但由于某种原因我不断收到此错误。
[error](*:ssExtractDependencies)冲突的跨版本后缀:
org.json4s:json4s-ast,org.apache.spark:spark-network-shuffle,com.twitter:chill,org.json4s:json4s-jackson,com.fasterxml.jackson.module:jackson-module-scala,org。 json4s:json4s-core,org.apache.spark:spark-core,org.apache.spark:spark-network-common
[error](*:update)冲突的交叉版本后缀:org.json4s:json4s-ast,org.apache.spark:spark-network-shuffle,com.twitter:chill,org.json4s:json4s-jackson,com .fasterxml.jackson.module:jackson-module-scala,org.json4s:json4s-core,org.apache.spark:spark-core,org.apache.spark:spark-network-common
name := "ubi-journeyData-validation"
version := "2.0"
scalaVersion := "2.11.11"
dependencyOverrides += "org.scala-lang" % "scala-compiler" % scalaVersion.value
//updateOptions := updateOptions.value.withCachedResolution(false)
libraryDependencies ++= {
val sparkVersion = "2.3.0"
//val sparkVersion = "1.6.3"
Seq("org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-hive" % sparkVersion,
"org.elasticsearch" %% "elasticsearch-spark-20" % "5.6.9",
//"org.elasticsearch" %% "elasticsearch-spark-13" % "5.6.9",
//"org.elasticsearch" %% "elasticsearch-spark-13" % "5.2.0",
"org.spark-project.hive" % "hive-cli" % "1.2.1.spark2",
"org.spark-project.hive" % "hive-metastore" % "1.2.1.spark2",
"org.spark-project.hive" % "hive-exec" % "1.2.1.spark2"
//"org.json4s" %% "json4s-jackson" % "3.2.11",
//"org.apache.calcite" % "calcite-core" % "1.2.0-incubating",
//"org.pentaho" % "pentaho-aggdesigner" % "5.1.5-jhyde" pomOnly(),
//"org.pentaho" % "pentaho-aggdesigner-algorithm" % "5.1.5-jhyde" % Test
)
}
resolvers += Resolver.mavenLocal
resolvers += "Cascading repo" at "http://conjars.org/repo"
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
这是因为你的一个依赖关系依赖,fe:
"org.spark-project.hive" % "hive-exec" % "1.2.1.spark2"
如果你检查,for example here,它取决于:
org.apache.spark » spark-core_2.10 » 1.3.1 » (optional)
所以我担心你不能更新你的2.3.0
和Scala到2.11
的火花,除非你转储那些org.spark-project.hive
模块。
以上是关于冲突的交叉版本后缀:Spark作业的主要内容,如果未能解决你的问题,请参考以下文章
为啥我在一个环境中的 sbt 中出现冲突的交叉版本,而在另一个环境中却没有?