IntelliJ 中 SBT 项目的未解析依赖路径
Posted
技术标签:
【中文标题】IntelliJ 中 SBT 项目的未解析依赖路径【英文标题】:Unresolved dependencies path for SBT project in IntelliJ 【发布时间】:2017-11-12 09:30:59 【问题描述】:我正在使用 IntelliJ 开发 Spark 应用程序。我正在关注这个instruction,了解如何使 intellij 与 SBT 项目完美配合。
由于我的整个团队都在使用 IntelliJ,所以我们可以修改 build.sbt,但我们得到了这个未解决的依赖项错误
错误:导入 SBT 项目时出错:
[信息] 解决 org.apache.thrift#libfb303;0.9.2 ... [信息] 解决 org.apache.spark#spark-streaming_2.10;2.1.0 ... [信息] 解决 org.apache.spark#spark-streaming_2.10;2.1.0 ... [信息] 解决 org.apache.spark#spark-parent_2.10;2.1.0 ... [信息] 解决 org.scala-lang#jline;2.10.6 ... [信息] 解决 org.fusesource.jansi#jansi;1.4 ... [警告] :::::::::::::::::::::::::::::::::::::::::::::: [警告] :: 未解决的依赖关系 :: [警告] :::::::::::::::::::::::::::::::::::::::::::::: [警告] :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: 未找到 [警告] :::::::::::::::::::::::::::::::::::::::::::::: [警告] [警告] 注意:未解析的依赖路径: [警告] sparrow-to-orc:sparrow-to-orc_2.10:0.1 [警告] +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT [trace] 堆栈跟踪被抑制:运行 'last mainRunner/:ssExtractDependencies' 以获得完整输出。 [trace] 堆栈跟踪被抑制:运行 'last mainRunner/:update' 以获得完整输出。 [错误] (mainRunner/:ssExtractDependencies) sbt.ResolveException: 未解决的依赖项: sparrow-to-orc#sparrow-to-orc_2.10;0.1: 未找到 [错误] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found [错误] 总时间:47 s,2017 年 6 月 10 日上午 8:39:57 完成这是我的 build.sbt
name := "sparrow-to-orc"
version := "0.1"
scalaVersion := "2.11.8"
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0",
"org.apache.spark" %% "spark-hive" % "2.1.0",
"org.apache.spark" %% "spark-streaming" % "2.1.0"
)
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"
libraryDependencies ++= sparkDependencies.map(_ % "provided")
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
assemblyMergeStrategy in assembly :=
case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
case PathList("org", "apache", xs @ _*) => MergeStrategy.last
case PathList("com", "google", xs @ _*) => MergeStrategy.last
case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
case "about.html" => MergeStrategy.rename
case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
case "META-INF/mailcap" => MergeStrategy.last
case "META-INF/mimetypes.default" => MergeStrategy.last
case "plugin.properties" => MergeStrategy.last
case "log4j.properties" => MergeStrategy.last
case "overview.html" => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
如果我没有这条线,那么程序可以正常工作
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
但是我将无法在 IntelliJ 中运行应用程序,因为 spark 依赖项不会包含在类路径中。
【问题讨论】:
sparrow-to-orc#sparrow-to-orc_2.10;0.1
似乎是您自己的程序,但使用了另一个 scala 版本。不知道这是从哪里来的,但这是一种配置错误的气味。
【参考方案1】:
我有同样的问题。解决方案是将mainRunner
中的Scala 版本设置为与build.sbt
文件顶部声明的版本相同:
lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile"),
scalaVersion := "2.11.8"
)
祝你好运!
【讨论】:
以上是关于IntelliJ 中 SBT 项目的未解析依赖路径的主要内容,如果未能解决你的问题,请参考以下文章
IntelliJ 在创建 Scala 项目后在 build.sbt 中给出“无法解析符号”的错误
intellij/activator/sbt 正在下载其他之前下载的依赖项
如何使用 SBT 和 IntelliJ IDEA 管理多个相互依赖的模块?
IntelliJ 14 - 创建/导入 Scala / SBT 项目