如何使用 sbt-assembly 排除测试依赖项
Posted
技术标签:
【中文标题】如何使用 sbt-assembly 排除测试依赖项【英文标题】:How to exclude test dependencies with sbt-assembly 【发布时间】:2019-08-23 11:55:06 【问题描述】:我有一个 sbt 项目,我正在尝试使用 sbt-assembly 插件将其构建到 jar 中。
build.sbt:
name := "project-name"
version := "0.1"
scalaVersion := "2.11.12"
val sparkVersion = "2.4.0"
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test",
// spark-hive dependencies for DataFrameSuiteBase. https://github.com/holdenk/spark-testing-base/issues/143
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
//"org.apache.hadoop" % "hadoop-aws" % "3.1.1"
"org.json" % "json" % "20180813"
)
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly :=
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
test in assembly :=
// https://github.com/holdenk/spark-testing-base
fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false
当我使用 sbt 程序集构建项目时,生成的 jar 包含 /org/junit/... 和 /org/opentest4j/... 文件
有什么办法可以不将这些测试相关的文件包含在最终的jar中?
我已尝试替换该行:
"org.scalatest" %% "scalatest" % "3.0.5" % "test"
与:
"org.scalatest" %% "scalatest" % "3.0.5" % "provided"
我还想知道这些文件是如何包含在 jar 中的,因为 build.sbt 中没有引用 junit(但是项目中有 junit 测试)?
更新:
name := "project-name"
version := "0.1"
scalaVersion := "2.11.12"
val sparkVersion = "2.4.0"
val excludeJUnitBinding = ExclusionRule(organization = "junit")
libraryDependencies ++= Seq(
// Provided
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
// Test
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
// Necessary
"org.json" % "json" % "20180813"
)
excludeDependencies += excludeJUnitBinding
// https://***.com/questions/25144484/sbt-assembly-deduplication-found-error
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly :=
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
// https://github.com/holdenk/spark-testing-base
fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false
【问题讨论】:
默认情况下,sbt-assembly 不包含测试 jars。当我包含自己的依赖项(错误地)将测试框架列为运行时依赖项时,我遇到了这个问题。你知道junit里拉的是哪个包吗? 我不确定,如果我在每个依赖项后面加上“必需”,测试文件仍然包括在内。这是否意味着它没有任何包含的依赖项在运行时将它们拉入? 【参考方案1】:要排除依赖项的某些传递依赖项,请使用excludeAll
或exclude
方法。
为项目发布 pom 时,应使用 exclude
方法。它要求排除组织和模块名称。
例如:
libraryDependencies +=
"log4j" % "log4j" % "1.2.15" exclude("javax.jms", "jms")
excludeAll 方法更灵活,但是因为它不能在 pom.xml 中表示,所以应该只在不需要生成 pom 时使用。
例如,
libraryDependencies +=
"log4j" % "log4j" % "1.2.15" excludeAll(
ExclusionRule(organization = "com.sun.jdmk"),
ExclusionRule(organization = "com.sun.jmx"),
ExclusionRule(organization = "javax.jms")
)
在某些情况下,应从所有依赖项中排除传递依赖项。这可以通过在 excludeDependencies 中设置 ExclusionRules 来实现(对于 sbt 0.13.8 及更高版本)。
excludeDependencies ++= Seq(
ExclusionRule("commons-logging", "commons-logging")
)
JUnit jar 文件作为以下依赖项的一部分下载。
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" //(junit)
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided"// (junit)
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test" //(org.junit)
要排除 junit 文件,请更新您的依赖项,如下所示。
val excludeJUnitBinding = ExclusionRule(organization = "junit")
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test" excludeAll(excludeJUnitBinding)
更新: 请按如下方式更新您的 build.abt。
resolvers += Resolver.url("bintray-sbt-plugins",
url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)
val excludeJUnitBinding = ExclusionRule(organization = "junit")
libraryDependencies ++= Seq(
// Provided
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
//"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
//"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
//"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
// Test
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
// Necessary
"org.json" % "json" % "20180813"
)
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly :=
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false
plugin.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
我试过了,没有下载junit
jar 文件。
【讨论】:
我尝试将唯一未提供的或测试导入更改为 org.json" % "json" % "20180813" exclude("org.junit", "junit"),仍然是 org/junit ...文件存在于 jar 中。我错过了什么吗?org.junit
不会作为 org.json
依赖项的一部分下载。你知道junit模块的包名吗?
我的理解是,因为 org.json 是唯一没有标记为“提供”或“测试”的库,所以只有它的依赖项才会被拉入。所以从这个库中排除 org.junit 应该使确保文件没有导入到 jar 中。这有意义吗?
@AlexShapovalov junit:junit:jar
文件正在作为"org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
和"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test"
的一部分下载。请检查我更新的评论以排除 Junit jar 文件。
如果您遇到同样的问题,请告诉我。以上是关于如何使用 sbt-assembly 排除测试依赖项的主要内容,如果未能解决你的问题,请参考以下文章