为 apache phoenix 导入 sbt 项目时出错

Posted

技术标签:

【中文标题】为 apache phoenix 导入 sbt 项目时出错【英文标题】:error while importing sbt project for apache phoenix 【发布时间】:2020-01-28 04:18:14 【问题描述】:

我是初学者,我正在尝试使用 sbt 导入 phoenix 库以在 spark 中读取 hbase 表,但我的 build.sbt 一直给我错误。

导入 sbt 项目时出错:

[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.hbase:hbase-common:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-common/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-common/$cdh.hbase.version/hbase-common-$cdh.hbase.version.pom
[error] Error downloading org.apache.hbase:hbase-hadoop-compat:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-hadoop-compat/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-hadoop-compat/$cdh.hbase.version/hbase-hadoop-compat-$cdh.hbase.version.pom

[error]   not found: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/$cdh.hadoop.version/hadoop-common-$cdh.hadoop.version.pom
[error] Error downloading org.apache.hbase:hbase-hadoop2-compat:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-hadoop2-compat/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-hadoop2-compat/$cdh.hbase.version/hbase-hadoop2-compat-$cdh.hbase.version.pom
[error] Error downloading org.apache.hbase:hbase-annotations:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-annotations/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-annotations/$cdh.hbase.version/hbase-annotations-$cdh.hbase.version.pom
[error] Error downloading org.apache.hbase:hbase-protocol:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-protocol/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-protocol/$cdh.hbase.version/hbase-protocol-$cdh.hbase.version.pom
[error] Error downloading org.apache.hbase:hbase-client:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-client/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-client/$cdh.hbase.version/hbase-client-$cdh.hbase.version.pom
[error] Error downloading org.apache.hbase:hbase-server:$cdh.hbase.version
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-server/$cdh.hbase.version/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-server/$cdh.hbase.version/hbase-server-$cdh.hbase.version.pom
[error] Error downloading com.cloudera.cdh:cdh-root:5.11.2
[error]   Not found
[error]   Not found
[error]   not found: /Users/johnny/.ivy2/local/com.cloudera.cdh/cdh-root/5.11.2/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/com/cloudera/cdh/cdh-root/5.11.2/cdh-root-5.11.2.pom
[error] Total time: 3 s, completed Sep 27, 2019, 4:54:09 PM
[info] shutting down sbt server)

我的 build.sbt 是:

name := "SparkHbase"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.2.0" % "provided"
  ,"org.apache.spark" %% "spark-sql" % "2.2.0"  % "provided"
  ,"org.apache.spark" %% "spark-hive" % "2.2.0" % "provided"
  ,"org.apache.phoenix" % "phoenix-spark" % "4.13.2-cdh5.11.2"
)

我什至包括了这个:resolvers += "ClouderaRepo" at "https://repository.cloudera.com/content/repositories/releases"

但仍然有错误。请问,我做错了什么?

【问题讨论】:

$cdh.hbase.version 的值从何而来?它出现在日志中 【参考方案1】:

问题是您正在尝试使用非常旧的phoenix-spark 版本。如果你有 HBase 1.3,你可以使用版本4.14.3-HBase-1.3,看这个build.sbt

name := "SparkHbase"
version := "0.1"
scalaVersion := "2.11.12"

resolvers += "Cloudera" at "https://repository.cloudera.com/content/repositories/releases/"
resolvers += "Cloudera_Artifactory" at "https://repository.cloudera.com/artifactory/cloudera-repos/"
resolvers += Resolver.sonatypeRepo("releases")

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.2.0" % "provided"
  ,"org.apache.spark" %% "spark-sql" % "2.2.0"  % "provided"
  ,"org.apache.spark" %% "spark-hive" % "2.2.0" % "provided"
  ,"org.apache.phoenix" % "phoenix-spark" % "4.14.3-HBase-1.3"
)

【讨论】:

以上是关于为 apache phoenix 导入 sbt 项目时出错的主要内容,如果未能解决你的问题,请参考以下文章

Phoenix 数据导入与导出

使用 org.apache.hadoop/* 依赖项离线编译 sbt 的问题

sbt 中库和插件依赖项的不同导入处理策略

Playframework:无法覆盖 sbt 依赖项

将 Intellij 与 SBT 一起使用时如何更改提供的依赖项?

SBT 导入空的基于 JavaFX 的 jar