未找到价值火花 SBT 项目
Posted
技术标签:
【中文标题】未找到价值火花 SBT 项目【英文标题】:Not found value spark SBT project 【发布时间】:2017-09-10 09:04:20 【问题描述】:您好,我正在尝试在 SBT 中设置一个小型 Spark 应用程序,
我的 build.sbt 是
import Dependencies._
name := "hello"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "1.6.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)
libraryDependencies += scalaTest % Test
一切正常,我得到了 SBT 解决的所有依赖项,但是当我尝试在我的 hello.scala 项目文件中导入 spark 时,我得到了这个错误 未找到:值火花
我的 hello.scala 文件是
package example
import org.apache.spark._
import org.apache.spark.SparkContext._
object Hello extends fileImport with App
println(greeting)
anime.select("*").orderBy($"rating".desc).limit(10).show()
trait fileImport
lazy val greeting: String = "hello"
var anime = spark.read.option("header", true).csv("C:/anime.csv")
var ratings = spark.read.option("header", true).csv("C:/rating.csv")
这是我得到的错误文件
[info] Compiling 1 Scala source to C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\target\scala-2.11\classes...
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:12: not found: value spark
[error] var anime = spark.read.option("header", true).csv("C:/anime.csv")
[error] ^
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:13: not found: value spark
[error] var ratings = spark.read.option("header", true).csv("C:/rating.csv")
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Sep 10, 2017 1:44:47 PM
【问题讨论】:
【参考方案1】:spark
仅在 spark-shell
中初始化
但是代码需要自己初始化spark
变量
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().appName("testings").master("local").getOrCreate
如果您想使用spark-submit
运行代码,您可以将testings
名称更改为您想要的名称.master
选项是可选的
【讨论】:
感谢 Ramesh,这里要补充的一件事是 SparkSession 仅适用于 spark 2.0.0,因此我的 build.sbt 现在已更新import Dependencies._ name := "hello" version := "1.0" scalaVersion := "2.11.8" val sparkVersion = "2.0.0" libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % sparkVersion, "org.apache.spark" %% "spark-streaming" % sparkVersion, "org.apache.spark" %% "spark-sql" % sparkVersion ) libraryDependencies += scalaTest % Test
[***.com/questions/37337461/…
是的,没错。对于旧版本,您需要创建 sqlContext
Ramesh 还有一件事我现在被困在我尝试使用在线帮助的过程中,value $ 不是 StringContext 的成员
import org.apache.spark.SparkConf, SparkContext import org.apache.spark.sql.SparkSession, SQLContext import org.apache.spark.sql._ object Hello val spark = SparkSession.builder().master("local").appName("TV Series Analysis").getOrCreate() import spark.sqlContext.implicits._ import spark.implicits._ var anime = spark.read.option("header", true).csv("C:/anime.csv") anime.select("*").orderBy($"rating".desc).limit(10).show()
谢谢我在这里完成是新的代码文件val spark = SparkSession.builder().master("local").appName("TV Series Analysis").config("spark.sql.warehouse.dir","file:///c:/tmp/spark-warehouse").getOrCreate() import spark.implicits._ var anime = spark.read.option("header", true).csv("C:/anime.csv") var ratings = spark.read.option("header", true).csv("C:/rating.csv") anime.select("*").orderBy($"rating".desc).limit(10).show()
以上是关于未找到价值火花 SBT 项目的主要内容,如果未能解决你的问题,请参考以下文章