关于scala工程结构(使用sbt)

Posted sayhihi

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了关于scala工程结构(使用sbt)相关的知识,希望对你有一定的参考价值。

scala_project:
常用目录结构:
  |lib:手动添加依赖包
  |project
  |  |build.properties:build的版本号,可以不写,会自动下载
  |  |plugins.sbt:需要添加的插件,包括sbt的插件(addSbtPlugin),否则sbt gen-idea、assembly命令不能执行(can‘t find key gen-idea in sbt)
  |  |build.scala:功能类似于build.sbt,声明工程文件,同时通过setting方式设定工程名字、版本、依赖(需要Dependencies文件)等信息
  |src
  |  |main
  |    |resources:工程的配置文件,sbt工程默认读取该目录下配置文件(ConfigFactory.load(),ConfigFactory.parseFile(),生成config)
  |    |scala
  |    |java
  |  |test
  |    |resources
  |    |scala
  |    |java
  |target:生成工程jar包和idea工程的xml
  |build.sbt:工程name、version、scalaVersion、libraryDependencies等

 

注意:build.sbt和build.scala可以取一即可。

一、使用build.sbt搭建工程:

A.build.sbt(每行以空行分割)

name:="config_read"

version:="1.1"

scalaVersion:="2.10.1"

libraryDependencies+="com.typesafe"%"config"%"1.2.1"  //依赖的第三方包

libraryDependencies+="org.slf4j"%"slf4j-api"%"1.7.7"  //依赖的第三方包

B.plugins.sbt(每行以空行分割)

//需要留空行,以下为sbt构建、编译、打包必须的插件
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")

addSbtPlugin("de.johoop" % "jacoco4sbt" % "2.1.5")

C.build.properties(指定sbt版本)

sbt.version=0.13.7

 

二、使用build.scala

目录结构(一个工程下多个子工程):

|project

|BuildProject.scala

|Dependencies.scala

|plugins.sbt

|build.properties

|Common.scala

|rdd_proj

|date_goodsstatistic

|src

|mian

|resources

|scala

|java

|date_salaryincharge

|src

|main

|resources

|scala

|java

A.BuikdProject.scala(该文件为scala文件object,依赖于Dependencies)

import sbt.Keys._
import sbt._
import sbtassembly.AssemblyKeys._

object BuildProject extends Build {

lazy val goodsstatistic= project.in(file("rdd_proj/date_goodsstatistic")).settings(name := "date_goodsstatistic").
  settings(Common.settings: _*).
  settings(libraryDependencies ++= Dependencies.sparkCounterDependencies).
  settings(assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)).
  settings(assemblyJarName in assembly := s"date_goodsstatistic_1.2.1.jar")


lazy val salaryincharge = project.in(file("rdd_proj/date_salaryincharge")).settings(name := "date_salaryincharge").
  settings(Common.settings: _*).
  settings(libraryDependencies ++= Dependencies.sparkCounterDependencies).
  settings(assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)).
  settings(assemblyJarName in assembly := s"date_salaryincharge_1.3.1.jar")

}

B.Dependencies.scala(该文件为scala文件object)

import sbt._

object Dependencies {

val sparkDependencies: Seq[ModuleID] = Seq(

"org.apache.spark" % "spark-core" % "2.0.0" %“provided,test”,

"org.apache.spark" % "hadoop-common" % "2.11-2.0.2"% "provided,test",//provided指明环境含有该依赖包,本地test时需要下载,打包时不需要

"org.apache.spark" % "spark-hive" % "2.11-2.0.2"% "provided,test")

val featureDependencies:Seq[MoudleID]=Seq(

"org.scalatest" %% "scalatest" % "2.2.4" % "test",

"mysql" % "mysql-connector-java" % "5.1.31" % "test")

val sparkCounterDependencies:Seq[MoudleID]=sparkDependencies++featureDependencies:Seq

}

C.plugins.sbt(同一)

D.build.properties(同一)

E.Common.scala(可以指定scalaVersion)

 

以上是关于关于scala工程结构(使用sbt)的主要内容,如果未能解决你的问题,请参考以下文章

有关Error during sbt execution: No Scala version specified or detected的解决方案--SBT

用eclipse调试scala工程代码

mac idea sbt工程打jar包

Chisel3-创建工程并转换为Verilog代码

为进一步工程化Scala打基础的点滴

Spark 中用 Scala 和 java 开发有啥区别