如何在 Play 2.2.3 项目中使用 Spark 1.2.0,因为它因 NoSuchMethodError:akka.util.Helpers 而失败?

Posted

技术标签:

【中文标题】如何在 Play 2.2.3 项目中使用 Spark 1.2.0,因为它因 NoSuchMethodError:akka.util.Helpers 而失败?【英文标题】:How to use Spark 1.2.0 in Play 2.2.3 project as it fails with NoSuchMethodError: akka.util.Helpers? 【发布时间】:2015-03-17 11:18:53 【问题描述】:

您曾经遇到过 Play 框架的问题吗?就我而言,首先我将所有内容都构建在一个 jar 中:spark-assebmly-1.2.0-hadoop2.4.0.jar,并且 Spark 在 shell 中完美运行。但是有两个问题:

    我应该在 Play_project 中使用这个组装好的 Spark_jar 吗?如何使用?因为我尝试将它移动到 lib_directiry 中,但它没有帮助提供任何 Spark_imports。

    如果我将 Spark 库定义为:"org.apache.spark" %% "spark-core" % "1.2.0"

播放框架代码

Build.scala

val appDependencies = Seq(
        jdbc
        ,"org.apache.spark" %% "spark-streaming" % "1.2.0"
        ,"org.apache.spark" %% "spark-core" % "1.2.0"
        ,"org.apache.spark" %% "spark-sql" % "1.2.0"

TestEntity.scala

package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import models.SparkMain
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._

object TestEntity 
 val TestEntityPath = "/home/t/PROD/dict/TestEntity .txt"
 val TestEntitySpark= SparkMain.sc.textFile(TestEntityPath, 4).cache
 val TestEntityData = TestEntitySpark.flatMap(_.split(","))
 def getFive() : Seq[String] = 
                println("TestEntity.getFive")
                TestEntityData.take(5)
                   

SparkMain.scala

package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
import org.apache.spark.streaming. Seconds, StreamingContext 
import StreamingContext._
import org.apache.spark.sql.SQLContext
import org.apache.spark.SparkConf


object SparkMain 
 val driverPort = 8080
 val driverHost = "localhost"
 val conf = new SparkConf(false) // skip loading external settings
 .setMaster("local[4]") // run locally with enough threads
 .setAppName("firstSparkApp")
 .set("spark.logConf", "true")
 .set("spark.driver.port", s"$driverPort")
 .set("spark.driver.host", s"$driverHost")
 .set("spark.akka.logLifecycleEvents", "true")
 val sc = new SparkContext(conf)

和控制器代码,使用 Spark 的东西:

def test = Action 
    implicit req => 
      val chk = TestEntity.getFive
      Ok("it works")
    
  

..在运行时有这个错误:

[info] o.a.s.SparkContext - Spark configuration:
spark.akka.logLifecycleEvents=true
spark.app.name=firstSparkApp
spark.driver.host=localhost
spark.driver.port=8080
spark.logConf=true
spark.master=local[4]
[warn] o.a.s.u.Utils - Your hostname, uisprk resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
[warn] o.a.s.u.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
[info] o.a.s.SecurityManager - Changing view acls to: t
[info] o.a.s.SecurityManager - Changing modify acls to: t
[info] o.a.s.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(t); users with modify permissions: Set(t)
[error] application -

! @6l039e8d5 - Internal server error, for (GET) [/ui] ->

play.api.Application$$anon$1: Execution exception[[RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;]]
        at play.api.Application$class.handleError(Application.scala:293) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.DefaultApplication.handleError(Application.scala:399) [play_2.10-2.2.3.jar:2.2.3]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:166) [play_2.10-2.2.3.jar:2.2.3]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:163) [play_2.10-2.2.3.jar:2.2.3]
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) [scala-library-2.10.4.jar:na]
        at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185) [scala-library-2.10.4.jar:na]
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
        at play.api.mvc.ActionBuilder$$anon$1.apply(Action.scala:314) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.utils.Threads$.withContextClassLoader(Threads.scala:18) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:108) ~[play_2.10-2.2.3.jar:2.2.3]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:107) ~[play_2.10-2.2.3.jar:2.2.3]
Caused by: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
        at akka.remote.RemoteSettings.<init>(RemoteSettings.scala:48) ~[akka-remote_2.10-2.3.4-spark.jar:na]
        at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:114) ~[akka-remote_2.10-2.3.4-spark.jar:na]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_72]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.7.0_72]
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_72]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_72]

如何绑定库?通过依赖或assembled_jar? 请给点建议。

【问题讨论】:

看起来像是 com.typesafe:config 依赖项的不兼容版本:Execution exception[[RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;]]。检查构建中的版本冲突。 使用不是2.3.7的版本有什么原因吗?我已经尝试了该版本的示例,并且效果很好。 我还发现 Akka 2.3.9 让这个错误消失了。 【参考方案1】:

nosuchmethodeersrror 异常是 100%,因为编译时和运行时 jar 版本不匹配。

检查 jar 的版本。另外我对您的应用架构有一些疑问

除了从 play 框架调用 spark 代码之外,您还可以从 shell 脚本调用 spark submit,这在您的情况下看起来更好。甚至你可以从你的播放应用程序中做到这一点。无需在 play 应用类路径中包含 jar。

【讨论】:

我描述了问题,当然是JAR的问题。 你建议只在shell中使用spark?不,谢谢!【参考方案2】:

配置的问题是 Apache Spark 和 Play Framework 的 Akka 依赖项——它们都依赖于 Akka,而且正如您所面对的那样,应该在构建时使用 evicted 命令解决不同且不兼容的版本在 sbt.

您可能想使用update 命令并发现target/resolution-cache/reports 中的报告非常有用。

【讨论】:

以上是关于如何在 Play 2.2.3 项目中使用 Spark 1.2.0,因为它因 NoSuchMethodError:akka.util.Helpers 而失败?的主要内容,如果未能解决你的问题,请参考以下文章

如何使用 Hudson 构建 Play 项目?

如何将总价设置为 google play 订阅项目

如何将 jdbc 依赖项添加到 Play 项目?

如何将AWS Java SDK添加到Scala / Play项目

如何在Play Scala应用程序中使用密钥

如何在 Play 2.3 中使用 publish/publishLocal 发布 webjar 资产?