spark Task not serializable

Posted 骑小象去远方

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了spark Task not serializable相关的知识,希望对你有一定的参考价值。

Job aborted due to stage failure: Task not serializable:

If you see this error:

org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: ...

The above error can be triggered when you intialize a variable on the driver (master), but then try to use it on one of the workers. In that case, Spark Streaming will try to serialize the object to send it over to the worker, and fail if the object is not serializable. Consider the following code snippet:

NotSerializable notSerializable = new NotSerializable();
JavaRDD<String> rdd = sc.textFile("/tmp/myfile");

rdd.map(s -> notSerializable.doSomething(s)).collect();

This will trigger that error. Here are some ideas to fix this error:

  • Serializable the class
  • Declare the instance only within the lambda function passed in map.
  • Make the NotSerializable object as a static and create it once per machine.
  • Call rdd.forEachPartition and create the NotSerializable object in there like this:
rdd.forEachPartition(iter -> {
  NotSerializable notSerializable = new NotSerializable();

  // ...Now process iter
});

参考:https://databricks.gitbooks.io/databricks-spark-knowledge-base/content/troubleshooting/javaionotserializableexception.html

以上是关于spark Task not serializable的主要内容,如果未能解决你的问题,请参考以下文章

Apache Spark:Task not serializable异常的排查和解决

org.apache.spark.SparkException: Task not serializable

spark 插入数据到mysql时遇到的问题 org.apache.spark.SparkException: Task not serializable

Spark:加入待转换的数据集时,“SparkException:Task not serializable”

spark2.1注册内部函数spark.udf.register("xx", xxx _),运行时抛出异常:Task not serializable

Spark面试题:GC导致的 Shuffle文件拉取失败,报错 Shuffle file not found