spark错误信息
Posted mqc1992
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了spark错误信息相关的知识,希望对你有一定的参考价值。
1.windows10使用idea创建wordcount时,hadoop 二进制 加 空指针异常。是因为没有hadoop,hadoop环境变量
解决:配置下载hadoop,配置环境变量
2.写的wordcount在spark集群上跑是
19/09/11 20:19:54 INFO spark.SparkContext: Created broadcast 0 from textFile at WordCount.scala:14
Exception in thread "main" java.lang.RuntimeException: Error in configuring object
............
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
... 48 more
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
...............
异常,而在yarn cluster上不报错,是因为在hadoop的core-site.xml 和mapred-site.xml中开启了压缩,并且压缩式lzo的。这就导致写入/上传到hdfs的文件自动被压缩为lzo了
解决:
spark-env.sh中
配置SPARK_LIBRARY_PATH添加hadoop的native
配置SPARK_CLASSPATH添加Hadoop的lzo
export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:hadoop-2.7.2/lib/native
export SPARK_CLASSPATH=$SPARK_CLASSPATH:hadoop-2.7.2/share/hadoop/common/hadoop-lzo-0.4.20.jar
错误三:
Error:(52, 27) overloaded method value / with alternatives:
(x: Double)Double <and>
(x: Float)Double <and>
(x: Long)Double <and>
(x: Int)Double <and>
(x: Char)Double <and>
(x: Short)Double <and>
(x: Byte)Double
cannot be applied to (AnyVal)
val rate = double / d
源码:
val result = ppp.map
case (flow, fc) =>
val page = flow.split("->")(0)
val d = rdd2.getOrElse(page.toLong, Double.MaxValue)
val double = fc.toDouble
val rate = double / d
val formater = new DecimalFormat(".00%")
(flow, formater.format(rate))
原因:前面那是因为你的元组数组由两种不同的类型组成:map[String, Int]
和map1[String, Double]
。这些类型由编译器推断,然后是元组数组的受干扰类型map1[String, AnyVal]
。当您放置Double表示时,编译器能够创建map2[String, Double]
。
所以把rdd2的数据改成string,double
以上是关于spark错误信息的主要内容,如果未能解决你的问题,请参考以下文章