其他分享
首页 > 其他分享> > spark错误信息

spark错误信息

作者:互联网

1.windows10使用idea创建wordcount时,hadoop 二进制   加    空指针异常。是因为没有hadoop,hadoop环境变量

解决:配置下载hadoop,配置环境变量

2.写的wordcount在spark集群上跑是

19/09/11 20:19:54 INFO spark.SparkContext: Created broadcast 0 from textFile at WordCount.scala:14
Exception in thread "main" java.lang.RuntimeException: Error in configuring object

............

Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
... 48 more
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.

...............

异常,而在yarn cluster上不报错,是因为在hadoop的core-site.xml 和mapred-site.xml中开启了压缩,并且压缩式lzo的。这就导致写入/上传到hdfs的文件自动被压缩为lzo了

 

解决:

spark-env.sh中

配置SPARK_LIBRARY_PATH添加hadoop的native

配置SPARK_CLASSPATH添加Hadoop的lzo

export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:hadoop-2.7.2/lib/native
export SPARK_CLASSPATH=$SPARK_CLASSPATH:hadoop-2.7.2/share/hadoop/common/hadoop-lzo-0.4.20.jar

 

错误三:

Error:(52, 27) overloaded method value / with alternatives:
(x: Double)Double <and>
(x: Float)Double <and>
(x: Long)Double <and>
(x: Int)Double <and>
(x: Char)Double <and>
(x: Short)Double <and>
(x: Byte)Double
cannot be applied to (AnyVal)
val rate = double / d

 

源码:
val result = ppp.map {
case (flow, fc) =>
val page = flow.split("->")(0)
val d = rdd2.getOrElse(page.toLong, Double.MaxValue)
val double = fc.toDouble
val rate = double / d
val formater = new DecimalFormat(".00%")
(flow, formater.format(rate))
}

 

原因:前面那是因为你的元组数组由两种不同的类型组成:map[String, Int]map1[String, Double]。这些类型由编译器推断,然后是元组数组的受干扰类型map1[String, AnyVal]。当您放置Double表示时,编译器能够创建map2[String, Double]

所以把rdd2的数据改成string,double

标签:java,val,Double,错误信息,hadoop,lzo,spark,SPARK
来源: https://www.cnblogs.com/mqc1992/p/11508571.html