java-SparkSql不支持日期格式
作者:互联网
我试图将日期文件与sparkSql一起使用,但无法正常工作
我试图添加datecoloumn dob
在Person类中,我添加了
将dob的setter和getter作为日期
当试图执行时
SELECT dob,name,age,count(*) as totalCount FROM Person WHERE dob >= '1995-01-01' AND age <= '2014-02-01';
还尝试在查询中也使用而不是< =& > =也是
/Volumes/Official/spark-1.0.2-bin-hadoop2$: bin/spark-submit --class "SimpleApp" --master local[4] try/simple-project/target/simple-project-1.0.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
2014-08-21 11:42:47.360 java[955:1903] Unable to load realm mapping info from SCDynamicStore
=== Data source: RDD ===
Exception in thread "main" scala.MatchError: class java.util.Date (of class java.lang.Class)
解决方法:
它仍然处于待处理状态,而Date可以在Person类中使用Timestamp
SPARK-2552
Spark SQL currently supports Timestamp, but not Date.
我们将需要等待一段时间,直到1.2.0版本.
细节:
>类型:改进改进
>状态:打开
>优先级:次要
>解决方法:未解决
>影响版本:1.0.1
>修复版本:无
>组件:SQL
>目标版本:1.2.0
标签:apache-spark,java 来源: https://codeday.me/bug/20191121/2051141.html