hive on spark后执行group by等查询时中文出现乱码问题
作者:互联网
出现这种问题可能是varchar类型数据导致的。不group by的时候不会遇到。解决方法类似如下写法:
with temp as(
select
cast(primary_channel as string) primary_channel,
cast(channel_keywords as string) channel_keywords,
cast(channel_code as string) channel_code,
cast(channel_name as string) channel_name
from ddi.dim_fill_strategic_channel_level_attribution_df_test
)
select
primary_channel,
channel_keywords,
channel_code,
channel_name
from temp
group by
primary_channel,
channel_keywords,
channel_code,
channel_name
标签:code,group,string,primary,hive,乱码,keywords,channel,name 来源: https://blog.csdn.net/smsmtiger/article/details/122058903