【Hive学习之六】Hive Lateral View &视图&索引
作者:互联网
环境
虚拟机:VMware 10
Linux版本:CentOS-6.5-x86_64
客户端:Xshell4
FTP:Xftp4
jdk8
hadoop-3.1.1
apache-hive-3.1.1
一、Hive Lateral View
Lateral View用于和UDTF函数(explode、split)结合来使用。
首先通过UDTF函数拆分成多行,再将多行结果组合成一个支持别名的虚拟表。
主要解决在select使用UDTF做查询过程中,查询只能包含单个UDTF,不能包含其他字段、以及多个UDTF的问题
语法:
LATERAL VIEW udtf(expression) tableAlias AS columnAlias (',' columnAlias)
举例:统计人员表中共有多少种爱好、多少个城市?
hive> select * from psn2; OK psn2.id psn2.name psn2.likes psn2.address psn2.age 1 小明1 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 10 2 小明2 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 10 3 小明3 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 10 4 小明4 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 10 5 小明5 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 10 6 小明6 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 10 1 小明1 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 20 2 小明2 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 20 3 小明3 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 20 4 小明4 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 20 5 小明5 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 20 6 小明6 ["lol","book","movie"] {"beijing":"shangxuetang","shanghai":"pudong"} 20 Time taken: 0.138 seconds, Fetched: 12 row(s) hive> select explode(likes) from psn2; OK col lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie lol book movie Time taken: 0.294 seconds, Fetched: 36 row(s) hive> select count(distinct(myCol1)), count(distinct(myCol2)) from psn2 > LATERAL VIEW explode(likes) myTable1 AS myCol1 > LATERAL VIEW explode(address) myTable2 AS myCol2, myCol3; Query ID = root_20190216171853_af297af9-dcc6-4e1e-8674-fa0969727b23 Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks determined at compile time: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Starting Job = job_1548397153910_0012, Tracking URL = http://PCS102:8088/proxy/application_1548397153910_0012/ Kill Command = /usr/local/hadoop-3.1.1/bin/mapred job -kill job_1548397153910_0012 Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1 2019-02-16 17:19:00,480 Stage-1 map = 0%, reduce = 0% 2019-02-16 17:19:04,582 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 4.08 sec 2019-02-16 17:19:09,693 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 7.24 sec MapReduce Total cumulative CPU time: 7 seconds 240 msec Ended Job = job_1548397153910_0012 MapReduce Jobs Launched: Stage-Stage-1: Map: 1 Reduce: 1 Cumulative CPU: 7.24 sec HDFS Read: 15860 HDFS Write: 103 SUCCESS Total MapReduce CPU Time Spent: 7 seconds 240 msec OK _c0 _c1 3 2 Time taken: 16.894 seconds, Fetched: 1 row(s) hive>
标签:lol,shangxuetang,movie,Lateral,shanghai,pudong,视图,book,Hive 来源: https://www.cnblogs.com/cac2020/p/10388485.html