其他分享
首页 > 其他分享> > 笔记5

笔记5

作者:互联网

#!/bin/sh

home=$(cd `dirname $0`; cd ..; pwd)
. ${home}/bin/common.sh

export HADOOP_HEAPSIZE=20000

fsimage_binary_name=`ls ${fsimage_binary_path} | grep ${cluster} | grep ${day}`

fsimage_binary_file=${fsimage_binary_path}/${fsimage_binary_name}

fsimage_txt_name=${fsimage_binary_name}.txt
fsimage_txt_file=${fsimage_txt_path}/${fsimage_txt_name}

hdfs oiv -p Delimited -i ${fsimage_binary_file} -o ${fsimage_txt_file}

hdfs dfs -mkdir -p ${fsimage_org_hdfs_path}
hdfs dfs -rm ${fsimage_org_hdfs_path}/${fsimage_txt_name}
hdfs dfs -put ${fsimage_txt_file} ${fsimage_org_hdfs_path}

echo -e "==========> load to hive start ==========>"
hive -e "ALTER TABLE xunshan.dwd_fsimage_org DROP IF EXISTS PARTITION(day=${day},cluster='${cluster}');"
hive -e "ALTER TABLE xunshan.dwd_fsimage_org ADD PARTITION(day=${day},cluster='${cluster}') LOCATION '${fsimage_org_hdfs_path}';"
echo -e "==========> load to hive end ==========>"

 

标签:binary,hdfs,笔记,fsimage,org,path,txt
来源: https://www.cnblogs.com/hackerer/p/14608699.html