数据库
首页 > 数据库> > 【Spark】spark.sql.sources.partitionOverwriteMode

【Spark】spark.sql.sources.partitionOverwriteMode

作者:互联网

参考:(80条消息) spark优化之分区插入_大怀特的博客-CSDN博客_spark插入

 

// 覆盖指定分区
table("tv_group").write.option("partitionOverwriteMode", "dynamic").partitionBy("store_id", "group_id").
mode(SaveMode.Overwrite).save("xxx")

// 删除所有分区再插入
table("tv_group").write.option("partitionOverwriteMode", "STATIC").partitionBy("store_id", "group_id").
mode(SaveMode.Overwrite).save("xx")

标签:group,插入,分区,id,sources,Spark,spark,partitionOverwriteMode
来源: https://www.cnblogs.com/144823836yj/p/16455713.html