Spark SQL 操作Apache Hudi
作者:互联网
1. 添加jar包到spark的jar目录下
https://repo1.maven.org/maven2/org/apache/hudi/hudi-spark3.1.2-bundle_2.12/0.10.1/hudi-spark3.1.2-bundle_2.12-0.10.1.jar
https://repo1.maven.org/maven2/org/apache/spark/spark-avro_2.12/3.1.2/spark-avro_2.12-3.1.2.jar
2. 重启spark集群
./stop-all.sh
./start-all.sh
3. 进入spark sql客户端
./spark-sql \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
4. 创建表
-- create a cow ta
标签:Hudi,jar,2.12,apache,sql,Apache,org,Spark,spark 来源: https://blog.csdn.net/u011095039/article/details/123092717