其他分享
首页 > 其他分享> > Spark打包与Kerberos相关命令

Spark打包与Kerberos相关命令

作者:互联网

发布的问题

mvn clean package project -am -Pcdp -DskipTests=true 之后将打包好的包上传到相关路径

提交任务与Kerberos文件配置

spark-submit \
--master yarn \
--deploy-mode  cluster \
--driver-memory 4g  --num-executors 4 --executor-memory 8g   --executor-cores 4 \
--keytab /keytab/hdfs.keytab \ keytabPath
--principal hdfs@SQ.CN \
--jars ./common-2.2.jar,tools-2.2.jar,kudu-spark2_2.11-1.15.0.7.1.7.0-551.jar \
--class com.bigdata.utils.MainUtils ./engine-test-2.2.jar

#这块flink有待于 下一步明确
 ./bin/submit.sh ./config-debe.properties testtask-1.1.0.jar 
flink run \
    -m yarn-cluster \
    -yqu default \
    -yjm 2048 -ytm 4096 \
    -yD taskmanager.memory.managed.size=128m \
    -yD flink.hadoop.hadoop.kerberos.keytab.login.autorenewal.enabled=true \
    -yD security.kerberos.login.keytab=$DEPLOY_DIR/kerberos/hdfs.keytab \ keytabPath
    -yD security.kerberos.login.principal=hdfs@SQ.CN \
    -yD java.security.auth.login.config=$DEPLOY_DIR/jaas.conf $JAR_PATH --config "$1"


标签:hdfs,keytab,--,Kerberos,jar,yD,Spark,login,打包
来源: https://www.cnblogs.com/hbym/p/16152615.html