Spark的Monitoring
作者:互联网
一、启动历史页面监控配置:
$ vi spark-defaults.conf
spark.eventLog.enabled true
spark.eventLog.dir hdfs://hadoop000:8020/g6_directory
$ vi spark-env.sh
SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://hadoop000:8020/g6_directory"
//在HDFS上创建监控历史文件夹
$ hadoop fs -mkdir hdfs://hadoop000:8020/g6_directory
二、启动后台服务
$ ./sbin/start-history-server.sh
starting org.apache.spark.deploy.history.HistoryServer, logging to /home/hadoop/apps/spark-2.4.2-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.history.HistoryServer-1-hadoop000.out
通过日志查看启动情况:
$ tail -200f /home/hadoop/apps/spark-2.4.2-bin-2.6.0-cdh5.7.0/logs/spark-hadoop-org.apache.spark.deploy.history.HistoryServer-1-hadoop000.out
19/06/19 19:38:10 INFO Utils: Successfully started service on port 18080.
19/06/19 19:38:11 INFO HistoryServer: Bound HistoryServer to 0.0.0.0, and started at http://hadoop000:18080
三、查看页面监控

四、REST API的使用
1、基本使用:
http://hadoop000:18080/api/v1/applications 
2、进一步使用:
http://hadoop000:18080/api/v1/applications/local-1560944971495/jobs 
3、官网部分信息参考:

更多信息参考: http://spark.apache.org/docs/2.2.0/monitoring.html#rest-api
标签:Monitoring,19,HistoryServer,hadoop,hadoop000,Spark,spark,history 来源: https://www.cnblogs.com/suixingc/p/spark-demonitoring.html