spark streaming的程序如果運行方式是yarn client mode咖杂,那么如何指定driver和executor的log4j配置文件?
Driver
添加參數(shù)--driver-java-options
spark-submit --driver-java-options "-Dlog4j.configuration=file:/data1/conf/log4j-driver.properties"
Executor
由于executor是運行在yarn的集群中的,所以先要將配置文件通過--files
上傳
spark-submit --files /data1/conf/log4j.properties --conf spark.executor.extraJavaOptions="-Dlog4j.configuration=log4j.properties"
在log4j.properties中要注意配置spark.yarn.app.container.log.dir
例如
log4j.rootLogger=INFO, file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.append=true
log4j.appender.file.file=${spark.yarn.app.container.log.dir}/stdout
log4j.appender.file.MaxFileSize=256MB
log4j.appender.file.MaxBackupIndex=20
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %p [%t] %c{1}:%L - %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
這樣就可以在spark的Web UI中直接查看日志
其他
如果是通過java -cp
命令運行自己的jar包,可以通過下面的方式添加log4j的配置
java -cp -Dlog4j.configuration=file:${APP_HOME}/conf/log4j.properties