在工作時(shí)會(huì)遇到log4j日志輸出友题,后臺(tái)進(jìn)程中,只有通過(guò)日志來(lái)分析程序中的問(wèn)題构拳,日志分為系統(tǒng)日志和業(yè)務(wù)日志咆爽,一般log4j配置是:log4j.rootLogger=INFO;需要在海量的日志中,獲取需要的信息置森,我們又應(yīng)該怎么做呢斗埂?
》對(duì)錯(cuò)誤日志單獨(dú)輸出
日志有級(jí)別和格式的劃分,一般通過(guò)日志篩選都可以定向的把錯(cuò)誤日志輸出凫海,如下日志輸出:
Java HotSpot(TM) 64-Bit Server VM warning: UseCMSCompactAtFullCollection is deprecated and will likely be removed in a future release
Java HotSpot(TM) 64-Bit Server VM warning: CMSFullGCsBeforeCompaction is deprecated and will likely be removed in a future release.
Java HotSpot(TM) 64-Bit Server VM warning: UseCMSCompactAtFullCollection is deprecated and will likely be removed in a future release
.Java HotSpot(TM) 64-Bit Server VM warning: CMSFullGCsBeforeCompaction is deprecated and will likely be removed in a future release.
Logging initialized using configuration in jar:file:/home/gzdacp/hive-ocdp3.5.1/lib/hive-common-1.1.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/gzdacp/hadoop-ocdp3.5.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/Stati
cLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/gzdacp/hive-ocdp3.5.1/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.c
lass]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
為了獲取報(bào)錯(cuò)信息
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
在crt上編寫(xiě)sh:
#!/bin/bash
current_dir=/home/gzdacp/logs
grep -Ev "warning|SLF4J|slf4j" ${current_dir}/data.`date -d "1 days ago" +%Y-%m-%d`.log >${current_dir}/error_`date -d "1 days ago" +%Y%m%d`.log
其次創(chuàng)建crontab定時(shí)調(diào)度:
00 03 * * * sh /home/gzdacp/logs/logerror.sh
這樣每天凌晨三點(diǎn)的時(shí)候就會(huì)生成前一天的錯(cuò)誤文件:error_yyyymmdd.log呛凶,每天只需要檢測(cè)此文件就知道前一天的報(bào)錯(cuò)情況
》對(duì)重要信息進(jìn)行提取
當(dāng)然除了對(duì)錯(cuò)誤日志進(jìn)行單獨(dú)輸入外,根據(jù)輸出的內(nèi)容也可以提取不同的數(shù)據(jù)信息行贪,幫助我們分析程序的運(yùn)行情況漾稀,比如調(diào)用量,比如訪問(wèn)時(shí)間等建瘫,由于每個(gè)環(huán)境的日志輸出不一樣崭捍,獲取日志信息方法很多,awk,tr,cut在此就不一一列出
》對(duì)日志主機(jī)上的日志進(jìn)行定期維護(hù)
由于日志會(huì)越積越多啰脚,如果程序沒(méi)有定期刪除日志功能殷蛇,就需要我們寫(xiě)sh維護(hù)日志logclear.sh
第一步;
對(duì)前兩天的data.yyyy-mm-dd.log進(jìn)行壓縮
for i in `find /home/gzdacp/logs -ctime +2 -name 'data.*.log' -print`;do
echo "************logclear.sh*********查找日志文件滿(mǎn)足條件需要壓縮的日志文件*******"
echo $i
if [ -n $i ];then
gzip $i
fi
done
第二步:
保留10天的壓縮文件
for j in `find /home/gzdacp/logs -ctime +10 -name 'data.*.log.gz' -print`;do
echo "************logclear.sh*********查找日志文件滿(mǎn)足條件需要壓縮的日志文件*******"
echo $j
if [ -n $j ];then
rm -rf $j
fi
done
第三步:
crontab定時(shí)調(diào)度維護(hù)日志sh
00 04 * * * sh /home/gzdacp/logs/logclear.sh
這樣每天凌晨四點(diǎn)就會(huì)對(duì)日志進(jìn)行壓縮和刪除橄浓,自動(dòng)管理日志目錄
以上就是通過(guò)shell對(duì)日志文件進(jìn)行基本的管理粒梦,能靈活的管理后臺(tái)日志,并清晰的展現(xiàn)后臺(tái)運(yùn)行情況荸实,幫助管理和維護(hù)系統(tǒng)的穩(wěn)定性