1.在settings中設(shè)置log級(jí)別晃择,在settings.py中添加一行:
LOG_LEVEL = 'WARNING'
Scrapy提供5層logging級(jí)別:
-
CRITICAL
- 嚴(yán)重錯(cuò)誤(critical) -
ERROR
- 一般錯(cuò)誤(regular errors) -
WARNING
- 警告信息(warning messages) -
INFO
- 一般信息(informational messages) -
DEBUG
- 調(diào)試信息(debugging messages)
scrapy默認(rèn)顯示DEBUG級(jí)別的log信息
- 將輸出的結(jié)果保存為log日志蛆楞,在settings.py中添加路徑:
LOG_FILE = './log.log'
3.顯示log位置,在pipelines.py中:
import logging
logger = logging.gerlogger(__name__)
def process_item(self,item,spider):
logger.warning(item)
4.在spider
文件中引入Log日志
class DcdappSpider(scrapy.Spider):
name = 'dcdapp'
allowed_domains = ['m.dcdapp.com']
custom_settings = {
# 設(shè)置管道下載
'ITEM_PIPELINES': {
'autospider.pipelines.DcdAppPipeline': 300,
},
# 設(shè)置log日志
'LOG_LEVEL':'DEBUG',
'LOG_FILE':'./././Log/dcdapp_log.log'
}
文章轉(zhuǎn)載:csdn:https://blog.csdn.net/iswangrl/article/details/78286467