加载中…
个人资料
麦兜搞IT
麦兜搞IT
  • 博客等级:
  • 博客积分:0
  • 博客访问:1,533,647
  • 关注人气:531
  • 获赠金笔:0支
  • 赠出金笔:0支
  • 荣誉徽章:
相关博文
推荐博文
谁看过这篇博文
加载中…
正文 字体大小:

Twisted的定制化log

(2011-12-02 13:14:57)
标签:

twisted

python

log

it

分类: Python

    无意在scrapy的源码中看到的,基于Twisted的log而定制化一个新的log类,输出或者写文件,并可以设置log的level。日后必然会用到这个模块,不用再自己写了,暂且记录下来。

 

 

"""
Scrapy logging facility

See documentation in docs/topics/logging.rst
"""
import sys
import logging
import warnings

from twisted.python import log

import scrapy
from scrapy.conf import settings
from scrapy.utils.python import unicode_to_str
from scrapy.utils.misc import load_object
from scrapy.exceptions import ScrapyDeprecationWarning
 
# Logging levels
DEBUG = logging.DEBUG
INFO = logging.INFO
WARNING = logging.WARNING
ERROR = logging.ERROR
CRITICAL = logging.CRITICAL
SILENT = CRITICAL + 1

level_names = {
    logging.DEBUG: "DEBUG",
    logging.INFO: "INFO",
    logging.WARNING: "WARNING",
    logging.ERROR: "ERROR",
    logging.CRITICAL: "CRITICAL",
    SILENT: "SILENT",
}

started = False

class ScrapyFileLogObserver(log.FileLogObserver):

    def __init__(self, f, level=INFO, encoding='utf-8'):
        self.level = level
        self.encoding = encoding
        log.FileLogObserver.__init__(self, f)

    def emit(self, eventDict):
        ev = _adapt_eventdict(eventDict, self.level, self.encoding)
        if ev is not None:
            log.FileLogObserver.emit(self, ev)

def _adapt_eventdict(eventDict, log_level=INFO, encoding='utf-8', prepend_level=True):
    """Adapt Twisted log eventDict making it suitable for logging with a Scrapy
    log observer. It may return None to indicate that the event should be
    ignored by a Scrapy log observer.

    `log_level` is the minimum level being logged, and `encoding` is the log
    encoding.
    """
    ev = eventDict.copy()
    if ev['isError']:
        ev.setdefault('logLevel', ERROR)
    # ignore non-error messages from outside scrapy
    if ev.get('system') != 'scrapy' and not ev['isError']:
        return
    level = ev.get('logLevel')
    if level < log_level:
        return
    spider = ev.get('spider')
    if spider:
        ev['system'] = spider.name
    message = ev.get('message')
    lvlname = level_names.get(level, 'NOLEVEL')
    if message:
        message = [unicode_to_str(x, encoding) for x in message]
        if prepend_level:
            message[0] = "%s: %s" % (lvlname, message[0])
    ev['message'] = message
    why = ev.get('why')
    if why:
        why = unicode_to_str(why, encoding)
        if prepend_level:
            why = "%s: %s" % (lvlname, why)
    ev['why'] = why
    return ev

def _get_log_level(level_name_or_id=None):
    if level_name_or_id is None:
        lvlname = settings['LOG_LEVEL']
        return globals()[lvlname]
    elif isinstance(level_name_or_id, int):
        return level_name_or_id
    elif isinstance(level_name_or_id, basestring):
        return globals()[level_name_or_id]
    else:
        raise ValueError("Unknown log level: %r" % level_name_or_id)

def start(logfile=None, loglevel=None, logstdout=None):
    global started
    if started or not settings.getbool('LOG_ENABLED'):
        return
    started = True

    if log.defaultObserver: # check twisted log not already started
        loglevel = _get_log_level(loglevel)
        logfile = logfile or settings['LOG_FILE']
        file = open(logfile, 'a') if logfile else sys.stderr
        if logstdout is None:
            logstdout = settings.getbool('LOG_STDOUT')
        sflo = ScrapyFileLogObserver(file, loglevel, settings['LOG_ENCODING'])
        _oldshowwarning = warnings.showwarning
        log.startLoggingWithObserver(sflo.emit, setStdout=logstdout)
        # restore warnings, wrongly silenced by Twisted
        warnings.showwarning = _oldshowwarning
        msg("Scrapy %s started (bot: %s)" % (scrapy.__version__, \
            settings['BOT_NAME']))

def msg(message, level=INFO, **kw):
    if 'component' in kw:
        warnings.warn("Argument `component` of scrapy.log.msg() is deprecated", \
            ScrapyDeprecationWarning, stacklevel=2)
    kw.setdefault('system', 'scrapy')
    kw['logLevel'] = level
    log.msg(message, **kw)

def err(_stuff=None, _why=None, **kw):
    kw.setdefault('system', 'scrapy')
    kw['logLevel'] = kw.pop('level', ERROR)
    log.err(_stuff, _why, **kw)

formatter = load_object(settings['LOG_FORMATTER'])()

 

.. _topics-logging:

=======
Logging
=======

Scrapy provides a logging facility which can be used through the
:mod:`scrapy.log` module. The current underlying implementation uses `Twisted
logging`_ but this may change in the future.

.. _Twisted logging: http://twistedmatrix.com/projects/core/documentation/howto/logging.html

The logging service must be explicitly started through the :func:`scrapy.log.start` function.

.. _topics-logging-levels:

Log levels
==========

Scrapy provides 5 logging levels:

1. :data:`~scrapy.log.CRITICAL` - for critical errors
2. :data:`~scrapy.log.ERROR` - for regular errors
3. :data:`~scrapy.log.WARNING` - for warning messages
4. :data:`~scrapy.log.INFO` - for informational messages
5. :data:`~scrapy.log.DEBUG` - for debugging messages

How to set the log level
========================

You can set the log level using the `--loglevel/-L` command line option, or
using the :setting:`LOG_LEVEL` setting.

How to log messages
===================

Here's a quick example of how to log a message using the ``WARNING`` level::

    from scrapy import log
    log.msg("This is a warning", level=log.WARNING)

Logging from Spiders
====================

The recommended way to log from spiders is by using the Spider
:meth:`~scrapy.spider.BaseSpider.log` method, which already populates the
``spider`` argument of the :func:`scrapy.log.msg` function. The other arguments
are passed directly to the :func:`~scrapy.log.msg` function.

scrapy.log module
=================

.. module:: scrapy.log
   :synopsis: Logging facility

.. attribute:: started

   A boolean which is ``True`` if logging has been started or ``False`` otherwise.

.. function:: start(logfile=None, loglevel=None, logstdout=None)

    Start the logging facility. This must be called before actually logging any
    messages. Otherwise, messages logged before this call will get lost.

    :param logfile: the file path to use for logging output. If omitted, the
        :setting:`LOG_FILE` setting will be used. If both are ``None``, the log
        will be sent to standard error.
    :type logfile: str

    :param loglevel: the minimum logging level to log. Availables values are:
        :data:`CRITICAL`, :data:`ERROR`, :data:`WARNING`, :data:`INFO` and
        :data:`DEBUG`.

    :param logstdout: if ``True``, all standard output (and error) of your
        application will be logged instead. For example if you "print 'hello'"
        it will appear in the Scrapy log. If ommited, the :setting:`LOG_STDOUT`
        setting will be used.
    :type logstdout: boolean

.. function:: msg(message, level=INFO, spider=None)

    Log a message

    :param message: the message to log
    :type message: str

    :param level: the log level for this message. See
        :ref:`topics-logging-levels`.

    :param spider: the spider to use for logging this message. This parameter
        should always be used when logging things related to a particular
        spider.
    :type spider: :class:`~scrapy.spider.BaseSpider` object

.. data:: CRITICAL

    Log level for critical errors

.. data:: ERROR

    Log level for errors

.. data:: WARNING

    Log level for warnings

.. data:: INFO

    Log level for informational messages (recommended level for production
    deployments)

.. data:: DEBUG

    Log level for debugging messages (recommended level for development)

Logging settings
================

These settings can be used to configure the logging:

* :setting:`LOG_ENABLED`
* :setting:`LOG_ENCODING`
* :setting:`LOG_FILE`
* :setting:`LOG_LEVEL`
* :setting:`LOG_STDOUT`

0

阅读 评论 收藏 转载 喜欢 打印举报/Report
  • 评论加载中,请稍候...
发评论

    发评论

    以上网友发言只代表其个人观点,不代表新浪网的观点或立场。

      

    新浪BLOG意见反馈留言板 电话:4000520066 提示音后按1键(按当地市话标准计费) 欢迎批评指正

    新浪简介 | About Sina | 广告服务 | 联系我们 | 招聘信息 | 网站律师 | SINA English | 会员注册 | 产品答疑

    新浪公司 版权所有