pytest icon indicating copy to clipboard operation
pytest copied to clipboard

Caplog don't capture all logs

Open FRiMN opened this issue 5 years ago • 7 comments

Caplog start capturing, but in a specific place capturing is stopped. It reproduces in one test and does not reproduce in the rest. Additional I need caplog.set_level('DEBUG') for this test (specify the logging level for caplog, otherwise nothing gets into it at all). In another tests this not need.

Example of output:

root@b1ad842cef21:/usr/src/app# pytest ./tests/functional/test_scraping_wix_site.py -s
/usr/local/lib/python3.7/site-packages/twisted/internet/address.py:101: DeprecationWarning: The usage of `cmp` is deprecated and will be removed on or after 2021-06-01.  Please use `eq` and `order` instead.
  @attr.s(hash=False, repr=False, cmp=False)
====================================================================== test session starts ======================================================================
platform linux -- Python 3.7.3, pytest-5.3.5, py-1.8.1, pluggy-0.13.1
rootdir: /usr/src/app, inifile: pytest.ini
plugins: twisted-1.12, cov-2.8.1
collected 1 item                                                                                                                                                

tests/functional/test_scraping_wix_site.py ---- init logging
2020-06-08 04:43:55,519 - botocore.hooks - DEBUG - Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane
2020-06-08 04:43:55,526 - botocore.hooks - DEBUG - Changing event name from before-call.apigateway to before-call.api-gateway

<... some logs ...>

2020-06-08 04:43:55,740 - scrapy.middleware - INFO - Enabled item pipelines:
['crawler.pipelines.MysqlPipeline', 'crawler.pipelines.ScreenshotPipeline']
2020-06-08 04:43:55,740 - scrapy.core.engine - INFO - Spider opened
2020-06-08 04:43:55,760 - scrapy.extensions.logstats - INFO - Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-06-08 04:43:56,447 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (1): splash:8050
2020-06-08 04:43:56,640 - urllib3.connectionpool - DEBUG - http://splash:8050 "POST /_gc HTTP/1.1" 200 72
2020-06-08 04:43:56,666 - urllib3.connectionpool - DEBUG - Starting new HTTP connection (1): splash:8050

<... some logs ...>

2020-06-08 04:44:28,617 - scrapy.statscollectors - INFO - Dumping Scrapy stats:
{'crawler_error_code': 5,
 'downloader/request_bytes': 4233,
 'downloader/request_count': 6,
 'downloader/request_method_count/POST': 6,
 'downloader/response_bytes': 1219766,
 'downloader/response_count': 6,
 'downloader/response_status_count/200': 6,
 'elapsed_time_seconds': 32.852748,
 'finish_reason': 'expected_crawler_error',
 'finish_time': datetime.datetime(2020, 6, 8, 4, 44, 28, 614023),
 'item_scraped_count': 4,
 'log_count/INFO': 4,
 'memusage/max': 148881408,
 'memusage/startup': 148881408,
 'parse_start_url': 'https://www.webseowriter.net',
 'pipeline_mysql/external_links_saved': 0,
 'pipeline_mysql/images_saved': 0,
 'pipeline_mysql/pages_saved': 1,
 'pipeline_mysql/url_links_saved': 0,
 'pipeline_screenshot/screenshots_failure': 0,
 'request_depth_count/0': 1,
 'request_depth_count/1': 1,
 'request_depth_max': 1,
 'response_received_count': 3,
 'robotstxt/request_count': 1,
 'robotstxt/response_count': 1,
 'robotstxt/response_status_count/200': 1,
 'scheduler/dequeued': 6,
 'scheduler/dequeued/memory': 6,
 'scheduler/enqueued': 6,
 'scheduler/enqueued/memory': 6,
 'splash/render.html/request_count': 3,
 'splash/render.html/response_count/200': 3,
 'splash/render.png/request_count': 3,
 'splash/render.png/response_count/200': 3,
 'start_time': datetime.datetime(2020, 6, 8, 4, 43, 55, 761275)}
2020-06-08 04:44:28,618 - scrapy.core.engine - INFO - Spider closed (expected_crawler_error)

F

=========================================================================== FAILURES ============================================================================

In caplog.text:

INFO     scrapy.crawler:crawler.py:49 Overridden settings: {'AJAXCRAWL_ENABLED': True, 'BOT_NAME': 'crawler', 'CONCURRENT_ITEMS': 500, 'CONCURRENT_REQUESTS': 100, 'CONCURRENT_REQUESTS_PER_DOMAIN': 500, 'COOKIES_ENABLED': False, 'DEPTH_LIMIT': 20, 'DEPTH_PRIORITY': 1, 'DEPTH_STATS_VERBOSE': True, 'DOWNLOAD_DELAY': 0.1, 'DOWNLOAD_MAXSIZE': 2097152, 'DOWNLOAD_TIMEOUT': 50, 'FILES_STORE_S3_ACL': 'public-read', 'HTTPCACHE_POLICY': 'scrapy.extensions.httpcache.RFC2616Policy', 'HTTPCACHE_STORAGE': 'scrapy.extensions.httpcache.DbmCacheStorage', 'NEWSPIDER_MODULE': 'crawler.spiders', 'REACTOR_THREADPOOL_MAXSIZE': 50, 'REDIRECT_MAX_TIMES': 10, 'RETRY_ENABLED': False, 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['crawler.spiders'], 'TELNETCONSOLE_ENABLED': False, 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0 Win64x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36'}
INFO     scrapy.middleware:middleware.py:48 Enabled extensions:
['scrapy.extensions.memusage.MemoryUsage',
 'scrapy.extensions.logstats.LogStats',
 'scrapy.extensions.corestats.CoreStats',
 'crawler.extentions.SpiderDetectJS']
INFO     scrapy.middleware:middleware.py:48 Enabled downloader middlewares:
['scrapy.spidermiddlewares.depth.DepthMiddleware',
 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.ajaxcrawl.AjaxCrawlMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'crawler.middlewares.CrawlerDownloaderWixMiddleware',
 'rotating_proxies.middlewares.BanDetectionMiddleware',
 'crawler.middlewares.CrawlerDownloaderMiddleware',
 'scrapy_splash.SplashCookiesMiddleware',
 'scrapy_splash.SplashMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
INFO     scrapy.middleware:middleware.py:48 Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy_splash.SplashDeduplicateArgsMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'crawler.middlewares.CrawlerSpiderMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']

Additional Info

root@b1ad842cef21:/usr/src/app# pip list
Package                 Version   
----------------------- ----------
attrs                   19.3.0    
Automat                 20.2.0    
beautifulsoup4          4.8.2     
boto3                   1.12.4    
botocore                1.15.49   
certifi                 2020.4.5.1
cffi                    1.14.0    
chardet                 3.0.4     
click                   7.1.2     
click-log               0.3.2     
constantly              15.1.0    
coverage                5.1       
cryptography            2.9.2     
cssselect               1.1.0     
decorator               4.4.2     
detectem                0.7.1     
docker                  4.1.0     
docutils                0.15.2    
greenlet                0.4.15    
hyperlink               19.0.0    
idna                    2.8       
importlib-metadata      1.6.0     
incremental             17.5.0    
jmespath                0.10.0    
logstash-formatter      0.5.17    
lxml                    4.5.1     
more-itertools          8.3.0     
mysqlclient             1.4.6     
packaging               20.4      
pact-python             0.21.0    
parsel                  1.6.0     
pip                     19.1.1    
pluggy                  0.13.1    
Protego                 0.1.16    
psutil                  5.7.0     
py                      1.8.1     
pyasn1                  0.4.8     
pyasn1-modules          0.2.8     
pycparser               2.20      
PyDispatcher            2.0.5     
PyHamcrest              2.0.2     
pyOpenSSL               19.1.0    
pyparsing               2.4.7     
pytest                  5.3.5     
pytest-cov              2.8.1     
pytest-twisted          1.12      
python-dateutil         2.8.1     
python3-logstash        0.4.80    
queuelib                1.5.0     
requests                2.22.0    
requests-file           1.5.1     
responses               0.10.9    
s3transfer              0.3.3     
scrapoxy                1.11      
Scrapy                  1.8.0     
scrapy-rotating-proxies 0.6.2     
scrapy-splash           0.7.2     
service-identity        18.1.0    
setuptools              41.0.1    
simplejson              3.17.0    
six                     1.15.0    
soupsieve               2.0.1     
SQLAlchemy              1.3.13    
tldextract              2.2.2     
Twisted                 19.10.0   
typing                  3.7.4.1   
urllib3                 1.25.9    
w3lib                   1.21.0    
wcwidth                 0.1.9     
websocket-client        0.57.0    
wheel                   0.33.1    
zipp                    3.1.0     
zope.interface          5.1.0     
WARNING: You are using pip version 19.1.1, however version 20.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
root@b1ad842cef21:/usr/src/app# pytest --version
This is pytest version 5.3.5, imported from /usr/local/lib/python3.7/site-packages/pytest/__init__.py
setuptools registered plugins:
  pytest-twisted-1.12 at /usr/local/lib/python3.7/site-packages/pytest_twisted.py
  pytest-cov-2.8.1 at /usr/local/lib/python3.7/site-packages/pytest_cov/plugin.py
root@b1ad842cef21:/usr/src/app# uname -a
Linux b1ad842cef21 5.3.0-46-generic #38~18.04.1-Ubuntu SMP Tue Mar 31 04:17:56 UTC 2020 x86_64 GNU/Linux

Reproduce

I will try to prepare the minimum code a bit later

  • [x] a detailed description of the bug or suggestion
  • [x] output of pip list from the virtual environment you are using
  • [x] pytest and operating system versions
  • [ ] minimal example if possible

FRiMN avatar Jun 08 '20 06:06 FRiMN

I'm affected too by this bug, and it seems that it doesn't capture INFO log messages. Here is a minimal repro setup I've come up with:

repro.py

import logging

logger = logging.getLogger()


def repro_7335():
    logger.info('info message')
    logger.warning('warning message')


def main():
    logging.basicConfig(level=logging.INFO)
    repro_7335()


if __name__ == '__main__':
    main()

test_repro.py

from repro import repro_7335

def test_repro_7335(caplog):
    repro_7335()
    assert 'warning message' in caplog.text
    assert 'info message' in caplog.text

venv setup

virtualenv --python=python3.7 pytest
. pytest/bin/activate
pip installl pytest  # got pytest==6.0.1

run the tests

$ pytest
===================================================================== test session starts ======================================================================
platform darwin -- Python 3.7.7, pytest-6.0.1, py-1.9.0, pluggy-0.13.1
rootdir: ~/pytest-7335
collected 1 item

test_repro.py F                                                                                                                                          [100%]

=========================================================================== FAILURES ===========================================================================
_______________________________________________________________________ test_repro_7335 ________________________________________________________________________

caplog = <_pytest.logging.LogCaptureFixture object at 0x10d397f90>

    def test_repro_7335(caplog):
        repro_7335()
        assert 'warning message' in caplog.text
>       assert 'info message' in caplog.text
E       AssertionError: assert 'info message' in 'WARNING  root:repro.py:8 warning message\n'
E        +  where 'WARNING  root:repro.py:8 warning message\n' = <_pytest.logging.LogCaptureFixture object at 0x10d397f90>.text

test_repro.py:6: AssertionError
---------------------------------------------------------------------- Captured log call -----------------------------------------------------------------------
WARNING  root:repro.py:8 warning message
=================================================================== short test summary info ====================================================================
FAILED test_repro.py::test_repro_7335 - AssertionError: assert 'info message' in 'WARNING  root:repro.py:8 warning message\n'
====================================================================== 1 failed in 0.05s =======================================================================

EDIT

repro.py edited according to the below reply calling logging.basicConfig() from within main().

volans- avatar Aug 17 '20 09:08 volans-

@FRiMN Can you provide a reproduction? Otherwise it is not very actionable.


@volans-

The problem in your case is that the logging.basicConfig(level=logging.INFO) call does nothing. The pytest logging capturing necessarily works by hooking into the logging infrastructure, therefore it adds various handlers to the root logger on initialization. When logging.basicConfig() sees this, it exists without doing anything (unless force=True true is used, but you shouldn't use that as it would interfere with pytest).

Basically, calling logging.basicConfig at the global scope of some module is not a good idea, you should instead call it from your main function or such.

The way you ended up fixing this (using caplog.set_level('INFO')) is not really a workaround, but a good way to do it. Alternatively you can set log_level = INFO in your pytest.ini, but note this also affects which messages are shown on test failures.

bluetech avatar Aug 22 '20 13:08 bluetech

@bluetech my actual software doesn't initialize the logger as is, this was the minimal setup I came up as a repro. The actual code sets various handlers to the root logger and calls setLevel() on it. My expectation would be that pytest should not alter the logs generated by the application under test, but maybe I'm missing something here. I'll see if I can come up with a better repro example.

volans- avatar Aug 24 '20 08:08 volans-

My expectation would be that pytest should not alter the logs generated by the application under test

Yes that would have been best, but there are competing desires and in order to function we do have to hook up into the logging flow somehow, which has global effects.

bluetech avatar Aug 24 '20 08:08 bluetech

We ran into this issue (or something very closely related) in https://github.com/adobe-type-tools/psautohint where caplog is used extensively in tests. To reproduce, simply run the test suite using pytest 6.0.0+. We are temporarily working around by pinning pytest to the previous version, which works fine. See https://github.com/adobe-type-tools/psautohint/runs/959541927?check_suite_focus=true and https://github.com/adobe-type-tools/psautohint/pull/264 for more info.

josh-hadley avatar Aug 25 '20 02:08 josh-hadley

@bluetech just FYI I've updated my previous repro example moving the call to basicConfig and it still repro the issue.

I've no problem in my specific case to set log_level in pytest's config, but it seems to hack a little bit the concept of testing in some cases, not to mention that it looks like a regression from pytest prior to the 6.0.0 release.

Just to make an example, let's say that I'd like to test a CLI script with and without a --debug option, if pytest alters the logging level that it's reported during testing there could be false positive/negative cases in which the test passes but the real application would behave differently.

volans- avatar Aug 25 '20 17:08 volans-

We have the same issue, but what I observed is that caplog behaves differently when different parts of the test suite are selected with -k.

kissgyorgy avatar Jun 10 '22 11:06 kissgyorgy

Similar issue here. caplog works fine when you run a subset of our tests, but when you run the whole test suite it's missing log lines. The log lines are definitely being emitted, I tested via the debugger.

I wish I could provide a minimal reproduction case but I can run 100 tests and it works, but when I run all 900+ tests that's when it fails. Having a really hard time finding the pattern.

basepi avatar Nov 17 '22 17:11 basepi

I finally got this working by changing

with caplog.at_level("DEBUG"):

to

with caplog.at_level("DEBUG", logger="elasticapm.transport.http"):

It turns out I was misunderstanding what caplog.at_level is supposed to do. I thought it was capturing logs at a given level. But it's not, caplog always captures all logs emitted during a test. What with caplog.at_level does is temporarily sets the log level for the given logger during the with block. If the logger is not specified, it sets the log level for the root logger.

So, if you're having issues where it's not capturing what you expect, make sure your log levels are actually set the way you expect. Another part of your test suite may be changing your log levels.

Edit: I should note the docs are clear on this, I think I learned about caplog from reading other tests and made assumptions which later bit me. Apologies for the noise, but hopefully this can help someone else.

basepi avatar Nov 17 '22 18:11 basepi

I should note the [docs](https://docs.pytest.org/en/7.1.x/how-to/logging.html#caplog-fixture) are clear on this

I don't think the docs are clear on this and I'd love a docs update with a complete example unit test. The docs say Inside tests it is possible to change the log level for the captured log messages.

However, the log level is not really changed for the whole test. Consider:

with caplog.at_level("DEBUG", logger=__name__):
        built_in_logger.info('info message')
        built_in_logger.warning('warning message')

    assert caplog.records[-1].levelname == "DEBUG"

this fails because the levelname is warning. the docs indicate it would be mutated throughout the test but it is not so

upstartjohnvandivier avatar Nov 28 '22 19:11 upstartjohnvandivier