2024-11-22 01:00:24 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: catalog_discovery) 2024-11-22 01:00:25 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.10 (main, Nov 12 2024, 02:25:24) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-6.4.10-dirty-x86_64-with-glibc2.36 2024-11-22 01:00:25 [scrapy.addons] INFO: Enabled addons: [] 2024-11-22 01:00:25 [scrapy.extensions.telnet] INFO: Telnet Password: 9282822c56acb7b4 2024-11-22 01:00:25 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.closespider.CloseSpider', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'spidermon.contrib.scrapy.extensions.Spidermon'] 2024-11-22 01:00:25 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'catalog_discovery', 'CONCURRENT_ITEMS': 1000, 'CONCURRENT_REQUESTS': 32, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/var/lib/scrapyd/logs/catalog_discovery/grainger/2659b0aaa86d11ef9b0f4200a9fe0102.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'catalog_discovery.spiders', 'REQUEST_FINGERPRINTER_CLASS': 'scrapy_poet.ScrapyPoetRequestFingerprinter', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_TIMES': 5, 'SPIDER_MODULES': ['catalog_discovery.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor', 'USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:125.0) ' 'Gecko/20100101 Firefox/125.0'} 2024-11-22 01:00:25 [scrapy_poet.injection] INFO: Loading providers: [, , , , , , ] 2024-11-22 01:00:25 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scraping_utils.middlewares.downloaders.ProxyManagerDownloaderMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy_poet.InjectionMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy_poet.DownloaderStatsMiddleware'] 2024-11-22 01:00:25 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy_poet.RetryMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2024-11-22 01:01:15 [catalog_discovery.pipelines] INFO: Starting CatalogItemFilterPipeline with 1388286 records from GRAINGER. 2024-11-22 01:01:15 [scrapy.middleware] INFO: Enabled item pipelines: ['catalog_discovery.pipelines.CatalogItemFilterPipeline', 'scraping_utils.pipelines.AttachSupplierPipeline', 'spidermon.contrib.scrapy.pipelines.ItemValidationPipeline'] 2024-11-22 01:01:15 [scrapy.core.engine] INFO: Spider opened 2024-11-22 01:01:15 [scrapy.extensions.closespider] INFO: Spider will stop when no items are produced after 1800 seconds. 2024-11-22 01:01:15 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2024-11-22 01:01:15 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2024-11-22 01:01:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/defer.py", line 295, in aiter_errback yield await it.__anext__() ^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 374, in __anext__ return await self.data.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 355, in _async_chain async for o in as_async_generator(it): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator async for r in it: File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 374, in __anext__ return await self.data.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 355, in _async_chain async for o in as_async_generator(it): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator async for r in it: File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 118, in process_async async for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async async for r in result or (): File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 118, in process_async async for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async async for r in result or (): File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 118, in process_async async for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async async for r in result or (): File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 118, in process_async async for r in iterable: File "/var/lib/scrapyd/eggs/catalog_discovery/1732237119.egg/catalog_discovery/spiders/__init__.py", line 79, in _parse_sitemap if page.is_ready_for_extraction(decompressed_body): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/var/lib/scrapyd/eggs/catalog_discovery/1732237119.egg/catalog_discovery/pages/__init__.py", line 45, in is_ready_for_extraction self._sitemap = Sitemap(decompressed_body) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/sitemap.py", line 22, in __init__ rt = self._root.tag ^^^^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'tag' 2024-11-22 01:01:16 [scrapy.core.engine] INFO: Closing spider (finished) 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] ------------------------------ MONITORS ------------------------------ 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Extracted Items Monitor/test_stat_monitor... FAIL 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Item Validation Monitor/test_stat_monitor... SKIPPED (Unable to find 'spidermon/validation/fields/errors' in job stats.) 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Error Count Monitor/test_stat_monitor... FAIL 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Warning Count Monitor/test_stat_monitor... SKIPPED (Unable to find 'log_count/WARNING' in job stats.) 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Finish Reason Monitor/Should have the expected finished reason(s)... OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Unwanted HTTP codes monitor/Should not hit the limit of unwanted http status... OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Field Coverage Monitor/test_check_if_field_coverage_rules_are_met... FAIL 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Retry Count monitor/Should not hit the limit of requests that reached the maximum retry amount... OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Downloader Exceptions monitor/test_stat_monitor... SKIPPED (Unable to find 'downloader/exception_count' in job stats.) 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Successful Requests monitor/Should have at least the minimum number of successful requests... OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] Total Requests monitor/Should not hit the total limit of requests... OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-11-22 01:01:16 [grainger] ERROR: [Spidermon] ====================================================================== FAIL: Extracted Items Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 177, in test_stat_monitor self.fail(message) AssertionError: Unable to find 'item_scraped_count' in job stats. 2024-11-22 01:01:16 [grainger] ERROR: [Spidermon] ====================================================================== FAIL: Error Count Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 184, in test_stat_monitor assertion_method( AssertionError: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '1' 2024-11-22 01:01:16 [grainger] ERROR: [Spidermon] ====================================================================== FAIL: Field Coverage Monitor/test_check_if_field_coverage_rules_are_met ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/monitors.py", line 476, in test_check_if_field_coverage_rules_are_met self.assertTrue(len(failures) == 0, msg=msg) AssertionError: The following items did not meet field coverage rules: dict/url (expected 1.0, got 0) dict/supplier (expected 1.0, got 0) 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] 11 monitors in 0.004s 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] FAILED (failures=3, skipped=3) 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] -------------------------- FINISHED ACTIONS -------------------------- 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] 0 actions in 0.000s 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] --------------------------- PASSED ACTIONS --------------------------- 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] 0 actions in 0.000s 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] --------------------------- FAILED ACTIONS --------------------------- 2024-11-22 01:01:16 [spidermon.contrib.actions.slack] INFO: :skull: `grainger` *spider finished with errors!* _(errors=3)_ 2024-11-22 01:01:16 [spidermon.contrib.actions.slack] INFO: [ { "text": "• _Extracted Items Monitor/test_stat_monitor_\n• _Error Count Monitor/test_stat_monitor_\n• _Field Coverage Monitor/test_check_if_field_coverage_rules_are_met_\n", "color": "danger", "mrkdwn_in": ["text", "pretext"] } , ] 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] SendSlackMessageSpiderFinished... OK 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] 1 action in 0.011s 2024-11-22 01:01:16 [grainger] INFO: [Spidermon] OK 2024-11-22 01:01:16 [scrapy.extensions.feedexport] INFO: No data to insert into BigQuery - closing feed storage 2024-11-22 01:01:16 [scrapy.extensions.feedexport] INFO: Stored bq feed (0 items) in: bq://response-elt.dev_scrapers.catalog_urls/batch:1 2024-11-22 01:01:16 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 400, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 12569, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'elapsed_time_seconds': 0.900441, 'feedexport/success_count/BigQueryFeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2024, 11, 22, 1, 1, 16, 523397, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 19287, 'httpcompression/response_count': 1, 'log_count/ERROR': 4, 'log_count/INFO': 45, 'memusage/max': 534048768, 'memusage/startup': 534048768, 'poet/injector/catalog_discovery.pages.grainger.GraingerSitemapPageObject': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/AttributeError': 1, 'spidermon/validation/validators': 1, 'spidermon/validation/validators/item/jsonschema': True, 'start_time': datetime.datetime(2024, 11, 22, 1, 1, 15, 622956, tzinfo=datetime.timezone.utc)} 2024-11-22 01:01:16 [scrapy.core.engine] INFO: Spider closed (finished)