2025-11-30 01:45:07 [scrapy.utils.log] (PID: 156) INFO: Scrapy 2.12.0 started (bot: catalog_extraction) 2025-11-30 01:45:07 [scrapy.utils.log] (PID: 156) INFO: Versions: lxml 5.3.1.0, libxml2 2.12.9, cssselect 1.3.0, parsel 1.10.0, w3lib 2.3.1, Twisted 24.11.0, Python 3.11.13 (main, Jun 10 2025, 23:54:42) [GCC 12.2.0], pyOpenSSL 25.0.0 (OpenSSL 3.4.1 11 Feb 2025), cryptography 44.0.2, Platform Linux-6.9.12-x86_64-with-glibc2.36 2025-11-30 01:45:08 [lowes] (PID: 156) INFO: Starting extraction spider lowes... 2025-11-30 01:45:08 [scrapy.addons] (PID: 156) INFO: Enabled addons: [] 2025-11-30 01:45:08 [py.warnings] (PID: 156) WARNING: /usr/local/lib/python3.11/site-packages/scrapy/utils/request.py:120: ScrapyDeprecationWarning: 'REQUEST_FINGERPRINTER_IMPLEMENTATION' is a deprecated setting. It will be removed in a future version of Scrapy. return cls(crawler) 2025-11-30 01:45:08 [scrapy.extensions.telnet] (PID: 156) INFO: Telnet Password: 40afb7e23846a011 2025-11-30 01:45:08 [py.warnings] (PID: 156) WARNING: /var/lib/scrapyd/eggs/catalog_extraction/1758126308.egg/catalog_extraction/extensions/bq_feedstorage.py:33: ScrapyDeprecationWarning: scrapy.extensions.feedexport.build_storage() is deprecated, call the builder directly. 2025-11-30 01:45:08 [scrapy.middleware] (PID: 156) INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.closespider.CloseSpider', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'scrapy_playwright.memusage.ScrapyPlaywrightMemoryUsageExtension', 'spidermon.contrib.scrapy.extensions.Spidermon'] 2025-11-30 01:45:08 [scrapy.crawler] (PID: 156) INFO: Overridden settings: {'BOT_NAME': 'catalog_extraction', 'CONCURRENT_ITEMS': 250, 'CONCURRENT_REQUESTS': 24, 'FEED_EXPORT_ENCODING': 'utf-8', 'HTTPPROXY_ENABLED': False, 'LOG_FILE': '/var/lib/scrapyd/logs/catalog_extraction/lowes/2fd96a34cd8e11f08c504200a9fe0102.log', 'LOG_FORMAT': '%(asctime)s [%(name)s] (PID: %(process)d) %(levelname)s: ' '%(message)s', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'catalog_extraction.spiders', 'REQUEST_FINGERPRINTER_CLASS': 'scrapy_poet.ScrapyPoetRequestFingerprinter', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_HTTP_CODES': [500, 502, 503, 504, 522, 524, 408, 429, 403], 'SPIDER_MODULES': ['catalog_extraction.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor', 'USER_AGENT': None} 2025-11-30 01:45:08 [scrapy-playwright] (PID: 156) WARNING: Connecting to remote browser, ignoring PLAYWRIGHT_LAUNCH_OPTIONS 2025-11-30 01:45:08 [scrapy-playwright] (PID: 156) WARNING: Connecting to remote browser, ignoring PLAYWRIGHT_LAUNCH_OPTIONS 2025-11-30 01:45:08 [scrapy-playwright] (PID: 156) WARNING: Connecting to remote browser, ignoring PLAYWRIGHT_LAUNCH_OPTIONS 2025-11-30 01:45:08 [scrapy-playwright] (PID: 156) WARNING: Connecting to remote browser, ignoring PLAYWRIGHT_LAUNCH_OPTIONS 2025-11-30 01:45:08 [scrapy_poet.injection] (PID: 156) INFO: Loading providers: [, , , , , , ] 2025-11-30 01:45:08 [scrapy.middleware] (PID: 156) INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scraping_utils.middlewares.downloaders.HeadersSpooferDownloaderMiddleware', 'scrapy_poet.InjectionMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy_poet.DownloaderStatsMiddleware'] 2025-11-30 01:45:08 [NotFoundHandlerSpiderMiddleware] (PID: 156) INFO: NotFoundHandlerSpiderMiddleware running on PRODUCTION environment. 2025-11-30 01:45:08 [scrapy.middleware] (PID: 156) INFO: Enabled spider middlewares: ['catalog_extraction.middlewares.NotFoundHandlerSpiderMiddleware', 'catalog_extraction.middlewares.FixtureSavingMiddleware', 'scrapy_poet.RetryMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2025-11-30 01:45:08 [scrapy.middleware] (PID: 156) INFO: Enabled item pipelines: ['catalog_extraction.pipelines.DuplicatedSKUsFilterPipeline', 'catalog_extraction.pipelines.DiscontinuedProductsAdjustmentPipeline', 'catalog_extraction.pipelines.PriceRoundingPipeline', 'scraping_utils.pipelines.AttachSupplierPipeline', 'spidermon.contrib.scrapy.pipelines.ItemValidationPipeline'] 2025-11-30 01:45:08 [scrapy.core.engine] (PID: 156) INFO: Spider opened 2025-11-30 01:45:08 [scrapy.extensions.closespider] (PID: 156) INFO: Spider will stop when no items are produced after 7200 seconds. 2025-11-30 01:45:09 [scrapy.extensions.logstats] (PID: 156) INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2025-11-30 01:45:09 [scrapy.extensions.telnet] (PID: 156) INFO: Telnet console listening on 127.0.0.1:6024 2025-11-30 01:45:09 [scrapy-playwright] (PID: 156) INFO: Starting download handler 2025-11-30 01:45:09 [scrapy-playwright] (PID: 156) INFO: Starting download handler 2025-11-30 01:45:14 [scrapy-playwright] (PID: 156) INFO: Connecting using CDP: wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222 2025-11-30 01:45:14 [scrapy.core.scraper] (PID: 156) ERROR: Error downloading Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 2013, in _inlineCallbacks result = context.run( ^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/twisted/python/failure.py", line 467, in throwExceptionIntoGenerator return g.throw(self.value.with_traceback(self.tb)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/middleware.py", line 68, in process_request return (yield download_func(request, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 1253, in adapt extracted: _SelfResultT | Failure = result.result() ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 380, in _download_request return await self._download_request_with_retry(request=request, spider=spider) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 399, in _download_request_with_retry page = await self._create_page(request=request, spider=spider) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy_playwright_stealth/handler.py", line 38, in _create_page page = await super()._create_page(request, spider) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 297, in _create_page ctx_wrapper = await self._create_browser_context( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 250, in _create_browser_context await self._maybe_connect_remote_devtools() File "/usr/local/lib/python3.11/site-packages/scrapy_playwright/handler.py", line 215, in _maybe_connect_remote_devtools self.browser = await self.browser_type.connect_over_cdp( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/playwright/async_api/_generated.py", line 14835, in connect_over_cdp await self._impl_obj.connect_over_cdp( File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_browser_type.py", line 183, in connect_over_cdp response = await self._channel.send_return_as_dict("connectOverCDP", params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 67, in send_return_as_dict return await self._connection.wrap_api_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 528, in wrap_api_call raise rewrite_error(error, f"{parsed_st['apiName']}: {error}") from None playwright._impl._errors.Error: BrowserType.connect_over_cdp: WebSocket error: wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222/ 403 Auth Failed (customer_suspended) Account is suspended Call log: - wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222/ - - wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222/ 403 Auth Failed (customer_suspended) Account is suspended - - wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222/ error WebSocket was closed before the connection was established - - wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222/ WebSocket was closed before the connection was established - - wss://brd-customer-hl_13cda1e4-zone-main_scraping_browser:l9p73ctebkrc@brd.superproxy.io:9222/ code=1006 reason= 2025-11-30 01:45:14 [scrapy.core.engine] (PID: 156) INFO: Closing spider (finished) 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] ------------------------------ MONITORS ------------------------------ 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Extracted Items Monitor/test_stat_monitor... FAIL 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Item Validation Monitor/test_stat_monitor... SKIPPED (Unable to find 'spidermon/validation/fields/errors' in job stats.) 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Error Count Monitor/test_stat_monitor... FAIL 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Warning Count Monitor/test_stat_monitor... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Finish Reason Monitor/Should have the expected finished reason(s)... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Unwanted HTTP codes monitor/Should not hit the limit of unwanted http status... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Field Coverage Monitor/test_check_if_field_coverage_rules_are_met... FAIL 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Retry Count monitor/Should not hit the limit of requests that reached the maximum retry amount... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Downloader Exceptions monitor/test_stat_monitor... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Successful Requests monitor/Should have at least the minimum number of successful requests... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] Total Requests monitor/Should not hit the total limit of requests... OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] ---------------------------------------------------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) ERROR: [Spidermon] ====================================================================== FAIL: Extracted Items Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 177, in test_stat_monitor self.fail(message) AssertionError: Unable to find 'item_scraped_count' in job stats. 2025-11-30 01:45:14 [lowes] (PID: 156) ERROR: [Spidermon] ====================================================================== FAIL: Error Count Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 184, in test_stat_monitor assertion_method( AssertionError: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '1' 2025-11-30 01:45:14 [lowes] (PID: 156) ERROR: [Spidermon] ====================================================================== FAIL: Field Coverage Monitor/test_check_if_field_coverage_rules_are_met ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/monitors.py", line 477, in test_check_if_field_coverage_rules_are_met self.assertTrue(len(failures) == 0, msg=msg) AssertionError: The following items did not meet field coverage rules: dict/inStock (expected 1.0, got 0) dict/name (expected 1.0, got 0) dict/prices (expected 1.0, got 0) dict/productStatus (expected 1.0, got 0) dict/supplier (expected 1.0, got 0) dict/supplierSku (expected 1.0, got 0) dict/url (expected 1.0, got 0) 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] 11 monitors in 0.005s 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] FAILED (failures=3, skipped=1) 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] -------------------------- FINISHED ACTIONS -------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] ---------------------------------------------------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] 0 actions in 0.000s 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] --------------------------- PASSED ACTIONS --------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] ---------------------------------------------------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] 0 actions in 0.000s 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] OK 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] --------------------------- FAILED ACTIONS --------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] CustomTemplateSendSlackMessageSpiderFinished... ERROR 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] ---------------------------------------------------------------------- 2025-11-30 01:45:14 [lowes] (PID: 156) ERROR: [Spidermon] ====================================================================== ERROR: CustomTemplateSendSlackMessageSpiderFinished ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/core/actions.py", line 39, in run self.run_action() File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/actions/slack/__init__.py", line 252, in run_action self.manager.send_message( File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/actions/slack/__init__.py", line 58, in send_message return [ ^ File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/actions/slack/__init__.py", line 59, in self.send_message( File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/actions/slack/__init__.py", line 85, in send_message return self._send_channel_message( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/actions/slack/__init__.py", line 129, in _send_channel_message attachments=self._parse_attachments(attachments), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/actions/slack/__init__.py", line 171, in _parse_attachments python_attachments = ast.literal_eval(attachments) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/ast.py", line 64, in literal_eval node_or_string = parse(node_or_string.lstrip(" \t"), mode='eval') ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/ast.py", line 50, in parse return compile(source, filename, mode, flags, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 4 "text": "• _Extracted Items Monitor/test_stat_monitor_: Unable to find 'item_scraped_count' in job stats.\n• _Error Count Monitor/test_stat_monitor_: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '1'\n• _Field Coverage Monitor/test_check_if_field_coverage_rules_are_met_: ^ SyntaxError: unterminated string literal (detected at line 4) 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] 1 action in 0.221s 2025-11-30 01:45:14 [lowes] (PID: 156) INFO: [Spidermon] FAILED (errors=1) 2025-11-30 01:45:14 [scrapy.utils.signal] (PID: 156) ERROR: Error caught on signal handler: > Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 1253, in adapt extracted: _SelfResultT | Failure = result.result() ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/extensions/feedexport.py", line 504, in close_spider self._close_slot(slot, spider) File "/usr/local/lib/python3.11/site-packages/scrapy/extensions/feedexport.py", line 535, in _close_slot d: Deferred[None] = maybeDeferred(slot.storage.store, get_file(slot)) # type: ignore[call-overload] ^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/extensions/feedexport.py", line 517, in get_file assert slot_.file ^^^^^^^^^^ AssertionError 2025-11-30 01:45:14 [scrapy.statscollectors] (PID: 156) INFO: Dumping Scrapy stats: {'HeadersSpooferDownloaderMiddleware/ignored/playwright': 1, 'downloader/exception_count': 1, 'downloader/exception_type_count/playwright._impl._errors.Error': 1, 'downloader/request_bytes': 157, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'elapsed_time_seconds': 5.552473, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2025, 11, 30, 1, 45, 14, 551711, tzinfo=datetime.timezone.utc), 'items_per_minute': None, 'log_count/ERROR': 6, 'log_count/INFO': 44, 'log_count/WARNING': 6, 'memusage/max': 128892928, 'memusage/startup': 128892928, 'responses_per_minute': None, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spidermon/validation/validators': 1, 'spidermon/validation/validators/item/jsonschema': True, 'start_time': datetime.datetime(2025, 11, 30, 1, 45, 8, 999238, tzinfo=datetime.timezone.utc)} 2025-11-30 01:45:14 [scrapy.core.engine] (PID: 156) INFO: Spider closed (finished) 2025-11-30 01:45:14 [scrapy-playwright] (PID: 156) INFO: Closing download handler 2025-11-30 01:45:14 [scrapy-playwright] (PID: 156) INFO: Closing download handler