2024-09-25 13:38:32 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: catalog_discovery) 2024-09-25 13:38:32 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.9 (main, Aug 13 2024, 01:19:58) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-6.4.10-dirty-x86_64-with-glibc2.36 2024-09-25 13:38:32 [scrapy.addons] INFO: Enabled addons: [] 2024-09-25 13:38:32 [scrapy.extensions.telnet] INFO: Telnet Password: 1273dd69766319ef 2024-09-25 13:38:32 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'spidermon.contrib.scrapy.extensions.Spidermon'] 2024-09-25 13:38:32 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'catalog_discovery', 'CONCURRENT_ITEMS': 1000, 'CONCURRENT_REQUESTS': 32, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/var/lib/scrapyd/logs/catalog_discovery/lowes/71705b7c7b4311efa9cf4200a9fe0102.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'catalog_discovery.spiders', 'REQUEST_FINGERPRINTER_CLASS': 'scrapy_poet.ScrapyPoetRequestFingerprinter', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_TIMES': 5, 'SPIDER_MODULES': ['catalog_discovery.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor', 'USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:125.0) ' 'Gecko/20100101 Firefox/125.0'} 2024-09-25 13:38:33 [scrapy_poet.injection] INFO: Loading providers: [, , , , , , ] 2024-09-25 13:38:33 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy_poet.InjectionMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy_poet.DownloaderStatsMiddleware'] 2024-09-25 13:38:33 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy_poet.RetryMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2024-09-25 13:38:33 [scrapy.middleware] INFO: Enabled item pipelines: ['scraping_utils.pipelines.DuplicatesFilterPipeline', 'scraping_utils.pipelines.AttachSupplierPipeline', 'spidermon.contrib.scrapy.pipelines.ItemValidationPipeline'] 2024-09-25 13:38:33 [scrapy.core.engine] INFO: Spider opened 2024-09-25 13:38:33 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2024-09-25 13:38:33 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2024-09-25 13:38:33 [scrapy.core.scraper] ERROR: Error downloading Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 1999, in _inlineCallbacks result = context.run( File "/usr/local/lib/python3.11/site-packages/twisted/python/failure.py", line 519, in throwExceptionIntoGenerator return g.throw(self.value.with_traceback(self.tb)) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request return (yield download_func(request=request, spider=spider)) File "/usr/local/lib/python3.11/site-packages/scrapy/utils/defer.py", line 81, in mustbe_deferred result = f(*args, **kw) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/handlers/__init__.py", line 86, in download_request return cast(Deferred, handler.download_request(request, spider)) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/handlers/http11.py", line 72, in download_request return agent.download_request(request) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/handlers/http11.py", line 362, in download_request agent = self._get_agent(request, timeout) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/handlers/http11.py", line 326, in _get_agent proxyScheme, proxyNetloc, proxyHost, proxyPort, proxyParams = _parse(proxy) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/webclient.py", line 42, in _parse return _parsed_url_args(parsed) File "/usr/local/lib/python3.11/site-packages/scrapy/core/downloader/webclient.py", line 24, in _parsed_url_args port = parsed.port File "/usr/local/lib/python3.11/urllib/parse.py", line 182, in port raise ValueError(f"Port could not be cast to integer value as {port!r}") ValueError: Port could not be cast to integer value as 'None' 2024-09-25 13:38:33 [scrapy.core.engine] INFO: Closing spider (finished) 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] ------------------------------ MONITORS ------------------------------ 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Extracted Items Monitor/test_stat_monitor... FAIL 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Item Validation Monitor/test_stat_monitor... SKIPPED (Unable to find 'spidermon/validation/fields/errors' in job stats.) 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Error Count Monitor/test_stat_monitor... FAIL 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Warning Count Monitor/test_stat_monitor... SKIPPED (Unable to find 'log_count/WARNING' in job stats.) 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Finish Reason Monitor/Should have the expected finished reason(s)... OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Unwanted HTTP codes monitor/Should not hit the limit of unwanted http status... OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Field Coverage Monitor/test_check_if_field_coverage_rules_are_met... FAIL 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Retry Count monitor/Should not hit the limit of requests that reached the maximum retry amount... OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Downloader Exceptions monitor/test_stat_monitor... OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Successful Requests monitor/Should have at least the minimum number of successful requests... OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] Total Requests monitor/Should not hit the total limit of requests... OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-09-25 13:38:33 [lowes] ERROR: [Spidermon] ====================================================================== FAIL: Extracted Items Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 177, in test_stat_monitor self.fail(message) AssertionError: Unable to find 'item_scraped_count' in job stats. 2024-09-25 13:38:33 [lowes] ERROR: [Spidermon] ====================================================================== FAIL: Error Count Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 184, in test_stat_monitor assertion_method( AssertionError: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '1' 2024-09-25 13:38:33 [lowes] ERROR: [Spidermon] ====================================================================== FAIL: Field Coverage Monitor/test_check_if_field_coverage_rules_are_met ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/spidermon/contrib/scrapy/monitors/monitors.py", line 476, in test_check_if_field_coverage_rules_are_met self.assertTrue(len(failures) == 0, msg=msg) AssertionError: The following items did not meet field coverage rules: dict/url (expected 1.0, got 0) dict/supplier (expected 1.0, got 0) 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] 11 monitors in 0.006s 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] FAILED (failures=3, skipped=2) 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] -------------------------- FINISHED ACTIONS -------------------------- 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] 0 actions in 0.000s 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] --------------------------- PASSED ACTIONS --------------------------- 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] 0 actions in 0.000s 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] OK 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] --------------------------- FAILED ACTIONS --------------------------- 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] 0 actions in 0.000s 2024-09-25 13:38:33 [lowes] INFO: [Spidermon] OK 2024-09-25 13:38:33 [scrapy.extensions.feedexport] INFO: No data to insert into BigQuery - closing feed storage 2024-09-25 13:38:33 [scrapy.extensions.feedexport] INFO: Stored bq feed (0 items) in: bq://response-elt.dev_scrapers.catalog_urls/batch:1 2024-09-25 13:38:33 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 1, 'downloader/exception_type_count/builtins.ValueError': 1, 'downloader/request_bytes': 315, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'elapsed_time_seconds': 0.355364, 'feedexport/success_count/BigQueryFeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2024, 9, 25, 13, 38, 33, 402809, tzinfo=datetime.timezone.utc), 'log_count/ERROR': 4, 'log_count/INFO': 40, 'memusage/max': 113819648, 'memusage/startup': 113819648, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spidermon/validation/validators': 1, 'spidermon/validation/validators/item/jsonschema': True, 'start_time': datetime.datetime(2024, 9, 25, 13, 38, 33, 47445, tzinfo=datetime.timezone.utc)} 2024-09-25 13:38:33 [scrapy.core.engine] INFO: Spider closed (finished)