2024-11-22 00:52:14 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: catalog_discovery) 2024-11-22 00:52:14 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.10 (main, Nov 12 2024, 02:25:24) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-6.4.10-dirty-x86_64-with-glibc2.36 2024-11-22 00:52:14 [scrapy.addons] INFO: Enabled addons: [] 2024-11-22 00:52:14 [scrapy.extensions.telnet] INFO: Telnet Password: 8f4f1ed00de4d4f0 2024-11-22 00:52:14 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.closespider.CloseSpider', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'spidermon.contrib.scrapy.extensions.Spidermon'] 2024-11-22 00:52:14 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'catalog_discovery', 'CONCURRENT_ITEMS': 1000, 'CONCURRENT_REQUESTS': 32, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/var/lib/scrapyd/logs/catalog_discovery/office_depot/0161746ea86c11ef823f4200a9fe0102.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'catalog_discovery.spiders', 'REQUEST_FINGERPRINTER_CLASS': 'scrapy_poet.ScrapyPoetRequestFingerprinter', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_TIMES': 5, 'SPIDER_MODULES': ['catalog_discovery.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor', 'USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:125.0) ' 'Gecko/20100101 Firefox/125.0'} 2024-11-22 00:52:14 [scrapy_poet.injection] INFO: Loading providers: [, , , , , , ] 2024-11-22 00:52:14 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scraping_utils.middlewares.downloaders.ProxyManagerDownloaderMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy_poet.InjectionMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy_poet.DownloaderStatsMiddleware'] 2024-11-22 00:52:14 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy_poet.RetryMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2024-11-22 00:52:14 [twisted] CRITICAL: Unhandled error in Deferred: 2024-11-22 00:52:14 [twisted] CRITICAL: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks result = context.run(gen.send, result) File "/usr/local/lib/python3.11/site-packages/scrapy/crawler.py", line 158, in crawl self.engine = self._create_engine() File "/usr/local/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "/usr/local/lib/python3.11/site-packages/scrapy/core/engine.py", line 101, in __init__ self.scraper = Scraper(crawler) File "/usr/local/lib/python3.11/site-packages/scrapy/core/scraper.py", line 109, in __init__ self.itemproc: ItemPipelineManager = itemproc_cls.from_crawler(crawler) File "/usr/local/lib/python3.11/site-packages/scrapy/middleware.py", line 90, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/usr/local/lib/python3.11/site-packages/scrapy/middleware.py", line 67, in from_settings mw = create_instance(mwcls, settings, crawler) File "/usr/local/lib/python3.11/site-packages/scrapy/utils/misc.py", line 188, in create_instance instance = objcls.from_crawler(crawler, *args, **kwargs) File "/var/lib/scrapyd/eggs/catalog_discovery/1732236689.egg/catalog_discovery/pipelines.py", line 58, in from_crawler TypeError: CatalogItemFilterPipeline.__init__() takes 3 positional arguments but 5 were given