2025-01-13 02:31:11 [scrapy.utils.log] (PID: 10198) INFO: Scrapy 2.11.2 started (bot: catalog_extraction) 2025-01-13 02:31:11 [scrapy.utils.log] (PID: 10198) INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.11 (main, Dec 4 2024, 20:38:25) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-6.4.10-dirty-x86_64-with-glibc2.36 2025-01-13 02:31:11 [webstaurant] (PID: 10198) INFO: Starting extraction spider webstaurant... 2025-01-13 02:31:11 [scrapy.addons] (PID: 10198) INFO: Enabled addons: [] 2025-01-13 02:31:11 [scrapy.extensions.telnet] (PID: 10198) INFO: Telnet Password: 0db80df6f334f733 2025-01-13 02:31:13 [scrapy.middleware] (PID: 10198) INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.closespider.CloseSpider', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'spidermon.contrib.scrapy.extensions.Spidermon'] 2025-01-13 02:31:13 [scrapy.crawler] (PID: 10198) INFO: Overridden settings: {'BOT_NAME': 'catalog_extraction', 'CONCURRENT_ITEMS': 250, 'CONCURRENT_REQUESTS': 24, 'DOWNLOAD_DELAY': 1.25, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/var/lib/scrapyd/logs/catalog_extraction/webstaurant/6e5f8836d15611efa4cb4200a9fe0102.log', 'LOG_FORMAT': '%(asctime)s [%(name)s] (PID: %(process)d) %(levelname)s: ' '%(message)s', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'catalog_extraction.spiders', 'REQUEST_FINGERPRINTER_CLASS': 'scrapy_poet.ScrapyPoetRequestFingerprinter', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_TIMES': 5, 'SPIDER_MODULES': ['catalog_extraction.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor', 'USER_AGENT': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, ' 'like Gecko) Chrome/129.0.0.0 Safari/537.36'} 2025-01-13 02:31:14 [scrapy_poet.injection] (PID: 10198) INFO: Loading providers: [, , , , , , ] 2025-01-13 02:31:14 [scrapy.middleware] (PID: 10198) INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scraping_utils.middlewares.downloaders.ProxyManagerDownloaderMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy_poet.InjectionMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy_poet.DownloaderStatsMiddleware'] 2025-01-13 02:31:14 [twisted] (PID: 10198) CRITICAL: Unhandled error in Deferred: 2025-01-13 02:31:14 [twisted] (PID: 10198) CRITICAL: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/misc.py", line 82, in load_object obj = getattr(mod, name) ^^^^^^^^^^^^^^^^^^ AttributeError: module 'scraping_utils.middlewares.spider' has no attribute 'CloudflareSolverMiddleware' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks result = context.run(gen.send, result) File "/usr/local/lib/python3.11/site-packages/scrapy/crawler.py", line 158, in crawl self.engine = self._create_engine() File "/usr/local/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "/usr/local/lib/python3.11/site-packages/scrapy/core/engine.py", line 101, in __init__ self.scraper = Scraper(crawler) File "/usr/local/lib/python3.11/site-packages/scrapy/core/scraper.py", line 103, in __init__ self.spidermw: SpiderMiddlewareManager = SpiderMiddlewareManager.from_crawler( File "/usr/local/lib/python3.11/site-packages/scrapy/middleware.py", line 90, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/usr/local/lib/python3.11/site-packages/scrapy/middleware.py", line 66, in from_settings mwcls = load_object(clspath) File "/usr/local/lib/python3.11/site-packages/scrapy/utils/misc.py", line 84, in load_object raise NameError(f"Module '{module}' doesn't define any object named '{name}'") NameError: Module 'scraping_utils.middlewares.spider' doesn't define any object named 'CloudflareSolverMiddleware'