2026-02-28 00:00:11 [scrapy.utils.log] (PID: 39) INFO: Scrapy 2.12.0 started (bot: catalog_extraction) 2026-02-28 00:00:11 [scrapy.utils.log] (PID: 39) INFO: Versions: lxml 5.3.1.0, libxml2 2.12.9, cssselect 1.3.0, parsel 1.10.0, w3lib 2.3.1, Twisted 24.11.0, Python 3.11.13 (main, Jun 10 2025, 23:54:42) [GCC 12.2.0], pyOpenSSL 25.0.0 (OpenSSL 3.4.1 11 Feb 2025), cryptography 44.0.2, Platform Linux-6.9.12-x86_64-with-glibc2.36 2026-02-28 00:00:12 [twisted] (PID: 39) CRITICAL: Unhandled error in Deferred: 2026-02-28 00:00:12 [twisted] (PID: 39) CRITICAL: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/twisted/internet/defer.py", line 2017, in _inlineCallbacks result = context.run(gen.send, result) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/crawler.py", line 149, in crawl self.spider = self._create_spider(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/crawler.py", line 163, in _create_spider return self.spidercls.from_crawler(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/var/lib/scrapyd/eggs/catalog_extraction/1769537350.egg/catalog_extraction/spiders/grainger.py", line 45, in from_crawler spider = super().from_crawler(crawler, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/var/lib/scrapyd/eggs/catalog_extraction/1769537350.egg/catalog_extraction/spiders/__init__.py", line 268, in from_crawler raise ValueError( ValueError: The environment variable `BRD_SCRAPING_BROWSER` is required to start a spider based on BaseCDPExtractionSpider