You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran an instance with 2 crawlers over the top 1000 sites. At some point during the test, Crawler 1 failed and never recovered. I had the timeout set at 60 seconds. Here's the readout from the console when the crawler died:
Crawler 1 timed out fetching http://www.patch.com/
Stopping Crawler 1
Starting Crawler 1
Process Crawler 1:
Traceback (most recent call last):
File "/home/alex/chameleon-crawler/crawler/crawler_manager.py", line 43, in __init__
timeout * ((num_timeouts + 1) ** 2)
File "/usr/lib/python3.5/multiprocessing/queues.py", line 105, in get
raise Empty
queue.Empty
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
self.run()
File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/home/alex/chameleon-crawler/crawler/crawler_process.py", line 35, in __init__
self.crawl()
File "/home/alex/chameleon-crawler/crawler/crawler_process.py", line 40, in crawl
with self.selenium():
File "/usr/lib/python3.5/contextlib.py", line 59, in __enter__
return next(self.gen)
File "/home/alex/chameleon-crawler/crawler/crawler_process.py", line 106, in selenium
self.startup()
File "/home/alex/chameleon-crawler/crawler/crawler_process.py", line 122, in startup
self.driver = webdriver.Chrome(chrome_options=opts)
File "/home/alex/.local/lib/python3.5/site-packages/selenium/webdriver/chrome/webdriver.py", line 65, in __init__
keep_alive=True)
File "/home/alex/.local/lib/python3.5/site-packages/selenium/webdriver/remote/webdriver.py", line 73, in __init__
self.start_session(desired_capabilities, browser_profile)
File "/home/alex/.local/lib/python3.5/site-packages/selenium/webdriver/remote/webdriver.py", line 121, in start_session
'desiredCapabilities': desired_capabilities,
File "/home/alex/.local/lib/python3.5/site-packages/selenium/webdriver/remote/webdriver.py", line 173, in execute
self.error_handler.check_response(response)
File "/home/alex/.local/lib/python3.5/site-packages/selenium/webdriver/remote/errorhandler.py", line 166, in check_response
raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.WebDriverException: Message: unknown error: failed to wait for extension background page to load: chrome-extension://mcgekeccgjgcmhnhbabplanchdogjcnh/_generated_background_page.html
from timeout
(Driver info: chromedriver=2.29,platform=Linux 4.4.0-87-generic x86_64)
The text was updated successfully, but these errors were encountered:
I ran an instance with 2 crawlers over the top 1000 sites. At some point during the test, Crawler 1 failed and never recovered. I had the timeout set at 60 seconds. Here's the readout from the console when the crawler died:
The text was updated successfully, but these errors were encountered: