I'm getting and caching (for performance) lots of URLs with something like:
import requests import requests_cache from multiprocessing.pool import ThreadPool urls = ['http://www.google.com', ...] with requests_cache.enabled(): responses = ThreadPool(100).map(requests.get, urls)
However, I'm getting a lot of errors for:
sqlite3.OperationalError: database is locked
Clearly too many threads are accessing the cache at the same time.
requests_cache support some kind of transaction so that the write only occurs when all the threads have finished? E.g.
with requests_cache.enabled(): with requests_cache.transaction(): responses = ThreadPool(100).map(requests.get, urls)