Description
Description
If a filelock.Timeout()
occurs, the resulting error does not pickle correctly.
This is a problem, because the module concurrent.futures
assumes exceptions are pickled correctly, so they can be
sent back to the main process.
filelock: 3.10.0
Python: 3.11.2
Steps to Reproduce
Run the following file
# demo.py
import time
import concurrent.futures
from filelock import FileLock
file_path = "high_ground.txt"
lock_path = "high_ground.txt.lock"
lock = FileLock(lock_path, timeout=1)
def worker_use_lock(msg):
with lock:
with open(file_path, "a") as f:
f.write(f"{msg}\n")
# Simulate doing some work with the lock held
time.sleep(10)
return msg
def main():
# Create executor to call function in another process
with concurrent.futures.ProcessPoolExecutor(2) as executor:
messages = ["Hello there!", "General Kenobi!"]
for msg in executor.map(worker_use_lock, messages):
print(f"Saved message {msg}")
if __name__ == "__main__":
main()
It will fail with traceback:
concurrent.futures.process._RemoteTraceback:
'''
Traceback (most recent call last):
File ".../concurrent/futures/process.py", line 414, in wait_result_broken_or_wakeup
result_item = result_reader.recv()
^^^^^^^^^^^^^^^^^^^^
File ".../multiprocessing/connection.py", line 250, in recv
return _ForkingPickler.loads(buf.getbuffer())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Timeout.__init__() missing 1 required positional argument: 'lock_file'
'''
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "demo.py", line 27, in <module>
main()
File "demo.py", line 23, in main
for msg in executor.map(worker_use_lock, messages):
File ".../concurrent/futures/process.py", line 597, in _chain_from_iterable_of_lists
for element in iterable:
File ".../concurrent/futures/_base.py", line 619, in result_iterator
yield _result_or_cancel(fs.pop())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../concurrent/futures/_base.py", line 317, in _result_or_cancel
return fut.result(timeout)
^^^^^^^^^^^^^^^^^^^
File ".../concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File ".../concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
The message TypeError: Timeout.__init__() missing 1 required positional argument: 'lock_file'
is very cryptic, but after some investigation, it appears this issue is caused by how filelock._error.Timeout
is implemented:
The issue seems to be that, when pickled, (via __reduce__
) the default behavior does not account for the custom "lock_file" format.
This can be shown by the example:
import filelock, pickle
exc = filelock.Timeout("/path/to/lock")
exc2 = pickle.loads(pickle.dumps(exc)) # TypeError: Timeout.__init__() missing 1 required positional argument: 'lock_file'
Expected Behavior
The filelock.Timeout
should pickle correctly, allowing the exception to be properly handled by concurrent.futures
.
There should also be a test case to make sure the exception pickling works.
Suggested Change
I suggest the following Timeout class be adopted:
from typing import Any, Tuple, Union
class Timeout(TimeoutError):
"""Raised when the lock could not be acquired in *timeout* seconds."""
def __init__(self, lock_file: str) -> None:
#: The path of the file lock.
super().__init__(f"The file lock '{lock_file}' could not be acquired.")
# Set filename so name of lock file is visible
self.filename = lock_file
def __reduce__(self) -> Union[str, Tuple[Any, ...]]:
# Properly pickle the exception
return self.__class__, (self.filename,)
def __str__(self) -> str:
return self.args[0]
@property
def lock_file(self) -> str:
# For compatibility
return self.filename
This error makes use of OSError's filename argument, as the general method of marking the file associated with the error. A property method is added so the old name lock_file
will still work.
By providing the explicit __reduce__()
method, the error is corrected. (Typing hint is based on discussion here: python/typeshed#3452)