././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/0000755000175100001730000000000014450111176012543 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/LICENSE0000644000175100001730000000210214450111171013536 0ustar00runnerdockerCopyright (c) 2013 Itamar Turner-Trauring and Twisted Matrix Labs Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/MANIFEST.in0000644000175100001730000000027214450111171014275 0ustar00runnerdockerinclude LICENSE include README.rst include requirements-dev.txt recursive-include docs * prune docs/_build recursive-include examples * include versioneer.py include crochet/_version.py ././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/PKG-INFO0000644000175100001730000002355214450111176013647 0ustar00runnerdockerMetadata-Version: 2.1 Name: crochet Version: 2.1.1 Summary: Use Twisted anywhere! Home-page: https://github.com/itamarst/crochet Maintainer: Itamar Turner-Trauring Maintainer-email: itamar@itamarst.org License: MIT Keywords: twisted threading Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Requires-Python: >=3.8.0 License-File: LICENSE Crochet: Use Twisted anywhere! ============================== Crochet is an MIT-licensed library that makes it easier to use Twisted from regular blocking code. Some use cases include: * Easily use Twisted from a blocking framework like Django or Flask. * Write a library that provides a blocking API, but uses Twisted for its implementation. * Port blocking code to Twisted more easily, by keeping a backwards compatibility layer. * Allow normal Twisted programs that use threads to interact with Twisted more cleanly from their threaded parts. For example, this can be useful when using Twisted as a `WSGI container`_. .. _WSGI container: https://twistedmatrix.com/documents/current/web/howto/web-in-60/wsgi.html Crochet is maintained by Itamar Turner-Trauring. **Note:** Crochet development is pretty slow these days because mostly it **Just Works**. PyPI shows about 30,000 downloads a month, so existing users seem happy: https://pypistats.org/packages/crochet You can install Crochet by running:: $ pip install crochet Downloads are available on `PyPI`_. Documentation can be found on `Read The Docs`_. Bugs and feature requests should be filed at the project `Github page`_. .. _Read the Docs: https://crochet.readthedocs.org/ .. _Github page: https://github.com/itamarst/crochet/ .. _PyPI: https://pypi.python.org/pypi/crochet API and features ================ Crochet supports Python 3.8, 3.9, 3.10, and 3.11 as well as PyPy3. Crochet provides the following basic APIs: * Allow blocking code to call into Twisted and block until results are available or a timeout is hit, using the ``crochet.wait_for`` decorator. * A lower-level API (``crochet.run_in_reactor``) allows blocking code to run code "in the background" in the Twisted thread, with the ability to repeatedly check if it's done. Crochet will do the following on your behalf in order to enable these APIs: * Transparently start Twisted's reactor in a thread it manages. * Shut down the reactor automatically when the process' main thread finishes. * Hook up Twisted's log system to the Python standard library ``logging`` framework. Unlike Twisted's built-in ``logging`` bridge, this includes support for blocking `Handler` instances. What's New ========== 2.1.0 ^^^^^ * Various internal modernizations and maintenance. * Dropped Python 3.6 and 3.7 support. 2.0.0 ^^^^^ New features: * It's possible to decorate ``async/await`` Twisted functions with ``@wait_for`` and ``@run_in_reactor``, thanks to Árni Már Jónsson. * Added type hints, thanks to Merlin Davis. * Added formal support for Python 3.9. Removed features: * Dropped the deprecated APIs ``@wait_for_reactor``, ``@in_reactor``, ``DeferredResult``, the ``wrapped_function`` attribute, and unlimited timeouts on ``EventualResult.wait()``. * Dropped support for Python 2.7 and 3.5. 1.12.0 ^^^^^^ Bug fixes: * Fix a timeout overflow bug in 32-bit machines. 1.11.0 ^^^^^^ New features: * Added support for Python 3.8 and PyPy 3. Backwards incompatibility: * Dropped support for Python 3.4, since latest Twisted doesn't support it. 1.10.0 ^^^^^^ New features: * Added support for Python 3.7. Thanks to Jeremy Cline for the patch. 1.9.0 ^^^^^ New features: * The underlying callable wrapped ``@run_in_reactor`` and ``@wait_for`` is now available via the more standard ``__wrapped__`` attribute. Backwards incompatibility (in tests): * This was actually introduced in 1.8.0: ``wrapped_function`` may not always be available on decorated callables. You should use ``__wrapped__`` instead. Bug fixes: * Fixed regression in 1.8.0 where bound method couldn't be wrapped. Thanks to 2mf for the bug report. 1.8.0 ^^^^^ New features: * Signatures on decorated functions now match the original functions. Thanks to Mikhail Terekhov for the original patch. * Documentation improvements, including an API reference. Bug fixes: * Switched to EPoll reactor for logging thread. Anecdotal evidence suggests this fixes some issues on AWS Lambda, but it's not clear why. Thanks to Rolando Espinoza for the patch. * It's now possible to call ``@run_in_reactor`` and ``@wait_for`` above a ``@classmethod``. Thanks to vak for the bug report. 1.7.0 ^^^^^ Bug fixes: * If the Python ``logging.Handler`` throws an exception Crochet no longer goes into a death spiral. Thanks to Michael Schlenker for the bug report. Removed features: * Versions of Twisted < 16.0 are no longer supported (i.e. no longer tested in CI.) 1.6.0 ^^^^^ New features: * Added support for Python 3.6. 1.5.0 ^^^^^ New features: * Added support for Python 3.5. Removed features: * Python 2.6, Python 3.3, and versions of Twisted < 15.0 are no longer supported. 1.4.0 ^^^^^ New features: * Added support for Python 3.4. Documentation: * Added a section on known issues and workarounds. Bug fixes: * Main thread detection (used to determine when Crochet should shutdown) is now less fragile. This means Crochet now supports more environments, e.g. uWSGI. Thanks to Ben Picolo for the patch. 1.3.0 ^^^^^ Bug fixes: * It is now possible to call ``EventualResult.wait()`` (or functions wrapped in ``wait_for``) at import time if another thread holds the import lock. Thanks to Ken Struys for the patch. 1.2.0 ^^^^^ New features: * ``crochet.wait_for`` implements the timeout/cancellation pattern documented in previous versions of Crochet. ``crochet.wait_for_reactor`` and ``EventualResult.wait(timeout=None)`` are now deprecated, since lacking timeouts they could potentially block forever. * Functions wrapped with ``wait_for`` and ``run_in_reactor`` can now be accessed via the ``wrapped_function`` attribute, to ease unit testing of the underlying Twisted code. API changes: * It is no longer possible to call ``EventualResult.wait()`` (or functions wrapped with ``wait_for``) at import time, since this can lead to deadlocks or prevent other threads from importing. Thanks to Tom Prince for the bug report. Bug fixes: * ``warnings`` are no longer erroneously turned into Twisted log messages. * The reactor is now only imported when ``crochet.setup()`` or ``crochet.no_setup()`` are called, allowing daemonization if only ``crochet`` is imported (http://tm.tl/7105). Thanks to Daniel Nephin for the bug report. Documentation: * Improved motivation, added contact info and news to the documentation. * Better example of using Crochet from a normal Twisted application. 1.1.0 ^^^^^ Bug fixes: * ``EventualResult.wait()`` can now be used safely from multiple threads, thanks to Gavin Panella for reporting the bug. * Fixed reentrancy deadlock in the logging code caused by http://bugs.python.org/issue14976, thanks to Rod Morehead for reporting the bug. * Crochet now installs on Python 3.3 again, thanks to Ben Cordero. * Crochet should now work on Windows, thanks to Konstantinos Koukopoulos. * Crochet tests can now run without adding its absolute path to PYTHONPATH or installing it first. Documentation: * ``EventualResult.original_failure`` is now documented. 1.0.0 ^^^^^ Documentation: * Added section on use cases and alternatives. Thanks to Tobias Oberstein for the suggestion. Bug fixes: * Twisted does not have to be pre-installed to run ``setup.py``, thanks to Paul Weaver for bug report and Chris Scutcher for patch. * Importing Crochet does not have side-effects (installing reactor event) any more. * Blocking calls are interrupted earlier in the shutdown process, to reduce scope for deadlocks. Thanks to rmorehead for bug report. 0.9.0 ^^^^^ New features: * Expanded and much improved documentation, including a new section with design suggestions. * New decorator ``@wait_for_reactor`` added, a simpler alternative to ``@run_in_reactor``. * Refactored ``@run_in_reactor``, making it a bit more responsive. * Blocking operations which would otherwise never finish due to reactor having stopped (``EventualResult.wait()`` or ``@wait_for_reactor`` decorated call) will be interrupted with a ``ReactorStopped`` exception. Thanks to rmorehead for the bug report. Bug fixes: * ``@run_in_reactor`` decorated functions (or rather, their generated wrapper) are interrupted by Ctrl-C. * On POSIX platforms, a workaround is installed to ensure processes started by `reactor.spawnProcess` have their exit noticed. See `Twisted ticket 6378`_ for more details about the underlying issue. .. _Twisted ticket 6378: http://tm.tl/6738 0.8.1 ^^^^^ * ``EventualResult.wait()`` now raises error if called in the reactor thread, thanks to David Buchmann. * Unittests are now included in the release tarball. * Allow Ctrl-C to interrupt ``EventualResult.wait(timeout=None)``. 0.7.0 ^^^^^ * Improved documentation. 0.6.0 ^^^^^ * Renamed ``DeferredResult`` to ``EventualResult``, to reduce confusion with Twisted's ``Deferred`` class. The old name still works, but is deprecated. * Deprecated ``@in_reactor``, replaced with ``@run_in_reactor`` which doesn't change the arguments to the wrapped function. The deprecated API still works, however. * Unhandled exceptions in ``EventualResult`` objects are logged. * Added more examples. * ``setup.py sdist`` should work now. 0.5.0 ^^^^^ * Initial release. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/README.rst0000644000175100001730000000430414450111171014226 0ustar00runnerdockerCrochet: Use Twisted anywhere! ============================== Crochet is an MIT-licensed library that makes it easier to use Twisted from regular blocking code. Some use cases include: * Easily use Twisted from a blocking framework like Django or Flask. * Write a library that provides a blocking API, but uses Twisted for its implementation. * Port blocking code to Twisted more easily, by keeping a backwards compatibility layer. * Allow normal Twisted programs that use threads to interact with Twisted more cleanly from their threaded parts. For example, this can be useful when using Twisted as a `WSGI container`_. .. _WSGI container: https://twistedmatrix.com/documents/current/web/howto/web-in-60/wsgi.html Crochet is maintained by Itamar Turner-Trauring. **Note:** Crochet development is pretty slow these days because mostly it **Just Works**. PyPI shows about 30,000 downloads a month, so existing users seem happy: https://pypistats.org/packages/crochet You can install Crochet by running:: $ pip install crochet Downloads are available on `PyPI`_. Documentation can be found on `Read The Docs`_. Bugs and feature requests should be filed at the project `Github page`_. .. _Read the Docs: https://crochet.readthedocs.org/ .. _Github page: https://github.com/itamarst/crochet/ .. _PyPI: https://pypi.python.org/pypi/crochet API and features ================ Crochet supports Python 3.8, 3.9, 3.10, and 3.11 as well as PyPy3. Crochet provides the following basic APIs: * Allow blocking code to call into Twisted and block until results are available or a timeout is hit, using the ``crochet.wait_for`` decorator. * A lower-level API (``crochet.run_in_reactor``) allows blocking code to run code "in the background" in the Twisted thread, with the ability to repeatedly check if it's done. Crochet will do the following on your behalf in order to enable these APIs: * Transparently start Twisted's reactor in a thread it manages. * Shut down the reactor automatically when the process' main thread finishes. * Hook up Twisted's log system to the Python standard library ``logging`` framework. Unlike Twisted's built-in ``logging`` bridge, this includes support for blocking `Handler` instances. ././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/crochet/0000755000175100001730000000000014450111176014172 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/__init__.py0000644000175100001730000000215414450111171016300 0ustar00runnerdocker""" Crochet: Use Twisted Anywhere! """ from twisted.python.log import startLoggingWithObserver from twisted.python.runtime import platform from ._shutdown import _watchdog, register from ._eventloop import ( EventualResult, EventLoop, _store, ReactorStopped ) from ._eventloop import TimeoutError # pylint: disable=redefined-builtin from ._version import get_versions if platform.type == "posix": from twisted.internet.process import reapAllProcesses else: # waitpid() is only necessary on POSIX: def reapAllProcesses(): pass __version__ = get_versions()['version'] del get_versions def _importReactor(): from twisted.internet import reactor return reactor _main = EventLoop( _importReactor, register, startLoggingWithObserver, _watchdog, reapAllProcesses) setup = _main.setup no_setup = _main.no_setup run_in_reactor = _main.run_in_reactor wait_for = _main.wait_for retrieve_result = _store.retrieve __all__ = [ "setup", "run_in_reactor", "EventualResult", "TimeoutError", "retrieve_result", "no_setup", "wait_for", "ReactorStopped", "__version__", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/__init__.pyi0000644000175100001730000000143714450111171016454 0ustar00runnerdockerimport sys from typing import Any, Callable, Generic, Optional, TypeVar from twisted.python.failure import Failure _T = TypeVar("_T") _T_co = TypeVar("_T_co", covariant=True) _F = TypeVar("_F", bound=Callable[..., Any]) def setup() -> None: ... def run_in_reactor( function: Callable[..., _T] ) -> Callable[..., EventualResult[_T]]: ... class EventualResult(Generic[_T_co]): def cancel(self) -> None: ... def wait(self, timeout: float) -> _T_co: ... def stash(self) -> int: ... def original_failure(self) -> Optional[Failure]: ... class TimeoutError(Exception): ... def retrieve_result(result_id: int) -> EventualResult[object]: ... def no_setup() -> None: ... def wait_for(timeout: float) -> Callable[[_F], _F]: ... class ReactorStopped(Exception): ... __version__: str ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/_eventloop.py0000644000175100001730000003630614450111171016721 0ustar00runnerdocker""" Expose Twisted's event loop to threaded programs. """ import threading import weakref import warnings from inspect import iscoroutinefunction from functools import wraps from queue import SimpleQueue from twisted.python import threadable from twisted.python.runtime import platform from twisted.python.failure import Failure from twisted.python.log import PythonLoggingObserver, err from twisted.internet.defer import maybeDeferred, ensureDeferred from twisted.internet.task import LoopingCall import wrapt from ._util import synchronized from ._resultstore import ResultStore _store = ResultStore() class TimeoutError(Exception): # pylint: disable=redefined-builtin """ A timeout has been hit. """ class ReactorStopped(Exception): """ The reactor has stopped, and therefore no result will ever become available from this EventualResult. """ class ResultRegistry(object): """ Keep track of EventualResults. Once the reactor has shutdown: 1. Registering new EventualResult instances is an error, since no results will ever become available. 2. Already registered EventualResult instances are "fired" with a ReactorStopped exception to unblock any remaining EventualResult.wait() calls. """ def __init__(self): self._results = weakref.WeakSet() self._stopped = False self._lock = threading.Lock() @synchronized def register(self, result): """ Register an EventualResult. May be called in any thread. """ if self._stopped: raise ReactorStopped() self._results.add(result) @synchronized def stop(self): """ Indicate no more results will get pushed into EventualResults, since the reactor has stopped. This should be called in the reactor thread. """ self._stopped = True for result in self._results: result._set_result(Failure(ReactorStopped())) class EventualResult(object): """ A blocking interface to Deferred results. This allows you to access results from Twisted operations that may not be available immediately, using the wait() method. In general you should not create these directly; instead use functions decorated with @run_in_reactor. """ def __init__(self, deferred, _reactor): """ The deferred parameter should be a Deferred or None indicating _connect_deferred will be called separately later. """ self._deferred = deferred self._reactor = _reactor self._value = None self._result_retrieved = False self._result_set = threading.Event() if deferred is not None: self._connect_deferred(deferred) def _connect_deferred(self, deferred): """ Hook up the Deferred that that this will be the result of. Should only be run in Twisted thread, and only called once. """ self._deferred = deferred # Because we use __del__, we need to make sure there are no cycles # involving this object, which is why we use a weakref: def put(result, eventual=weakref.ref(self)): eventual = eventual() if eventual: eventual._set_result(result) else: err(result, "Unhandled error in EventualResult") deferred.addBoth(put) def _set_result(self, result): """ Set the result of the EventualResult, if not already set. This can only happen in the reactor thread, either as a result of Deferred firing, or as a result of ResultRegistry.stop(). So, no need for thread-safety. """ if self._result_set.isSet(): return self._value = result self._result_set.set() def __del__(self): if self._result_retrieved or not self._result_set.isSet(): return if isinstance(self._value, Failure): err(self._value, "Unhandled error in EventualResult") def cancel(self): """ Try to cancel the operation by cancelling the underlying Deferred. Cancellation of the operation may or may not happen depending on underlying cancellation support and whether the operation has already finished. In any case, however, the underlying Deferred will be fired. Multiple calls will have no additional effect. """ self._reactor.callFromThread(lambda: self._deferred.cancel()) def _result(self, timeout): """ Return the result, if available. It may take an unknown amount of time to return the result, so a timeout option is provided. If the given number of seconds pass with no result, a TimeoutError will be thrown. If a previous call timed out, additional calls to this function will still wait for a result and return it if available. If a result was returned on one call, additional calls will return/raise the same result. """ self._result_set.wait(timeout) # In Python 2.6 we can't rely on the return result of wait(), so we # have to check manually: if not self._result_set.is_set(): raise TimeoutError() self._result_retrieved = True return self._value def wait(self, timeout): """ Return the result, or throw the exception if result is a failure. It may take an unknown amount of time to return the result, so a timeout option is provided. If the given number of seconds pass with no result, a TimeoutError will be thrown. If a previous call timed out, additional calls to this function will still wait for a result and return it if available. If a result was returned or raised on one call, additional calls will return/raise the same result. """ if threadable.isInIOThread(): raise RuntimeError( "EventualResult.wait() must not be run in the reactor thread.") result = self._result(timeout) if isinstance(result, Failure): result.raiseException() return result def stash(self): """ Store the EventualResult in memory for later retrieval. Returns a integer uid which can be passed to crochet.retrieve_result() to retrieve the instance later on. """ return _store.store(self) def original_failure(self): """ Return the underlying Failure object, if the result is an error. If no result is yet available, or the result was not an error, None is returned. This method is useful if you want to get the original traceback for an error result. """ try: result = self._result(0.0) except TimeoutError: return None if isinstance(result, Failure): return result else: return None _STOP = object() class ThreadLogObserver(object): """ A log observer that wraps another observer, and calls it in a thread. In particular, used to wrap PythonLoggingObserver, so that blocking logging.py Handlers don't block the event loop. """ def __init__(self, observer): self._observer = observer self._queue = SimpleQueue() self._thread = threading.Thread( target=self._reader, name="CrochetLogWriter") self._thread.start() def _reader(self): """ Runs in a thread, reads messages from a queue and writes them to the wrapped observer. """ while True: msg = self._queue.get() if msg is _STOP: return try: self._observer(msg) except Exception: # Lower-level logging system blew up, nothing we can do, so # just drop on the floor. pass def stop(self): """ Stop the thread. """ self._queue.put(_STOP) def __call__(self, msg): """ A log observer that writes to a queue. """ self._queue.put(msg) class EventLoop(object): """ Initialization infrastructure for running a reactor in a thread. """ def __init__( self, reactorFactory, atexit_register, startLoggingWithObserver=None, watchdog_thread=None, reapAllProcesses=None ): """ reactorFactory: Zero-argument callable that returns a reactor. atexit_register: atexit.register, or look-alike. startLoggingWithObserver: Either None, or twisted.python.log.startLoggingWithObserver or lookalike. watchdog_thread: crochet._shutdown.Watchdog instance, or None. reapAllProcesses: twisted.internet.process.reapAllProcesses or lookalike. """ self._reactorFactory = reactorFactory self._atexit_register = atexit_register self._startLoggingWithObserver = startLoggingWithObserver self._started = False self._lock = threading.Lock() self._watchdog_thread = watchdog_thread self._reapAllProcesses = reapAllProcesses def _startReapingProcesses(self): """ Start a LoopingCall that calls reapAllProcesses. """ lc = LoopingCall(self._reapAllProcesses) lc.clock = self._reactor lc.start(0.1, False) def _common_setup(self): """ The minimal amount of setup done by both setup() and no_setup(). """ self._started = True self._reactor = self._reactorFactory() self._registry = ResultRegistry() # We want to unblock EventualResult regardless of how the reactor is # run, so we always register this: self._reactor.addSystemEventTrigger( "before", "shutdown", self._registry.stop) @synchronized def setup(self): """ Initialize the crochet library. This starts the reactor in a thread, and connect's Twisted's logs to Python's standard library logging module. This must be called at least once before the library can be used, and can be called multiple times. """ if self._started: return self._common_setup() if platform.type == "posix": self._reactor.callFromThread(self._startReapingProcesses) if self._startLoggingWithObserver: observer = ThreadLogObserver(PythonLoggingObserver().emit) def start(): # Twisted is going to override warnings.showwarning; let's # make sure that has no effect: from twisted.python import log original = log.showwarning log.showwarning = warnings.showwarning self._startLoggingWithObserver(observer, False) log.showwarning = original self._reactor.callFromThread(start) # We only want to stop the logging thread once the reactor has # shut down: self._reactor.addSystemEventTrigger( "after", "shutdown", observer.stop) t = threading.Thread( target=lambda: self._reactor.run(installSignalHandlers=False), name="CrochetReactor") t.start() self._atexit_register(self._reactor.callFromThread, self._reactor.stop) self._atexit_register(_store.log_errors) if self._watchdog_thread is not None: self._watchdog_thread.start() @synchronized def no_setup(self): """ Initialize the crochet library with no side effects. No reactor will be started, logging is uneffected, etc.. Future calls to setup() will have no effect. This is useful for applications that intend to run Twisted's reactor themselves, and so do not want libraries using crochet to attempt to start it on their own. If no_setup() is called after setup(), a RuntimeError is raised. """ if self._started: raise RuntimeError( "no_setup() is intended to be called once, by a" " Twisted application, before any libraries " "using crochet are imported and call setup().") self._common_setup() def run_in_reactor(self, function): """ A decorator that ensures the wrapped function runs in the reactor thread. When the wrapped function is called, an EventualResult is returned. """ def _run_in_reactor(wrapped, _, args, kwargs): """ Implementation: A decorator that ensures the wrapped function runs in the reactor thread. When the wrapped function is called, an EventualResult is returned. """ if iscoroutinefunction(wrapped): def runs_in_reactor(result, args, kwargs): d = ensureDeferred(wrapped(*args, **kwargs)) result._connect_deferred(d) else: def runs_in_reactor(result, args, kwargs): d = maybeDeferred(wrapped, *args, **kwargs) result._connect_deferred(d) result = EventualResult(None, self._reactor) self._registry.register(result) self._reactor.callFromThread(runs_in_reactor, result, args, kwargs) return result if iscoroutinefunction(function): # Create a non-async wrapper with same signature. @wraps(function) def non_async_wrapper(): pass else: # Just use default behavior of looking at underlying object. non_async_wrapper = None return wrapt.decorator(_run_in_reactor, adapter=non_async_wrapper)(function) def wait_for(self, timeout): """ A decorator factory that ensures the wrapped function runs in the reactor thread. When the wrapped function is called, its result is returned or its exception raised. Deferreds are handled transparently. Calls will timeout after the given number of seconds (a float), raising a crochet.TimeoutError, and cancelling the Deferred being waited on. """ def decorator(function): def wrapper(function, _, args, kwargs): @self.run_in_reactor def run(): if iscoroutinefunction(function): return ensureDeferred(function(*args, **kwargs)) else: return function(*args, **kwargs) eventual_result = run() try: return eventual_result.wait(timeout) except TimeoutError: eventual_result.cancel() raise if iscoroutinefunction(function): # Create a non-async wrapper with same signature. @wraps(function) def non_async_wrapper(): pass else: # Just use default behavior of looking at underlying object. non_async_wrapper = None wrapper = wrapt.decorator(wrapper, adapter=non_async_wrapper) return wrapper(function) return decorator ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/_resultstore.py0000644000175100001730000000272014450111171017272 0ustar00runnerdocker""" In-memory store for EventualResults. """ import threading from twisted.python import log from ._util import synchronized class ResultStore(object): """ An in-memory store for EventualResult instances. Each EventualResult put in the store gets a unique identifier, which can be used to retrieve it later. This is useful for referring to results in e.g. web sessions. EventualResults that are not retrieved by shutdown will be logged if they have an error result. """ def __init__(self): self._counter = 0 self._stored = {} self._lock = threading.Lock() @synchronized def store(self, deferred_result): """ Store a EventualResult. Return an integer, a unique identifier that can be used to retrieve the object. """ self._counter += 1 self._stored[self._counter] = deferred_result return self._counter @synchronized def retrieve(self, result_id): """ Return the given EventualResult, and remove it from the store. """ return self._stored.pop(result_id) @synchronized def log_errors(self): """ Log errors for all stored EventualResults that have error results. """ for result in self._stored.values(): failure = result.original_failure() if failure is not None: log.err(failure, "Unhandled error in stashed EventualResult:") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/_shutdown.py0000644000175100001730000000306714450111171016557 0ustar00runnerdocker""" Support for calling code when the main thread exits. atexit cannot be used, since registered atexit functions only run after *all* threads have exited. The watchdog thread will be started by crochet.setup(). """ import threading import time from twisted.python import log class Watchdog(threading.Thread): """ Watch a given thread, call a list of functions when that thread exits. """ def __init__(self, canary, shutdown_function): threading.Thread.__init__(self, name="CrochetShutdownWatchdog") self._canary = canary self._shutdown_function = shutdown_function def run(self): while self._canary.is_alive(): time.sleep(0.1) self._shutdown_function() class FunctionRegistry(object): """ A registry of functions that can be called all at once. """ def __init__(self): self._functions = [] def register(self, f, *args, **kwargs): """ Register a function and arguments to be called later. """ self._functions.append(lambda: f(*args, **kwargs)) def run(self): """ Run all registered functions in reverse order of registration. """ for f in reversed(self._functions): try: f() except Exception: log.err() # This is... fragile. Not sure how else to do it though. _registry = FunctionRegistry() _watchdog = Watchdog( [t for t in threading.enumerate() if isinstance(t, threading._MainThread)][0], _registry.run, ) register = _registry.register ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/_util.py0000644000175100001730000000063314450111171015655 0ustar00runnerdocker""" Utility functions and classes. """ import wrapt @wrapt.decorator def _synced(method, self, args, kwargs): """Underlying synchronized wrapper.""" with self._lock: return method(*args, **kwargs) def synchronized(method): """ Decorator that wraps a method with an acquire/release of self._lock. """ result = _synced(method) result.synchronized = True return result ././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/crochet/_version.py0000644000175100001730000000072714450111176016376 0ustar00runnerdocker # This file was generated by 'versioneer.py' (0.16) from # revision-control system data, or from the parent directory name of an # unpacked source archive. Distribution tarballs contain a pre-generated copy # of this file. import json import sys version_json = ''' { "dirty": false, "error": null, "full-revisionid": "9ca30d02f4dc2376545aea14cbed52fc7880c6ba", "version": "2.1.1" } ''' # END VERSION_JSON def get_versions(): return json.loads(version_json) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/mypy.py0000644000175100001730000000333714450111171015543 0ustar00runnerdocker""" Mypy plugin to aid with typechecking code that uses Crochet. """ import typing from typing import Callable, Optional from mypy.plugin import FunctionContext, Plugin # pylint: disable=no-name-in-module from mypy.types import CallableType, Type, get_proper_type # pylint: disable=no-name-in-module def plugin(_version: str) -> typing.Type[Plugin]: return CrochetMypyPlugin class CrochetMypyPlugin(Plugin): """ Assists mypy with type checking APIs not (yet) fully covered by Python's type hint annotation types, by copying run_in_reactor decorated function's argument types to the type mypy deduces for the wrapped function. """ def get_function_hook( self, fullname: str, ) -> Optional[Callable[[FunctionContext], Type]]: if fullname == "crochet.run_in_reactor": return _copyargs_callback return None def _copyargs_callback(ctx: FunctionContext) -> Type: """ Copy the parameters from the signature of the type of the argument of the call to the signature of the return type. """ original_return_type = ctx.default_return_type if not ctx.arg_types or len(ctx.arg_types[0]) != 1: return original_return_type arg_type = get_proper_type(ctx.arg_types[0][0]) default_return_type = get_proper_type(original_return_type) if not ( isinstance(arg_type, CallableType) and isinstance(default_return_type, CallableType) ): return original_return_type return default_return_type.copy_modified( arg_types=arg_type.arg_types, arg_kinds=arg_type.arg_kinds, arg_names=arg_type.arg_names, variables=arg_type.variables, is_ellipsis_args=arg_type.is_ellipsis_args, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/py.typed0000644000175100001730000000000014450111171015652 0ustar00runnerdocker././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1688244862.3213902 crochet-2.1.1/crochet/tests/0000755000175100001730000000000014450111176015334 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/__init__.py0000644000175100001730000000027614450111171017445 0ustar00runnerdocker""" Crochet tests. """ from twisted.python.filepath import FilePath # Directory where the crochet package is stored: crochet_directory = FilePath(__file__).parent().parent().parent().path ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_api.py0000644000175100001730000010105314450111171017511 0ustar00runnerdocker""" Tests for the crochet APIs. """ from __future__ import absolute_import import threading import subprocess import time import gc import sys import weakref import tempfile import os import inspect from unittest import SkipTest from twisted.trial.unittest import TestCase from twisted.internet.defer import succeed, Deferred, fail, CancelledError from twisted.python.failure import Failure from twisted.python import threadable from twisted.python.runtime import platform from .._eventloop import ( EventLoop, EventualResult, TimeoutError, ResultRegistry, ReactorStopped) from .test_setup import FakeReactor from .. import ( _main, setup as setup_crochet, retrieve_result, _store, no_setup, run_in_reactor, wait_for) from ..tests import crochet_directory if platform.type == "posix": try: from twisted.internet.process import reapAllProcesses except (SyntaxError, ImportError): if sys.version_info < (3, 3, 0): raise else: # Process support is still not ported to Python 3 on some versions # of Twisted. reapAllProcesses = None else: # waitpid() is only necessary on POSIX: reapAllProcesses = None class ResultRegistryTests(TestCase): """ Tests for ResultRegistry. """ def test_stopped_registered(self): """ ResultRegistery.stop() fires registered EventualResult with ReactorStopped. """ registry = ResultRegistry() er = EventualResult(None, None) registry.register(er) registry.stop() self.assertRaises(ReactorStopped, er.wait, timeout=0) def test_stopped_new_registration(self): """ After ResultRegistery.stop() is called subsequent register() calls raise ReactorStopped. """ registry = ResultRegistry() er = EventualResult(None, None) registry.stop() self.assertRaises(ReactorStopped, registry.register, er) def test_stopped_already_have_result(self): """ ResultRegistery.stop() has no impact on registered EventualResult which already have a result. """ registry = ResultRegistry() er = EventualResult(succeed(123), None) registry.register(er) registry.stop() self.assertEqual(er.wait(0.1), 123) self.assertEqual(er.wait(0.1), 123) self.assertEqual(er.wait(0.1), 123) def test_weakref(self): """ Registering an EventualResult with a ResultRegistry does not prevent it from being garbage collected. """ registry = ResultRegistry() er = EventualResult(None, None) registry.register(er) ref = weakref.ref(er) del er gc.collect() self.assertIdentical(ref(), None) def test_runs_with_lock(self): """ All code in ResultRegistry.stop() and register() is protected by a lock. """ self.assertTrue(ResultRegistry.stop.synchronized) self.assertTrue(ResultRegistry.register.synchronized) def append_in_thread(a_list, f, *args, **kwargs): """ Call a function in a thread, append its result to the given list. Only return once the thread has actually started. Will return a threading.Event that will be set when the action is done. """ started = threading.Event() done = threading.Event() def go(): started.set() try: result = f(*args, **kwargs) except Exception as e: a_list.extend([False, e]) else: a_list.extend([True, result]) done.set() threading.Thread(target=go).start() started.wait() return done class EventualResultTests(TestCase): """ Tests for EventualResult. """ def setUp(self): self.patch(threadable, "isInIOThread", lambda: False) def test_success_result(self): """ wait() returns the value the Deferred fired with. """ dr = EventualResult(succeed(123), None) self.assertEqual(dr.wait(0.1), 123) def test_later_success_result(self): """ wait() returns the value the Deferred fired with, in the case where the Deferred is fired after wait() is called. """ d = Deferred() dr = EventualResult(d, None) result_list = [] done = append_in_thread(result_list, dr.wait, 100) time.sleep(0.1) # At this point dr.wait() should have started: d.callback(345) done.wait(100) self.assertEqual(result_list, [True, 345]) def test_success_result_twice(self): """ A second call to wait() returns same value as the first call. """ dr = EventualResult(succeed(123), None) self.assertEqual(dr.wait(0.1), 123) self.assertEqual(dr.wait(0.1), 123) def test_failure_result(self): """ wait() raises the exception the Deferred fired with. """ dr = EventualResult(fail(RuntimeError()), None) self.assertRaises(RuntimeError, dr.wait, 0.1) def test_later_failure_result(self): """ wait() raises the exception the Deferred fired with, in the case where the Deferred is fired after wait() is called. """ d = Deferred() dr = EventualResult(d, None) result_list = [] done = append_in_thread(result_list, dr.wait, 100) time.sleep(0.1) d.errback(RuntimeError()) done.wait(100) self.assertEqual( (result_list[0], result_list[1].__class__), (False, RuntimeError)) def test_failure_result_twice(self): """ A second call to wait() raises same value as the first call. """ dr = EventualResult(fail(ZeroDivisionError()), None) self.assertRaises(ZeroDivisionError, dr.wait, 0.1) self.assertRaises(ZeroDivisionError, dr.wait, 0.1) def test_timeout(self): """ If no result is available, wait(timeout) will throw a TimeoutError. """ start = time.time() dr = EventualResult(Deferred(), None) self.assertRaises(TimeoutError, dr.wait, timeout=0.03) # be a little lenient for slow computers: self.assertTrue(abs(time.time() - start) < 0.05) def test_timeout_twice(self): """ If no result is available, a second call to wait(timeout) will also result in a TimeoutError exception. """ dr = EventualResult(Deferred(), None) self.assertRaises(TimeoutError, dr.wait, timeout=0.01) self.assertRaises(TimeoutError, dr.wait, timeout=0.01) def test_timeout_then_result(self): """ If a result becomes available after a timeout, a second call to wait() will return it. """ d = Deferred() dr = EventualResult(d, None) self.assertRaises(TimeoutError, dr.wait, timeout=0.01) d.callback(u"value") self.assertEqual(dr.wait(0.1), u"value") self.assertEqual(dr.wait(0.1), u"value") def test_reactor_thread_disallowed(self): """ wait() cannot be called from the reactor thread. """ self.patch(threadable, "isInIOThread", lambda: True) d = Deferred() dr = EventualResult(d, None) self.assertRaises(RuntimeError, dr.wait, 0) def test_cancel(self): """ cancel() cancels the wrapped Deferred, running cancellation in the event loop thread. """ reactor = FakeReactor() cancelled = [] def error(f): cancelled.append(reactor.in_call_from_thread) cancelled.append(f) d = Deferred().addErrback(error) dr = EventualResult(d, _reactor=reactor) dr.cancel() self.assertTrue(cancelled[0]) self.assertIsInstance(cancelled[1].value, CancelledError) def test_stash(self): """ EventualResult.stash() stores the object in the global ResultStore. """ dr = EventualResult(Deferred(), None) uid = dr.stash() self.assertIdentical(dr, _store.retrieve(uid)) def test_original_failure(self): """ original_failure() returns the underlying Failure of the Deferred wrapped by the EventualResult. """ try: 1 / 0 except ZeroDivisionError: f = Failure() dr = EventualResult(fail(f), None) self.assertIdentical(dr.original_failure(), f) def test_original_failure_no_result(self): """ If there is no result yet, original_failure() returns None. """ dr = EventualResult(Deferred(), None) self.assertIdentical(dr.original_failure(), None) def test_original_failure_not_error(self): """ If the result is not an error, original_failure() returns None. """ dr = EventualResult(succeed(3), None) self.assertIdentical(dr.original_failure(), None) def test_error_logged_no_wait(self): """ If the result is an error and wait() was never called, the error will be logged once the EventualResult is garbage-collected. """ dr = EventualResult(fail(ZeroDivisionError()), None) del dr gc.collect() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) def test_error_logged_wait_timeout(self): """ If the result is an error and wait() was called but timed out, the error will be logged once the EventualResult is garbage-collected. """ d = Deferred() dr = EventualResult(d, None) try: dr.wait(0) except TimeoutError: pass d.errback(ZeroDivisionError()) del dr if sys.version_info[0] == 2: sys.exc_clear() gc.collect() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) def test_error_after_gc_logged(self): """ If the result is an error that occurs after all user references to the EventualResult are lost, the error is still logged. """ d = Deferred() dr = EventualResult(d, None) del dr d.errback(ZeroDivisionError()) gc.collect() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) def test_control_c_is_possible(self): """ If you're wait()ing on an EventualResult in main thread, make sure the KeyboardInterrupt happens in timely manner. """ if platform.type != "posix": raise SkipTest("I don't have the energy to fight Windows semantics.") program = """\ import os, threading, signal, time, sys import crochet crochet.setup() from twisted.internet.defer import Deferred if sys.platform.startswith('win'): signal.signal(signal.SIGBREAK, signal.default_int_handler) sig_int=signal.CTRL_BREAK_EVENT sig_kill=signal.SIGTERM else: sig_int=signal.SIGINT sig_kill=signal.SIGKILL def interrupt(): time.sleep(0.1) # Make sure we've hit wait() os.kill(os.getpid(), sig_int) time.sleep(1) # Still running, test shall fail... os.kill(os.getpid(), sig_kill) t = threading.Thread(target=interrupt, daemon=True) t.start() d = Deferred() e = crochet.EventualResult(d, None) try: e.wait(10000) except KeyboardInterrupt: sys.exit(23) """ kw = {'cwd': crochet_directory} # on Windows the only way to interrupt a subprocess reliably is to # create a new process group: # http://docs.python.org/2/library/subprocess.html#subprocess.CREATE_NEW_PROCESS_GROUP if platform.type.startswith('win'): kw['creationflags'] = subprocess.CREATE_NEW_PROCESS_GROUP process = subprocess.Popen([sys.executable, "-c", program], **kw) self.assertEqual(process.wait(), 23) def test_connect_deferred(self): """ If an EventualResult is created with None, EventualResult._connect_deferred can be called later to register a Deferred as the one it is wrapping. """ er = EventualResult(None, None) self.assertRaises(TimeoutError, er.wait, 0) d = Deferred() er._connect_deferred(d) self.assertRaises(TimeoutError, er.wait, 0) d.callback(123) self.assertEqual(er.wait(0.1), 123) def test_reactor_stop_unblocks_EventualResult(self): """ Any EventualResult.wait() calls still waiting when the reactor has stopped will get a ReactorStopped exception. """ program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred from twisted.internet import reactor import crochet crochet.setup() @crochet.run_in_reactor def run(): reactor.callLater(0.1, reactor.stop) return Deferred() er = run() try: er.wait(timeout=10) except crochet.ReactorStopped: sys.exit(23) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) def test_reactor_stop_unblocks_EventualResult_in_threadpool(self): """ Any EventualResult.wait() calls still waiting when the reactor has stopped will get a ReactorStopped exception, even if it is running in Twisted's thread pool. """ program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred from twisted.internet import reactor import crochet crochet.setup() @crochet.run_in_reactor def run(): reactor.callLater(0.1, reactor.stop) return Deferred() result = [13] def inthread(): er = run() try: er.wait(timeout=10) except crochet.ReactorStopped: result[0] = 23 reactor.callInThread(inthread) time.sleep(1) sys.exit(result[0]) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) def test_immediate_cancel(self): """ Immediately cancelling the result of @run_in_reactor function will still cancel the Deferred. """ # This depends on the way reactor runs callFromThread calls, so need # real functional test. program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred, CancelledError import crochet crochet.setup() @crochet.run_in_reactor def run(): return Deferred() er = run() er.cancel() try: er.wait(1) except CancelledError: sys.exit(23) else: sys.exit(3) """ process = subprocess.Popen( [sys.executable, "-c", program], cwd=crochet_directory, ) self.assertEqual(process.wait(), 23) def test_noWaitingDuringImport(self): """ EventualResult.wait() raises an exception if called while a module is being imported. This prevents the imports from taking a long time, preventing other imports from running in other threads. It also prevents deadlocks, which can happen if the code being waited on also tries to import something. """ if sys.version_info[0] > 2: from unittest import SkipTest raise SkipTest( "This test is too fragile (and insufficient) on " "Python 3 - see " "https://github.com/itamarst/crochet/issues/43") directory = tempfile.mktemp() os.mkdir(directory) sys.path.append(directory) self.addCleanup(sys.path.remove, directory) with open(os.path.join(directory, "shouldbeunimportable.py"), "w") as f: f.write( """\ from crochet import EventualResult from twisted.internet.defer import Deferred EventualResult(Deferred(), None).wait(1.0) """) self.assertRaises(RuntimeError, __import__, "shouldbeunimportable") class RunInReactorTests(TestCase): """ Tests for the run_in_reactor decorator. """ def test_signature(self): """ The function decorated with the run_in_reactor decorator has the same signature as the original function. """ c = EventLoop(lambda: FakeReactor(), lambda f, g: None) def some_name(arg1, arg2, karg1=2, *args, **kw): pass decorated = c.run_in_reactor(some_name) self.assertEqual(inspect.signature(some_name), inspect.signature(decorated)) def test_name(self): """ The function decorated with run_in_reactor has the same name as the original function. """ c = EventLoop(lambda: FakeReactor(), lambda f, g: None) @c.run_in_reactor def some_name(): pass self.assertEqual(some_name.__name__, "some_name") def test_run_in_reactor_thread(self): """ The function decorated with run_in_reactor is run in the reactor thread. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] @c.run_in_reactor def func(a, b, c): self.assertTrue(myreactor.in_call_from_thread) calls.append((a, b, c)) func(1, 2, c=3) self.assertEqual(calls, [(1, 2, 3)]) def test_method(self): """ The function decorated with the wait decorator can be a method. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] class C(object): @c.run_in_reactor def func(self, a, b, c): calls.append((self, a, b, c)) o = C() o.func(1, 2, c=3) self.assertEqual(calls, [(o, 1, 2, 3)]) def test_classmethod(self): """ The function decorated with the wait decorator can be a classmethod. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] class C(object): @c.run_in_reactor @classmethod def func(cls, a, b, c): calls.append((cls, a, b, c)) @classmethod @c.run_in_reactor def func2(cls, a, b, c): calls.append((cls, a, b, c)) C.func(1, 2, c=3) C.func2(1, 2, c=3) self.assertEqual(calls, [(C, 1, 2, 3), (C, 1, 2, 3)]) def test_wrap_method(self): """ The object decorated with the wait decorator can be a method object """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] class C(object): def func(self, a, b, c): calls.append((a, b, c)) f = c.run_in_reactor(C().func) f(4, 5, c=6) self.assertEqual(calls, [(4, 5, 6)]) def make_wrapped_function(self): """ Return a function wrapped with run_in_reactor that returns its first argument. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() @c.run_in_reactor def passthrough(argument): return argument return passthrough def test_deferred_success_result(self): """ If the underlying function returns a Deferred, the wrapper returns a EventualResult hooked up to the Deferred. """ passthrough = self.make_wrapped_function() result = passthrough(succeed(123)) self.assertIsInstance(result, EventualResult) self.assertEqual(result.wait(0.1), 123) def test_deferred_failure_result(self): """ If the underlying function returns a Deferred, the wrapper returns a EventualResult hooked up to the Deferred that can deal with failures as well. """ passthrough = self.make_wrapped_function() result = passthrough(fail(ZeroDivisionError())) self.assertIsInstance(result, EventualResult) self.assertRaises(ZeroDivisionError, result.wait, 0.1) def test_regular_result(self): """ If the underlying function returns a non-Deferred, the wrapper returns a EventualResult hooked up to a Deferred wrapping the result. """ passthrough = self.make_wrapped_function() result = passthrough(123) self.assertIsInstance(result, EventualResult) self.assertEqual(result.wait(0.1), 123) def test_exception_result(self): """ If the underlying function throws an exception, the wrapper returns a EventualResult hooked up to a Deferred wrapping the exception. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() @c.run_in_reactor def raiser(): 1 / 0 result = raiser() self.assertIsInstance(result, EventualResult) self.assertRaises(ZeroDivisionError, result.wait, 0.1) def test_registry(self): """ @run_in_reactor registers the EventualResult in the ResultRegistry. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() @c.run_in_reactor def run(): return result = run() self.assertIn(result, c._registry._results) def test_wrapped_function(self): """ The function wrapped by @run_in_reactor can be accessed via the `__wrapped__` attribute. """ c = EventLoop(lambda: None, lambda f, g: None) def func(): pass wrapper = c.run_in_reactor(func) self.assertIdentical(wrapper.__wrapped__, func) def test_async_function(self): """ Async functions can be wrapped with @run_in_reactor. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] @c.run_in_reactor async def go(): self.assertTrue(myreactor.in_call_from_thread) calls.append(1) return 23 self.assertEqual((go().wait(0.1), go().wait(0.1)), (23, 23)) self.assertEqual(len(calls), 2) self.assertFalse(inspect.iscoroutinefunction(go)) class WaitTests(TestCase): """ Tests for wait_for decorators. """ def setUp(self): self.reactor = FakeReactor() self.eventloop = EventLoop(lambda: self.reactor, lambda f, g: None) self.eventloop.no_setup() DECORATOR_CALL = "wait_for(timeout=5)" def decorator(self): return lambda func: self.eventloop.wait_for(timeout=5)(func) def make_wrapped_function(self): """ Return a function wrapped with the decorator being tested that returns its first argument, or raises it if it's an exception. """ decorator = self.decorator() @decorator def passthrough(argument): if isinstance(argument, Exception): raise argument return argument return passthrough def test_name(self): """ The function decorated with the wait decorator has the same name as the original function. """ decorator = self.decorator() @decorator def some_name(argument): pass self.assertEqual(some_name.__name__, "some_name") def test_signature(self): """ The function decorated with the wait decorator has the same signature as the original function. """ decorator = self.decorator() def some_name(arg1, arg2, karg1=2, *args, **kw): pass decorated = decorator(some_name) self.assertEqual(inspect.signature(some_name), inspect.signature(decorated)) def test_wrapped_function(self): """ The function wrapped by the wait decorator can be accessed via the `__wrapped__` attribute. """ decorator = self.decorator() def func(): pass wrapper = decorator(func) self.assertIdentical(wrapper.__wrapped__, func) def test_reactor_thread_disallowed(self): """ Functions decorated with the wait decorator cannot be called from the reactor thread. """ self.patch(threadable, "isInIOThread", lambda: True) f = self.make_wrapped_function() self.assertRaises(RuntimeError, f, None) def test_wait_for_reactor_thread(self): """ The function decorated with the wait decorator is run in the reactor thread. """ in_call_from_thread = [] decorator = self.decorator() @decorator def func(): in_call_from_thread.append(self.reactor.in_call_from_thread) in_call_from_thread.append(self.reactor.in_call_from_thread) func() in_call_from_thread.append(self.reactor.in_call_from_thread) self.assertEqual(in_call_from_thread, [False, True, False]) def test_arguments(self): """ The function decorated with wait decorator gets all arguments passed to the wrapper. """ calls = [] decorator = self.decorator() @decorator def func(a, b, c): calls.append((a, b, c)) func(1, 2, c=3) self.assertEqual(calls, [(1, 2, 3)]) def test_classmethod(self): """ The function decorated with the wait decorator can be a classmethod. """ calls = [] decorator = self.decorator() class C(object): @decorator @classmethod def func(cls, a, b, c): calls.append((a, b, c)) @classmethod @decorator def func2(cls, a, b, c): calls.append((a, b, c)) C.func(1, 2, c=3) C.func2(1, 2, c=3) self.assertEqual(calls, [(1, 2, 3), (1, 2, 3)]) def test_deferred_success_result(self): """ If the underlying function returns a Deferred, the wrapper returns a the Deferred's result. """ passthrough = self.make_wrapped_function() result = passthrough(succeed(123)) self.assertEqual(result, 123) def test_deferred_failure_result(self): """ If the underlying function returns a Deferred with an errback, the wrapper throws an exception. """ passthrough = self.make_wrapped_function() self.assertRaises( ZeroDivisionError, passthrough, fail(ZeroDivisionError())) def test_regular_result(self): """ If the underlying function returns a non-Deferred, the wrapper returns that result. """ passthrough = self.make_wrapped_function() result = passthrough(123) self.assertEqual(result, 123) def test_exception_result(self): """ If the underlying function throws an exception, the wrapper raises that exception. """ raiser = self.make_wrapped_function() self.assertRaises(ZeroDivisionError, raiser, ZeroDivisionError()) def test_control_c_is_possible(self): """ A call to a decorated function responds to a Ctrl-C (i.e. with a KeyboardInterrupt) in a timely manner. """ if platform.type != "posix": raise SkipTest("I don't have the energy to fight Windows semantics.") program = """\ import os, threading, signal, time, sys import crochet crochet.setup() from twisted.internet.defer import Deferred if sys.platform.startswith('win'): signal.signal(signal.SIGBREAK, signal.default_int_handler) sig_int=signal.CTRL_BREAK_EVENT sig_kill=signal.SIGTERM else: sig_int=signal.SIGINT sig_kill=signal.SIGKILL def interrupt(): time.sleep(0.1) # Make sure we've hit wait() os.kill(os.getpid(), sig_int) time.sleep(1) # Still running, test shall fail... os.kill(os.getpid(), sig_kill) t = threading.Thread(target=interrupt, daemon=True) t.start() @crochet.%s def wait(): return Deferred() try: wait() except KeyboardInterrupt: sys.exit(23) """ % (self.DECORATOR_CALL, ) kw = {'cwd': crochet_directory} if platform.type.startswith('win'): kw['creationflags'] = subprocess.CREATE_NEW_PROCESS_GROUP process = subprocess.Popen([sys.executable, "-c", program], **kw) self.assertEqual(process.wait(), 23) def test_reactor_stop_unblocks(self): """ Any @wait_for_reactor-decorated calls still waiting when the reactor has stopped will get a ReactorStopped exception. """ program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred from twisted.internet import reactor import crochet crochet.setup() @crochet.%s def run(): reactor.callLater(0.1, reactor.stop) return Deferred() try: er = run() except crochet.ReactorStopped: sys.exit(23) """ % (self.DECORATOR_CALL, ) process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) def test_timeoutRaises(self): """ If a function wrapped with wait_for hits the timeout, it raises TimeoutError. """ @self.eventloop.wait_for(timeout=0.5) def times_out(): return Deferred().addErrback(lambda f: f.trap(CancelledError)) start = time.time() self.assertRaises(TimeoutError, times_out) self.assertTrue(abs(time.time() - start - 0.5) < 0.1) def test_timeoutCancels(self): """ If a function wrapped with wait_for hits the timeout, it cancels the underlying Deferred. """ result = Deferred() error = [] result.addErrback(error.append) @self.eventloop.wait_for(timeout=0.0) def times_out(): return result self.assertRaises(TimeoutError, times_out) self.assertIsInstance(error[0].value, CancelledError) def test_async_function(self): """ Async functions can be wrapped with @wait_for. """ @self.eventloop.wait_for(timeout=0.1) async def go(): self.assertTrue(self.reactor.in_call_from_thread) return 17 self.assertEqual((go(), go()), (17, 17)) self.assertFalse(inspect.iscoroutinefunction(go)) class PublicAPITests(TestCase): """ Tests for the public API. """ def test_no_sideeffects(self): """ Creating an EventLoop object, as is done in crochet.__init__, does not call any methods on the objects it is created with. """ c = EventLoop( lambda: None, lambda f, g: 1 / 0, lambda *args: 1 / 0, watchdog_thread=object(), reapAllProcesses=lambda: 1 / 0) del c def test_eventloop_api(self): """ An EventLoop object configured with the real reactor and _shutdown.register is exposed via its public methods. """ from twisted.python.log import startLoggingWithObserver from crochet import _shutdown self.assertIsInstance(_main, EventLoop) self.assertEqual(_main.setup, setup_crochet) self.assertEqual(_main.no_setup, no_setup) self.assertEqual(_main.run_in_reactor, run_in_reactor) self.assertEqual(_main.wait_for, wait_for) self.assertIdentical(_main._atexit_register, _shutdown.register) self.assertIdentical( _main._startLoggingWithObserver, startLoggingWithObserver) self.assertIdentical(_main._watchdog_thread, _shutdown._watchdog) def test_eventloop_api_reactor(self): """ The publicly exposed EventLoop will, when setup, use the global reactor. """ from twisted.internet import reactor _main.no_setup() self.assertIdentical(_main._reactor, reactor) def test_retrieve_result(self): """ retrieve_result() calls retrieve() on the global ResultStore. """ dr = EventualResult(Deferred(), None) uid = dr.stash() self.assertIdentical(dr, retrieve_result(uid)) def test_reapAllProcesses(self): """ An EventLoop object configured with the real reapAllProcesses on POSIX plaforms. """ self.assertIdentical(_main._reapAllProcesses, reapAllProcesses) if platform.type != "posix": test_reapAllProcesses.skip = "Only relevant on POSIX platforms" if reapAllProcesses is None: test_reapAllProcesses.skip = "Twisted does not yet support processes" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_logging.py0000644000175100001730000000511614450111171020371 0ustar00runnerdocker"""Tests for the logging bridge.""" from __future__ import absolute_import from twisted.trial.unittest import SynchronousTestCase import threading from twisted.python import threadable from .._eventloop import ThreadLogObserver class ThreadLogObserverTest(SynchronousTestCase): """ Tests for ThreadLogObserver. We use Twisted's SyncTestCase to ensure that unhandled logged errors get reported as errors, in particular for test_error. """ def test_stop(self): """ ThreadLogObserver.stop() stops the thread started in __init__. """ threadLog = ThreadLogObserver(None) self.assertTrue(threadLog._thread.is_alive()) threadLog.stop() threadLog._thread.join() self.assertFalse(threadLog._thread.is_alive()) def test_emit(self): """ ThreadLogObserver.emit runs the wrapped observer's in its thread, with the given message. """ messages = [] def observer(msg): messages.append((threading.current_thread().ident, msg)) threadLog = ThreadLogObserver(observer) ident = threadLog._thread.ident msg1 = {} msg2 = {"a": "b"} threadLog(msg1) threadLog(msg2) threadLog.stop() # Wait for writing to finish: threadLog._thread.join() self.assertEqual(messages, [(ident, msg1), (ident, msg2)]) def test_errors(self): """ ThreadLogObserver.emit catches and silently drops exceptions from its observer. """ messages = [] counter = [] def observer(msg): counter.append(1) if len(counter) == 2: raise RuntimeError("ono a bug") messages.append(msg) threadLog = ThreadLogObserver(observer) msg1 = {"m": "1"} msg2 = {"m": "2"} msg3 = {"m": "3"} threadLog(msg1) threadLog(msg2) threadLog(msg3) threadLog.stop() # Wait for writing to finish: threadLog._thread.join() self.assertEqual(messages, [msg1, msg3]) def test_ioThreadUnchanged(self): """ ThreadLogObserver does not change the Twisted I/O thread (which is supposed to match the thread the main reactor is running in.) """ threadLog = ThreadLogObserver(None) threadLog.stop() threadLog._thread.join() self.assertIn( threadable.ioThread, # Either reactor was never run, or run in thread running # the tests: (None, threading.current_thread().ident)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_mypy.py0000644000175100001730000003361414450111171017745 0ustar00runnerdocker""" Tests for crochet.mypy. """ from tempfile import NamedTemporaryFile from textwrap import dedent, indent from unittest import TestCase, skipUnless try: import mypy.api MYPY_AVAILABLE = True except ImportError: MYPY_AVAILABLE = False MYPY_CONFIG = dedent( """\ [mypy] plugins = crochet.mypy """ ) @skipUnless(MYPY_AVAILABLE, "Tests require mypy to be installed.") class MypyTests(TestCase): def test_mypy_working(self) -> None: """ mypy's API is able to function and produce errors when expected. """ _assert_mypy(True, "ivar: int = 1\n") _assert_mypy(False, "ivar: int = 'bad'\n") def test_setup_no_args(self) -> None: """ setup() and no_setup() take no arguments. """ _assert_mypy( True, dedent( r""" from crochet import setup setup() """ ), ) _assert_mypy( True, dedent( r""" from crochet import no_setup no_setup() """ ), ) def test_run_in_reactor_func_takes_same_args(self) -> None: """ The mypy plugin correctly passes the wrapped parameter signature through the @run_in_reactor decorator. """ template = dedent( """\ from crochet import run_in_reactor @run_in_reactor def foo({params}) -> None: pass foo({args}) """ ) for params, args, good in ( ( "x: int, y: str, z: float, *a: int, **kw: str", "1, 'something', -1, 4, 5, 6, k1='x', k2='y'", True, ), ( "", "1", False, ), ( "x: int", "", False, ), ( "x: int", "1, 2", False, ), ( "x: int", "'something'", False, ), ( "*x: int", "1, 2, 3", True, ), ( "*x: int", "'something'", False, ), ( "**x: int", "k1=16, k2=-5", True, ), ( "**x: int", "k1='something'", False, ), ( "x: int, y: str", "1, 'ok'", True, ), ( "x: int, y: str", "'not ok', 1", False, ), ( "x: str, y: int", "'ok', 1", True, ), ( "x: str, y: int", "1, 'not ok'", False, ), ): with self.subTest(params=params, args=args): _assert_mypy(good, template.format(params=params, args=args)) def test_run_in_reactor_func_returns_typed_eventual(self) -> None: """ run_in_reactor preserves the decorated function's return type indirectly through an EventualResult. """ template = dedent( """\ from typing import Optional from crochet import EventualResult, run_in_reactor @run_in_reactor def foo() -> {return_type}: return {return_value} eventual_result: {receiver_type} = foo() final_result: {final_type} = eventual_result.wait(1) """ ) for return_type, return_value, receiver_type, final_type, good in ( ( "int", "1", "EventualResult[int]", "int", True, ), ( "int", "'str'", "EventualResult[int]", "int", False, ), ( "int", "1", "EventualResult[str]", "int", False, ), ( "int", "1", "EventualResult[str]", "str", False, ), ( "int", "1", "int", "int", False, ), ( "int", "1", "EventualResult[int]", "Optional[int]", True, ), ( "Optional[int]", "1", "EventualResult[Optional[int]]", "Optional[int]", True, ), ( "Optional[int]", "None", "EventualResult[Optional[int]]", "Optional[int]", True, ), ( "Optional[int]", "1", "EventualResult[int]", "Optional[int]", False, ), ( "Optional[int]", "1", "EventualResult[Optional[int]]", "int", False, ), ): with self.subTest( return_type=return_type, return_value=return_value, receiver_type=receiver_type, final_type=final_type, ): _assert_mypy( good, template.format( return_type=return_type, return_value=return_value, receiver_type=receiver_type, final_type=final_type, ), ) def test_run_in_reactor_func_signature_transform(self) -> None: """ The mypy plugin correctly passes the wrapped signature though the @run_in_reactor decorator with an EventualResult-wrapped return type. """ template = dedent( """\ from typing import Callable from crochet import EventualResult, run_in_reactor class Thing: pass @run_in_reactor def foo(x: int, y: str, z: float) -> Thing: return Thing() re_foo: {result_type} = foo """ ) for result_type, good in ( ("Callable[[int, str, float], EventualResult[Thing]]", True), ("Callable[[int, str, float], EventualResult[object]]", True), ("Callable[[int, str, float], EventualResult[int]]", False), ("Callable[[int, str, float], Thing]", False), ("Callable[[int, str, float], int]", False), ("Callable[[int, str], EventualResult[Thing]]", False), ("Callable[[int], EventualResult[Thing]]", False), ("Callable[[], EventualResult[Thing]]", False), ("Callable[[float, int, str], EventualResult[Thing]]", False), ): with self.subTest(result_type=result_type): _assert_mypy(good, template.format(result_type=result_type)) def test_eventual_result_cancel_signature(self) -> None: """ EventualResult's cancel() method takes no arguments. """ _assert_mypy( True, dedent( """\ from crochet import EventualResult def foo(er: EventualResult[object]) -> None: er.cancel() """ ), ) def test_eventual_result_wait_signature(self) -> None: """ EventualResult's wait() method takes one timeout float argument. """ _assert_mypy( True, dedent( """\ from crochet import EventualResult def foo(er: EventualResult[object]) -> object: return er.wait(2.0) """ ), ) _assert_mypy( True, dedent( """\ from crochet import EventualResult def foo(er: EventualResult[object]) -> object: return er.wait(timeout=2.0) """ ), ) def test_eventual_result_stash_signature(self) -> None: """ EventualResult's stash() method takes no arguments and returns the same type retrieve_result's one result_id parameter takes. """ _assert_mypy( True, dedent( """\ from crochet import EventualResult, retrieve_result def foo(er: EventualResult[object]) -> None: retrieve_result(er.stash()) retrieve_result(result_id=er.stash()) """ ), ) def test_eventual_result_original_failure_signature(self) -> None: """ EventualResult's original_failure() method takes no arguments and returns an optional Failure. """ _assert_mypy( True, dedent( """\ from typing import Optional from twisted.python.failure import Failure from crochet import EventualResult def foo(er: EventualResult[object]) -> Optional[Failure]: return er.original_failure() """ ), ) _assert_mypy( False, dedent( """\ from twisted.python.failure import Failure from crochet import EventualResult def foo(er: EventualResult[object]) -> Failure: return er.original_failure() """ ), ) def test_exceptions(self) -> None: """ ReactorStopped and TimeoutError are Exception types. """ _assert_mypy( True, dedent( """\ from crochet import ReactorStopped, TimeoutError e1: Exception = ReactorStopped() e2: Exception = TimeoutError() """ ), ) def test_retrieve_result_returns_untyped_eventual_result(self) -> None: """ retrieve_result() returns an untyped EventualResult. """ _assert_mypy( True, dedent( """\ from crochet import EventualResult, retrieve_result r: EventualResult[object] = retrieve_result(3) """ ), ) _assert_mypy( False, dedent( """\ from crochet import EventualResult, retrieve_result r: EventualResult[int] = retrieve_result(3) """ ), ) def test_wait_for_signature(self) -> None: """ The @wait_for decorator takes a timeout float. """ _assert_mypy( True, dedent( """\ from crochet import wait_for @wait_for(1.5) def foo() -> None: pass """ ), ) _assert_mypy( True, dedent( """\ from crochet import wait_for @wait_for(timeout=1.5) def foo() -> None: pass """ ), ) def test_wait_for_func_signature_unchanged(self) -> None: """ The @wait_for(timeout) decorator preserves the wrapped function's signature. """ template = dedent( """\ from typing import Callable from crochet import wait_for class Thing: pass @wait_for(1) def foo(x: int, y: str, z: float) -> Thing: return Thing() re_foo: {result_type} = foo """ ) for result_type, good in ( ("Callable[[int, str, float], Thing]", True), ("Callable[[int, str, float], object]", True), ("Callable[[int, str, float], int]", False), ("Callable[[int, str, float], EventualResult[Thing]]", False), ("Callable[[int, str, float], None]", False), ("Callable[[int, str], Thing]", False), ("Callable[[int], Thing]", False), ("Callable[[], Thing]", False), ("Callable[[float, int, str], Thing]", False), ): with self.subTest(result_type=result_type): _assert_mypy(good, template.format(result_type=result_type)) def test_version_string(self) -> None: """ __version__ is a string. """ _assert_mypy( True, dedent( """\ import crochet x: str = crochet.__version__ """ ), ) def _assert_mypy(expect_success: bool, source_code: str) -> None: with NamedTemporaryFile(mode="w+t", delete=False) as config_file: config_file.write(MYPY_CONFIG) out, err, status = mypy.api.run( ["--config-file", config_file.name, "-c", source_code] ) if status not in (0, 1): raise RuntimeError( f"Unexpected mypy error (status {status}):\n{indent(err, ' ' * 2)}" ) if expect_success: assert ( status == 0 ), f"Unexpected mypy failure (status {status}):\n{indent(out, ' ' * 2)}" else: assert status == 1, f"Unexpected mypy success: stdout: {out}\nstderr: {err}\n" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_process.py0000644000175100001730000000353514450111171020424 0ustar00runnerdocker""" Tests for IReactorProcess. """ import subprocess import sys from twisted.trial.unittest import TestCase from twisted.python.runtime import platform from ..tests import crochet_directory class ProcessTests(TestCase): """ Tests for process support. """ def test_processExit(self): """ A Crochet-managed reactor notice when a process it started exits. On POSIX platforms this requies waitpid() to be called, which in default Twisted implementation relies on a SIGCHLD handler which is not installed by Crochet at the moment. """ program = """\ from crochet import setup, run_in_reactor setup() import sys import os from twisted.internet.protocol import ProcessProtocol from twisted.internet.defer import Deferred from twisted.internet import reactor class Waiter(ProcessProtocol): def __init__(self): self.result = Deferred() def processExited(self, reason): self.result.callback(None) @run_in_reactor def run(): waiter = Waiter() # Closing FDs before exit forces us to rely on SIGCHLD to notice process # exit: reactor.spawnProcess(waiter, sys.executable, [sys.executable, '-c', 'import os; os.close(0); os.close(1); os.close(2)'], env=os.environ) return waiter.result run().wait(10) # If we don't notice process exit, TimeoutError will be thrown and we won't # reach the next line: sys.stdout.write("abc") """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory, stdout=subprocess.PIPE) result = process.stdout.read() self.assertEqual(result, b"abc") if platform.type != "posix": test_processExit.skip = "SIGCHLD is a POSIX-specific issue" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_resultstore.py0000644000175100001730000000446114450111171021340 0ustar00runnerdocker""" Tests for _resultstore. """ from twisted.trial.unittest import TestCase from twisted.internet.defer import Deferred, fail, succeed from .._resultstore import ResultStore from .._eventloop import EventualResult class ResultStoreTests(TestCase): """ Tests for ResultStore. """ def test_store_and_retrieve(self): """ EventualResult instances be be stored in a ResultStore and then retrieved using the id returned from store(). """ store = ResultStore() dr = EventualResult(Deferred(), None) uid = store.store(dr) self.assertIdentical(store.retrieve(uid), dr) def test_retrieve_only_once(self): """ Once a result is retrieved, it can no longer be retrieved again. """ store = ResultStore() dr = EventualResult(Deferred(), None) uid = store.store(dr) store.retrieve(uid) self.assertRaises(KeyError, store.retrieve, uid) def test_synchronized(self): """ store() and retrieve() are synchronized. """ self.assertTrue(ResultStore.store.synchronized) self.assertTrue(ResultStore.retrieve.synchronized) self.assertTrue(ResultStore.log_errors.synchronized) def test_uniqueness(self): """ Each store() operation returns a larger number, ensuring uniqueness. """ store = ResultStore() dr = EventualResult(Deferred(), None) previous = store.store(dr) for i in range(100): store.retrieve(previous) dr = EventualResult(Deferred(), None) uid = store.store(dr) self.assertTrue(uid > previous) previous = uid def test_log_errors(self): """ Unretrieved EventualResults have their errors, if any, logged on shutdown. """ store = ResultStore() store.store(EventualResult(Deferred(), None)) store.store(EventualResult(fail(ZeroDivisionError()), None)) store.store(EventualResult(succeed(1), None)) store.store(EventualResult(fail(RuntimeError()), None)) store.log_errors() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) excs = self.flushLoggedErrors(RuntimeError) self.assertEqual(len(excs), 1) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_setup.py0000644000175100001730000002622014450111171020102 0ustar00runnerdocker""" Tests for the initial setup. """ from __future__ import absolute_import import threading import warnings import subprocess import sys from unittest import SkipTest, TestCase import twisted from twisted.python.log import PythonLoggingObserver from twisted.python import log from twisted.python.runtime import platform from twisted.internet.task import Clock from .._eventloop import EventLoop, ThreadLogObserver, _store from ..tests import crochet_directory class FakeReactor(Clock): """ A fake reactor for testing purposes. """ thread_id = None runs = 0 in_call_from_thread = False def __init__(self): Clock.__init__(self) self.started = threading.Event() self.stopping = False self.events = [] def run(self, installSignalHandlers=True): self.runs += 1 self.thread_id = threading.current_thread().ident self.installSignalHandlers = installSignalHandlers self.started.set() def callFromThread(self, f, *args, **kwargs): self.in_call_from_thread = True f(*args, **kwargs) self.in_call_from_thread = False def stop(self): self.stopping = True def addSystemEventTrigger(self, when, event, f): self.events.append((when, event, f)) class FakeThread: started = False def start(self): self.started = True class SetupTests(TestCase): """ Tests for setup(). """ def test_first_runs_reactor(self): """ With it first call, setup() runs the reactor in a thread. """ reactor = FakeReactor() EventLoop(lambda: reactor, lambda f, *g: None).setup() reactor.started.wait(5) self.assertNotEqual(reactor.thread_id, None) self.assertNotEqual( reactor.thread_id, threading.current_thread().ident) self.assertFalse(reactor.installSignalHandlers) def test_second_does_nothing(self): """ The second call to setup() does nothing. """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() s.setup() reactor.started.wait(5) self.assertEqual(reactor.runs, 1) def test_stop_on_exit(self): """ setup() registers an exit handler that stops the reactor, and an exit handler that logs stashed EventualResults. """ atexit = [] reactor = FakeReactor() s = EventLoop( lambda: reactor, lambda f, *args: atexit.append((f, args))) s.setup() self.assertEqual(len(atexit), 2) self.assertFalse(reactor.stopping) f, args = atexit[0] self.assertEqual(f, reactor.callFromThread) self.assertEqual(args, (reactor.stop, )) f(*args) self.assertTrue(reactor.stopping) f, args = atexit[1] self.assertEqual(f, _store.log_errors) self.assertEqual(args, ()) f(*args) # make sure it doesn't throw an exception def test_runs_with_lock(self): """ All code in setup() and no_setup() is protected by a lock. """ self.assertTrue(EventLoop.setup.synchronized) self.assertTrue(EventLoop.no_setup.synchronized) def test_logging(self): """ setup() registers a PythonLoggingObserver wrapped in a ThreadLogObserver, removing the default log observer. """ logging = [] def fakeStartLoggingWithObserver(observer, setStdout=1): self.assertIsInstance(observer, ThreadLogObserver) wrapped = observer._observer expected = PythonLoggingObserver.emit # Python 3 and 2 differ in value of __func__: expected = getattr(expected, "__func__", expected) self.assertIs(wrapped.__func__, expected) self.assertEqual(setStdout, False) self.assertTrue(reactor.in_call_from_thread) logging.append(observer) reactor = FakeReactor() loop = EventLoop( lambda: reactor, lambda f, *g: None, fakeStartLoggingWithObserver) loop.setup() self.assertTrue(logging) logging[0].stop() def test_stop_logging_on_exit(self): """ setup() registers a reactor shutdown event that stops the logging thread. """ observers = [] reactor = FakeReactor() s = EventLoop( lambda: reactor, lambda f, *arg: None, lambda observer, setStdout=1: observers.append(observer)) s.setup() self.addCleanup(observers[0].stop) self.assertIn(("after", "shutdown", observers[0].stop), reactor.events) def test_warnings_untouched(self): """ setup() ensure the warnings module's showwarning is unmodified, overriding the change made by normal Twisted logging setup. """ def fakeStartLoggingWithObserver(observer, setStdout=1): warnings.showwarning = log.showwarning self.addCleanup(observer.stop) original = warnings.showwarning reactor = FakeReactor() loop = EventLoop( lambda: reactor, lambda f, *g: None, fakeStartLoggingWithObserver) loop.setup() self.assertIs(warnings.showwarning, original) def test_start_watchdog_thread(self): """ setup() starts the shutdown watchdog thread. """ thread = FakeThread() reactor = FakeReactor() loop = EventLoop( lambda: reactor, lambda *args: None, watchdog_thread=thread) loop.setup() self.assertTrue(thread.started) def test_no_setup(self): """ If called first, no_setup() makes subsequent calls to setup() do nothing. """ observers = [] atexit = [] thread = FakeThread() reactor = FakeReactor() loop = EventLoop( lambda: reactor, lambda f, *arg: atexit.append(f), lambda observer, *a, **kw: observers.append(observer), watchdog_thread=thread) loop.no_setup() loop.setup() self.assertFalse(observers) self.assertFalse(atexit) self.assertFalse(reactor.runs) self.assertFalse(thread.started) def test_no_setup_after_setup(self): """ If called after setup(), no_setup() throws an exception. """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() self.assertRaises(RuntimeError, s.no_setup) def test_setup_registry_shutdown(self): """ ResultRegistry.stop() is registered to run before reactor shutdown by setup(). """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() self.assertEqual( reactor.events, [("before", "shutdown", s._registry.stop)]) def test_no_setup_registry_shutdown(self): """ ResultRegistry.stop() is registered to run before reactor shutdown by setup(). """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.no_setup() self.assertEqual( reactor.events, [("before", "shutdown", s._registry.stop)]) class ProcessSetupTests(TestCase): """ setup() enables support for IReactorProcess on POSIX plaforms. """ def test_posix(self): """ On POSIX systems, setup() installs a LoopingCall that runs t.i.process.reapAllProcesses() 10 times a second. """ if platform.type != "posix": raise SkipTest("SIGCHLD is a POSIX-specific issue") reactor = FakeReactor() reaps = [] s = EventLoop( lambda: reactor, lambda f, *g: None, reapAllProcesses=lambda: reaps.append(1)) s.setup() reactor.advance(0.1) self.assertEqual(reaps, [1]) reactor.advance(0.1) self.assertEqual(reaps, [1, 1]) reactor.advance(0.1) self.assertEqual(reaps, [1, 1, 1]) def test_non_posix(self): """ On non-POSIX systems, setup() does not install a LoopingCall. """ if platform.type == "posix": raise SkipTest("This test is for non-POSIX systems.") reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() self.assertFalse(reactor.getDelayedCalls()) class ReactorImportTests(TestCase): """ Tests for when the reactor gets imported. The reactor should only be imported as part of setup()/no_setup(), rather than as side-effect of Crochet import, since daemonization doesn't work if reactor is imported (https://twistedmatrix.com/trac/ticket/7105). """ def test_crochet_import_no_reactor(self): """ Importing crochet should not import the reactor. """ program = """\ import sys import crochet if "twisted.internet.reactor" not in sys.modules: sys.exit(23) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) LOGGING_PROGRAM = """\ import sys from logging import StreamHandler, Formatter, getLogger, DEBUG handler = StreamHandler(sys.stdout) handler.setFormatter(Formatter("%%(levelname)s %%(message)s")) l = getLogger("twisted") l.addHandler(handler) l.setLevel(DEBUG) import crochet crochet.setup() from twisted.python import log %s log.msg("log-info") log.msg("log-error", isError=True) """ class LoggingTests(TestCase): """ End-to-end tests for Twisted->stdlib logging bridge. """ maxDiff = None def test_old_logging(self): """ Messages from the old Twisted logging API are emitted to Python standard library logging. """ if tuple(map(int, twisted.__version__.split("."))) >= (15, 2, 0): raise SkipTest("This test is for Twisted < 15.2.") program = LOGGING_PROGRAM % ("", ) output = subprocess.check_output([sys.executable, "-u", "-c", program], cwd=crochet_directory) self.assertTrue( output.startswith( """\ INFO Log opened. INFO log-info ERROR log-error """)) def test_new_logging(self): """ Messages from both new and old Twisted logging APIs are emitted to Python standard library logging. """ if tuple(map(int, twisted.__version__.split("."))) < (15, 2, 0): raise SkipTest("This test is for Twisted 15.2 and later.") program = LOGGING_PROGRAM % ( """\ from twisted.logger import Logger l2 = Logger() import time time.sleep(1) # workaround, there is race condition... somewhere l2.info("logger-info") l2.critical("logger-critical") l2.warn("logger-warning") l2.debug("logger-debug") """, ) output = subprocess.check_output([sys.executable, "-u", "-c", program], cwd=crochet_directory) self.assertIn( """\ INFO logger-info CRITICAL logger-critical WARNING logger-warning DEBUG logger-debug INFO log-info CRITICAL log-error """, output.decode("utf-8").replace("\r\n", "\n")) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_shutdown.py0000644000175100001730000000644514450111171020624 0ustar00runnerdocker""" Tests for _shutdown. """ from __future__ import absolute_import import sys import subprocess import time from twisted.trial.unittest import TestCase from crochet._shutdown import ( Watchdog, FunctionRegistry, _watchdog, register, _registry) from ..tests import crochet_directory class ShutdownTests(TestCase): """ Tests for shutdown registration. """ def test_shutdown(self): """ A function registered with _shutdown.register() is called when the main thread exits. """ program = """\ import threading, sys from crochet._shutdown import register, _watchdog _watchdog.start() end = False def thread(): while not end: pass sys.stdout.write("byebye") sys.stdout.flush() def stop(x, y): # Move this into separate test at some point. assert x == 1 assert y == 2 global end end = True threading.Thread(target=thread).start() register(stop, 1, y=2) sys.exit() """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory, stdout=subprocess.PIPE) result = process.stdout.read() self.assertEqual(process.wait(), 0) self.assertEqual(result, b"byebye") def test_watchdog(self): """ The watchdog thread exits when the thread it is watching exits, and calls its shutdown function. """ done = [] alive = True class FakeThread: def is_alive(self): return alive w = Watchdog(FakeThread(), lambda: done.append(True)) w.start() time.sleep(0.2) self.assertTrue(w.is_alive()) self.assertFalse(done) alive = False time.sleep(0.2) self.assertTrue(done) self.assertFalse(w.is_alive()) def test_api(self): """ The module exposes a shutdown thread that will call a global registry's run(), and a register function tied to the global registry. """ self.assertIsInstance(_registry, FunctionRegistry) self.assertEqual(register, _registry.register) self.assertIsInstance(_watchdog, Watchdog) self.assertEqual(_watchdog._shutdown_function, _registry.run) class FunctionRegistryTests(TestCase): """ Tests for FunctionRegistry. """ def test_called(self): """ Functions registered with a FunctionRegistry are called in reverse order by run(). """ result = [] registry = FunctionRegistry() registry.register(lambda: result.append(1)) registry.register(lambda x: result.append(x), 2) registry.register(lambda y: result.append(y), y=3) registry.run() self.assertEqual(result, [3, 2, 1]) def test_log_errors(self): """ Registered functions that raise an error have the error logged, and run() continues processing. """ result = [] registry = FunctionRegistry() registry.register(lambda: result.append(2)) registry.register(lambda: 1 / 0) registry.register(lambda: result.append(1)) registry.run() self.assertEqual(result, [1, 2]) excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/crochet/tests/test_util.py0000644000175100001730000000327414450111171017723 0ustar00runnerdocker""" Tests for crochet._util. """ from __future__ import absolute_import from twisted.trial.unittest import TestCase from .._util import synchronized class FakeLock(object): locked = False def __enter__(self): self.locked = True def __exit__(self, type, value, traceback): self.locked = False class Lockable(object): def __init__(self): self._lock = FakeLock() @synchronized def check(self, x, y): if not self._lock.locked: raise RuntimeError() return x, y @synchronized def raiser(self): if not self._lock.locked: raise RuntimeError() raise ZeroDivisionError() class SynchronizedTests(TestCase): """ Tests for the synchronized decorator. """ def test_return(self): """ A method wrapped with @synchronized is called with the lock acquired, and it is released on return. """ obj = Lockable() self.assertEqual(obj.check(1, y=2), (1, 2)) self.assertFalse(obj._lock.locked) def test_raise(self): """ A method wrapped with @synchronized is called with the lock acquired, and it is released on exception raise. """ obj = Lockable() self.assertRaises(ZeroDivisionError, obj.raiser) self.assertFalse(obj._lock.locked) def test_name(self): """ A method wrapped with @synchronized preserves its name. """ self.assertEqual(Lockable.check.__name__, "check") def test_marked(self): """ A method wrapped with @synchronized is marked as synchronized. """ self.assertEqual(Lockable.check.synchronized, True) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1688244862.3213902 crochet-2.1.1/crochet.egg-info/0000755000175100001730000000000014450111176015664 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244862.0 crochet-2.1.1/crochet.egg-info/PKG-INFO0000644000175100001730000002355214450111176016770 0ustar00runnerdockerMetadata-Version: 2.1 Name: crochet Version: 2.1.1 Summary: Use Twisted anywhere! Home-page: https://github.com/itamarst/crochet Maintainer: Itamar Turner-Trauring Maintainer-email: itamar@itamarst.org License: MIT Keywords: twisted threading Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Requires-Python: >=3.8.0 License-File: LICENSE Crochet: Use Twisted anywhere! ============================== Crochet is an MIT-licensed library that makes it easier to use Twisted from regular blocking code. Some use cases include: * Easily use Twisted from a blocking framework like Django or Flask. * Write a library that provides a blocking API, but uses Twisted for its implementation. * Port blocking code to Twisted more easily, by keeping a backwards compatibility layer. * Allow normal Twisted programs that use threads to interact with Twisted more cleanly from their threaded parts. For example, this can be useful when using Twisted as a `WSGI container`_. .. _WSGI container: https://twistedmatrix.com/documents/current/web/howto/web-in-60/wsgi.html Crochet is maintained by Itamar Turner-Trauring. **Note:** Crochet development is pretty slow these days because mostly it **Just Works**. PyPI shows about 30,000 downloads a month, so existing users seem happy: https://pypistats.org/packages/crochet You can install Crochet by running:: $ pip install crochet Downloads are available on `PyPI`_. Documentation can be found on `Read The Docs`_. Bugs and feature requests should be filed at the project `Github page`_. .. _Read the Docs: https://crochet.readthedocs.org/ .. _Github page: https://github.com/itamarst/crochet/ .. _PyPI: https://pypi.python.org/pypi/crochet API and features ================ Crochet supports Python 3.8, 3.9, 3.10, and 3.11 as well as PyPy3. Crochet provides the following basic APIs: * Allow blocking code to call into Twisted and block until results are available or a timeout is hit, using the ``crochet.wait_for`` decorator. * A lower-level API (``crochet.run_in_reactor``) allows blocking code to run code "in the background" in the Twisted thread, with the ability to repeatedly check if it's done. Crochet will do the following on your behalf in order to enable these APIs: * Transparently start Twisted's reactor in a thread it manages. * Shut down the reactor automatically when the process' main thread finishes. * Hook up Twisted's log system to the Python standard library ``logging`` framework. Unlike Twisted's built-in ``logging`` bridge, this includes support for blocking `Handler` instances. What's New ========== 2.1.0 ^^^^^ * Various internal modernizations and maintenance. * Dropped Python 3.6 and 3.7 support. 2.0.0 ^^^^^ New features: * It's possible to decorate ``async/await`` Twisted functions with ``@wait_for`` and ``@run_in_reactor``, thanks to Árni Már Jónsson. * Added type hints, thanks to Merlin Davis. * Added formal support for Python 3.9. Removed features: * Dropped the deprecated APIs ``@wait_for_reactor``, ``@in_reactor``, ``DeferredResult``, the ``wrapped_function`` attribute, and unlimited timeouts on ``EventualResult.wait()``. * Dropped support for Python 2.7 and 3.5. 1.12.0 ^^^^^^ Bug fixes: * Fix a timeout overflow bug in 32-bit machines. 1.11.0 ^^^^^^ New features: * Added support for Python 3.8 and PyPy 3. Backwards incompatibility: * Dropped support for Python 3.4, since latest Twisted doesn't support it. 1.10.0 ^^^^^^ New features: * Added support for Python 3.7. Thanks to Jeremy Cline for the patch. 1.9.0 ^^^^^ New features: * The underlying callable wrapped ``@run_in_reactor`` and ``@wait_for`` is now available via the more standard ``__wrapped__`` attribute. Backwards incompatibility (in tests): * This was actually introduced in 1.8.0: ``wrapped_function`` may not always be available on decorated callables. You should use ``__wrapped__`` instead. Bug fixes: * Fixed regression in 1.8.0 where bound method couldn't be wrapped. Thanks to 2mf for the bug report. 1.8.0 ^^^^^ New features: * Signatures on decorated functions now match the original functions. Thanks to Mikhail Terekhov for the original patch. * Documentation improvements, including an API reference. Bug fixes: * Switched to EPoll reactor for logging thread. Anecdotal evidence suggests this fixes some issues on AWS Lambda, but it's not clear why. Thanks to Rolando Espinoza for the patch. * It's now possible to call ``@run_in_reactor`` and ``@wait_for`` above a ``@classmethod``. Thanks to vak for the bug report. 1.7.0 ^^^^^ Bug fixes: * If the Python ``logging.Handler`` throws an exception Crochet no longer goes into a death spiral. Thanks to Michael Schlenker for the bug report. Removed features: * Versions of Twisted < 16.0 are no longer supported (i.e. no longer tested in CI.) 1.6.0 ^^^^^ New features: * Added support for Python 3.6. 1.5.0 ^^^^^ New features: * Added support for Python 3.5. Removed features: * Python 2.6, Python 3.3, and versions of Twisted < 15.0 are no longer supported. 1.4.0 ^^^^^ New features: * Added support for Python 3.4. Documentation: * Added a section on known issues and workarounds. Bug fixes: * Main thread detection (used to determine when Crochet should shutdown) is now less fragile. This means Crochet now supports more environments, e.g. uWSGI. Thanks to Ben Picolo for the patch. 1.3.0 ^^^^^ Bug fixes: * It is now possible to call ``EventualResult.wait()`` (or functions wrapped in ``wait_for``) at import time if another thread holds the import lock. Thanks to Ken Struys for the patch. 1.2.0 ^^^^^ New features: * ``crochet.wait_for`` implements the timeout/cancellation pattern documented in previous versions of Crochet. ``crochet.wait_for_reactor`` and ``EventualResult.wait(timeout=None)`` are now deprecated, since lacking timeouts they could potentially block forever. * Functions wrapped with ``wait_for`` and ``run_in_reactor`` can now be accessed via the ``wrapped_function`` attribute, to ease unit testing of the underlying Twisted code. API changes: * It is no longer possible to call ``EventualResult.wait()`` (or functions wrapped with ``wait_for``) at import time, since this can lead to deadlocks or prevent other threads from importing. Thanks to Tom Prince for the bug report. Bug fixes: * ``warnings`` are no longer erroneously turned into Twisted log messages. * The reactor is now only imported when ``crochet.setup()`` or ``crochet.no_setup()`` are called, allowing daemonization if only ``crochet`` is imported (http://tm.tl/7105). Thanks to Daniel Nephin for the bug report. Documentation: * Improved motivation, added contact info and news to the documentation. * Better example of using Crochet from a normal Twisted application. 1.1.0 ^^^^^ Bug fixes: * ``EventualResult.wait()`` can now be used safely from multiple threads, thanks to Gavin Panella for reporting the bug. * Fixed reentrancy deadlock in the logging code caused by http://bugs.python.org/issue14976, thanks to Rod Morehead for reporting the bug. * Crochet now installs on Python 3.3 again, thanks to Ben Cordero. * Crochet should now work on Windows, thanks to Konstantinos Koukopoulos. * Crochet tests can now run without adding its absolute path to PYTHONPATH or installing it first. Documentation: * ``EventualResult.original_failure`` is now documented. 1.0.0 ^^^^^ Documentation: * Added section on use cases and alternatives. Thanks to Tobias Oberstein for the suggestion. Bug fixes: * Twisted does not have to be pre-installed to run ``setup.py``, thanks to Paul Weaver for bug report and Chris Scutcher for patch. * Importing Crochet does not have side-effects (installing reactor event) any more. * Blocking calls are interrupted earlier in the shutdown process, to reduce scope for deadlocks. Thanks to rmorehead for bug report. 0.9.0 ^^^^^ New features: * Expanded and much improved documentation, including a new section with design suggestions. * New decorator ``@wait_for_reactor`` added, a simpler alternative to ``@run_in_reactor``. * Refactored ``@run_in_reactor``, making it a bit more responsive. * Blocking operations which would otherwise never finish due to reactor having stopped (``EventualResult.wait()`` or ``@wait_for_reactor`` decorated call) will be interrupted with a ``ReactorStopped`` exception. Thanks to rmorehead for the bug report. Bug fixes: * ``@run_in_reactor`` decorated functions (or rather, their generated wrapper) are interrupted by Ctrl-C. * On POSIX platforms, a workaround is installed to ensure processes started by `reactor.spawnProcess` have their exit noticed. See `Twisted ticket 6378`_ for more details about the underlying issue. .. _Twisted ticket 6378: http://tm.tl/6738 0.8.1 ^^^^^ * ``EventualResult.wait()`` now raises error if called in the reactor thread, thanks to David Buchmann. * Unittests are now included in the release tarball. * Allow Ctrl-C to interrupt ``EventualResult.wait(timeout=None)``. 0.7.0 ^^^^^ * Improved documentation. 0.6.0 ^^^^^ * Renamed ``DeferredResult`` to ``EventualResult``, to reduce confusion with Twisted's ``Deferred`` class. The old name still works, but is deprecated. * Deprecated ``@in_reactor``, replaced with ``@run_in_reactor`` which doesn't change the arguments to the wrapped function. The deprecated API still works, however. * Unhandled exceptions in ``EventualResult`` objects are logged. * Added more examples. * ``setup.py sdist`` should work now. 0.5.0 ^^^^^ * Initial release. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244862.0 crochet-2.1.1/crochet.egg-info/SOURCES.txt0000644000175100001730000000202514450111176017547 0ustar00runnerdockerLICENSE MANIFEST.in README.rst requirements-dev.txt setup.cfg setup.py versioneer.py crochet/__init__.py crochet/__init__.pyi crochet/_eventloop.py crochet/_resultstore.py crochet/_shutdown.py crochet/_util.py crochet/_version.py crochet/mypy.py crochet/py.typed crochet.egg-info/PKG-INFO crochet.egg-info/SOURCES.txt crochet.egg-info/dependency_links.txt crochet.egg-info/requires.txt crochet.egg-info/top_level.txt crochet/tests/__init__.py crochet/tests/test_api.py crochet/tests/test_logging.py crochet/tests/test_mypy.py crochet/tests/test_process.py crochet/tests/test_resultstore.py crochet/tests/test_setup.py crochet/tests/test_shutdown.py crochet/tests/test_util.py docs/Makefile docs/api-reference.rst docs/api.rst docs/async.rst docs/conf.py docs/index.rst docs/introduction.rst docs/make.bat docs/news.rst docs/type-checking.rst docs/using.rst docs/workarounds.rst examples/async.py examples/blockingdns.py examples/downloader.py examples/fromtwisted.py examples/mxquery.py examples/scheduling.py examples/ssh.py examples/testing.py././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244862.0 crochet-2.1.1/crochet.egg-info/dependency_links.txt0000644000175100001730000000000114450111176021732 0ustar00runnerdocker ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244862.0 crochet-2.1.1/crochet.egg-info/requires.txt0000644000175100001730000000002414450111176020260 0ustar00runnerdockerTwisted>=16.0 wrapt ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244862.0 crochet-2.1.1/crochet.egg-info/top_level.txt0000644000175100001730000000001014450111176020405 0ustar00runnerdockercrochet ././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/docs/0000755000175100001730000000000014450111176013473 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/Makefile0000644000175100001730000001270014450111171015126 0ustar00runnerdocker# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Crochet.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Crochet.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/Crochet" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Crochet" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/api-reference.rst0000644000175100001730000000057514450111171016734 0ustar00runnerdockerAPI Reference ============= .. autofunction:: crochet.setup() .. autofunction:: crochet.no_setup() .. autofunction:: crochet.run_in_reactor(function) .. autofunction:: crochet.wait_for(timeout) .. autoclass:: crochet.EventualResult :members: .. autofunction:: crochet.retrieve_result(result_id) .. autoexception:: crochet.TimeoutError .. autoexception:: crochet.ReactorStopped ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/api.rst0000644000175100001730000001667414450111171015007 0ustar00runnerdockerUsing Crochet ------------- Using Crochet involves three parts: reactor setup, defining functions that call into Twisted's reactor, and using those functions. Setup ^^^^^ Crochet does a number of things for you as part of setup. Most significantly, it runs Twisted's reactor in a thread it manages. Doing setup is easy, just call the ``setup()`` function: .. code-block:: python from crochet import setup setup() Since Crochet is intended to be used as a library, multiple calls work just fine; if more than one library does ``crochet.setup()`` only the first one will do anything. @wait_for: Blocking calls into Twisted ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Now that you've got the reactor running, the next stage is defining some functions that will run inside the Twisted reactor thread. Twisted's APIs are not thread-safe, and so they cannot be called directly from another thread. Moreover, results may not be available immediately. The easiest way to deal with these issues is to decorate a function that calls Twisted APIs with ``crochet.wait_for``. * When the decorated function is called, the code will not run in the calling thread, but rather in the reactor thread. * The function blocks until a result is available from the code running in the Twisted thread. The returned result is the result of running the code; if the code throws an exception, an exception is thrown. * If the underlying code returns a ``Deferred``, it is handled transparently; its results are extracted and passed to the caller. * ``crochet.wait_for`` takes a ``timeout`` argument, a ``float`` indicating the number of seconds to wait until a result is available. If the given number of seconds pass and the underlying operation is still unfinished a ``crochet.TimeoutError`` exception is raised, and the wrapped ``Deferred`` is canceled. If the underlying API supports cancellation this might free up any unused resources, close outgoing connections etc., but cancellation is not guaranteed and should not be relied on. To see what this means, let's return to the first example in the documentation: .. literalinclude:: ../examples/blockingdns.py Twisted's ``lookupAddress()`` call returns a ``Deferred``, but the code calling the decorated ``gethostbyname()`` doesn't know that. As far as the caller concerned is just calling a blocking function that returns a result or raises an exception. Run on the command line with a valid domain we get:: $ python blockingdns.py twistedmatrix.com twistedmatrix.com -> 66.35.39.66 If we try to call the function with an invalid domain, we get back an exception:: $ python blockingdns.py doesnotexist Trace back (most recent call last): File "examples/blockingdns.py", line 33, in ip = gethostbyname(name) File "/home/itamar/crochet/crochet/_eventloop.py", line 434, in wrapper return eventual_result.wait(timeout) File "/home/itamar/crochet/crochet/_eventloop.py", line 216, in wait result.raiseException() File "", line 2, in raiseException twisted.names.error.DNSNameError: ]> You can, similarly, wrap an ``async`` function: .. literalinclude:: ../examples/async.py @run_in_reactor: Asynchronous results ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ``wait_for`` is implemented using ``run_in_reactor``, a more sophisticated and lower-level API. Rather than waiting until a result is available, it returns a special object supporting multiple attempts to retrieve results, as well as manual cancellation. This can be useful for running tasks "in the background", i.e. asynchronously, as opposed to blocking and waiting for them to finish. Decorating a function that calls Twisted APIs with ``run_in_reactor`` has two consequences: * When the function is called, the code will not run in the calling thread, but rather in the reactor thread. * The return result from a decorated function is an ``EventualResult`` instance, wrapping the result of the underlying code, with built-in support for functions that return ``Deferred`` instances as well as ``async`` functions. ``EventualResult`` has the following basic methods: * ``wait(timeout)``: Return the result when it becomes available; if the result is an exception it will be raised. The timeout argument is a ``float`` indicating a number of seconds; ``wait()`` will throw ``crochet.TimeoutError`` if the timeout is hit. * ``cancel()``: Cancel the operation tied to the underlying ``Deferred``. Many, but not all, ``Deferred`` results returned from Twisted allow the underlying operation to be canceled. Even if implemented, cancellation may not be possible for a variety of reasons, e.g. it may be too late. Its main purpose to free up no longer used resources, and it should not be relied on otherwise. There are also some more specialized methods: * ``original_failure()`` returns the underlying Twisted `Failure`_ object if your result was a raised exception, allowing you to print the original traceback that caused the exception. This is necessary because the default exception you will see raised from ``EventualResult.wait()`` won't include the stack from the underlying Twisted code where the exception originated. * ``stash()``: Sometimes you want to store the ``EventualResult`` in memory for later retrieval. This is specifically useful when you want to store a reference to the ``EventualResult`` in a web session like Flask's (see the example below). ``stash()`` stores the ``EventualResult`` in memory, and returns an integer uid that can be used to retrieve the result using ``crochet.retrieve_result(uid)``. Note that retrieval works only once per uid. You will need the stash the ``EventualResult`` again (with a new resulting uid) if you want to retrieve it again later. In the following example, you can see all of these APIs in use. For each user session, a download is started in the background. Subsequent page refreshes will eventually show the downloaded page. .. literalinclude:: ../examples/downloader.py .. _Failure: https://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html Using Crochet from Twisted applications ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ If your application is already planning on running the Twisted reactor itself (e.g. you're using Twisted as a WSGI container), Crochet's default behavior of running the reactor in a thread is a problem. To solve this, Crochet provides the ``no_setup()`` function, which causes future calls to ``setup()`` to do nothing. Thus, an application that will run the Twisted reactor but also wants to use a Crochet-using library must run it first: .. code-block:: python from crochet import no_setup no_setup() # Only now do we import libraries that might run crochet.setup(): import blockinglib # ... setup application ... from twisted.internet import reactor reactor.run() Unit testing ^^^^^^^^^^^^ Both ``@wait_for`` and ``@run_in_reactor`` expose the underlying Twisted function via a ``__wrapped__`` attribute. This allows unit testing of the Twisted code without having to go through the Crochet layer. .. literalinclude:: ../examples/testing.py When run, this gives the following output:: add(1, 2) returns EventualResult: add.__wrapped__(1, 2) returns result of underlying function: 3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/async.rst0000644000175100001730000000123014450111171015331 0ustar00runnerdockerDifferences from async/await ============================ Python 3.6 introduces a new mechanism, ``async``/``await``, that allows integrating asynchronous code in a seemingly blocking way. This mechanism is however quite different than Crochet. ``await`` gives the illusion of blocking, but can only be used in functions that are marked as ``async``. As such, this is not true blocking integration: the asyncess percolates throughout your program and cannot be restricted to just a single function. In contrast, Crochet allows you to truly block on an asynchronous event: it's just another blocking function call, and can be used in any normal Python function. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/conf.py0000644000175100001730000001725114450111171014773 0ustar00runnerdocker# -*- coding: utf-8 -*- # # Crochet documentation build configuration file, created by # sphinx-quickstart on Mon Sep 16 19:37:18 2013. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys, os # Make sure local crochet is used when importing: sys.path.insert(0, os.path.abspath('..')) # -- General configuration ----------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = ["sphinx.ext.autodoc"] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'Crochet' copyright = u'2013, Itamar Turner-Trauring' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. import crochet version = crochet.__version__ # Versioneer adds -dirty suffix to version if checkout is dirty, and # therefore ReadTheDocs somehow ends up with this in its version, so strip # it out. if version.endswith(".dirty"): version = version[:-len(".dirty")] # The full version, including alpha/beta/rc tags. release = version # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'default' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. html_use_index = False # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'crochetdoc' # -- Options for LaTeX output -------------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ('index', 'Crochet.tex', u'Crochet Documentation', u'Itamar Turner-Trauring', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'crochet', u'Crochet Documentation', [u'Itamar Turner-Trauring'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------------ # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'Crochet', u'Crochet Documentation', u'Itamar Turner-Trauring', 'Crochet', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/index.rst0000644000175100001730000000241214450111171015326 0ustar00runnerdockerUse Twisted Anywhere! ===================== .. raw:: html Crochet is an MIT-licensed library that makes it easier for blocking and threaded applications like Flask or Django to use the Twisted networking framework. Here's an example of a program using Crochet. Notice that you get a completely blocking interface to Twisted and do not need to run the Twisted reactor, the event loop, yourself. .. literalinclude:: ../examples/blockingdns.py Run on the command line:: $ python blockingdns.py twistedmatrix.com twistedmatrix.com -> 66.35.39.66 You can also wrap ``async`` functions. Here is the equivalent code to the previous example, but using an ``async/await`` function: .. code-block:: python @wait_for(timeout=5.0) async def gethostbyname(name): result = await client.lookupAddress(name) return result[0][0].payload.dottedQuad() Table of Contents ^^^^^^^^^^^^^^^^^ .. toctree:: :maxdepth: 3 introduction api using type-checking workarounds async api-reference news ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/introduction.rst0000644000175100001730000000152514450111171016744 0ustar00runnerdockerIntroduction ------------ .. include:: ../README.rst Examples ======== Background scheduling ^^^^^^^^^^^^^^^^^^^^^ You can use Crochet to schedule events that will run in the background without slowing down the page rendering of your web applications: .. literalinclude:: ../examples/scheduling.py SSH into your server ^^^^^^^^^^^^^^^^^^^^ You can SSH into your Python process and get a Python prompt, allowing you to poke around in the internals of your running program: .. literalinclude:: ../examples/ssh.py DNS query ^^^^^^^^^ Twisted also has a fully featured DNS library: .. literalinclude:: ../examples/mxquery.py Using Crochet in normal Twisted code ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ You can use Crochet's APIs for calling into the reactor thread from normal Twisted applications: .. literalinclude:: ../examples/fromtwisted.py ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/make.bat0000644000175100001730000001175214450111171015101 0ustar00runnerdocker@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Crochet.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Crochet.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/news.rst0000644000175100001730000001556214450111171015205 0ustar00runnerdockerWhat's New ========== 2.1.0 ^^^^^ * Various internal modernizations and maintenance. * Dropped Python 3.6 and 3.7 support. 2.0.0 ^^^^^ New features: * It's possible to decorate ``async/await`` Twisted functions with ``@wait_for`` and ``@run_in_reactor``, thanks to Árni Már Jónsson. * Added type hints, thanks to Merlin Davis. * Added formal support for Python 3.9. Removed features: * Dropped the deprecated APIs ``@wait_for_reactor``, ``@in_reactor``, ``DeferredResult``, the ``wrapped_function`` attribute, and unlimited timeouts on ``EventualResult.wait()``. * Dropped support for Python 2.7 and 3.5. 1.12.0 ^^^^^^ Bug fixes: * Fix a timeout overflow bug in 32-bit machines. 1.11.0 ^^^^^^ New features: * Added support for Python 3.8 and PyPy 3. Backwards incompatibility: * Dropped support for Python 3.4, since latest Twisted doesn't support it. 1.10.0 ^^^^^^ New features: * Added support for Python 3.7. Thanks to Jeremy Cline for the patch. 1.9.0 ^^^^^ New features: * The underlying callable wrapped ``@run_in_reactor`` and ``@wait_for`` is now available via the more standard ``__wrapped__`` attribute. Backwards incompatibility (in tests): * This was actually introduced in 1.8.0: ``wrapped_function`` may not always be available on decorated callables. You should use ``__wrapped__`` instead. Bug fixes: * Fixed regression in 1.8.0 where bound method couldn't be wrapped. Thanks to 2mf for the bug report. 1.8.0 ^^^^^ New features: * Signatures on decorated functions now match the original functions. Thanks to Mikhail Terekhov for the original patch. * Documentation improvements, including an API reference. Bug fixes: * Switched to EPoll reactor for logging thread. Anecdotal evidence suggests this fixes some issues on AWS Lambda, but it's not clear why. Thanks to Rolando Espinoza for the patch. * It's now possible to call ``@run_in_reactor`` and ``@wait_for`` above a ``@classmethod``. Thanks to vak for the bug report. 1.7.0 ^^^^^ Bug fixes: * If the Python ``logging.Handler`` throws an exception Crochet no longer goes into a death spiral. Thanks to Michael Schlenker for the bug report. Removed features: * Versions of Twisted < 16.0 are no longer supported (i.e. no longer tested in CI.) 1.6.0 ^^^^^ New features: * Added support for Python 3.6. 1.5.0 ^^^^^ New features: * Added support for Python 3.5. Removed features: * Python 2.6, Python 3.3, and versions of Twisted < 15.0 are no longer supported. 1.4.0 ^^^^^ New features: * Added support for Python 3.4. Documentation: * Added a section on known issues and workarounds. Bug fixes: * Main thread detection (used to determine when Crochet should shutdown) is now less fragile. This means Crochet now supports more environments, e.g. uWSGI. Thanks to Ben Picolo for the patch. 1.3.0 ^^^^^ Bug fixes: * It is now possible to call ``EventualResult.wait()`` (or functions wrapped in ``wait_for``) at import time if another thread holds the import lock. Thanks to Ken Struys for the patch. 1.2.0 ^^^^^ New features: * ``crochet.wait_for`` implements the timeout/cancellation pattern documented in previous versions of Crochet. ``crochet.wait_for_reactor`` and ``EventualResult.wait(timeout=None)`` are now deprecated, since lacking timeouts they could potentially block forever. * Functions wrapped with ``wait_for`` and ``run_in_reactor`` can now be accessed via the ``wrapped_function`` attribute, to ease unit testing of the underlying Twisted code. API changes: * It is no longer possible to call ``EventualResult.wait()`` (or functions wrapped with ``wait_for``) at import time, since this can lead to deadlocks or prevent other threads from importing. Thanks to Tom Prince for the bug report. Bug fixes: * ``warnings`` are no longer erroneously turned into Twisted log messages. * The reactor is now only imported when ``crochet.setup()`` or ``crochet.no_setup()`` are called, allowing daemonization if only ``crochet`` is imported (http://tm.tl/7105). Thanks to Daniel Nephin for the bug report. Documentation: * Improved motivation, added contact info and news to the documentation. * Better example of using Crochet from a normal Twisted application. 1.1.0 ^^^^^ Bug fixes: * ``EventualResult.wait()`` can now be used safely from multiple threads, thanks to Gavin Panella for reporting the bug. * Fixed reentrancy deadlock in the logging code caused by http://bugs.python.org/issue14976, thanks to Rod Morehead for reporting the bug. * Crochet now installs on Python 3.3 again, thanks to Ben Cordero. * Crochet should now work on Windows, thanks to Konstantinos Koukopoulos. * Crochet tests can now run without adding its absolute path to PYTHONPATH or installing it first. Documentation: * ``EventualResult.original_failure`` is now documented. 1.0.0 ^^^^^ Documentation: * Added section on use cases and alternatives. Thanks to Tobias Oberstein for the suggestion. Bug fixes: * Twisted does not have to be pre-installed to run ``setup.py``, thanks to Paul Weaver for bug report and Chris Scutcher for patch. * Importing Crochet does not have side-effects (installing reactor event) any more. * Blocking calls are interrupted earlier in the shutdown process, to reduce scope for deadlocks. Thanks to rmorehead for bug report. 0.9.0 ^^^^^ New features: * Expanded and much improved documentation, including a new section with design suggestions. * New decorator ``@wait_for_reactor`` added, a simpler alternative to ``@run_in_reactor``. * Refactored ``@run_in_reactor``, making it a bit more responsive. * Blocking operations which would otherwise never finish due to reactor having stopped (``EventualResult.wait()`` or ``@wait_for_reactor`` decorated call) will be interrupted with a ``ReactorStopped`` exception. Thanks to rmorehead for the bug report. Bug fixes: * ``@run_in_reactor`` decorated functions (or rather, their generated wrapper) are interrupted by Ctrl-C. * On POSIX platforms, a workaround is installed to ensure processes started by `reactor.spawnProcess` have their exit noticed. See `Twisted ticket 6378`_ for more details about the underlying issue. .. _Twisted ticket 6378: http://tm.tl/6738 0.8.1 ^^^^^ * ``EventualResult.wait()`` now raises error if called in the reactor thread, thanks to David Buchmann. * Unittests are now included in the release tarball. * Allow Ctrl-C to interrupt ``EventualResult.wait(timeout=None)``. 0.7.0 ^^^^^ * Improved documentation. 0.6.0 ^^^^^ * Renamed ``DeferredResult`` to ``EventualResult``, to reduce confusion with Twisted's ``Deferred`` class. The old name still works, but is deprecated. * Deprecated ``@in_reactor``, replaced with ``@run_in_reactor`` which doesn't change the arguments to the wrapped function. The deprecated API still works, however. * Unhandled exceptions in ``EventualResult`` objects are logged. * Added more examples. * ``setup.py sdist`` should work now. 0.5.0 ^^^^^ * Initial release. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/type-checking.rst0000644000175100001730000000445314450111171016760 0ustar00runnerdockerStatic Type Checking -------------------- Crochet comes with type hints for Python 3.6+. However, due to current limitations in ``Callable`` generic construction (see `PEP 612 — Parameter Specification Variables`_), the arguments of a call to a ``@run_in_reactor``-decorated function or method cannot be checked without giving type checkers some special help. Crochet ships with a plugin which fills this role when the ``mypy`` static type checker is used. It resides in ``crochet.mypy`` and must be configured as described in `Configuring mypy to use plugins`_. For example, in a ``mypy.ini`` configuration file:: [mypy] plugins = crochet.mypy This type checking is intended primarily for code which calls the decorated function. As Twisted isn't fully type-hinted yet, and in particular Deferred does not yet have a generic type argument so that the eventual result type can vary, the analysis of the return type of a ``@run_in_reactor`` function/method does not account for a Deferred result. This requires you to lie to the type checker when returning a Deferred; just cast it to the known, eventual result type using ``typing.cast``. For example:: @run_in_reactor def get_time_in_x_seconds(delay: float) -> float: def get_time() -> float: return reactor.seconds() # type: ignore if delay < 0.001: # Close enough; just return the current time. return get_time() else: d = Deferred() def complete(): d.callback(get_time()) reactor.callLater(delay, complete) # type: ignore return typing.cast(float, d) If the mypy plugin is correctly installed, the client code will expect a float from the ``wait()`` of the ``EventualResult`` returned by a call to this function:: # OK t1: float = get_time_in_x_seconds(2).wait(3) print(f"The reactor time is {t1}") # mypy error: Incompatible types in assignment # (expression has type "float", variable has type "str") t2: str = get_time_in_x_seconds(2).wait(3) print(f"The reactor time is {t2}") .. _PEP 612 — Parameter Specification Variables: https://www.python.org/dev/peps/pep-0612/ .. _Configuring mypy to use plugins: https://mypy.readthedocs.io/en/latest/extending_mypy.html#configuring-mypy-to-use-plugins ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/using.rst0000644000175100001730000000442014450111171015345 0ustar00runnerdockerBest Practices -------------- Hide Twisted and Crochet ^^^^^^^^^^^^^^^^^^^^^^^^ Consider some synchronous do-one-thing-after-the-other application code that wants to use event-driven Twisted-using code. We have two threads at a minimum: the application thread(s) and the reactor thread. There are also multiple layers of code involved in this interaction: * **Twisted code:** Should only be called in reactor thread. This may be code from the Twisted package itself, or more likely code you have written that is built on top of Twisted. * **@wait_for/@run_in_reactor wrappers:** The body of the functions runs in the reactor thread... but the caller should be in the application thread. * **The application code:** Runs in the application thread(s), expects synchronous/blocking calls. Sometimes the first two layers suffice, but there are some issues with only having these. First, if you're using ``@run_in_reactor`` it requires the application code to understand Crochet's API, i.e. ``EventualResult`` objects. Second, if the wrapped function returns an object that expects to interact with Twisted, the application code will not be able to use that object since it will be called in the wrong thread. A better solution is to have an additional layer in-between the application code and ``@wait_for/@run_in_reactor`` wrappers. This layer can hide the details of the Crochet API and wrap returned Twisted objects if necessary. As a result the application code simply seems a normal API, with no need to understand ``EventualResult`` objects or Twisted. In the following example the different layers of the code demonstrate this separation: ``_ExchangeRate`` is just Twisted code, and ``ExchangeRate`` provides a blocking wrapper using Crochet. .. literalinclude:: ../examples/scheduling.py Minimize decorated code ^^^^^^^^^^^^^^^^^^^^^^^ It's best to have as little code as possible in the ``@wait_for/@run_in_reactor`` wrappers. As this code straddles two worlds (or at least, two threads) it is more difficult to unit test. Having an extra layer between this code and the application code is also useful in this regard as well: Twisted code can be pushed into the lower-level Twisted layer, and code hiding the Twisted details from the application code can be pushed into the higher-level layer. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/docs/workarounds.rst0000644000175100001730000000702414450111171016601 0ustar00runnerdockerKnown Issues and Workarounds ---------------------------- Don't Call Twisted APIs from non-Twisted threads ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ As is the case in any Twisted program, you should never call Twisted APIs (e.g. ``reactor.callLater``) from non-Twisted threads. Only call Twisted APIs from functions decorated by ``@wait_for`` and friends. Preventing deadlocks on shutdown ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ To ensure a timely process exit, during reactor shutdown Crochet will try to interrupt calls to ``EventualResult.wait()`` or functions decorated with ``@wait_for`` with a ``crochet.ReactorStopped`` exception. This is still not a complete solution, unfortunately. If you are shutting down a thread pool as part of Twisted's reactor shutdown, this will wait until all threads are done. If you're blocking indefinitely, this may rely on Crochet interrupting those blocking calls... but Crochet's shutdown may be delayed until the thread pool finishes shutting down, depending on the ordering of shutdown events. The solution is to interrupt all blocking calls yourself. You can do this by firing or canceling any ``Deferred`` instances you are waiting on as part of your application shutdown, and do so before you stop any thread pools. Reducing Twisted log messages ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Twisted can be rather verbose with its log messages. If you wish to reduce the message flow you can limit them to error messages only: .. code-block:: python import logging logging.getLogger('twisted').setLevel(logging.ERROR) Missing tracebacks ^^^^^^^^^^^^^^^^^^ In order to prevent massive memory leaks, Twisted currently wipes out the traceback from exceptions it captures (see https://tm.tl/7873 for ideas on improving this). This means that often exceptions re-raised by Crochet will be missing their tracebacks. You can however get access to a string version of the traceback, suitable for logging, from ``EventualResult`` objects returned by ``@run_in_reactor``\-wrapped functions: .. code-block:: python from crochet import run_in_reactor, TimeoutError @run_in_reactor def download_page(url): from twisted.web.client import getPage return getPage(url) result = download_page("https://github.com") try: page = result.wait(timeout=1000) except TimeoutError: # Handle timeout ... except: # Something else happened: print(result.original_failure().getTraceback()) uWSGI, multiprocessing, Celery ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ uWSGI, the standard library ``multiprocessing.py`` library and Celery by default use ``fork()`` without ``exec()`` to create child processes on Unix systems. This means they effectively clone a running parent Python process, preserving all existing imported modules. This is a fundamentally broken thing to do, e.g. it breaks the standard library's ``logging`` package. It also breaks Crochet. You have two options for dealing with this problem. The ideal solution is to avoid this "feature": uWSGI Use the ``--lazy-apps`` command-line option. ``multiprocessing.py`` Use the ``spawn`` (or possibly ``forkserver``) start methods when using Python 3. See https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods for more details. Alternatively, you can ensure you only start Crochet inside the child process: uWSGI Only run ``crochet.setup()`` inside the WSGI application function. ``multiprocessing.py`` Only run ``crochet.setup()`` in the child process. Celery Only run ``crochet.setup()`` inside tasks. ././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/examples/0000755000175100001730000000000014450111176014361 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/async.py0000644000175100001730000000170214450111171016043 0ustar00runnerdocker#!/usr/bin/python """ Async/await DNS lookup using Twisted's APIs. """ from __future__ import print_function # The Twisted code we'll be using: from twisted.names import client from crochet import setup, wait_for setup() # Crochet layer, wrapping Twisted's DNS library in a blocking call. # Uses async/await. @wait_for(timeout=5.0) async def gethostbyname(name): """Lookup the IP of a given hostname. Unlike socket.gethostbyname() which can take an arbitrary amount of time to finish, this function will raise crochet.TimeoutError if more than 5 seconds elapse without an answer being received. """ result = await client.lookupAddress(name) return result[0][0].payload.dottedQuad() if __name__ == '__main__': # Application code using the public API - notice it works in a normal # blocking manner, with no event loop visible: import sys name = sys.argv[1] ip = gethostbyname(name) print(name, "->", ip) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/blockingdns.py0000644000175100001730000000167314450111171017232 0ustar00runnerdocker#!/usr/bin/python """ Do a DNS lookup using Twisted's APIs. """ from __future__ import print_function # The Twisted code we'll be using: from twisted.names import client from crochet import setup, wait_for setup() # Crochet layer, wrapping Twisted's DNS library in a blocking call. @wait_for(timeout=5.0) def gethostbyname(name): """Lookup the IP of a given hostname. Unlike socket.gethostbyname() which can take an arbitrary amount of time to finish, this function will raise crochet.TimeoutError if more than 5 seconds elapse without an answer being received. """ d = client.lookupAddress(name) d.addCallback(lambda result: result[0][0].payload.dottedQuad()) return d if __name__ == '__main__': # Application code using the public API - notice it works in a normal # blocking manner, with no event loop visible: import sys name = sys.argv[1] ip = gethostbyname(name) print(name, "->", ip) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/downloader.py0000644000175100001730000000260714450111171017071 0ustar00runnerdocker#!/usr/bin/python """ A flask web application that downloads a page in the background. """ import logging from flask import Flask, session, escape from crochet import setup, run_in_reactor, retrieve_result, TimeoutError # Can be called multiple times with no ill-effect: setup() app = Flask(__name__) @run_in_reactor def download_page(url): """ Download a page. """ from twisted.web.client import getPage return getPage(url) @app.route('/') def index(): if 'download' not in session: # Calling an @run_in_reactor function returns an EventualResult: result = download_page('http://www.google.com') session['download'] = result.stash() return "Starting download, refresh to track progress." # Retrieval is a one-time operation, so the uid in the session cannot be # reused: result = retrieve_result(session.pop('download')) try: download = result.wait(timeout=0.1) return "Downloaded: " + escape(download) except TimeoutError: session['download'] = result.stash() return "Download in progress..." except: # The original traceback of the exception: return "Download failed:\n" + result.original_failure().getTraceback() if __name__ == '__main__': import os, sys logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) app.secret_key = os.urandom(24) app.run() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/fromtwisted.py0000644000175100001730000000231414450111171017275 0ustar00runnerdocker#!/usr/bin/python """ An example of using Crochet from a normal Twisted application. """ import sys from crochet import no_setup, wait_for # Tell Crochet not to run the reactor: no_setup() from twisted.internet import reactor from twisted.python import log from twisted.web.wsgi import WSGIResource from twisted.web.server import Site from twisted.names import client # A WSGI application, will be run in thread pool: def application(environ, start_response): start_response('200 OK', []) try: ip = gethostbyname('twistedmatrix.com') return "%s has IP %s" % ('twistedmatrix.com', ip) except Exception, e: return 'Error doing lookup: %s' % (e,) # A blocking API that will be called from the WSGI application, but actually # uses DNS: @wait_for(timeout=10) def gethostbyname(name): d = client.lookupAddress(name) d.addCallback(lambda result: result[0][0].payload.dottedQuad()) return d # Normal Twisted code, serving the WSGI application and running the reactor: def main(): log.startLogging(sys.stdout) pool = reactor.getThreadPool() reactor.listenTCP(5000, Site(WSGIResource(reactor, pool, application))) reactor.run() if __name__ == '__main__': main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/mxquery.py0000644000175100001730000000177414450111171016451 0ustar00runnerdocker#!/usr/bin/python """ A command-line application that uses Twisted to do an MX DNS query. """ from __future__ import print_function from twisted.names.client import lookupMailExchange from crochet import setup, wait_for setup() # Twisted code: def _mx(domain): """ Return Deferred that fires with a list of (priority, MX domain) tuples for a given domain. """ def got_records(result): return sorted( [(int(record.payload.preference), str(record.payload.name)) for record in result[0]]) d = lookupMailExchange(domain) d.addCallback(got_records) return d # Blocking wrapper: @wait_for(timeout=5) def mx(domain): """ Return list of (priority, MX domain) tuples for a given domain. """ return _mx(domain) # Application code: def main(domain): print("Mail servers for %s:" % (domain,)) for priority, mailserver in mx(domain): print(priority, mailserver) if __name__ == '__main__': import sys main(sys.argv[1]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/scheduling.py0000644000175100001730000000513014450111171017052 0ustar00runnerdocker#!/usr/bin/python """ An example of scheduling time-based events in the background. Download the latest EUR/USD exchange rate from Yahoo every 30 seconds in the background; the rendered Flask web page can use the latest value without having to do the request itself. Note this is example is for demonstration purposes only, and is not actually used in the real world. You should not do this in a real application without reading Yahoo's terms-of-service and following them. """ from __future__ import print_function from flask import Flask from twisted.internet.task import LoopingCall from twisted.web.client import getPage from twisted.python import log from crochet import wait_for, run_in_reactor, setup setup() # Twisted code: class _ExchangeRate(object): """Download an exchange rate from Yahoo Finance using Twisted.""" def __init__(self, name): self._value = None self._name = name # External API: def latest_value(self): """Return the latest exchange rate value. May be None if no value is available. """ return self._value def start(self): """Start the background process.""" self._lc = LoopingCall(self._download) # Run immediately, and then every 30 seconds: self._lc.start(30, now=True) def _download(self): """Download the page.""" print("Downloading!") def parse(result): print("Got %r back from Yahoo." % (result,)) values = result.strip().split(",") self._value = float(values[1]) d = getPage( "http://download.finance.yahoo.com/d/quotes.csv?e=.csv&f=c4l1&s=%s=X" % (self._name,)) d.addCallback(parse) d.addErrback(log.err) return d # Blocking wrapper: class ExchangeRate(object): """Blocking API for downloading exchange rate.""" def __init__(self, name): self._exchange = _ExchangeRate(name) @run_in_reactor def start(self): self._exchange.start() @wait_for(timeout=1) def latest_value(self): """Return the latest exchange rate value. May be None if no value is available. """ return self._exchange.latest_value() EURUSD = ExchangeRate("EURUSD") app = Flask(__name__) @app.route('/') def index(): rate = EURUSD.latest_value() if rate is None: rate = "unavailable, please refresh the page" return "Current EUR/USD exchange rate is %s." % (rate,) if __name__ == '__main__': import sys, logging logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) EURUSD.start() app.run() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/ssh.py0000644000175100001730000000366514450111171015535 0ustar00runnerdocker#!/usr/bin/python """ A demonstration of Conch, allowing you to SSH into a running Python server and inspect objects at a Python prompt. If you're using the system install of Twisted, you may need to install Conch separately, e.g. on Ubuntu: $ sudo apt-get install python-twisted-conch Once you've started the program, you can ssh in by doing: $ ssh admin@localhost -p 5022 The password is 'secret'. Once you've reached the Python prompt, you have access to the app object, and can import code, etc.: >>> 3 + 4 7 >>> print(app) """ import logging from flask import Flask from crochet import setup, run_in_reactor setup() # Web server: app = Flask(__name__) @app.route('/') def index(): return "Welcome to my boring web server!" @run_in_reactor def start_ssh_server(port, username, password, namespace): """ Start an SSH server on the given port, exposing a Python prompt with the given namespace. """ # This is a lot of boilerplate, see http://tm.tl/6429 for a ticket to # provide a utility function that simplifies this. from twisted.internet import reactor from twisted.conch.insults import insults from twisted.conch import manhole, manhole_ssh from twisted.cred.checkers import ( InMemoryUsernamePasswordDatabaseDontUse as MemoryDB) from twisted.cred.portal import Portal sshRealm = manhole_ssh.TerminalRealm() def chainedProtocolFactory(): return insults.ServerProtocol(manhole.Manhole, namespace) sshRealm.chainedProtocolFactory = chainedProtocolFactory sshPortal = Portal(sshRealm, [MemoryDB(**{username: password})]) reactor.listenTCP(port, manhole_ssh.ConchFactory(sshPortal), interface="127.0.0.1") if __name__ == '__main__': import sys logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) start_ssh_server(5022, "admin", "secret", {"app": app}) app.run() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/examples/testing.py0000644000175100001730000000067614450111171016414 0ustar00runnerdocker#!/usr/bin/python """ Demonstration of accessing wrapped functions for testing. """ from __future__ import print_function from crochet import setup, run_in_reactor setup() @run_in_reactor def add(x, y): return x + y if __name__ == '__main__': print("add(1, 2) returns EventualResult:") print(" ", add(1, 2)) print("add.__wrapped__(1, 2) is the result of the underlying function:") print(" ", add.__wrapped__(1, 2)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/requirements-dev.txt0000644000175100001730000000005514450111171016576 0ustar00runnerdockersphinx tox tox-gh-actions mypy flake8 pylint ././@PaxHeader0000000000000000000000000000003200000000000010210 xustar0026 mtime=1688244862.32539 crochet-2.1.1/setup.cfg0000644000175100001730000000031014450111176014356 0ustar00runnerdocker[versioneer] VCS = git style = pep440 versionfile_source = crochet/_version.py versionfile_build = crochet/_version.py tag_prefix = parentdir_prefix = crochet- [egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/setup.py0000644000175100001730000000256614450111171014261 0ustar00runnerdockertry: from setuptools import setup except ImportError: from distutils.core import setup import versioneer def read(path): """ Read the contents of a file. """ with open(path, encoding="utf-8") as f: return f.read() setup( classifiers=[ 'Intended Audience :: Developers', 'License :: OSI Approved :: MIT License', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 3.8', 'Programming Language :: Python :: 3.9', 'Programming Language :: Python :: 3.10', 'Programming Language :: Python :: 3.11', 'Programming Language :: Python :: Implementation :: CPython', 'Programming Language :: Python :: Implementation :: PyPy', ], name='crochet', version=versioneer.get_version(), cmdclass=versioneer.get_cmdclass(), description="Use Twisted anywhere!", python_requires=">=3.8.0", install_requires=[ "Twisted>=16.0", "wrapt", ], keywords="twisted threading", license="MIT", package_data={"crochet": ["py.typed", "*.pyi"]}, packages=["crochet", "crochet.tests"], url="https://github.com/itamarst/crochet", maintainer='Itamar Turner-Trauring', maintainer_email='itamar@itamarst.org', long_description=read('README.rst') + '\n' + read('docs/news.rst'), ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1688244857.0 crochet-2.1.1/versioneer.py0000644000175100001730000020032314450111171015271 0ustar00runnerdocker # Version: 0.16 """The Versioneer - like a rocketeer, but for versions. The Versioneer ============== * like a rocketeer, but for versions! * https://github.com/warner/python-versioneer * Brian Warner * License: Public Domain * Compatible With: python2.6, 2.7, 3.3, 3.4, 3.5, and pypy * [![Latest Version] (https://pypip.in/version/versioneer/badge.svg?style=flat) ](https://pypi.python.org/pypi/versioneer/) * [![Build Status] (https://travis-ci.org/warner/python-versioneer.png?branch=master) ](https://travis-ci.org/warner/python-versioneer) This is a tool for managing a recorded version number in distutils-based python projects. The goal is to remove the tedious and error-prone "update the embedded version string" step from your release process. Making a new release should be as easy as recording a new tag in your version-control system, and maybe making new tarballs. ## Quick Install * `pip install versioneer` to somewhere to your $PATH * add a `[versioneer]` section to your setup.cfg (see below) * run `versioneer install` in your source tree, commit the results ## Version Identifiers Source trees come from a variety of places: * a version-control system checkout (mostly used by developers) * a nightly tarball, produced by build automation * a snapshot tarball, produced by a web-based VCS browser, like github's "tarball from tag" feature * a release tarball, produced by "setup.py sdist", distributed through PyPI Within each source tree, the version identifier (either a string or a number, this tool is format-agnostic) can come from a variety of places: * ask the VCS tool itself, e.g. "git describe" (for checkouts), which knows about recent "tags" and an absolute revision-id * the name of the directory into which the tarball was unpacked * an expanded VCS keyword ($Id$, etc) * a `_version.py` created by some earlier build step For released software, the version identifier is closely related to a VCS tag. Some projects use tag names that include more than just the version string (e.g. "myproject-1.2" instead of just "1.2"), in which case the tool needs to strip the tag prefix to extract the version identifier. For unreleased software (between tags), the version identifier should provide enough information to help developers recreate the same tree, while also giving them an idea of roughly how old the tree is (after version 1.2, before version 1.3). Many VCS systems can report a description that captures this, for example `git describe --tags --dirty --always` reports things like "0.7-1-g574ab98-dirty" to indicate that the checkout is one revision past the 0.7 tag, has a unique revision id of "574ab98", and is "dirty" (it has uncommitted changes. The version identifier is used for multiple purposes: * to allow the module to self-identify its version: `myproject.__version__` * to choose a name and prefix for a 'setup.py sdist' tarball ## Theory of Operation Versioneer works by adding a special `_version.py` file into your source tree, where your `__init__.py` can import it. This `_version.py` knows how to dynamically ask the VCS tool for version information at import time. `_version.py` also contains `$Revision$` markers, and the installation process marks `_version.py` to have this marker rewritten with a tag name during the `git archive` command. As a result, generated tarballs will contain enough information to get the proper version. To allow `setup.py` to compute a version too, a `versioneer.py` is added to the top level of your source tree, next to `setup.py` and the `setup.cfg` that configures it. This overrides several distutils/setuptools commands to compute the version when invoked, and changes `setup.py build` and `setup.py sdist` to replace `_version.py` with a small static file that contains just the generated version data. ## Installation First, decide on values for the following configuration variables: * `VCS`: the version control system you use. Currently accepts "git". * `style`: the style of version string to be produced. See "Styles" below for details. Defaults to "pep440", which looks like `TAG[+DISTANCE.gSHORTHASH[.dirty]]`. * `versionfile_source`: A project-relative pathname into which the generated version strings should be written. This is usually a `_version.py` next to your project's main `__init__.py` file, so it can be imported at runtime. If your project uses `src/myproject/__init__.py`, this should be `src/myproject/_version.py`. This file should be checked in to your VCS as usual: the copy created below by `setup.py setup_versioneer` will include code that parses expanded VCS keywords in generated tarballs. The 'build' and 'sdist' commands will replace it with a copy that has just the calculated version string. This must be set even if your project does not have any modules (and will therefore never import `_version.py`), since "setup.py sdist" -based trees still need somewhere to record the pre-calculated version strings. Anywhere in the source tree should do. If there is a `__init__.py` next to your `_version.py`, the `setup.py setup_versioneer` command (described below) will append some `__version__`-setting assignments, if they aren't already present. * `versionfile_build`: Like `versionfile_source`, but relative to the build directory instead of the source directory. These will differ when your setup.py uses 'package_dir='. If you have `package_dir={'myproject': 'src/myproject'}`, then you will probably have `versionfile_build='myproject/_version.py'` and `versionfile_source='src/myproject/_version.py'`. If this is set to None, then `setup.py build` will not attempt to rewrite any `_version.py` in the built tree. If your project does not have any libraries (e.g. if it only builds a script), then you should use `versionfile_build = None`. To actually use the computed version string, your `setup.py` will need to override `distutils.command.build_scripts` with a subclass that explicitly inserts a copy of `versioneer.get_version()` into your script file. See `test/demoapp-script-only/setup.py` for an example. * `tag_prefix`: a string, like 'PROJECTNAME-', which appears at the start of all VCS tags. If your tags look like 'myproject-1.2.0', then you should use tag_prefix='myproject-'. If you use unprefixed tags like '1.2.0', this should be an empty string, using either `tag_prefix=` or `tag_prefix=''`. * `parentdir_prefix`: a optional string, frequently the same as tag_prefix, which appears at the start of all unpacked tarball filenames. If your tarball unpacks into 'myproject-1.2.0', this should be 'myproject-'. To disable this feature, just omit the field from your `setup.cfg`. This tool provides one script, named `versioneer`. That script has one mode, "install", which writes a copy of `versioneer.py` into the current directory and runs `versioneer.py setup` to finish the installation. To versioneer-enable your project: * 1: Modify your `setup.cfg`, adding a section named `[versioneer]` and populating it with the configuration values you decided earlier (note that the option names are not case-sensitive): ```` [versioneer] VCS = git style = pep440 versionfile_source = src/myproject/_version.py versionfile_build = myproject/_version.py tag_prefix = parentdir_prefix = myproject- ```` * 2: Run `versioneer install`. This will do the following: * copy `versioneer.py` into the top of your source tree * create `_version.py` in the right place (`versionfile_source`) * modify your `__init__.py` (if one exists next to `_version.py`) to define `__version__` (by calling a function from `_version.py`) * modify your `MANIFEST.in` to include both `versioneer.py` and the generated `_version.py` in sdist tarballs `versioneer install` will complain about any problems it finds with your `setup.py` or `setup.cfg`. Run it multiple times until you have fixed all the problems. * 3: add a `import versioneer` to your setup.py, and add the following arguments to the setup() call: version=versioneer.get_version(), cmdclass=versioneer.get_cmdclass(), * 4: commit these changes to your VCS. To make sure you won't forget, `versioneer install` will mark everything it touched for addition using `git add`. Don't forget to add `setup.py` and `setup.cfg` too. ## Post-Installation Usage Once established, all uses of your tree from a VCS checkout should get the current version string. All generated tarballs should include an embedded version string (so users who unpack them will not need a VCS tool installed). If you distribute your project through PyPI, then the release process should boil down to two steps: * 1: git tag 1.0 * 2: python setup.py register sdist upload If you distribute it through github (i.e. users use github to generate tarballs with `git archive`), the process is: * 1: git tag 1.0 * 2: git push; git push --tags Versioneer will report "0+untagged.NUMCOMMITS.gHASH" until your tree has at least one tag in its history. ## Version-String Flavors Code which uses Versioneer can learn about its version string at runtime by importing `_version` from your main `__init__.py` file and running the `get_versions()` function. From the "outside" (e.g. in `setup.py`), you can import the top-level `versioneer.py` and run `get_versions()`. Both functions return a dictionary with different flavors of version information: * `['version']`: A condensed version string, rendered using the selected style. This is the most commonly used value for the project's version string. The default "pep440" style yields strings like `0.11`, `0.11+2.g1076c97`, or `0.11+2.g1076c97.dirty`. See the "Styles" section below for alternative styles. * `['full-revisionid']`: detailed revision identifier. For Git, this is the full SHA1 commit id, e.g. "1076c978a8d3cfc70f408fe5974aa6c092c949ac". * `['dirty']`: a boolean, True if the tree has uncommitted changes. Note that this is only accurate if run in a VCS checkout, otherwise it is likely to be False or None * `['error']`: if the version string could not be computed, this will be set to a string describing the problem, otherwise it will be None. It may be useful to throw an exception in setup.py if this is set, to avoid e.g. creating tarballs with a version string of "unknown". Some variants are more useful than others. Including `full-revisionid` in a bug report should allow developers to reconstruct the exact code being tested (or indicate the presence of local changes that should be shared with the developers). `version` is suitable for display in an "about" box or a CLI `--version` output: it can be easily compared against release notes and lists of bugs fixed in various releases. The installer adds the following text to your `__init__.py` to place a basic version in `YOURPROJECT.__version__`: from ._version import get_versions __version__ = get_versions()['version'] del get_versions ## Styles The setup.cfg `style=` configuration controls how the VCS information is rendered into a version string. The default style, "pep440", produces a PEP440-compliant string, equal to the un-prefixed tag name for actual releases, and containing an additional "local version" section with more detail for in-between builds. For Git, this is TAG[+DISTANCE.gHEX[.dirty]] , using information from `git describe --tags --dirty --always`. For example "0.11+2.g1076c97.dirty" indicates that the tree is like the "1076c97" commit but has uncommitted changes (".dirty"), and that this commit is two revisions ("+2") beyond the "0.11" tag. For released software (exactly equal to a known tag), the identifier will only contain the stripped tag, e.g. "0.11". Other styles are available. See details.md in the Versioneer source tree for descriptions. ## Debugging Versioneer tries to avoid fatal errors: if something goes wrong, it will tend to return a version of "0+unknown". To investigate the problem, run `setup.py version`, which will run the version-lookup code in a verbose mode, and will display the full contents of `get_versions()` (including the `error` string, which may help identify what went wrong). ## Updating Versioneer To upgrade your project to a new release of Versioneer, do the following: * install the new Versioneer (`pip install -U versioneer` or equivalent) * edit `setup.cfg`, if necessary, to include any new configuration settings indicated by the release notes * re-run `versioneer install` in your source tree, to replace `SRC/_version.py` * commit any changed files ### Upgrading to 0.16 Nothing special. ### Upgrading to 0.15 Starting with this version, Versioneer is configured with a `[versioneer]` section in your `setup.cfg` file. Earlier versions required the `setup.py` to set attributes on the `versioneer` module immediately after import. The new version will refuse to run (raising an exception during import) until you have provided the necessary `setup.cfg` section. In addition, the Versioneer package provides an executable named `versioneer`, and the installation process is driven by running `versioneer install`. In 0.14 and earlier, the executable was named `versioneer-installer` and was run without an argument. ### Upgrading to 0.14 0.14 changes the format of the version string. 0.13 and earlier used hyphen-separated strings like "0.11-2-g1076c97-dirty". 0.14 and beyond use a plus-separated "local version" section strings, with dot-separated components, like "0.11+2.g1076c97". PEP440-strict tools did not like the old format, but should be ok with the new one. ### Upgrading from 0.11 to 0.12 Nothing special. ### Upgrading from 0.10 to 0.11 You must add a `versioneer.VCS = "git"` to your `setup.py` before re-running `setup.py setup_versioneer`. This will enable the use of additional version-control systems (SVN, etc) in the future. ## Future Directions This tool is designed to make it easily extended to other version-control systems: all VCS-specific components are in separate directories like src/git/ . The top-level `versioneer.py` script is assembled from these components by running make-versioneer.py . In the future, make-versioneer.py will take a VCS name as an argument, and will construct a version of `versioneer.py` that is specific to the given VCS. It might also take the configuration arguments that are currently provided manually during installation by editing setup.py . Alternatively, it might go the other direction and include code from all supported VCS systems, reducing the number of intermediate scripts. ## License To make Versioneer easier to embed, all its code is dedicated to the public domain. The `_version.py` that it creates is also in the public domain. Specifically, both are released under the Creative Commons "Public Domain Dedication" license (CC0-1.0), as described in https://creativecommons.org/publicdomain/zero/1.0/ . """ from __future__ import print_function try: import configparser except ImportError: import ConfigParser as configparser import errno import json import os import re import subprocess import sys class VersioneerConfig: """Container for Versioneer configuration parameters.""" def get_root(): """Get the project root directory. We require that all commands are run from the project root, i.e. the directory that contains setup.py, setup.cfg, and versioneer.py . """ root = os.path.realpath(os.path.abspath(os.getcwd())) setup_py = os.path.join(root, "setup.py") versioneer_py = os.path.join(root, "versioneer.py") if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)): # allow 'python path/to/setup.py COMMAND' root = os.path.dirname(os.path.realpath(os.path.abspath(sys.argv[0]))) setup_py = os.path.join(root, "setup.py") versioneer_py = os.path.join(root, "versioneer.py") if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)): err = ("Versioneer was unable to run the project root directory. " "Versioneer requires setup.py to be executed from " "its immediate directory (like 'python setup.py COMMAND'), " "or in a way that lets it use sys.argv[0] to find the root " "(like 'python path/to/setup.py COMMAND').") raise VersioneerBadRootError(err) try: # Certain runtime workflows (setup.py install/develop in a setuptools # tree) execute all dependencies in a single python process, so # "versioneer" may be imported multiple times, and python's shared # module-import table will cache the first one. So we can't use # os.path.dirname(__file__), as that will find whichever # versioneer.py was first imported, even in later projects. me = os.path.realpath(os.path.abspath(__file__)) if os.path.splitext(me)[0] != os.path.splitext(versioneer_py)[0]: print("Warning: build in %s is using versioneer.py from %s" % (os.path.dirname(me), versioneer_py)) except NameError: pass return root def get_config_from_root(root): """Read the project setup.cfg file to determine Versioneer config.""" # This might raise EnvironmentError (if setup.cfg is missing), or # configparser.NoSectionError (if it lacks a [versioneer] section), or # configparser.NoOptionError (if it lacks "VCS="). See the docstring at # the top of versioneer.py for instructions on writing your setup.cfg . setup_cfg = os.path.join(root, "setup.cfg") parser = configparser.SafeConfigParser() with open(setup_cfg, "r") as f: parser.readfp(f) VCS = parser.get("versioneer", "VCS") # mandatory def get(parser, name): if parser.has_option("versioneer", name): return parser.get("versioneer", name) return None cfg = VersioneerConfig() cfg.VCS = VCS cfg.style = get(parser, "style") or "" cfg.versionfile_source = get(parser, "versionfile_source") cfg.versionfile_build = get(parser, "versionfile_build") cfg.tag_prefix = get(parser, "tag_prefix") if cfg.tag_prefix in ("''", '""'): cfg.tag_prefix = "" cfg.parentdir_prefix = get(parser, "parentdir_prefix") cfg.verbose = get(parser, "verbose") return cfg class NotThisMethod(Exception): """Exception raised if a method is not valid for the current scenario.""" # these dictionaries contain VCS-specific tools LONG_VERSION_PY = {} HANDLERS = {} def register_vcs_handler(vcs, method): # decorator """Decorator to mark a method as the handler for a particular VCS.""" def decorate(f): """Store f in HANDLERS[vcs][method].""" if vcs not in HANDLERS: HANDLERS[vcs] = {} HANDLERS[vcs][method] = f return f return decorate def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False): """Call the given command(s).""" assert isinstance(commands, list) p = None for c in commands: try: dispcmd = str([c] + args) # remember shell=False, so use git.cmd on windows, not just git p = subprocess.Popen([c] + args, cwd=cwd, stdout=subprocess.PIPE, stderr=(subprocess.PIPE if hide_stderr else None)) break except EnvironmentError: e = sys.exc_info()[1] if e.errno == errno.ENOENT: continue if verbose: print("unable to run %s" % dispcmd) print(e) return None else: if verbose: print("unable to find command, tried %s" % (commands,)) return None stdout = p.communicate()[0].strip() if sys.version_info[0] >= 3: stdout = stdout.decode() if p.returncode != 0: if verbose: print("unable to run %s (error)" % dispcmd) return None return stdout LONG_VERSION_PY['git'] = ''' # This file helps to compute a version number in source trees obtained from # git-archive tarball (such as those provided by githubs download-from-tag # feature). Distribution tarballs (built by setup.py sdist) and build # directories (produced by setup.py build) will contain a much shorter file # that just contains the computed version number. # This file is released into the public domain. Generated by # versioneer-0.16 (https://github.com/warner/python-versioneer) """Git implementation of _version.py.""" import errno import os import re import subprocess import sys def get_keywords(): """Get the keywords needed to look up the version information.""" # these strings will be replaced by git during git-archive. # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). git_refnames = "%(DOLLAR)sFormat:%%d%(DOLLAR)s" git_full = "%(DOLLAR)sFormat:%%H%(DOLLAR)s" keywords = {"refnames": git_refnames, "full": git_full} return keywords class VersioneerConfig: """Container for Versioneer configuration parameters.""" def get_config(): """Create, populate and return the VersioneerConfig() object.""" # these strings are filled in when 'setup.py versioneer' creates # _version.py cfg = VersioneerConfig() cfg.VCS = "git" cfg.style = "%(STYLE)s" cfg.tag_prefix = "%(TAG_PREFIX)s" cfg.parentdir_prefix = "%(PARENTDIR_PREFIX)s" cfg.versionfile_source = "%(VERSIONFILE_SOURCE)s" cfg.verbose = False return cfg class NotThisMethod(Exception): """Exception raised if a method is not valid for the current scenario.""" LONG_VERSION_PY = {} HANDLERS = {} def register_vcs_handler(vcs, method): # decorator """Decorator to mark a method as the handler for a particular VCS.""" def decorate(f): """Store f in HANDLERS[vcs][method].""" if vcs not in HANDLERS: HANDLERS[vcs] = {} HANDLERS[vcs][method] = f return f return decorate def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False): """Call the given command(s).""" assert isinstance(commands, list) p = None for c in commands: try: dispcmd = str([c] + args) # remember shell=False, so use git.cmd on windows, not just git p = subprocess.Popen([c] + args, cwd=cwd, stdout=subprocess.PIPE, stderr=(subprocess.PIPE if hide_stderr else None)) break except EnvironmentError: e = sys.exc_info()[1] if e.errno == errno.ENOENT: continue if verbose: print("unable to run %%s" %% dispcmd) print(e) return None else: if verbose: print("unable to find command, tried %%s" %% (commands,)) return None stdout = p.communicate()[0].strip() if sys.version_info[0] >= 3: stdout = stdout.decode() if p.returncode != 0: if verbose: print("unable to run %%s (error)" %% dispcmd) return None return stdout def versions_from_parentdir(parentdir_prefix, root, verbose): """Try to determine the version from the parent directory name. Source tarballs conventionally unpack into a directory that includes both the project name and a version string. """ dirname = os.path.basename(root) if not dirname.startswith(parentdir_prefix): if verbose: print("guessing rootdir is '%%s', but '%%s' doesn't start with " "prefix '%%s'" %% (root, dirname, parentdir_prefix)) raise NotThisMethod("rootdir doesn't start with parentdir_prefix") return {"version": dirname[len(parentdir_prefix):], "full-revisionid": None, "dirty": False, "error": None} @register_vcs_handler("git", "get_keywords") def git_get_keywords(versionfile_abs): """Extract version information from the given file.""" # the code embedded in _version.py can just fetch the value of these # keywords. When used from setup.py, we don't want to import _version.py, # so we do it with a regexp instead. This function is not used from # _version.py. keywords = {} try: f = open(versionfile_abs, "r") for line in f.readlines(): if line.strip().startswith("git_refnames ="): mo = re.search(r'=\s*"(.*)"', line) if mo: keywords["refnames"] = mo.group(1) if line.strip().startswith("git_full ="): mo = re.search(r'=\s*"(.*)"', line) if mo: keywords["full"] = mo.group(1) f.close() except EnvironmentError: pass return keywords @register_vcs_handler("git", "keywords") def git_versions_from_keywords(keywords, tag_prefix, verbose): """Get version information from git keywords.""" if not keywords: raise NotThisMethod("no keywords at all, weird") refnames = keywords["refnames"].strip() if refnames.startswith("$Format"): if verbose: print("keywords are unexpanded, not using") raise NotThisMethod("unexpanded keywords, not a git-archive tarball") refs = set([r.strip() for r in refnames.strip("()").split(",")]) # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %%d # expansion behaves like git log --decorate=short and strips out the # refs/heads/ and refs/tags/ prefixes that would let us distinguish # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". tags = set([r for r in refs if re.search(r'\d', r)]) if verbose: print("discarding '%%s', no digits" %% ",".join(refs-tags)) if verbose: print("likely tags: %%s" %% ",".join(sorted(tags))) for ref in sorted(tags): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] if verbose: print("picking %%s" %% r) return {"version": r, "full-revisionid": keywords["full"].strip(), "dirty": False, "error": None } # no suitable tags, so version is "0+unknown", but full hex is still there if verbose: print("no suitable tags, using unknown + full revision id") return {"version": "0+unknown", "full-revisionid": keywords["full"].strip(), "dirty": False, "error": "no suitable tags"} @register_vcs_handler("git", "pieces_from_vcs") def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): """Get version from 'git describe' in the root of the source tree. This only gets called if the git-archive 'subst' keywords were *not* expanded, and _version.py hasn't already been rewritten with a short version string, meaning we're inside a checked out source tree. """ if not os.path.exists(os.path.join(root, ".git")): if verbose: print("no .git in %%s" %% root) raise NotThisMethod("no .git directory") GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty] # if there isn't one, this yields HEX[-dirty] (no NUM) describe_out = run_command(GITS, ["describe", "--tags", "--dirty", "--always", "--long", "--match", "%%s*" %% tag_prefix], cwd=root) # --long was added in git-1.5.5 if describe_out is None: raise NotThisMethod("'git describe' failed") describe_out = describe_out.strip() full_out = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) if full_out is None: raise NotThisMethod("'git rev-parse' failed") full_out = full_out.strip() pieces = {} pieces["long"] = full_out pieces["short"] = full_out[:7] # maybe improved later pieces["error"] = None # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty] # TAG might have hyphens. git_describe = describe_out # look for -dirty suffix dirty = git_describe.endswith("-dirty") pieces["dirty"] = dirty if dirty: git_describe = git_describe[:git_describe.rindex("-dirty")] # now we have TAG-NUM-gHEX or HEX if "-" in git_describe: # TAG-NUM-gHEX mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe) if not mo: # unparseable. Maybe git-describe is misbehaving? pieces["error"] = ("unable to parse git-describe output: '%%s'" %% describe_out) return pieces # tag full_tag = mo.group(1) if not full_tag.startswith(tag_prefix): if verbose: fmt = "tag '%%s' doesn't start with prefix '%%s'" print(fmt %% (full_tag, tag_prefix)) pieces["error"] = ("tag '%%s' doesn't start with prefix '%%s'" %% (full_tag, tag_prefix)) return pieces pieces["closest-tag"] = full_tag[len(tag_prefix):] # distance: number of commits since tag pieces["distance"] = int(mo.group(2)) # commit: short hex revision ID pieces["short"] = mo.group(3) else: # HEX: no tags pieces["closest-tag"] = None count_out = run_command(GITS, ["rev-list", "HEAD", "--count"], cwd=root) pieces["distance"] = int(count_out) # total number of commits return pieces def plus_or_dot(pieces): """Return a + if we don't already have one, else return a .""" if "+" in pieces.get("closest-tag", ""): return "." return "+" def render_pep440(pieces): """Build up version string, with post-release "local version identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty Exceptions: 1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"] or pieces["dirty"]: rendered += plus_or_dot(pieces) rendered += "%%d.g%%s" %% (pieces["distance"], pieces["short"]) if pieces["dirty"]: rendered += ".dirty" else: # exception #1 rendered = "0+untagged.%%d.g%%s" %% (pieces["distance"], pieces["short"]) if pieces["dirty"]: rendered += ".dirty" return rendered def render_pep440_pre(pieces): """TAG[.post.devDISTANCE] -- No -dirty. Exceptions: 1: no tags. 0.post.devDISTANCE """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"]: rendered += ".post.dev%%d" %% pieces["distance"] else: # exception #1 rendered = "0.post.dev%%d" %% pieces["distance"] return rendered def render_pep440_post(pieces): """TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that .dev0 sorts backwards (a dirty tree will appear "older" than the corresponding clean one), but you shouldn't be releasing software with -dirty anyways. Exceptions: 1: no tags. 0.postDISTANCE[.dev0] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"] or pieces["dirty"]: rendered += ".post%%d" %% pieces["distance"] if pieces["dirty"]: rendered += ".dev0" rendered += plus_or_dot(pieces) rendered += "g%%s" %% pieces["short"] else: # exception #1 rendered = "0.post%%d" %% pieces["distance"] if pieces["dirty"]: rendered += ".dev0" rendered += "+g%%s" %% pieces["short"] return rendered def render_pep440_old(pieces): """TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty. Eexceptions: 1: no tags. 0.postDISTANCE[.dev0] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"] or pieces["dirty"]: rendered += ".post%%d" %% pieces["distance"] if pieces["dirty"]: rendered += ".dev0" else: # exception #1 rendered = "0.post%%d" %% pieces["distance"] if pieces["dirty"]: rendered += ".dev0" return rendered def render_git_describe(pieces): """TAG[-DISTANCE-gHEX][-dirty]. Like 'git describe --tags --dirty --always'. Exceptions: 1: no tags. HEX[-dirty] (note: no 'g' prefix) """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"]: rendered += "-%%d-g%%s" %% (pieces["distance"], pieces["short"]) else: # exception #1 rendered = pieces["short"] if pieces["dirty"]: rendered += "-dirty" return rendered def render_git_describe_long(pieces): """TAG-DISTANCE-gHEX[-dirty]. Like 'git describe --tags --dirty --always -long'. The distance/hash is unconditional. Exceptions: 1: no tags. HEX[-dirty] (note: no 'g' prefix) """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] rendered += "-%%d-g%%s" %% (pieces["distance"], pieces["short"]) else: # exception #1 rendered = pieces["short"] if pieces["dirty"]: rendered += "-dirty" return rendered def render(pieces, style): """Render the given version pieces into the requested style.""" if pieces["error"]: return {"version": "unknown", "full-revisionid": pieces.get("long"), "dirty": None, "error": pieces["error"]} if not style or style == "default": style = "pep440" # the default if style == "pep440": rendered = render_pep440(pieces) elif style == "pep440-pre": rendered = render_pep440_pre(pieces) elif style == "pep440-post": rendered = render_pep440_post(pieces) elif style == "pep440-old": rendered = render_pep440_old(pieces) elif style == "git-describe": rendered = render_git_describe(pieces) elif style == "git-describe-long": rendered = render_git_describe_long(pieces) else: raise ValueError("unknown style '%%s'" %% style) return {"version": rendered, "full-revisionid": pieces["long"], "dirty": pieces["dirty"], "error": None} def get_versions(): """Get version information or return default if unable to do so.""" # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have # __file__, we can work backwards from there to the root. Some # py2exe/bbfreeze/non-CPython implementations don't do __file__, in which # case we can only use expanded keywords. cfg = get_config() verbose = cfg.verbose try: return git_versions_from_keywords(get_keywords(), cfg.tag_prefix, verbose) except NotThisMethod: pass try: root = os.path.realpath(__file__) # versionfile_source is the relative path from the top of the source # tree (where the .git directory might live) to this file. Invert # this to find the root from __file__. for i in cfg.versionfile_source.split('/'): root = os.path.dirname(root) except NameError: return {"version": "0+unknown", "full-revisionid": None, "dirty": None, "error": "unable to find root of source tree"} try: pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose) return render(pieces, cfg.style) except NotThisMethod: pass try: if cfg.parentdir_prefix: return versions_from_parentdir(cfg.parentdir_prefix, root, verbose) except NotThisMethod: pass return {"version": "0+unknown", "full-revisionid": None, "dirty": None, "error": "unable to compute version"} ''' @register_vcs_handler("git", "get_keywords") def git_get_keywords(versionfile_abs): """Extract version information from the given file.""" # the code embedded in _version.py can just fetch the value of these # keywords. When used from setup.py, we don't want to import _version.py, # so we do it with a regexp instead. This function is not used from # _version.py. keywords = {} try: f = open(versionfile_abs, "r") for line in f.readlines(): if line.strip().startswith("git_refnames ="): mo = re.search(r'=\s*"(.*)"', line) if mo: keywords["refnames"] = mo.group(1) if line.strip().startswith("git_full ="): mo = re.search(r'=\s*"(.*)"', line) if mo: keywords["full"] = mo.group(1) f.close() except EnvironmentError: pass return keywords @register_vcs_handler("git", "keywords") def git_versions_from_keywords(keywords, tag_prefix, verbose): """Get version information from git keywords.""" if not keywords: raise NotThisMethod("no keywords at all, weird") refnames = keywords["refnames"].strip() if refnames.startswith("$Format"): if verbose: print("keywords are unexpanded, not using") raise NotThisMethod("unexpanded keywords, not a git-archive tarball") refs = set([r.strip() for r in refnames.strip("()").split(",")]) # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %d # expansion behaves like git log --decorate=short and strips out the # refs/heads/ and refs/tags/ prefixes that would let us distinguish # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". tags = set([r for r in refs if re.search(r'\d', r)]) if verbose: print("discarding '%s', no digits" % ",".join(refs-tags)) if verbose: print("likely tags: %s" % ",".join(sorted(tags))) for ref in sorted(tags): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] if verbose: print("picking %s" % r) return {"version": r, "full-revisionid": keywords["full"].strip(), "dirty": False, "error": None } # no suitable tags, so version is "0+unknown", but full hex is still there if verbose: print("no suitable tags, using unknown + full revision id") return {"version": "0+unknown", "full-revisionid": keywords["full"].strip(), "dirty": False, "error": "no suitable tags"} @register_vcs_handler("git", "pieces_from_vcs") def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command): """Get version from 'git describe' in the root of the source tree. This only gets called if the git-archive 'subst' keywords were *not* expanded, and _version.py hasn't already been rewritten with a short version string, meaning we're inside a checked out source tree. """ if not os.path.exists(os.path.join(root, ".git")): if verbose: print("no .git in %s" % root) raise NotThisMethod("no .git directory") GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty] # if there isn't one, this yields HEX[-dirty] (no NUM) describe_out = run_command(GITS, ["describe", "--tags", "--dirty", "--always", "--long", "--match", "%s*" % tag_prefix], cwd=root) # --long was added in git-1.5.5 if describe_out is None: raise NotThisMethod("'git describe' failed") describe_out = describe_out.strip() full_out = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) if full_out is None: raise NotThisMethod("'git rev-parse' failed") full_out = full_out.strip() pieces = {} pieces["long"] = full_out pieces["short"] = full_out[:7] # maybe improved later pieces["error"] = None # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty] # TAG might have hyphens. git_describe = describe_out # look for -dirty suffix dirty = git_describe.endswith("-dirty") pieces["dirty"] = dirty if dirty: git_describe = git_describe[:git_describe.rindex("-dirty")] # now we have TAG-NUM-gHEX or HEX if "-" in git_describe: # TAG-NUM-gHEX mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe) if not mo: # unparseable. Maybe git-describe is misbehaving? pieces["error"] = ("unable to parse git-describe output: '%s'" % describe_out) return pieces # tag full_tag = mo.group(1) if not full_tag.startswith(tag_prefix): if verbose: fmt = "tag '%s' doesn't start with prefix '%s'" print(fmt % (full_tag, tag_prefix)) pieces["error"] = ("tag '%s' doesn't start with prefix '%s'" % (full_tag, tag_prefix)) return pieces pieces["closest-tag"] = full_tag[len(tag_prefix):] # distance: number of commits since tag pieces["distance"] = int(mo.group(2)) # commit: short hex revision ID pieces["short"] = mo.group(3) else: # HEX: no tags pieces["closest-tag"] = None count_out = run_command(GITS, ["rev-list", "HEAD", "--count"], cwd=root) pieces["distance"] = int(count_out) # total number of commits return pieces def do_vcs_install(manifest_in, versionfile_source, ipy): """Git-specific installation logic for Versioneer. For Git, this means creating/changing .gitattributes to mark _version.py for export-time keyword substitution. """ GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] files = [manifest_in, versionfile_source] if ipy: files.append(ipy) try: me = __file__ if me.endswith(".pyc") or me.endswith(".pyo"): me = os.path.splitext(me)[0] + ".py" versioneer_file = os.path.relpath(me) except NameError: versioneer_file = "versioneer.py" files.append(versioneer_file) present = False try: f = open(".gitattributes", "r") for line in f.readlines(): if line.strip().startswith(versionfile_source): if "export-subst" in line.strip().split()[1:]: present = True f.close() except EnvironmentError: pass if not present: f = open(".gitattributes", "a+") f.write("%s export-subst\n" % versionfile_source) f.close() files.append(".gitattributes") run_command(GITS, ["add", "--"] + files) def versions_from_parentdir(parentdir_prefix, root, verbose): """Try to determine the version from the parent directory name. Source tarballs conventionally unpack into a directory that includes both the project name and a version string. """ dirname = os.path.basename(root) if not dirname.startswith(parentdir_prefix): if verbose: print("guessing rootdir is '%s', but '%s' doesn't start with " "prefix '%s'" % (root, dirname, parentdir_prefix)) raise NotThisMethod("rootdir doesn't start with parentdir_prefix") return {"version": dirname[len(parentdir_prefix):], "full-revisionid": None, "dirty": False, "error": None} SHORT_VERSION_PY = """ # This file was generated by 'versioneer.py' (0.16) from # revision-control system data, or from the parent directory name of an # unpacked source archive. Distribution tarballs contain a pre-generated copy # of this file. import json import sys version_json = ''' %s ''' # END VERSION_JSON def get_versions(): return json.loads(version_json) """ def versions_from_file(filename): """Try to determine the version from _version.py if present.""" try: with open(filename) as f: contents = f.read() except EnvironmentError: raise NotThisMethod("unable to read _version.py") mo = re.search(r"version_json = '''\n(.*)''' # END VERSION_JSON", contents, re.M | re.S) if not mo: raise NotThisMethod("no version_json in _version.py") return json.loads(mo.group(1)) def write_to_version_file(filename, versions): """Write the given version number to the given _version.py file.""" os.unlink(filename) contents = json.dumps(versions, sort_keys=True, indent=1, separators=(",", ": ")) with open(filename, "w") as f: f.write(SHORT_VERSION_PY % contents) print("set %s to '%s'" % (filename, versions["version"])) def plus_or_dot(pieces): """Return a + if we don't already have one, else return a .""" if "+" in pieces.get("closest-tag", ""): return "." return "+" def render_pep440(pieces): """Build up version string, with post-release "local version identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty Exceptions: 1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"] or pieces["dirty"]: rendered += plus_or_dot(pieces) rendered += "%d.g%s" % (pieces["distance"], pieces["short"]) if pieces["dirty"]: rendered += ".dirty" else: # exception #1 rendered = "0+untagged.%d.g%s" % (pieces["distance"], pieces["short"]) if pieces["dirty"]: rendered += ".dirty" return rendered def render_pep440_pre(pieces): """TAG[.post.devDISTANCE] -- No -dirty. Exceptions: 1: no tags. 0.post.devDISTANCE """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"]: rendered += ".post.dev%d" % pieces["distance"] else: # exception #1 rendered = "0.post.dev%d" % pieces["distance"] return rendered def render_pep440_post(pieces): """TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that .dev0 sorts backwards (a dirty tree will appear "older" than the corresponding clean one), but you shouldn't be releasing software with -dirty anyways. Exceptions: 1: no tags. 0.postDISTANCE[.dev0] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"] or pieces["dirty"]: rendered += ".post%d" % pieces["distance"] if pieces["dirty"]: rendered += ".dev0" rendered += plus_or_dot(pieces) rendered += "g%s" % pieces["short"] else: # exception #1 rendered = "0.post%d" % pieces["distance"] if pieces["dirty"]: rendered += ".dev0" rendered += "+g%s" % pieces["short"] return rendered def render_pep440_old(pieces): """TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty. Eexceptions: 1: no tags. 0.postDISTANCE[.dev0] """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"] or pieces["dirty"]: rendered += ".post%d" % pieces["distance"] if pieces["dirty"]: rendered += ".dev0" else: # exception #1 rendered = "0.post%d" % pieces["distance"] if pieces["dirty"]: rendered += ".dev0" return rendered def render_git_describe(pieces): """TAG[-DISTANCE-gHEX][-dirty]. Like 'git describe --tags --dirty --always'. Exceptions: 1: no tags. HEX[-dirty] (note: no 'g' prefix) """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] if pieces["distance"]: rendered += "-%d-g%s" % (pieces["distance"], pieces["short"]) else: # exception #1 rendered = pieces["short"] if pieces["dirty"]: rendered += "-dirty" return rendered def render_git_describe_long(pieces): """TAG-DISTANCE-gHEX[-dirty]. Like 'git describe --tags --dirty --always -long'. The distance/hash is unconditional. Exceptions: 1: no tags. HEX[-dirty] (note: no 'g' prefix) """ if pieces["closest-tag"]: rendered = pieces["closest-tag"] rendered += "-%d-g%s" % (pieces["distance"], pieces["short"]) else: # exception #1 rendered = pieces["short"] if pieces["dirty"]: rendered += "-dirty" return rendered def render(pieces, style): """Render the given version pieces into the requested style.""" if pieces["error"]: return {"version": "unknown", "full-revisionid": pieces.get("long"), "dirty": None, "error": pieces["error"]} if not style or style == "default": style = "pep440" # the default if style == "pep440": rendered = render_pep440(pieces) elif style == "pep440-pre": rendered = render_pep440_pre(pieces) elif style == "pep440-post": rendered = render_pep440_post(pieces) elif style == "pep440-old": rendered = render_pep440_old(pieces) elif style == "git-describe": rendered = render_git_describe(pieces) elif style == "git-describe-long": rendered = render_git_describe_long(pieces) else: raise ValueError("unknown style '%s'" % style) return {"version": rendered, "full-revisionid": pieces["long"], "dirty": pieces["dirty"], "error": None} class VersioneerBadRootError(Exception): """The project root directory is unknown or missing key files.""" def get_versions(verbose=False): """Get the project version from whatever source is available. Returns dict with two keys: 'version' and 'full'. """ if "versioneer" in sys.modules: # see the discussion in cmdclass.py:get_cmdclass() del sys.modules["versioneer"] root = get_root() cfg = get_config_from_root(root) assert cfg.VCS is not None, "please set [versioneer]VCS= in setup.cfg" handlers = HANDLERS.get(cfg.VCS) assert handlers, "unrecognized VCS '%s'" % cfg.VCS verbose = verbose or cfg.verbose assert cfg.versionfile_source is not None, \ "please set versioneer.versionfile_source" assert cfg.tag_prefix is not None, "please set versioneer.tag_prefix" versionfile_abs = os.path.join(root, cfg.versionfile_source) # extract version from first of: _version.py, VCS command (e.g. 'git # describe'), parentdir. This is meant to work for developers using a # source checkout, for users of a tarball created by 'setup.py sdist', # and for users of a tarball/zipball created by 'git archive' or github's # download-from-tag feature or the equivalent in other VCSes. get_keywords_f = handlers.get("get_keywords") from_keywords_f = handlers.get("keywords") if get_keywords_f and from_keywords_f: try: keywords = get_keywords_f(versionfile_abs) ver = from_keywords_f(keywords, cfg.tag_prefix, verbose) if verbose: print("got version from expanded keyword %s" % ver) return ver except NotThisMethod: pass try: ver = versions_from_file(versionfile_abs) if verbose: print("got version from file %s %s" % (versionfile_abs, ver)) return ver except NotThisMethod: pass from_vcs_f = handlers.get("pieces_from_vcs") if from_vcs_f: try: pieces = from_vcs_f(cfg.tag_prefix, root, verbose) ver = render(pieces, cfg.style) if verbose: print("got version from VCS %s" % ver) return ver except NotThisMethod: pass try: if cfg.parentdir_prefix: ver = versions_from_parentdir(cfg.parentdir_prefix, root, verbose) if verbose: print("got version from parentdir %s" % ver) return ver except NotThisMethod: pass if verbose: print("unable to compute version") return {"version": "0+unknown", "full-revisionid": None, "dirty": None, "error": "unable to compute version"} def get_version(): """Get the short version string for this project.""" return get_versions()["version"] def get_cmdclass(): """Get the custom setuptools/distutils subclasses used by Versioneer.""" if "versioneer" in sys.modules: del sys.modules["versioneer"] # this fixes the "python setup.py develop" case (also 'install' and # 'easy_install .'), in which subdependencies of the main project are # built (using setup.py bdist_egg) in the same python process. Assume # a main project A and a dependency B, which use different versions # of Versioneer. A's setup.py imports A's Versioneer, leaving it in # sys.modules by the time B's setup.py is executed, causing B to run # with the wrong versioneer. Setuptools wraps the sub-dep builds in a # sandbox that restores sys.modules to it's pre-build state, so the # parent is protected against the child's "import versioneer". By # removing ourselves from sys.modules here, before the child build # happens, we protect the child from the parent's versioneer too. # Also see https://github.com/warner/python-versioneer/issues/52 cmds = {} # we add "version" to both distutils and setuptools from distutils.core import Command class cmd_version(Command): description = "report generated version string" user_options = [] boolean_options = [] def initialize_options(self): pass def finalize_options(self): pass def run(self): vers = get_versions(verbose=True) print("Version: %s" % vers["version"]) print(" full-revisionid: %s" % vers.get("full-revisionid")) print(" dirty: %s" % vers.get("dirty")) if vers["error"]: print(" error: %s" % vers["error"]) cmds["version"] = cmd_version # we override "build_py" in both distutils and setuptools # # most invocation pathways end up running build_py: # distutils/build -> build_py # distutils/install -> distutils/build ->.. # setuptools/bdist_wheel -> distutils/install ->.. # setuptools/bdist_egg -> distutils/install_lib -> build_py # setuptools/install -> bdist_egg ->.. # setuptools/develop -> ? # we override different "build_py" commands for both environments if "setuptools" in sys.modules: from setuptools.command.build_py import build_py as _build_py else: from distutils.command.build_py import build_py as _build_py class cmd_build_py(_build_py): def run(self): root = get_root() cfg = get_config_from_root(root) versions = get_versions() _build_py.run(self) # now locate _version.py in the new build/ directory and replace # it with an updated value if cfg.versionfile_build: target_versionfile = os.path.join(self.build_lib, cfg.versionfile_build) print("UPDATING %s" % target_versionfile) write_to_version_file(target_versionfile, versions) cmds["build_py"] = cmd_build_py if "cx_Freeze" in sys.modules: # cx_freeze enabled? from cx_Freeze.dist import build_exe as _build_exe class cmd_build_exe(_build_exe): def run(self): root = get_root() cfg = get_config_from_root(root) versions = get_versions() target_versionfile = cfg.versionfile_source print("UPDATING %s" % target_versionfile) write_to_version_file(target_versionfile, versions) _build_exe.run(self) os.unlink(target_versionfile) with open(cfg.versionfile_source, "w") as f: LONG = LONG_VERSION_PY[cfg.VCS] f.write(LONG % {"DOLLAR": "$", "STYLE": cfg.style, "TAG_PREFIX": cfg.tag_prefix, "PARENTDIR_PREFIX": cfg.parentdir_prefix, "VERSIONFILE_SOURCE": cfg.versionfile_source, }) cmds["build_exe"] = cmd_build_exe del cmds["build_py"] # we override different "sdist" commands for both environments if "setuptools" in sys.modules: from setuptools.command.sdist import sdist as _sdist else: from distutils.command.sdist import sdist as _sdist class cmd_sdist(_sdist): def run(self): versions = get_versions() self._versioneer_generated_versions = versions # unless we update this, the command will keep using the old # version self.distribution.metadata.version = versions["version"] return _sdist.run(self) def make_release_tree(self, base_dir, files): root = get_root() cfg = get_config_from_root(root) _sdist.make_release_tree(self, base_dir, files) # now locate _version.py in the new base_dir directory # (remembering that it may be a hardlink) and replace it with an # updated value target_versionfile = os.path.join(base_dir, cfg.versionfile_source) print("UPDATING %s" % target_versionfile) write_to_version_file(target_versionfile, self._versioneer_generated_versions) cmds["sdist"] = cmd_sdist return cmds CONFIG_ERROR = """ setup.cfg is missing the necessary Versioneer configuration. You need a section like: [versioneer] VCS = git style = pep440 versionfile_source = src/myproject/_version.py versionfile_build = myproject/_version.py tag_prefix = parentdir_prefix = myproject- You will also need to edit your setup.py to use the results: import versioneer setup(version=versioneer.get_version(), cmdclass=versioneer.get_cmdclass(), ...) Please read the docstring in ./versioneer.py for configuration instructions, edit setup.cfg, and re-run the installer or 'python versioneer.py setup'. """ SAMPLE_CONFIG = """ # See the docstring in versioneer.py for instructions. Note that you must # re-run 'versioneer.py setup' after changing this section, and commit the # resulting files. [versioneer] #VCS = git #style = pep440 #versionfile_source = #versionfile_build = #tag_prefix = #parentdir_prefix = """ INIT_PY_SNIPPET = """ from ._version import get_versions __version__ = get_versions()['version'] del get_versions """ def do_setup(): """Main VCS-independent setup function for installing Versioneer.""" root = get_root() try: cfg = get_config_from_root(root) except (EnvironmentError, configparser.NoSectionError, configparser.NoOptionError) as e: if isinstance(e, (EnvironmentError, configparser.NoSectionError)): print("Adding sample versioneer config to setup.cfg", file=sys.stderr) with open(os.path.join(root, "setup.cfg"), "a") as f: f.write(SAMPLE_CONFIG) print(CONFIG_ERROR, file=sys.stderr) return 1 print(" creating %s" % cfg.versionfile_source) with open(cfg.versionfile_source, "w") as f: LONG = LONG_VERSION_PY[cfg.VCS] f.write(LONG % {"DOLLAR": "$", "STYLE": cfg.style, "TAG_PREFIX": cfg.tag_prefix, "PARENTDIR_PREFIX": cfg.parentdir_prefix, "VERSIONFILE_SOURCE": cfg.versionfile_source, }) ipy = os.path.join(os.path.dirname(cfg.versionfile_source), "__init__.py") if os.path.exists(ipy): try: with open(ipy, "r") as f: old = f.read() except EnvironmentError: old = "" if INIT_PY_SNIPPET not in old: print(" appending to %s" % ipy) with open(ipy, "a") as f: f.write(INIT_PY_SNIPPET) else: print(" %s unmodified" % ipy) else: print(" %s doesn't exist, ok" % ipy) ipy = None # Make sure both the top-level "versioneer.py" and versionfile_source # (PKG/_version.py, used by runtime code) are in MANIFEST.in, so # they'll be copied into source distributions. Pip won't be able to # install the package without this. manifest_in = os.path.join(root, "MANIFEST.in") simple_includes = set() try: with open(manifest_in, "r") as f: for line in f: if line.startswith("include "): for include in line.split()[1:]: simple_includes.add(include) except EnvironmentError: pass # That doesn't cover everything MANIFEST.in can do # (http://docs.python.org/2/distutils/sourcedist.html#commands), so # it might give some false negatives. Appending redundant 'include' # lines is safe, though. if "versioneer.py" not in simple_includes: print(" appending 'versioneer.py' to MANIFEST.in") with open(manifest_in, "a") as f: f.write("include versioneer.py\n") else: print(" 'versioneer.py' already in MANIFEST.in") if cfg.versionfile_source not in simple_includes: print(" appending versionfile_source ('%s') to MANIFEST.in" % cfg.versionfile_source) with open(manifest_in, "a") as f: f.write("include %s\n" % cfg.versionfile_source) else: print(" versionfile_source already in MANIFEST.in") # Make VCS-specific changes. For git, this means creating/changing # .gitattributes to mark _version.py for export-time keyword # substitution. do_vcs_install(manifest_in, cfg.versionfile_source, ipy) return 0 def scan_setup_py(): """Validate the contents of setup.py against Versioneer's expectations.""" found = set() setters = False errors = 0 with open("setup.py", "r") as f: for line in f.readlines(): if "import versioneer" in line: found.add("import") if "versioneer.get_cmdclass()" in line: found.add("cmdclass") if "versioneer.get_version()" in line: found.add("get_version") if "versioneer.VCS" in line: setters = True if "versioneer.versionfile_source" in line: setters = True if len(found) != 3: print("") print("Your setup.py appears to be missing some important items") print("(but I might be wrong). Please make sure it has something") print("roughly like the following:") print("") print(" import versioneer") print(" setup( version=versioneer.get_version(),") print(" cmdclass=versioneer.get_cmdclass(), ...)") print("") errors += 1 if setters: print("You should remove lines like 'versioneer.VCS = ' and") print("'versioneer.versionfile_source = ' . This configuration") print("now lives in setup.cfg, and should be removed from setup.py") print("") errors += 1 return errors if __name__ == "__main__": cmd = sys.argv[1] if cmd == "setup": errors = do_setup() errors += scan_setup_py() if errors: sys.exit(1)