crochet-1.4.0/0000775000175000017500000000000012522534160014637 5ustar itamarstitamarst00000000000000crochet-1.4.0/PKG-INFO0000664000175000017500000002151612522534160015741 0ustar itamarstitamarst00000000000000Metadata-Version: 1.1 Name: crochet Version: 1.4.0 Summary: Use Twisted anywhere! Home-page: https://github.com/itamarst/crochet Author: Itamar Turner-Trauring Author-email: itamar@itamarst.org License: MIT Description: Crochet: Use Twisted Anywhere! ============================== Crochet is an MIT-licensed library that makes it easier to use Twisted from regular blocking code. Some use cases include: * Easily use Twisted from a blocking framework like Django or Flask. * Write a library that provides a blocking API, but uses Twisted for its implementation. * Port blocking code to Twisted more easily, by keeping a backwards compatibility layer. * Allow normal Twisted programs that use threads to interact with Twisted more cleanly from their threaded parts. For example this can be useful when using Twisted as a `WSGI container`_. .. _WSGI container: https://twistedmatrix.com/documents/current/web/howto/web-in-60/wsgi.html Crochet is maintained by Itamar Turner-Trauring. Downloads are available on `PyPI`_. Documentation can be found on `Read The Docs`_. Bugs and feature requests should be filed at the project `Github page`_. .. _Read the Docs: https://crochet.readthedocs.org/ .. _Github page: https://github.com/itamarst/crochet/ .. _PyPI: https://pypi.python.org/pypi/crochet Features ======== Crochet aims for 100% unit test coverage, and supports Python 2.6, 2.7, 3.3 and 3.4 as well as PyPy. .. image:: https://travis-ci.org/itamarst/crochet.png?branch=master :target: http://travis-ci.org/itamarst/crochet :alt: Build Status Crochet provides the following general features: * Allow blocking code to call into Twisted and block until results are available or a timeout is hit, using the ``crochet.wait_for`` decorator. * A lower-level API (``crochet.run_in_reactor``) allows blocking code to run code "in the background" in the Twisted thread, with ability to repeatedly check if it's done. Additionally Crochet can: * Transparently start Twisted's reactor in a thread it manages. * The reactor shuts down automatically when the process' main thread finishes. * Hooks up Twisted's log system to the Python standard library ``logging`` framework. Unlike Twisted's built-in ``logging`` bridge, this includes support for blocking `Handler` instances. What's New ========== 1.4.0 ^^^^^ New features: * Added support for Python 3.4. Documentation: * Added a section on known issues and workarounds. Bug fixes: * Main thread detection (used to determine when Crochet should shutdown) is now less fragile. This means Crochet now supports more environments, e.g. uWSGI. Thanks to Ben Picolo for the patch. 1.3.0 ^^^^^ Bug fixes: * It is now possible to call ``EventualResult.wait()`` (or functions wrapped in ``wait_for``) at import time if another thread holds the import lock. Thanks to Ken Struys for the patch. 1.2.0 ^^^^^ New features: * ``crochet.wait_for`` implements the timeout/cancellation pattern documented in previous versions of Crochet. ``crochet.wait_for_reactor`` and ``EventualResult.wait(timeout=None)`` are now deprecated, since lacking timeouts they could potentially block forever. * Functions wrapped with ``wait_for`` and ``run_in_reactor`` can now be accessed via the ``wrapped_function`` attribute, to ease unit testing of the underlying Twisted code. API changes: * It is no longer possible to call ``EventualResult.wait()`` (or functions wrapped with ``wait_for``) at import time, since this can lead to deadlocks or prevent other threads from importing. Thanks to Tom Prince for the bug report. Bug fixes: * ``warnings`` are no longer erroneously turned into Twisted log messages. * The reactor is now only imported when ``crochet.setup()`` or ``crochet.no_setup()`` are called, allowing daemonization if only ``crochet`` is imported (http://tm.tl/7105). Thanks to Daniel Nephin for the bug report. Documentation: * Improved motivation, added contact info and news to the documentation. * Better example of using Crochet from a normal Twisted application. 1.1.0 ^^^^^ Bug fixes: * ``EventualResult.wait()`` can now be used safely from multiple threads, thanks to Gavin Panella for reporting the bug. * Fixed reentrancy deadlock in the logging code caused by http://bugs.python.org/issue14976, thanks to Rod Morehead for reporting the bug. * Crochet now installs on Python 3.3 again, thanks to Ben Cordero. * Crochet should now work on Windows, thanks to Konstantinos Koukopoulos. * Crochet tests can now run without adding its absolute path to PYTHONPATH or installing it first. Documentation: * ``EventualResult.original_failure`` is now documented. 1.0.0 ^^^^^ Documentation: * Added section on use cases and alternatives. Thanks to Tobias Oberstein for the suggestion. Bug fixes: * Twisted does not have to be pre-installed to run ``setup.py``, thanks to Paul Weaver for bug report and Chris Scutcher for patch. * Importing Crochet does not have side-effects (installing reactor event) any more. * Blocking calls are interrupted earlier in the shutdown process, to reduce scope for deadlocks. Thanks to rmorehead for bug report. 0.9.0 ^^^^^ New features: * Expanded and much improved documentation, including a new section with design suggestions. * New decorator ``@wait_for_reactor`` added, a simpler alternative to ``@run_in_reactor``. * Refactored ``@run_in_reactor``, making it a bit more responsive. * Blocking operations which would otherwise never finish due to reactor having stopped (``EventualResult.wait()`` or ``@wait_for_reactor`` decorated call) will be interrupted with a ``ReactorStopped`` exception. Thanks to rmorehead for the bug report. Bug fixes: * ``@run_in_reactor`` decorated functions (or rather, their generated wrapper) are interrupted by Ctrl-C. * On POSIX platforms, a workaround is installed to ensure processes started by `reactor.spawnProcess` have their exit noticed. See `Twisted ticket 6378`_ for more details about the underlying issue. .. _Twisted ticket 6378: http://tm.tl/6738 0.8.1 ^^^^^ * ``EventualResult.wait()`` now raises error if called in the reactor thread, thanks to David Buchmann. * Unittests are now included in the release tarball. * Allow Ctrl-C to interrupt ``EventualResult.wait(timeout=None)``. 0.7.0 ^^^^^ * Improved documentation. 0.6.0 ^^^^^ * Renamed ``DeferredResult`` to ``EventualResult``, to reduce confusion with Twisted's ``Deferred`` class. The old name still works, but is deprecated. * Deprecated ``@in_reactor``, replaced with ``@run_in_reactor`` which doesn't change the arguments to the wrapped function. The deprecated API still works, however. * Unhandled exceptions in ``EventualResult`` objects are logged. * Added more examples. * ``setup.py sdist`` should work now. 0.5.0 ^^^^^ * Initial release. Keywords: twisted threading Platform: UNKNOWN Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2.6 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy crochet-1.4.0/README.rst0000664000175000017500000000407012516747054016342 0ustar itamarstitamarst00000000000000Crochet: Use Twisted Anywhere! ============================== Crochet is an MIT-licensed library that makes it easier to use Twisted from regular blocking code. Some use cases include: * Easily use Twisted from a blocking framework like Django or Flask. * Write a library that provides a blocking API, but uses Twisted for its implementation. * Port blocking code to Twisted more easily, by keeping a backwards compatibility layer. * Allow normal Twisted programs that use threads to interact with Twisted more cleanly from their threaded parts. For example this can be useful when using Twisted as a `WSGI container`_. .. _WSGI container: https://twistedmatrix.com/documents/current/web/howto/web-in-60/wsgi.html Crochet is maintained by Itamar Turner-Trauring. Downloads are available on `PyPI`_. Documentation can be found on `Read The Docs`_. Bugs and feature requests should be filed at the project `Github page`_. .. _Read the Docs: https://crochet.readthedocs.org/ .. _Github page: https://github.com/itamarst/crochet/ .. _PyPI: https://pypi.python.org/pypi/crochet Features ======== Crochet aims for 100% unit test coverage, and supports Python 2.6, 2.7, 3.3 and 3.4 as well as PyPy. .. image:: https://travis-ci.org/itamarst/crochet.png?branch=master :target: http://travis-ci.org/itamarst/crochet :alt: Build Status Crochet provides the following general features: * Allow blocking code to call into Twisted and block until results are available or a timeout is hit, using the ``crochet.wait_for`` decorator. * A lower-level API (``crochet.run_in_reactor``) allows blocking code to run code "in the background" in the Twisted thread, with ability to repeatedly check if it's done. Additionally Crochet can: * Transparently start Twisted's reactor in a thread it manages. * The reactor shuts down automatically when the process' main thread finishes. * Hooks up Twisted's log system to the Python standard library ``logging`` framework. Unlike Twisted's built-in ``logging`` bridge, this includes support for blocking `Handler` instances. crochet-1.4.0/crochet/0000775000175000017500000000000012522534160016266 5ustar itamarstitamarst00000000000000crochet-1.4.0/crochet/tests/0000775000175000017500000000000012522534160017430 5ustar itamarstitamarst00000000000000crochet-1.4.0/crochet/tests/test_util.py0000664000175000017500000000327112127024027022016 0ustar itamarstitamarst00000000000000""" Tests for crochet._util. """ from __future__ import absolute_import from twisted.trial.unittest import TestCase from .._util import synchronized class FakeLock(object): locked = False def __enter__(self): self.locked = True def __exit__(self, type, value, traceback): self.locked = False class Lockable(object): def __init__(self): self._lock = FakeLock() @synchronized def check(self, x, y): if not self._lock.locked: raise RuntimeError() return x, y @synchronized def raiser(self): if not self._lock.locked: raise RuntimeError() raise ZeroDivisionError() class SynchronizedTests(TestCase): """ Tests for the synchronized decorator. """ def test_return(self): """ A method wrapped with @synchronized is called with the lock acquired, and it is released on return. """ obj = Lockable() self.assertEqual(obj.check(1, y=2), (1, 2)) self.assertFalse(obj._lock.locked) def test_raise(self): """ A method wrapped with @synchronized is called with the lock acquired, and it is released on exception raise. """ obj = Lockable() self.assertRaises(ZeroDivisionError, obj.raiser) self.assertFalse(obj._lock.locked) def test_name(self): """ A method wrapped with @synchronized preserves its name. """ self.assertEqual(Lockable.check.__name__, "check") def test_marked(self): """ A method wrapped with @synchronized is marked as synchronized. """ self.assertEqual(Lockable.check.synchronized, True) crochet-1.4.0/crochet/tests/test_resultstore.py0000664000175000017500000000446112342417332023442 0ustar itamarstitamarst00000000000000""" Tests for _resultstore. """ from twisted.trial.unittest import TestCase from twisted.internet.defer import Deferred, fail, succeed from .._resultstore import ResultStore from .._eventloop import EventualResult class ResultStoreTests(TestCase): """ Tests for ResultStore. """ def test_store_and_retrieve(self): """ EventualResult instances be be stored in a ResultStore and then retrieved using the id returned from store(). """ store = ResultStore() dr = EventualResult(Deferred(), None) uid = store.store(dr) self.assertIdentical(store.retrieve(uid), dr) def test_retrieve_only_once(self): """ Once a result is retrieved, it can no longer be retrieved again. """ store = ResultStore() dr = EventualResult(Deferred(), None) uid = store.store(dr) store.retrieve(uid) self.assertRaises(KeyError, store.retrieve, uid) def test_synchronized(self): """ store() and retrieve() are synchronized. """ self.assertTrue(ResultStore.store.synchronized) self.assertTrue(ResultStore.retrieve.synchronized) self.assertTrue(ResultStore.log_errors.synchronized) def test_uniqueness(self): """ Each store() operation returns a larger number, ensuring uniqueness. """ store = ResultStore() dr = EventualResult(Deferred(), None) previous = store.store(dr) for i in range(100): store.retrieve(previous) dr = EventualResult(Deferred(), None) uid = store.store(dr) self.assertTrue(uid > previous) previous = uid def test_log_errors(self): """ Unretrieved EventualResults have their errors, if any, logged on shutdown. """ store = ResultStore() store.store(EventualResult(Deferred(), None)) store.store(EventualResult(fail(ZeroDivisionError()), None)) store.store(EventualResult(succeed(1), None)) store.store(EventualResult(fail(RuntimeError()), None)) store.log_errors() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) excs = self.flushLoggedErrors(RuntimeError) self.assertEqual(len(excs), 1) crochet-1.4.0/crochet/tests/test_api.py0000664000175000017500000007700712516747054021640 0ustar itamarstitamarst00000000000000""" Tests for the crochet APIs. """ from __future__ import absolute_import import threading import subprocess import time import gc import sys import weakref import tempfile import os import imp from twisted.trial.unittest import TestCase from twisted.internet.defer import succeed, Deferred, fail, CancelledError from twisted.python.failure import Failure from twisted.python import threadable from twisted.python.runtime import platform if platform.type == "posix": try: from twisted.internet.process import reapAllProcesses except (SyntaxError, ImportError): if sys.version_info < (3, 3, 0): raise else: # Process support is still not ported to Python 3 on some versions # of Twisted. reapAllProcesses = None else: # waitpid() is only necessary on POSIX: reapAllProcesses = None from .._eventloop import (EventLoop, EventualResult, TimeoutError, ResultRegistry, ReactorStopped) from .test_setup import FakeReactor from .. import (_main, setup, in_reactor, retrieve_result, _store, no_setup, run_in_reactor, wait_for_reactor, wait_for) from ..tests import crochet_directory class ResultRegistryTests(TestCase): """ Tests for ResultRegistry. """ def test_stopped_registered(self): """ ResultRegistery.stop() fires registered EventualResult with ReactorStopped. """ registry = ResultRegistry(FakeReactor()) er = EventualResult(None, None) registry.register(er) registry.stop() self.assertRaises(ReactorStopped, er.wait, timeout=0) def test_stopped_new_registration(self): """ After ResultRegistery.stop() is called subsequent register() calls raise ReactorStopped. """ registry = ResultRegistry(FakeReactor()) er = EventualResult(None, None) registry.stop() self.assertRaises(ReactorStopped, registry.register, er) def test_stopped_already_have_result(self): """ ResultRegistery.stop() has no impact on registered EventualResult which already have a result. """ registry = ResultRegistry(FakeReactor()) er = EventualResult(succeed(123), None) registry.register(er) registry.stop() self.assertEqual(er.wait(), 123) self.assertEqual(er.wait(), 123) self.assertEqual(er.wait(), 123) def test_weakref(self): """ Registering an EventualResult with a ResultRegistry does not prevent it from being garbage collected. """ registry = ResultRegistry(FakeReactor()) er = EventualResult(None, None) registry.register(er) ref = weakref.ref(er) del er gc.collect() self.assertIdentical(ref(), None) def test_runs_with_lock(self): """ All code in ResultRegistry.stop() and register() is protected by a lock. """ self.assertTrue(ResultRegistry.stop.synchronized) self.assertTrue(ResultRegistry.register.synchronized) class EventualResultTests(TestCase): """ Tests for EventualResult. """ def setUp(self): self.patch(threadable, "isInIOThread", lambda: False) def test_success_result(self): """ wait() returns the value the Deferred fired with. """ dr = EventualResult(succeed(123), None) self.assertEqual(dr.wait(), 123) def test_later_success_result(self): """ wait() returns the value the Deferred fired with, in the case where the Deferred is fired after wait() is called. """ d = Deferred() def fireSoon(): import time; time.sleep(0.01) d.callback(345) threading.Thread(target=fireSoon).start() dr = EventualResult(d, None) self.assertEqual(dr.wait(), 345) def test_success_result_twice(self): """ A second call to wait() returns same value as the first call. """ dr = EventualResult(succeed(123), None) self.assertEqual(dr.wait(), 123) self.assertEqual(dr.wait(), 123) def test_failure_result(self): """ wait() raises the exception the Deferred fired with. """ dr = EventualResult(fail(RuntimeError()), None) self.assertRaises(RuntimeError, dr.wait) def test_later_failure_result(self): """ wait() raises the exception the Deferred fired with, in the case where the Deferred is fired after wait() is called. """ d = Deferred() def fireSoon(): time.sleep(0.01) d.errback(RuntimeError()) threading.Thread(target=fireSoon).start() dr = EventualResult(d, None) self.assertRaises(RuntimeError, dr.wait) def test_failure_result_twice(self): """ A second call to wait() raises same value as the first call. """ dr = EventualResult(fail(ZeroDivisionError()), None) self.assertRaises(ZeroDivisionError, dr.wait) self.assertRaises(ZeroDivisionError, dr.wait) def test_timeout(self): """ If no result is available, wait(timeout) will throw a TimeoutError. """ start = time.time() dr = EventualResult(Deferred(), None) self.assertRaises(TimeoutError, dr.wait, timeout=0.03) self.assertTrue(abs(time.time() - start - 0.03) < 0.005) def test_timeout_twice(self): """ If no result is available, a second call to wait(timeout) will also result in a TimeoutError exception. """ dr = EventualResult(Deferred(), None) self.assertRaises(TimeoutError, dr.wait, timeout=0.01) self.assertRaises(TimeoutError, dr.wait, timeout=0.01) def test_timeout_then_result(self): """ If a result becomes available after a timeout, a second call to wait() will return it. """ d = Deferred() dr = EventualResult(d, None) self.assertRaises(TimeoutError, dr.wait, timeout=0.01) d.callback(u"value") self.assertEqual(dr.wait(), u"value") self.assertEqual(dr.wait(), u"value") def test_reactor_thread_disallowed(self): """ wait() cannot be called from the reactor thread. """ self.patch(threadable, "isInIOThread", lambda: True) d = Deferred() dr = EventualResult(d, None) self.assertRaises(RuntimeError, dr.wait, 0) def test_cancel(self): """ cancel() cancels the wrapped Deferred, running cancellation in the event loop thread. """ reactor = FakeReactor() cancelled = [] def error(f): cancelled.append(reactor.in_call_from_thread) cancelled.append(f) d = Deferred().addErrback(error) dr = EventualResult(d, _reactor=reactor) dr.cancel() self.assertTrue(cancelled[0]) self.assertIsInstance(cancelled[1].value, CancelledError) def test_stash(self): """ EventualResult.stash() stores the object in the global ResultStore. """ dr = EventualResult(Deferred(), None) uid = dr.stash() self.assertIdentical(dr, _store.retrieve(uid)) def test_original_failure(self): """ original_failure() returns the underlying Failure of the Deferred wrapped by the EventualResult. """ try: 1/0 except: f = Failure() dr = EventualResult(fail(f), None) self.assertIdentical(dr.original_failure(), f) def test_original_failure_no_result(self): """ If there is no result yet, original_failure() returns None. """ dr = EventualResult(Deferred(), None) self.assertIdentical(dr.original_failure(), None) def test_original_failure_not_error(self): """ If the result is not an error, original_failure() returns None. """ dr = EventualResult(succeed(3), None) self.assertIdentical(dr.original_failure(), None) def test_error_logged_no_wait(self): """ If the result is an error and wait() was never called, the error will be logged once the EventualResult is garbage-collected. """ dr = EventualResult(fail(ZeroDivisionError()), None) del dr gc.collect() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) def test_error_logged_wait_timeout(self): """ If the result is an error and wait() was called but timed out, the error will be logged once the EventualResult is garbage-collected. """ d = Deferred() dr = EventualResult(d, None) try: dr.wait(0) except TimeoutError: pass d.errback(ZeroDivisionError()) del dr if sys.version_info[0] == 2: sys.exc_clear() gc.collect() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) def test_error_after_gc_logged(self): """ If the result is an error that occurs after all user references to the EventualResult are lost, the error is still logged. """ d = Deferred() dr = EventualResult(d, None) del dr d.errback(ZeroDivisionError()) gc.collect() excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) def test_control_c_is_possible(self): """ If you're wait()ing on an EventualResult in main thread, make sure the KeyboardInterrupt happens in timely manner. """ program = """\ import os, threading, signal, time, sys import crochet crochet.setup() from twisted.internet.defer import Deferred if sys.platform.startswith('win'): signal.signal(signal.SIGBREAK, signal.default_int_handler) sig_int=signal.CTRL_BREAK_EVENT sig_kill=signal.SIGTERM else: sig_int=signal.SIGINT sig_kill=signal.SIGKILL def interrupt(): time.sleep(0.1) # Make sure we've hit wait() os.kill(os.getpid(), sig_int) time.sleep(1) # Still running, test shall fail... os.kill(os.getpid(), sig_kill) t = threading.Thread(target=interrupt) t.setDaemon(True) t.start() d = Deferred() e = crochet.EventualResult(d, None) try: # Queue.get() has special non-interruptible behavior if not given timeout, # so don't give timeout here. e.wait() except KeyboardInterrupt: sys.exit(23) """ kw = { 'cwd': crochet_directory } # on Windows the only way to interrupt a subprocess reliably is to # create a new process group: # http://docs.python.org/2/library/subprocess.html#subprocess.CREATE_NEW_PROCESS_GROUP if platform.type.startswith('win'): kw['creationflags'] = subprocess.CREATE_NEW_PROCESS_GROUP process = subprocess.Popen([sys.executable, "-c", program], **kw) self.assertEqual(process.wait(), 23) def test_connect_deferred(self): """ If an EventualResult is created with None, EventualResult._connect_deferred can be called later to register a Deferred as the one it is wrapping. """ er = EventualResult(None, None) self.assertRaises(TimeoutError, er.wait, 0) d = Deferred() er._connect_deferred(d) self.assertRaises(TimeoutError, er.wait, 0) d.callback(123) self.assertEqual(er.wait(), 123) def test_reactor_stop_unblocks_EventualResult(self): """ Any EventualResult.wait() calls still waiting when the reactor has stopped will get a ReactorStopped exception. """ program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred from twisted.internet import reactor import crochet crochet.setup() @crochet.run_in_reactor def run(): reactor.callLater(0.1, reactor.stop) return Deferred() er = run() try: er.wait(timeout=10) except crochet.ReactorStopped: sys.exit(23) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) def test_reactor_stop_unblocks_EventualResult_in_threadpool(self): """ Any EventualResult.wait() calls still waiting when the reactor has stopped will get a ReactorStopped exception, even if it is running in Twisted's thread pool. """ program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred from twisted.internet import reactor import crochet crochet.setup() @crochet.run_in_reactor def run(): reactor.callLater(0.1, reactor.stop) return Deferred() result = [13] def inthread(): er = run() try: er.wait(timeout=10) except crochet.ReactorStopped: result[0] = 23 reactor.callInThread(inthread) time.sleep(1) sys.exit(result[0]) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) def test_immediate_cancel(self): """ Immediately cancelling the result of @run_in_reactor function will still cancel the Deferred. """ # This depends on the way reactor runs callFromThread calls, so need # real functional test. program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred, CancelledError import crochet crochet.setup() @crochet.run_in_reactor def run(): return Deferred() er = run() er.cancel() try: er.wait(1) except CancelledError: sys.exit(23) else: sys.exit(3) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory,) self.assertEqual(process.wait(), 23) def test_noWaitingDuringImport(self): """ EventualResult.wait() raises an exception if called while a module is being imported. This prevents the imports from taking a long time, preventing other imports from running in other threads. It also prevents deadlocks, which can happen if the code being waited on also tries to import something. """ if sys.version_info[0] > 2: from unittest import SkipTest raise SkipTest("This test is too fragile (and insufficient) on " "Python 3 - see " "https://github.com/itamarst/crochet/issues/43") directory = tempfile.mktemp() os.mkdir(directory) sys.path.append(directory) self.addCleanup(sys.path.remove, directory) with open(os.path.join(directory, "shouldbeunimportable.py"), "w") as f: f.write("""\ from crochet import EventualResult from twisted.internet.defer import Deferred EventualResult(Deferred(), None).wait(1.0) """) self.assertRaises(RuntimeError, __import__, "shouldbeunimportable") def test_waiting_during_different_thread_importing(self): """ EventualResult.wait() should work if called while a module is being imported in a different thread. See EventualResultTests.test_noWaitingDuringImport for the explanation of what should happen if an import is happening in the current thread. """ test_complete = threading.Event() lock_held = threading.Event() er = EventualResult(succeed(123), None) def other_thread(): imp.acquire_lock() lock_held.set() test_complete.wait() imp.release_lock() t = threading.Thread(target=other_thread) t.start() lock_held.wait() # While the imp lock is held by the other thread, we can't # allow exceptions/assertions to happen because trial will # try to do an import causing a deadlock instead of a # failure. We collect all assertion pairs (result, expected), # wait for the import lock to be released, and then check our # assertions at the end of the test. assertions = [] # we want to run .wait while the other thread has the lock acquired assertions.append((imp.lock_held(), True)) try: assertions.append((er.wait(), 123)) finally: test_complete.set() assertions.append((imp.lock_held(), True)) test_complete.set() t.join() [self.assertEqual(result, expected) for result, expected in assertions] class InReactorTests(TestCase): """ Tests for the deprecated in_reactor decorator. """ def test_name(self): """ The function decorated with in_reactor has the same name as the original function. """ c = EventLoop(lambda: FakeReactor(), lambda f, g: None) @c.in_reactor def some_name(reactor): pass self.assertEqual(some_name.__name__, "some_name") def test_in_reactor_thread(self): """ The function decorated with in_reactor is run in the reactor thread, and takes the reactor as its first argument. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] @c.in_reactor def func(reactor, a, b, c): self.assertIdentical(reactor, myreactor) self.assertTrue(reactor.in_call_from_thread) calls.append((a, b, c)) func(1, 2, c=3) self.assertEqual(calls, [(1, 2, 3)]) def test_run_in_reactor_wrapper(self): """ in_reactor is implemented on top of run_in_reactor. """ wrapped = [False] def fake_run_in_reactor(function): def wrapper(*args, **kwargs): wrapped[0] = True result = function(*args, **kwargs) wrapped[0] = False return result return wrapper myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() c.run_in_reactor = fake_run_in_reactor @c.in_reactor def func(reactor): self.assertTrue(wrapped[0]) return 17 result = func() self.assertFalse(wrapped[0]) self.assertEqual(result, 17) class RunInReactorTests(TestCase): """ Tests for the run_in_reactor decorator. """ def test_name(self): """ The function decorated with run_in_reactor has the same name as the original function. """ c = EventLoop(lambda: FakeReactor(), lambda f, g: None) @c.run_in_reactor def some_name(): pass self.assertEqual(some_name.__name__, "some_name") def test_run_in_reactor_thread(self): """ The function decorated with run_in_reactor is run in the reactor thread. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() calls = [] @c.run_in_reactor def func(a, b, c): self.assertTrue(myreactor.in_call_from_thread) calls.append((a, b, c)) func(1, 2, c=3) self.assertEqual(calls, [(1, 2, 3)]) def make_wrapped_function(self): """ Return a function wrapped with run_in_reactor that returns its first argument. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() @c.run_in_reactor def passthrough(argument): return argument return passthrough def test_deferred_success_result(self): """ If the underlying function returns a Deferred, the wrapper returns a EventualResult hooked up to the Deferred. """ passthrough = self.make_wrapped_function() result = passthrough(succeed(123)) self.assertIsInstance(result, EventualResult) self.assertEqual(result.wait(), 123) def test_deferred_failure_result(self): """ If the underlying function returns a Deferred, the wrapper returns a EventualResult hooked up to the Deferred that can deal with failures as well. """ passthrough = self.make_wrapped_function() result = passthrough(fail(ZeroDivisionError())) self.assertIsInstance(result, EventualResult) self.assertRaises(ZeroDivisionError, result.wait) def test_regular_result(self): """ If the underlying function returns a non-Deferred, the wrapper returns a EventualResult hooked up to a Deferred wrapping the result. """ passthrough = self.make_wrapped_function() result = passthrough(123) self.assertIsInstance(result, EventualResult) self.assertEqual(result.wait(), 123) def test_exception_result(self): """ If the underlying function throws an exception, the wrapper returns a EventualResult hooked up to a Deferred wrapping the exception. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() @c.run_in_reactor def raiser(): 1/0 result = raiser() self.assertIsInstance(result, EventualResult) self.assertRaises(ZeroDivisionError, result.wait) def test_registry(self): """ @run_in_reactor registers the EventualResult in the ResultRegistry. """ myreactor = FakeReactor() c = EventLoop(lambda: myreactor, lambda f, g: None) c.no_setup() @c.run_in_reactor def run(): return result = run() self.assertIn(result, c._registry._results) def test_wrapped_function(self): """ The function wrapped by @run_in_reactor can be accessed via the `wrapped_function` attribute. """ c = EventLoop(lambda: None, lambda f, g: None) def func(): pass wrapper = c.run_in_reactor(func) self.assertIdentical(wrapper.wrapped_function, func) class WaitTestsMixin(object): """ Tests mixin for the wait_for_reactor/wait_for decorators. """ def setUp(self): self.reactor = FakeReactor() self.eventloop = EventLoop(lambda: self.reactor, lambda f, g: None) self.eventloop.no_setup() def decorator(self): """ Return a callable that decorates a function, using the decorator being tested. """ raise NotImplementedError() def make_wrapped_function(self): """ Return a function wrapped with the decorator being tested that returns its first argument, or raises it if it's an exception. """ decorator = self.decorator() @decorator def passthrough(argument): if isinstance(argument, Exception): raise argument return argument return passthrough def test_name(self): """ The function decorated with the wait decorator has the same name as the original function. """ decorator = self.decorator() @decorator def some_name(argument): pass self.assertEqual(some_name.__name__, "some_name") def test_wrapped_function(self): """ The function wrapped by the wait decorator can be accessed via the `wrapped_function` attribute. """ decorator = self.decorator() def func(): pass wrapper = decorator(func) self.assertIdentical(wrapper.wrapped_function, func) def test_reactor_thread_disallowed(self): """ Functions decorated with the wait decorator cannot be called from the reactor thread. """ self.patch(threadable, "isInIOThread", lambda: True) f = self.make_wrapped_function() self.assertRaises(RuntimeError, f, None) def test_wait_for_reactor_thread(self): """ The function decorated with the wait decorator is run in the reactor thread. """ in_call_from_thread = [] decorator = self.decorator() @decorator def func(): in_call_from_thread.append(self.reactor.in_call_from_thread) in_call_from_thread.append(self.reactor.in_call_from_thread) func() in_call_from_thread.append(self.reactor.in_call_from_thread) self.assertEqual(in_call_from_thread, [False, True, False]) def test_arguments(self): """ The function decorated with wait decorator gets all arguments passed to the wrapper. """ calls = [] decorator = self.decorator() @decorator def func(a, b, c): calls.append((a, b, c)) func(1, 2, c=3) self.assertEqual(calls, [(1, 2, 3)]) def test_deferred_success_result(self): """ If the underlying function returns a Deferred, the wrapper returns a the Deferred's result. """ passthrough = self.make_wrapped_function() result = passthrough(succeed(123)) self.assertEqual(result, 123) def test_deferred_failure_result(self): """ If the underlying function returns a Deferred with an errback, the wrapper throws an exception. """ passthrough = self.make_wrapped_function() self.assertRaises( ZeroDivisionError, passthrough, fail(ZeroDivisionError())) def test_regular_result(self): """ If the underlying function returns a non-Deferred, the wrapper returns that result. """ passthrough = self.make_wrapped_function() result = passthrough(123) self.assertEqual(result, 123) def test_exception_result(self): """ If the underlying function throws an exception, the wrapper raises that exception. """ raiser = self.make_wrapped_function() self.assertRaises(ZeroDivisionError, raiser, ZeroDivisionError()) def test_control_c_is_possible(self): """ A call to a decorated function responds to a Ctrl-C (i.e. with a KeyboardInterrupt) in a timely manner. """ program = """\ import os, threading, signal, time, sys import crochet crochet.setup() from twisted.internet.defer import Deferred if sys.platform.startswith('win'): signal.signal(signal.SIGBREAK, signal.default_int_handler) sig_int=signal.CTRL_BREAK_EVENT sig_kill=signal.SIGTERM else: sig_int=signal.SIGINT sig_kill=signal.SIGKILL def interrupt(): time.sleep(0.1) # Make sure we've hit wait() os.kill(os.getpid(), sig_int) time.sleep(1) # Still running, test shall fail... os.kill(os.getpid(), sig_kill) t = threading.Thread(target=interrupt) t.setDaemon(True) t.start() @crochet.%s def wait(): return Deferred() try: wait() except KeyboardInterrupt: sys.exit(23) """ % (self.DECORATOR_CALL,) kw = { 'cwd': crochet_directory } if platform.type.startswith('win'): kw['creationflags'] = subprocess.CREATE_NEW_PROCESS_GROUP process = subprocess.Popen([sys.executable, "-c", program], **kw) self.assertEqual(process.wait(), 23) def test_reactor_stop_unblocks(self): """ Any @wait_for_reactor-decorated calls still waiting when the reactor has stopped will get a ReactorStopped exception. """ program = """\ import os, threading, signal, time, sys from twisted.internet.defer import Deferred from twisted.internet import reactor import crochet crochet.setup() @crochet.%s def run(): reactor.callLater(0.1, reactor.stop) return Deferred() try: er = run() except crochet.ReactorStopped: sys.exit(23) """ % (self.DECORATOR_CALL,) process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) class WaitForReactorTests(WaitTestsMixin, TestCase): """ Tests for the wait_for_reactor decorator. """ DECORATOR_CALL = "wait_for_reactor" def decorator(self): return self.eventloop.wait_for_reactor class WaitForTests(WaitTestsMixin, TestCase): """ Tests for the wait_for_reactor decorator. """ DECORATOR_CALL = "wait_for(timeout=5)" def decorator(self): return lambda func: self.eventloop.wait_for(timeout=5)(func) def test_timeoutRaises(self): """ If a function wrapped with wait_for hits the timeout, it raises TimeoutError. """ @self.eventloop.wait_for(timeout=0.5) def times_out(): return Deferred().addErrback(lambda f: f.trap(CancelledError)) start = time.time() self.assertRaises(TimeoutError, times_out) self.assertTrue(abs(time.time() - start - 0.5) < 0.1) def test_timeoutCancels(self): """ If a function wrapped with wait_for hits the timeout, it cancels the underlying Deferred. """ result = Deferred() error = [] result.addErrback(error.append) @self.eventloop.wait_for(timeout=0.0) def times_out(): return result self.assertRaises(TimeoutError, times_out) self.assertIsInstance(error[0].value, CancelledError) class PublicAPITests(TestCase): """ Tests for the public API. """ def test_no_sideeffects(self): """ Creating an EventLoop object, as is done in crochet.__init__, does not call any methods on the objects it is created with. """ c = EventLoop(lambda: None, lambda f, g: 1/0, lambda *args: 1/0, watchdog_thread=object(), reapAllProcesses=lambda: 1/0) del c def test_eventloop_api(self): """ An EventLoop object configured with the real reactor and _shutdown.register is exposed via its public methods. """ from twisted.python.log import startLoggingWithObserver from crochet import _shutdown self.assertIsInstance(_main, EventLoop) self.assertEqual(_main.setup, setup) self.assertEqual(_main.no_setup, no_setup) self.assertEqual(_main.in_reactor, in_reactor) self.assertEqual(_main.run_in_reactor, run_in_reactor) self.assertEqual(_main.wait_for_reactor, wait_for_reactor) self.assertEqual(_main.wait_for, wait_for) self.assertIdentical(_main._atexit_register, _shutdown.register) self.assertIdentical(_main._startLoggingWithObserver, startLoggingWithObserver) self.assertIdentical(_main._watchdog_thread, _shutdown._watchdog) def test_eventloop_api_reactor(self): """ The publicly exposed EventLoop will, when setup, use the global reactor. """ from twisted.internet import reactor _main.no_setup() self.assertIdentical(_main._reactor, reactor) def test_retrieve_result(self): """ retrieve_result() calls retrieve() on the global ResultStore. """ dr = EventualResult(Deferred(), None) uid = dr.stash() self.assertIdentical(dr, retrieve_result(uid)) def test_reapAllProcesses(self): """ An EventLoop object configured with the real reapAllProcesses on POSIX plaforms. """ self.assertIdentical(_main._reapAllProcesses, reapAllProcesses) if platform.type != "posix": test_reapAllProcesses.skip = "Only relevant on POSIX platforms" if reapAllProcesses is None: test_reapAllProcesses.skip = "Twisted does not yet support processes" crochet-1.4.0/crochet/tests/test_shutdown.py0000664000175000017500000000647412264332676022742 0ustar itamarstitamarst00000000000000""" Tests for _shutdown. """ from __future__ import absolute_import import sys import subprocess import time from twisted.trial.unittest import TestCase from crochet._shutdown import (Watchdog, FunctionRegistry, _watchdog, register, _registry) from ..tests import crochet_directory class ShutdownTests(TestCase): """ Tests for shutdown registration. """ def test_shutdown(self): """ A function registered with _shutdown.register() is called when the main thread exits. """ program = """\ import threading, sys from crochet._shutdown import register, _watchdog _watchdog.start() end = False def thread(): while not end: pass sys.stdout.write("byebye") sys.stdout.flush() def stop(x, y): # Move this into separate test at some point. assert x == 1 assert y == 2 global end end = True threading.Thread(target=thread).start() register(stop, 1, y=2) sys.exit() """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory, stdout=subprocess.PIPE) result = process.stdout.read() self.assertEqual(process.wait(), 0) self.assertEqual(result, b"byebye") def test_watchdog(self): """ The watchdog thread exits when the thread it is watching exits, and calls its shutdown function. """ done = [] alive = True class FakeThread: def is_alive(self): return alive w = Watchdog(FakeThread(), lambda: done.append(True)) w.start() time.sleep(0.2) self.assertTrue(w.is_alive()) self.assertFalse(done) alive = False time.sleep(0.2) self.assertTrue(done) self.assertFalse(w.is_alive()) def test_api(self): """ The module exposes a shutdown thread that will call a global registry's run(), and a register function tied to the global registry. """ self.assertIsInstance(_registry, FunctionRegistry) self.assertEqual(register, _registry.register) self.assertIsInstance(_watchdog, Watchdog) self.assertEqual(_watchdog._shutdown_function, _registry.run) class FunctionRegistryTests(TestCase): """ Tests for FunctionRegistry. """ def test_called(self): """ Functions registered with a FunctionRegistry are called in reverse order by run(). """ result = [] registry = FunctionRegistry() registry.register(lambda: result.append(1)) registry.register(lambda x: result.append(x), 2) registry.register(lambda y: result.append(y), y=3) registry.run() self.assertEqual(result, [3, 2, 1]) def test_log_errors(self): """ Registered functions that raise an error have the error logged, and run() continues processing. """ result = [] registry = FunctionRegistry() registry.register(lambda: result.append(2)) registry.register(lambda: 1/0) registry.register(lambda: result.append(1)) registry.run() self.assertEqual(result, [1, 2]) excs = self.flushLoggedErrors(ZeroDivisionError) self.assertEqual(len(excs), 1) crochet-1.4.0/crochet/tests/test_setup.py0000664000175000017500000002523512342431664022215 0ustar itamarstitamarst00000000000000""" Tests for the initial setup. """ from __future__ import absolute_import import threading import warnings import subprocess import sys from twisted.trial.unittest import TestCase from twisted.python.log import PythonLoggingObserver from twisted.python import log from twisted.python.runtime import platform from twisted.python import threadable from twisted.internet.task import Clock from .._eventloop import EventLoop, ThreadLogObserver, _store from ..tests import crochet_directory class FakeReactor(Clock): """ A fake reactor for testing purposes. """ thread_id = None runs = 0 in_call_from_thread = False def __init__(self): Clock.__init__(self) self.started = threading.Event() self.stopping = False self.events = [] def run(self, installSignalHandlers=True): self.runs += 1 self.thread_id = threading.current_thread().ident self.installSignalHandlers = installSignalHandlers self.started.set() def callFromThread(self, f, *args, **kwargs): self.in_call_from_thread = True f(*args, **kwargs) self.in_call_from_thread = False def stop(self): self.stopping = True def addSystemEventTrigger(self, when, event, f): self.events.append((when, event, f)) class FakeThread: started = False def start(self): self.started = True class SetupTests(TestCase): """ Tests for setup(). """ def test_first_runs_reactor(self): """ With it first call, setup() runs the reactor in a thread. """ reactor = FakeReactor() EventLoop(lambda: reactor, lambda f, *g: None).setup() reactor.started.wait(5) self.assertNotEqual(reactor.thread_id, None) self.assertNotEqual(reactor.thread_id, threading.current_thread().ident) self.assertFalse(reactor.installSignalHandlers) def test_second_does_nothing(self): """ The second call to setup() does nothing. """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() s.setup() reactor.started.wait(5) self.assertEqual(reactor.runs, 1) def test_stop_on_exit(self): """ setup() registers an exit handler that stops the reactor, and an exit handler that logs stashed EventualResults. """ atexit = [] reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *args: atexit.append((f, args))) s.setup() self.assertEqual(len(atexit), 2) self.assertFalse(reactor.stopping) f, args = atexit[0] self.assertEqual(f, reactor.callFromThread) self.assertEqual(args, (reactor.stop,)) f(*args) self.assertTrue(reactor.stopping) f, args = atexit[1] self.assertEqual(f, _store.log_errors) self.assertEqual(args, ()) f(*args) # make sure it doesn't throw an exception def test_runs_with_lock(self): """ All code in setup() and no_setup() is protected by a lock. """ self.assertTrue(EventLoop.setup.synchronized) self.assertTrue(EventLoop.no_setup.synchronized) def test_logging(self): """ setup() registers a PythonLoggingObserver wrapped in a ThreadLogObserver, removing the default log observer. """ logging = [] def fakeStartLoggingWithObserver(observer, setStdout=1): self.assertIsInstance(observer, ThreadLogObserver) wrapped = observer._observer expected = PythonLoggingObserver.emit # Python 3 and 2 differ in value of __func__: expected = getattr(expected, "__func__", expected) self.assertIdentical(wrapped.__func__, expected) self.assertEqual(setStdout, False) self.assertTrue(reactor.in_call_from_thread) logging.append(observer) reactor = FakeReactor() loop = EventLoop(lambda: reactor, lambda f, *g: None, fakeStartLoggingWithObserver) loop.setup() self.assertTrue(logging) logging[0].stop() def test_stop_logging_on_exit(self): """ setup() registers a reactor shutdown event that stops the logging thread. """ observers = [] reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *arg: None, lambda observer, setStdout=1: observers.append(observer)) s.setup() self.addCleanup(observers[0].stop) self.assertIn(("after", "shutdown", observers[0].stop), reactor.events) def test_warnings_untouched(self): """ setup() ensure the warnings module's showwarning is unmodified, overriding the change made by normal Twisted logging setup. """ def fakeStartLoggingWithObserver(observer, setStdout=1): warnings.showwarning = log.showwarning self.addCleanup(observer.stop) original = warnings.showwarning reactor = FakeReactor() loop = EventLoop(lambda: reactor, lambda f, *g: None, fakeStartLoggingWithObserver) loop.setup() self.assertIdentical(warnings.showwarning, original) def test_start_watchdog_thread(self): """ setup() starts the shutdown watchdog thread. """ thread = FakeThread() reactor = FakeReactor() loop = EventLoop(lambda: reactor, lambda *args: None, watchdog_thread=thread) loop.setup() self.assertTrue(thread.started) def test_no_setup(self): """ If called first, no_setup() makes subsequent calls to setup() do nothing. """ observers = [] atexit = [] thread = FakeThread() reactor = FakeReactor() loop = EventLoop(lambda: reactor, lambda f, *arg: atexit.append(f), lambda observer, *a, **kw: observers.append(observer), watchdog_thread=thread) loop.no_setup() loop.setup() self.assertFalse(observers) self.assertFalse(atexit) self.assertFalse(reactor.runs) self.assertFalse(thread.started) def test_no_setup_after_setup(self): """ If called after setup(), no_setup() throws an exception. """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() self.assertRaises(RuntimeError, s.no_setup) def test_setup_registry_shutdown(self): """ ResultRegistry.stop() is registered to run before reactor shutdown by setup(). """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() self.assertEqual(reactor.events, [("before", "shutdown", s._registry.stop)]) def test_no_setup_registry_shutdown(self): """ ResultRegistry.stop() is registered to run before reactor shutdown by setup(). """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.no_setup() self.assertEqual(reactor.events, [("before", "shutdown", s._registry.stop)]) class ProcessSetupTests(TestCase): """ setup() enables support for IReactorProcess on POSIX plaforms. """ def test_posix(self): """ On POSIX systems, setup() installs a LoopingCall that runs t.i.process.reapAllProcesses() 10 times a second. """ reactor = FakeReactor() reaps = [] s = EventLoop(lambda: reactor, lambda f, *g: None, reapAllProcesses=lambda: reaps.append(1)) s.setup() reactor.advance(0.1) self.assertEquals(reaps, [1]) reactor.advance(0.1) self.assertEquals(reaps, [1, 1]) reactor.advance(0.1) self.assertEquals(reaps, [1, 1, 1]) if platform.type != "posix": test_posix.skip = "SIGCHLD is a POSIX-specific issue" def test_non_posix(self): """ On POSIX systems, setup() does not install a LoopingCall. """ reactor = FakeReactor() s = EventLoop(lambda: reactor, lambda f, *g: None) s.setup() self.assertFalse(reactor.getDelayedCalls()) if platform.type == "posix": test_non_posix.skip = "SIGCHLD is a POSIX-specific issue" class ThreadLogObserverTest(TestCase): """ Tests for ThreadLogObserver. """ def test_stop(self): """ ThreadLogObserver.stop() stops the thread started in __init__. """ threadLog = ThreadLogObserver(None) self.assertTrue(threadLog._thread.is_alive()) threadLog.stop() threadLog._thread.join() self.assertFalse(threadLog._thread.is_alive()) def test_emit(self): """ ThreadLogObserver.emit runs the wrapped observer's in its thread, with the given message. """ messages = [] def observer(msg): messages.append((threading.current_thread().ident, msg)) threadLog = ThreadLogObserver(observer) ident = threadLog._thread.ident msg1 = {} msg2 = {"a": "b"} threadLog(msg1) threadLog(msg2) threadLog.stop() # Wait for writing to finish: threadLog._thread.join() self.assertEqual(messages, [(ident, msg1), (ident, msg2)]) def test_ioThreadUnchanged(self): """ ThreadLogObserver does not change the Twisted I/O thread (which is supposed to match the thread the main reactor is running in.) """ threadLog = ThreadLogObserver(None) threadLog.stop() threadLog._thread.join() self.assertIn(threadable.ioThread, # Either reactor was never run, or run in thread running # the tests: (None, threading.current_thread().ident)) class ReactorImportTests(TestCase): """ Tests for when the reactor gets imported. The reactor should only be imported as part of setup()/no_setup(), rather than as side-effect of Crochet import, since daemonization doesn't work if reactor is imported (https://twistedmatrix.com/trac/ticket/7105). """ def test_crochet_import_no_reactor(self): """ Importing crochet should not import the reactor. """ program = """\ import sys import crochet if "twisted.internet.reactor" not in sys.modules: sys.exit(23) """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory) self.assertEqual(process.wait(), 23) crochet-1.4.0/crochet/tests/test_process.py0000664000175000017500000000372512264332653022534 0ustar itamarstitamarst00000000000000""" Tests for IReactorProcess. """ import subprocess import sys from twisted.trial.unittest import TestCase from twisted.python.runtime import platform from ..tests import crochet_directory class ProcessTests(TestCase): """ Tests for process support. """ def test_processExit(self): """ A Crochet-managed reactor notice when a process it started exits. On POSIX platforms this requies waitpid() to be called, which in default Twisted implementation relies on a SIGCHLD handler which is not installed by Crochet at the moment. """ program = """\ from crochet import setup, run_in_reactor setup() import sys import os from twisted.internet.protocol import ProcessProtocol from twisted.internet.defer import Deferred from twisted.internet import reactor class Waiter(ProcessProtocol): def __init__(self): self.result = Deferred() def processExited(self, reason): self.result.callback(None) @run_in_reactor def run(): waiter = Waiter() # Closing FDs before exit forces us to rely on SIGCHLD to notice process # exit: reactor.spawnProcess(waiter, sys.executable, [sys.executable, '-c', 'import os; os.close(0); os.close(1); os.close(2)'], env=os.environ) return waiter.result run().wait(10) # If we don't notice process exit, TimeoutError will be thrown and we won't # reach the next line: sys.stdout.write("abc") """ process = subprocess.Popen([sys.executable, "-c", program], cwd=crochet_directory, stdout=subprocess.PIPE) result = process.stdout.read() self.assertEqual(result, b"abc") if platform.type != "posix": test_processExit.skip = "SIGCHLD is a POSIX-specific issue" if sys.version_info >= (3, 0, 0): test_processExit.skip = "Twisted does not support processes on Python 3 yet" crochet-1.4.0/crochet/tests/__init__.py0000664000175000017500000000027612264332721021550 0ustar itamarstitamarst00000000000000""" Crochet tests. """ from twisted.python.filepath import FilePath # Directory where the crochet package is stored: crochet_directory = FilePath(__file__).parent().parent().parent().path crochet-1.4.0/crochet/_util.py0000664000175000017500000000055612127024027017757 0ustar itamarstitamarst00000000000000""" Utility functions and classes. """ from functools import wraps def synchronized(method): """ Decorator that wraps a method with an acquire/release of self._lock. """ @wraps(method) def synced(self, *args, **kwargs): with self._lock: return method(self, *args, **kwargs) synced.synchronized = True return synced crochet-1.4.0/crochet/_version.py0000664000175000017500000000064112522534160020465 0ustar itamarstitamarst00000000000000 # This file was generated by 'versioneer.py' (0.10) from # revision-control system data, or from the parent directory name of an # unpacked source archive. Distribution tarballs contain a pre-generated copy # of this file. version_version = '1.4.0' version_full = 'ecc17a072aa8804c73e4ab8af1b3528cdd206f61' def get_versions(default={}, verbose=False): return {'version': version_version, 'full': version_full} crochet-1.4.0/crochet/_eventloop.py0000664000175000017500000004124312514553127021023 0ustar itamarstitamarst00000000000000""" Expose Twisted's event loop to threaded programs. """ from __future__ import absolute_import import select import threading import weakref import warnings from functools import wraps import imp from twisted.python import threadable from twisted.python.runtime import platform from twisted.python.failure import Failure from twisted.python.log import PythonLoggingObserver, err from twisted.internet.defer import maybeDeferred from twisted.internet.task import LoopingCall from ._util import synchronized from ._resultstore import ResultStore _store = ResultStore() if hasattr(weakref, "WeakSet"): WeakSet = weakref.WeakSet else: class WeakSet(object): """ Minimal WeakSet emulation. """ def __init__(self): self._items = weakref.WeakKeyDictionary() def add(self, value): self._items[value] = True def __iter__(self): return iter(self._items) class TimeoutError(Exception): """ A timeout has been hit. """ class ReactorStopped(Exception): """ The reactor has stopped, and therefore no result will ever become available from this EventualResult. """ class ResultRegistry(object): """ Keep track of EventualResults. Once the reactor has shutdown: 1. Registering new EventualResult instances is an error, since no results will ever become available. 2. Already registered EventualResult instances are "fired" with a ReactorStopped exception to unblock any remaining EventualResult.wait() calls. """ def __init__(self, reactor): self._results = WeakSet() self._stopped = False self._lock = threading.Lock() @synchronized def register(self, result): """ Register an EventualResult. May be called in any thread. """ if self._stopped: raise ReactorStopped() self._results.add(result) @synchronized def stop(self): """ Indicate no more results will get pushed into EventualResults, since the reactor has stopped. This should be called in the reactor thread. """ self._stopped = True for result in self._results: result._set_result(Failure(ReactorStopped())) class EventualResult(object): """ A blocking interface to Deferred results. This allows you to access results from Twisted operations that may not be available immediately, using the wait() method. In general you should not create these directly; instead use functions decorated with @run_in_reactor. """ def __init__(self, deferred, _reactor): """ The deferred parameter should be a Deferred or None indicating _connect_deferred will be called separately later. """ self._deferred = deferred self._reactor = _reactor self._value = None self._result_retrieved = False self._result_set = threading.Event() if deferred is not None: self._connect_deferred(deferred) def _connect_deferred(self, deferred): """ Hook up the Deferred that that this will be the result of. Should only be run in Twisted thread, and only called once. """ self._deferred = deferred # Because we use __del__, we need to make sure there are no cycles # involving this object, which is why we use a weakref: def put(result, eventual=weakref.ref(self)): eventual = eventual() if eventual: eventual._set_result(result) else: err(result, "Unhandled error in EventualResult") deferred.addBoth(put) def _set_result(self, result): """ Set the result of the EventualResult, if not already set. This can only happen in the reactor thread, either as a result of Deferred firing, or as a result of ResultRegistry.stop(). So, no need for thread-safety. """ if self._result_set.isSet(): return self._value = result self._result_set.set() def __del__(self): if self._result_retrieved or not self._result_set.isSet(): return if isinstance(self._value, Failure): err(self._value, "Unhandled error in EventualResult") def cancel(self): """ Try to cancel the operation by cancelling the underlying Deferred. Cancellation of the operation may or may not happen depending on underlying cancellation support and whether the operation has already finished. In any case, however, the underlying Deferred will be fired. Multiple calls will have no additional effect. """ self._reactor.callFromThread(lambda: self._deferred.cancel()) def _result(self, timeout=None): """ Return the result, if available. It may take an unknown amount of time to return the result, so a timeout option is provided. If the given number of seconds pass with no result, a TimeoutError will be thrown. If a previous call timed out, additional calls to this function will still wait for a result and return it if available. If a result was returned on one call, additional calls will return/raise the same result. """ if timeout is None: warnings.warn("Unlimited timeouts are deprecated.", DeprecationWarning, stacklevel=3) # Queue.get(None) won't get interrupted by Ctrl-C... timeout = 2 ** 31 self._result_set.wait(timeout) # In Python 2.6 we can't rely on the return result of wait(), so we # have to check manually: if not self._result_set.is_set(): raise TimeoutError() self._result_retrieved = True return self._value def wait(self, timeout=None): """ Return the result, or throw the exception if result is a failure. It may take an unknown amount of time to return the result, so a timeout option is provided. If the given number of seconds pass with no result, a TimeoutError will be thrown. If a previous call timed out, additional calls to this function will still wait for a result and return it if available. If a result was returned or raised on one call, additional calls will return/raise the same result. """ if threadable.isInIOThread(): raise RuntimeError( "EventualResult.wait() must not be run in the reactor thread.") if imp.lock_held(): try: imp.release_lock() except RuntimeError: # The lock is held by some other thread. We should be safe # to continue. pass else: # If EventualResult.wait() is run during module import, if the # Twisted code that is being run also imports something the result # will be a deadlock. Even if that is not an issue it would # prevent importing in other threads until the call returns. raise RuntimeError( "EventualResult.wait() must not be run at module import time.") result = self._result(timeout) if isinstance(result, Failure): result.raiseException() return result def stash(self): """ Store the EventualResult in memory for later retrieval. Returns a integer uid which can be passed to crochet.retrieve_result() to retrieve the instance later on. """ return _store.store(self) def original_failure(self): """ Return the underlying Failure object, if the result is an error. If no result is yet available, or the result was not an error, None is returned. This method is useful if you want to get the original traceback for an error result. """ try: result = self._result(0.0) except TimeoutError: return None if isinstance(result, Failure): return result else: return None class ThreadLogObserver(object): """ A log observer that wraps another observer, and calls it in a thread. In particular, used to wrap PythonLoggingObserver, so that blocking logging.py Handlers don't block the event loop. """ def __init__(self, observer): self._observer = observer if getattr(select, "poll", None): from twisted.internet.pollreactor import PollReactor reactorFactory = PollReactor else: from twisted.internet.selectreactor import SelectReactor reactorFactory = SelectReactor self._logWritingReactor = reactorFactory() self._logWritingReactor._registerAsIOThread = False self._thread = threading.Thread(target=self._reader, name="CrochetLogWriter") self._thread.start() def _reader(self): """ Runs in a thread, reads messages from a queue and writes them to the wrapped observer. """ self._logWritingReactor.run(installSignalHandlers=False) def stop(self): """ Stop the thread. """ self._logWritingReactor.callFromThread(self._logWritingReactor.stop) def __call__(self, msg): """ A log observer that writes to a queue. """ self._logWritingReactor.callFromThread(self._observer, msg) class EventLoop(object): """ Initialization infrastructure for running a reactor in a thread. """ def __init__(self, reactorFactory, atexit_register, startLoggingWithObserver=None, watchdog_thread=None, reapAllProcesses=None): """ reactorFactory: Zero-argument callable that returns a reactor. atexit_register: atexit.register, or look-alike. startLoggingWithObserver: Either None, or twisted.python.log.startLoggingWithObserver or lookalike. watchdog_thread: crochet._shutdown.Watchdog instance, or None. reapAllProcesses: twisted.internet.process.reapAllProcesses or lookalike. """ self._reactorFactory = reactorFactory self._atexit_register = atexit_register self._startLoggingWithObserver = startLoggingWithObserver self._started = False self._lock = threading.Lock() self._watchdog_thread = watchdog_thread self._reapAllProcesses = reapAllProcesses def _startReapingProcesses(self): """ Start a LoopingCall that calls reapAllProcesses. """ lc = LoopingCall(self._reapAllProcesses) lc.clock = self._reactor lc.start(0.1, False) def _common_setup(self): """ The minimal amount of setup done by both setup() and no_setup(). """ self._started = True self._reactor = self._reactorFactory() self._registry = ResultRegistry(self._reactor) # We want to unblock EventualResult regardless of how the reactor is # run, so we always register this: self._reactor.addSystemEventTrigger( "before", "shutdown", self._registry.stop) @synchronized def setup(self): """ Initialize the crochet library. This starts the reactor in a thread, and connect's Twisted's logs to Python's standard library logging module. This must be called at least once before the library can be used, and can be called multiple times. """ if self._started: return self._common_setup() if platform.type == "posix": self._reactor.callFromThread(self._startReapingProcesses) if self._startLoggingWithObserver: observer = ThreadLogObserver(PythonLoggingObserver().emit) def start(): # Twisted is going to override warnings.showwarning; let's # make sure that has no effect: from twisted.python import log original = log.showwarning log.showwarning = warnings.showwarning self._startLoggingWithObserver(observer, False) log.showwarning = original self._reactor.callFromThread(start) # We only want to stop the logging thread once the reactor has # shut down: self._reactor.addSystemEventTrigger("after", "shutdown", observer.stop) t = threading.Thread( target=lambda: self._reactor.run(installSignalHandlers=False), name="CrochetReactor") t.start() self._atexit_register(self._reactor.callFromThread, self._reactor.stop) self._atexit_register(_store.log_errors) if self._watchdog_thread is not None: self._watchdog_thread.start() @synchronized def no_setup(self): """ Initialize the crochet library with no side effects. No reactor will be started, logging is uneffected, etc.. Future calls to setup() will have no effect. This is useful for applications that intend to run Twisted's reactor themselves, and so do not want libraries using crochet to attempt to start it on their own. If no_setup() is called after setup(), a RuntimeError is raised. """ if self._started: raise RuntimeError("no_setup() is intended to be called once, by a" " Twisted application, before any libraries " "using crochet are imported and call setup().") self._common_setup() def run_in_reactor(self, function): """ A decorator that ensures the wrapped function runs in the reactor thread. When the wrapped function is called, an EventualResult is returned. """ def runs_in_reactor(result, args, kwargs): d = maybeDeferred(function, *args, **kwargs) result._connect_deferred(d) @wraps(function) def wrapper(*args, **kwargs): result = EventualResult(None, self._reactor) self._registry.register(result) self._reactor.callFromThread(runs_in_reactor, result, args, kwargs) return result wrapper.wrapped_function = function return wrapper def wait_for_reactor(self, function): """ DEPRECATED, use wait_for(timeout) instead. A decorator that ensures the wrapped function runs in the reactor thread. When the wrapped function is called, its result is returned or its exception raised. Deferreds are handled transparently. """ warnings.warn("@wait_for_reactor is deprecated, use @wait_for instead", DeprecationWarning, stacklevel=2) # This will timeout, in theory. In practice the process will be dead # long before that. return self.wait_for(2 ** 31)(function) def wait_for(self, timeout): """ A decorator factory that ensures the wrapped function runs in the reactor thread. When the wrapped function is called, its result is returned or its exception raised. Deferreds are handled transparently. Calls will timeout after the given number of seconds (a float), raising a crochet.TimeoutError, and cancelling the Deferred being waited on. """ def decorator(function): @wraps(function) def wrapper(*args, **kwargs): @self.run_in_reactor def run(): return function(*args, **kwargs) eventual_result = run() try: return eventual_result.wait(timeout) except TimeoutError: eventual_result.cancel() raise wrapper.wrapped_function = function return wrapper return decorator def in_reactor(self, function): """ DEPRECATED, use run_in_reactor. A decorator that ensures the wrapped function runs in the reactor thread. The wrapped function will get the reactor passed in as a first argument, in addition to any arguments it is called with. When the wrapped function is called, an EventualResult is returned. """ warnings.warn("@in_reactor is deprecated, use @run_in_reactor", DeprecationWarning, stacklevel=2) @self.run_in_reactor @wraps(function) def add_reactor(*args, **kwargs): return function(self._reactor, *args, **kwargs) return add_reactor crochet-1.4.0/crochet/_resultstore.py0000664000175000017500000000272012215625072021375 0ustar itamarstitamarst00000000000000""" In-memory store for EventualResults. """ import threading from twisted.python import log from ._util import synchronized class ResultStore(object): """ An in-memory store for EventualResult instances. Each EventualResult put in the store gets a unique identifier, which can be used to retrieve it later. This is useful for referring to results in e.g. web sessions. EventualResults that are not retrieved by shutdown will be logged if they have an error result. """ def __init__(self): self._counter = 0 self._stored = {} self._lock = threading.Lock() @synchronized def store(self, deferred_result): """ Store a EventualResult. Return an integer, a unique identifier that can be used to retrieve the object. """ self._counter += 1 self._stored[self._counter] = deferred_result return self._counter @synchronized def retrieve(self, result_id): """ Return the given EventualResult, and remove it from the store. """ return self._stored.pop(result_id) @synchronized def log_errors(self): """ Log errors for all stored EventualResults that have error results. """ for result in self._stored.values(): failure = result.original_failure() if failure is not None: log.err(failure, "Unhandled error in stashed EventualResult:") crochet-1.4.0/crochet/_shutdown.py0000664000175000017500000000307512522257664020672 0ustar itamarstitamarst00000000000000""" Support for calling code when the main thread exits. atexit cannot be used, since registered atexit functions only run after *all* threads have exited. The watchdog thread will be started by crochet.setup(). """ import threading import time from twisted.python import log class Watchdog(threading.Thread): """ Watch a given thread, call a list of functions when that thread exits. """ def __init__(self, canary, shutdown_function): threading.Thread.__init__(self, name="CrochetShutdownWatchdog") self._canary = canary self._shutdown_function = shutdown_function def run(self): while self._canary.is_alive(): time.sleep(0.1) self._shutdown_function() class FunctionRegistry(object): """ A registry of functions that can be called all at once. """ def __init__(self): self._functions = [] def register(self, f, *args, **kwargs): """ Register a function and arguments to be called later. """ self._functions.append(lambda: f(*args, **kwargs)) def run(self): """ Run all registered functions in reverse order of registration. """ for f in reversed(self._functions): try: f() except: log.err() # This is... fragile. Not sure how else to do it though. _registry = FunctionRegistry() _watchdog = Watchdog( [ t for t in threading.enumerate() if isinstance(t, threading._MainThread) ][0], _registry.run, ) register = _registry.register crochet-1.4.0/crochet/__init__.py0000664000175000017500000000326212342431670020404 0ustar itamarstitamarst00000000000000""" Crochet: Use Twisted Anywhere! """ from __future__ import absolute_import import sys from twisted.python.log import startLoggingWithObserver from twisted.python.runtime import platform if platform.type == "posix": try: from twisted.internet.process import reapAllProcesses except (SyntaxError, ImportError): if sys.version_info < (3, 3, 0): raise else: # Process support is still not ported to Python 3 on some versions # of Twisted. reapAllProcesses = lambda: None else: # waitpid() is only necessary on POSIX: reapAllProcesses = lambda: None from ._shutdown import _watchdog, register from ._eventloop import (EventualResult, TimeoutError, EventLoop, _store, ReactorStopped) from ._version import get_versions __version__ = get_versions()['version'] del get_versions def _importReactor(): from twisted.internet import reactor return reactor _main = EventLoop(_importReactor, register, startLoggingWithObserver, _watchdog, reapAllProcesses) setup = _main.setup no_setup = _main.no_setup run_in_reactor = _main.run_in_reactor wait_for = _main.wait_for retrieve_result = _store.retrieve # Backwards compatibility with 0.5.0: in_reactor = _main.in_reactor DeferredResult = EventualResult # Backwards compatibility with 1.1.0 and earlier: wait_for_reactor = _main.wait_for_reactor __all__ = ["setup", "run_in_reactor", "EventualResult", "TimeoutError", "retrieve_result", "no_setup", "wait_for", "ReactorStopped", "__version__", # Backwards compatibility: "DeferredResult", "in_reactor", "wait_for_reactor", ] crochet-1.4.0/LICENSE0000664000175000017500000000210212224075317015642 0ustar itamarstitamarst00000000000000Copyright (c) 2013 Itamar Turner-Trauring and Twisted Matrix Labs Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. crochet-1.4.0/examples/0000775000175000017500000000000012522534160016455 5ustar itamarstitamarst00000000000000crochet-1.4.0/examples/mxquery.py0000664000175000017500000000177412514553133020554 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ A command-line application that uses Twisted to do an MX DNS query. """ from __future__ import print_function from twisted.names.client import lookupMailExchange from crochet import setup, wait_for setup() # Twisted code: def _mx(domain): """ Return Deferred that fires with a list of (priority, MX domain) tuples for a given domain. """ def got_records(result): return sorted( [(int(record.payload.preference), str(record.payload.name)) for record in result[0]]) d = lookupMailExchange(domain) d.addCallback(got_records) return d # Blocking wrapper: @wait_for(timeout=5) def mx(domain): """ Return list of (priority, MX domain) tuples for a given domain. """ return _mx(domain) # Application code: def main(domain): print("Mail servers for %s:" % (domain,)) for priority, mailserver in mx(domain): print(priority, mailserver) if __name__ == '__main__': import sys main(sys.argv[1]) crochet-1.4.0/examples/fromtwisted.py0000664000175000017500000000231412342417332021377 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ An example of using Crochet from a normal Twisted application. """ import sys from crochet import no_setup, wait_for # Tell Crochet not to run the reactor: no_setup() from twisted.internet import reactor from twisted.python import log from twisted.web.wsgi import WSGIResource from twisted.web.server import Site from twisted.names import client # A WSGI application, will be run in thread pool: def application(environ, start_response): start_response('200 OK', []) try: ip = gethostbyname('twistedmatrix.com') return "%s has IP %s" % ('twistedmatrix.com', ip) except Exception, e: return 'Error doing lookup: %s' % (e,) # A blocking API that will be called from the WSGI application, but actually # uses DNS: @wait_for(timeout=10) def gethostbyname(name): d = client.lookupAddress(name) d.addCallback(lambda result: result[0][0].payload.dottedQuad()) return d # Normal Twisted code, serving the WSGI application and running the reactor: def main(): log.startLogging(sys.stdout) pool = reactor.getThreadPool() reactor.listenTCP(5000, Site(WSGIResource(reactor, pool, application))) reactor.run() if __name__ == '__main__': main() crochet-1.4.0/examples/ssh.py0000664000175000017500000000366512320347622017637 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ A demonstration of Conch, allowing you to SSH into a running Python server and inspect objects at a Python prompt. If you're using the system install of Twisted, you may need to install Conch separately, e.g. on Ubuntu: $ sudo apt-get install python-twisted-conch Once you've started the program, you can ssh in by doing: $ ssh admin@localhost -p 5022 The password is 'secret'. Once you've reached the Python prompt, you have access to the app object, and can import code, etc.: >>> 3 + 4 7 >>> print(app) """ import logging from flask import Flask from crochet import setup, run_in_reactor setup() # Web server: app = Flask(__name__) @app.route('/') def index(): return "Welcome to my boring web server!" @run_in_reactor def start_ssh_server(port, username, password, namespace): """ Start an SSH server on the given port, exposing a Python prompt with the given namespace. """ # This is a lot of boilerplate, see http://tm.tl/6429 for a ticket to # provide a utility function that simplifies this. from twisted.internet import reactor from twisted.conch.insults import insults from twisted.conch import manhole, manhole_ssh from twisted.cred.checkers import ( InMemoryUsernamePasswordDatabaseDontUse as MemoryDB) from twisted.cred.portal import Portal sshRealm = manhole_ssh.TerminalRealm() def chainedProtocolFactory(): return insults.ServerProtocol(manhole.Manhole, namespace) sshRealm.chainedProtocolFactory = chainedProtocolFactory sshPortal = Portal(sshRealm, [MemoryDB(**{username: password})]) reactor.listenTCP(port, manhole_ssh.ConchFactory(sshPortal), interface="127.0.0.1") if __name__ == '__main__': import sys logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) start_ssh_server(5022, "admin", "secret", {"app": app}) app.run() crochet-1.4.0/examples/scheduling.py0000664000175000017500000000513012320347622021154 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ An example of scheduling time-based events in the background. Download the latest EUR/USD exchange rate from Yahoo every 30 seconds in the background; the rendered Flask web page can use the latest value without having to do the request itself. Note this is example is for demonstration purposes only, and is not actually used in the real world. You should not do this in a real application without reading Yahoo's terms-of-service and following them. """ from __future__ import print_function from flask import Flask from twisted.internet.task import LoopingCall from twisted.web.client import getPage from twisted.python import log from crochet import wait_for, run_in_reactor, setup setup() # Twisted code: class _ExchangeRate(object): """Download an exchange rate from Yahoo Finance using Twisted.""" def __init__(self, name): self._value = None self._name = name # External API: def latest_value(self): """Return the latest exchange rate value. May be None if no value is available. """ return self._value def start(self): """Start the background process.""" self._lc = LoopingCall(self._download) # Run immediately, and then every 30 seconds: self._lc.start(30, now=True) def _download(self): """Download the page.""" print("Downloading!") def parse(result): print("Got %r back from Yahoo." % (result,)) values = result.strip().split(",") self._value = float(values[1]) d = getPage( "http://download.finance.yahoo.com/d/quotes.csv?e=.csv&f=c4l1&s=%s=X" % (self._name,)) d.addCallback(parse) d.addErrback(log.err) return d # Blocking wrapper: class ExchangeRate(object): """Blocking API for downloading exchange rate.""" def __init__(self, name): self._exchange = _ExchangeRate(name) @run_in_reactor def start(self): self._exchange.start() @wait_for(timeout=1) def latest_value(self): """Return the latest exchange rate value. May be None if no value is available. """ return self._exchange.latest_value() EURUSD = ExchangeRate("EURUSD") app = Flask(__name__) @app.route('/') def index(): rate = EURUSD.latest_value() if rate is None: rate = "unavailable, please refresh the page" return "Current EUR/USD exchange rate is %s." % (rate,) if __name__ == '__main__': import sys, logging logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) EURUSD.start() app.run() crochet-1.4.0/examples/downloader.py0000664000175000017500000000260712264334233021174 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ A flask web application that downloads a page in the background. """ import logging from flask import Flask, session, escape from crochet import setup, run_in_reactor, retrieve_result, TimeoutError # Can be called multiple times with no ill-effect: setup() app = Flask(__name__) @run_in_reactor def download_page(url): """ Download a page. """ from twisted.web.client import getPage return getPage(url) @app.route('/') def index(): if 'download' not in session: # Calling an @run_in_reactor function returns an EventualResult: result = download_page('http://www.google.com') session['download'] = result.stash() return "Starting download, refresh to track progress." # Retrieval is a one-time operation, so the uid in the session cannot be # reused: result = retrieve_result(session.pop('download')) try: download = result.wait(timeout=0.1) return "Downloaded: " + escape(download) except TimeoutError: session['download'] = result.stash() return "Download in progress..." except: # The original traceback of the exception: return "Download failed:\n" + result.original_failure().getTraceback() if __name__ == '__main__': import os, sys logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) app.secret_key = os.urandom(24) app.run() crochet-1.4.0/examples/testing.py0000664000175000017500000000067512342417332020515 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ Demonstration of accessing wrapped functions for testing. """ from __future__ import print_function from crochet import setup, run_in_reactor setup() @run_in_reactor def add(x, y): return x + y if __name__ == '__main__': print("add() returns EventualResult:") print(" ", add(1, 2)) print("add.wrapped_function() returns result of underlying function:") print(" ", add.wrapped_function(1, 2)) crochet-1.4.0/examples/blockingdns.py0000664000175000017500000000167312516754774021355 0ustar itamarstitamarst00000000000000#!/usr/bin/python """ Do a DNS lookup using Twisted's APIs. """ from __future__ import print_function # The Twisted code we'll be using: from twisted.names import client from crochet import setup, wait_for setup() # Crochet layer, wrapping Twisted's DNS library in a blocking call. @wait_for(timeout=5.0) def gethostbyname(name): """Lookup the IP of a given hostname. Unlike socket.gethostbyname() which can take an arbitrary amount of time to finish, this function will raise crochet.TimeoutError if more than 5 seconds elapse without an answer being received. """ d = client.lookupAddress(name) d.addCallback(lambda result: result[0][0].payload.dottedQuad()) return d if __name__ == '__main__': # Application code using the public API - notice it works in a normal # blocking manner, with no event loop visible: import sys name = sys.argv[1] ip = gethostbyname(name) print(name, "->", ip) crochet-1.4.0/requirements-dev.txt0000664000175000017500000000002712217622714020702 0ustar itamarstitamarst00000000000000Twisted>=11.1.0 sphinx crochet-1.4.0/versioneer.py0000664000175000017500000010367312342431670017406 0ustar itamarstitamarst00000000000000 # Version: 0.10 """ The Versioneer ============== * like a rocketeer, but for versions! * https://github.com/warner/python-versioneer * Brian Warner * License: Public Domain * Compatible With: python2.6, 2.7, and 3.2, 3.3 [![Build Status](https://travis-ci.org/warner/python-versioneer.png?branch=master)](https://travis-ci.org/warner/python-versioneer) This is a tool for managing a recorded version number in distutils-based python projects. The goal is to remove the tedious and error-prone "update the embedded version string" step from your release process. Making a new release should be as easy as recording a new tag in your version-control system, and maybe making new tarballs. ## Quick Install * `pip install versioneer` to somewhere to your $PATH * run `versioneer-installer` in your source tree: this installs `versioneer.py` * follow the instructions below (also in the `versioneer.py` docstring) ## Version Identifiers Source trees come from a variety of places: * a version-control system checkout (mostly used by developers) * a nightly tarball, produced by build automation * a snapshot tarball, produced by a web-based VCS browser, like github's "tarball from tag" feature * a release tarball, produced by "setup.py sdist", distributed through PyPI Within each source tree, the version identifier (either a string or a number, this tool is format-agnostic) can come from a variety of places: * ask the VCS tool itself, e.g. "git describe" (for checkouts), which knows about recent "tags" and an absolute revision-id * the name of the directory into which the tarball was unpacked * an expanded VCS variable ($Id$, etc) * a `_version.py` created by some earlier build step For released software, the version identifier is closely related to a VCS tag. Some projects use tag names that include more than just the version string (e.g. "myproject-1.2" instead of just "1.2"), in which case the tool needs to strip the tag prefix to extract the version identifier. For unreleased software (between tags), the version identifier should provide enough information to help developers recreate the same tree, while also giving them an idea of roughly how old the tree is (after version 1.2, before version 1.3). Many VCS systems can report a description that captures this, for example 'git describe --tags --dirty --always' reports things like "0.7-1-g574ab98-dirty" to indicate that the checkout is one revision past the 0.7 tag, has a unique revision id of "574ab98", and is "dirty" (it has uncommitted changes. The version identifier is used for multiple purposes: * to allow the module to self-identify its version: `myproject.__version__` * to choose a name and prefix for a 'setup.py sdist' tarball ## Theory of Operation Versioneer works by adding a special `_version.py` file into your source tree, where your `__init__.py` can import it. This `_version.py` knows how to dynamically ask the VCS tool for version information at import time. However, when you use "setup.py build" or "setup.py sdist", `_version.py` in the new copy is replaced by a small static file that contains just the generated version data. `_version.py` also contains `$Revision$` markers, and the installation process marks `_version.py` to have this marker rewritten with a tag name during the "git archive" command. As a result, generated tarballs will contain enough information to get the proper version. ## Installation First, decide on values for the following configuration variables: * `versionfile_source`: A project-relative pathname into which the generated version strings should be written. This is usually a `_version.py` next to your project's main `__init__.py` file. If your project uses `src/myproject/__init__.py`, this should be `src/myproject/_version.py`. This file should be checked in to your VCS as usual: the copy created below by `setup.py versioneer` will include code that parses expanded VCS keywords in generated tarballs. The 'build' and 'sdist' commands will replace it with a copy that has just the calculated version string. * `versionfile_build`: Like `versionfile_source`, but relative to the build directory instead of the source directory. These will differ when your setup.py uses 'package_dir='. If you have `package_dir={'myproject': 'src/myproject'}`, then you will probably have `versionfile_build='myproject/_version.py'` and `versionfile_source='src/myproject/_version.py'`. * `tag_prefix`: a string, like 'PROJECTNAME-', which appears at the start of all VCS tags. If your tags look like 'myproject-1.2.0', then you should use tag_prefix='myproject-'. If you use unprefixed tags like '1.2.0', this should be an empty string. * `parentdir_prefix`: a string, frequently the same as tag_prefix, which appears at the start of all unpacked tarball filenames. If your tarball unpacks into 'myproject-1.2.0', this should be 'myproject-'. This tool provides one script, named `versioneer-installer`. That script does one thing: write a copy of `versioneer.py` into the current directory. To versioneer-enable your project: * 1: Run `versioneer-installer` to copy `versioneer.py` into the top of your source tree. * 2: add the following lines to the top of your `setup.py`, with the configuration values you decided earlier: import versioneer versioneer.versionfile_source = 'src/myproject/_version.py' versioneer.versionfile_build = 'myproject/_version.py' versioneer.tag_prefix = '' # tags are like 1.2.0 versioneer.parentdir_prefix = 'myproject-' # dirname like 'myproject-1.2.0' * 3: add the following arguments to the setup() call in your setup.py: version=versioneer.get_version(), cmdclass=versioneer.get_cmdclass(), * 4: now run `setup.py versioneer`, which will create `_version.py`, and will modify your `__init__.py` to define `__version__` (by calling a function from `_version.py`). It will also modify your `MANIFEST.in` to include both `versioneer.py` and the generated `_version.py` in sdist tarballs. * 5: commit these changes to your VCS. To make sure you won't forget, `setup.py versioneer` will mark everything it touched for addition. ## Post-Installation Usage Once established, all uses of your tree from a VCS checkout should get the current version string. All generated tarballs should include an embedded version string (so users who unpack them will not need a VCS tool installed). If you distribute your project through PyPI, then the release process should boil down to two steps: * 1: git tag 1.0 * 2: python setup.py register sdist upload If you distribute it through github (i.e. users use github to generate tarballs with `git archive`), the process is: * 1: git tag 1.0 * 2: git push; git push --tags Currently, all version strings must be based upon a tag. Versioneer will report "unknown" until your tree has at least one tag in its history. This restriction will be fixed eventually (see issue #12). ## Version-String Flavors Code which uses Versioneer can learn about its version string at runtime by importing `_version` from your main `__init__.py` file and running the `get_versions()` function. From the "outside" (e.g. in `setup.py`), you can import the top-level `versioneer.py` and run `get_versions()`. Both functions return a dictionary with different keys for different flavors of the version string: * `['version']`: condensed tag+distance+shortid+dirty identifier. For git, this uses the output of `git describe --tags --dirty --always` but strips the tag_prefix. For example "0.11-2-g1076c97-dirty" indicates that the tree is like the "1076c97" commit but has uncommitted changes ("-dirty"), and that this commit is two revisions ("-2-") beyond the "0.11" tag. For released software (exactly equal to a known tag), the identifier will only contain the stripped tag, e.g. "0.11". * `['full']`: detailed revision identifier. For Git, this is the full SHA1 commit id, followed by "-dirty" if the tree contains uncommitted changes, e.g. "1076c978a8d3cfc70f408fe5974aa6c092c949ac-dirty". Some variants are more useful than others. Including `full` in a bug report should allow developers to reconstruct the exact code being tested (or indicate the presence of local changes that should be shared with the developers). `version` is suitable for display in an "about" box or a CLI `--version` output: it can be easily compared against release notes and lists of bugs fixed in various releases. In the future, this will also include a [PEP-0440](http://legacy.python.org/dev/peps/pep-0440/) -compatible flavor (e.g. `1.2.post0.dev123`). This loses a lot of information (and has no room for a hash-based revision id), but is safe to use in a `setup.py` "`version=`" argument. It also enables tools like *pip* to compare version strings and evaluate compatibility constraint declarations. The `setup.py versioneer` command adds the following text to your `__init__.py` to place a basic version in `YOURPROJECT.__version__`: from ._version import get_versions __version = get_versions()['version'] del get_versions ## Updating Versioneer To upgrade your project to a new release of Versioneer, do the following: * install the new Versioneer (`pip install -U versioneer` or equivalent) * re-run `versioneer-installer` in your source tree to replace `versioneer.py` * edit `setup.py`, if necessary, to include any new configuration settings indicated by the release notes * re-run `setup.py versioneer` to replace `SRC/_version.py` * commit any changed files ## Future Directions This tool is designed to make it easily extended to other version-control systems: all VCS-specific components are in separate directories like src/git/ . The top-level `versioneer.py` script is assembled from these components by running make-versioneer.py . In the future, make-versioneer.py will take a VCS name as an argument, and will construct a version of `versioneer.py` that is specific to the given VCS. It might also take the configuration arguments that are currently provided manually during installation by editing setup.py . Alternatively, it might go the other direction and include code from all supported VCS systems, reducing the number of intermediate scripts. ## License To make Versioneer easier to embed, all its code is hereby released into the public domain. The `_version.py` that it creates is also in the public domain. """ import os, sys, re from distutils.core import Command from distutils.command.sdist import sdist as _sdist from distutils.command.build import build as _build versionfile_source = None versionfile_build = None tag_prefix = None parentdir_prefix = None VCS = "git" LONG_VERSION_PY = ''' # This file helps to compute a version number in source trees obtained from # git-archive tarball (such as those provided by githubs download-from-tag # feature). Distribution tarballs (build by setup.py sdist) and build # directories (produced by setup.py build) will contain a much shorter file # that just contains the computed version number. # This file is released into the public domain. Generated by # versioneer-0.10 (https://github.com/warner/python-versioneer) # these strings will be replaced by git during git-archive git_refnames = "%(DOLLAR)sFormat:%%d%(DOLLAR)s" git_full = "%(DOLLAR)sFormat:%%H%(DOLLAR)s" import subprocess import sys import errno def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False): assert isinstance(commands, list) p = None for c in commands: try: # remember shell=False, so use git.cmd on windows, not just git p = subprocess.Popen([c] + args, cwd=cwd, stdout=subprocess.PIPE, stderr=(subprocess.PIPE if hide_stderr else None)) break except EnvironmentError: e = sys.exc_info()[1] if e.errno == errno.ENOENT: continue if verbose: print("unable to run %%s" %% args[0]) print(e) return None else: if verbose: print("unable to find command, tried %%s" %% (commands,)) return None stdout = p.communicate()[0].strip() if sys.version >= '3': stdout = stdout.decode() if p.returncode != 0: if verbose: print("unable to run %%s (error)" %% args[0]) return None return stdout import sys import re import os.path def get_expanded_variables(versionfile_abs): # the code embedded in _version.py can just fetch the value of these # variables. When used from setup.py, we don't want to import # _version.py, so we do it with a regexp instead. This function is not # used from _version.py. variables = {} try: f = open(versionfile_abs,"r") for line in f.readlines(): if line.strip().startswith("git_refnames ="): mo = re.search(r'=\s*"(.*)"', line) if mo: variables["refnames"] = mo.group(1) if line.strip().startswith("git_full ="): mo = re.search(r'=\s*"(.*)"', line) if mo: variables["full"] = mo.group(1) f.close() except EnvironmentError: pass return variables def versions_from_expanded_variables(variables, tag_prefix, verbose=False): refnames = variables["refnames"].strip() if refnames.startswith("$Format"): if verbose: print("variables are unexpanded, not using") return {} # unexpanded, so not in an unpacked git-archive tarball refs = set([r.strip() for r in refnames.strip("()").split(",")]) # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %%d # expansion behaves like git log --decorate=short and strips out the # refs/heads/ and refs/tags/ prefixes that would let us distinguish # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". tags = set([r for r in refs if re.search(r'\d', r)]) if verbose: print("discarding '%%s', no digits" %% ",".join(refs-tags)) if verbose: print("likely tags: %%s" %% ",".join(sorted(tags))) for ref in sorted(tags): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] if verbose: print("picking %%s" %% r) return { "version": r, "full": variables["full"].strip() } # no suitable tags, so we use the full revision id if verbose: print("no suitable tags, using full revision id") return { "version": variables["full"].strip(), "full": variables["full"].strip() } def versions_from_vcs(tag_prefix, root, verbose=False): # this runs 'git' from the root of the source tree. This only gets called # if the git-archive 'subst' variables were *not* expanded, and # _version.py hasn't already been rewritten with a short version string, # meaning we're inside a checked out source tree. if not os.path.exists(os.path.join(root, ".git")): if verbose: print("no .git in %%s" %% root) return {} GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] stdout = run_command(GITS, ["describe", "--tags", "--dirty", "--always"], cwd=root) if stdout is None: return {} if not stdout.startswith(tag_prefix): if verbose: print("tag '%%s' doesn't start with prefix '%%s'" %% (stdout, tag_prefix)) return {} tag = stdout[len(tag_prefix):] stdout = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) if stdout is None: return {} full = stdout.strip() if tag.endswith("-dirty"): full += "-dirty" return {"version": tag, "full": full} def versions_from_parentdir(parentdir_prefix, root, verbose=False): # Source tarballs conventionally unpack into a directory that includes # both the project name and a version string. dirname = os.path.basename(root) if not dirname.startswith(parentdir_prefix): if verbose: print("guessing rootdir is '%%s', but '%%s' doesn't start with prefix '%%s'" %% (root, dirname, parentdir_prefix)) return None return {"version": dirname[len(parentdir_prefix):], "full": ""} tag_prefix = "%(TAG_PREFIX)s" parentdir_prefix = "%(PARENTDIR_PREFIX)s" versionfile_source = "%(VERSIONFILE_SOURCE)s" def get_versions(default={"version": "unknown", "full": ""}, verbose=False): # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have # __file__, we can work backwards from there to the root. Some # py2exe/bbfreeze/non-CPython implementations don't do __file__, in which # case we can only use expanded variables. variables = { "refnames": git_refnames, "full": git_full } ver = versions_from_expanded_variables(variables, tag_prefix, verbose) if ver: return ver try: root = os.path.abspath(__file__) # versionfile_source is the relative path from the top of the source # tree (where the .git directory might live) to this file. Invert # this to find the root from __file__. for i in range(len(versionfile_source.split("/"))): root = os.path.dirname(root) except NameError: return default return (versions_from_vcs(tag_prefix, root, verbose) or versions_from_parentdir(parentdir_prefix, root, verbose) or default) ''' import subprocess import sys import errno def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False): assert isinstance(commands, list) p = None for c in commands: try: # remember shell=False, so use git.cmd on windows, not just git p = subprocess.Popen([c] + args, cwd=cwd, stdout=subprocess.PIPE, stderr=(subprocess.PIPE if hide_stderr else None)) break except EnvironmentError: e = sys.exc_info()[1] if e.errno == errno.ENOENT: continue if verbose: print("unable to run %s" % args[0]) print(e) return None else: if verbose: print("unable to find command, tried %s" % (commands,)) return None stdout = p.communicate()[0].strip() if sys.version >= '3': stdout = stdout.decode() if p.returncode != 0: if verbose: print("unable to run %s (error)" % args[0]) return None return stdout import sys import re import os.path def get_expanded_variables(versionfile_abs): # the code embedded in _version.py can just fetch the value of these # variables. When used from setup.py, we don't want to import # _version.py, so we do it with a regexp instead. This function is not # used from _version.py. variables = {} try: f = open(versionfile_abs,"r") for line in f.readlines(): if line.strip().startswith("git_refnames ="): mo = re.search(r'=\s*"(.*)"', line) if mo: variables["refnames"] = mo.group(1) if line.strip().startswith("git_full ="): mo = re.search(r'=\s*"(.*)"', line) if mo: variables["full"] = mo.group(1) f.close() except EnvironmentError: pass return variables def versions_from_expanded_variables(variables, tag_prefix, verbose=False): refnames = variables["refnames"].strip() if refnames.startswith("$Format"): if verbose: print("variables are unexpanded, not using") return {} # unexpanded, so not in an unpacked git-archive tarball refs = set([r.strip() for r in refnames.strip("()").split(",")]) # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of # just "foo-1.0". If we see a "tag: " prefix, prefer those. TAG = "tag: " tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)]) if not tags: # Either we're using git < 1.8.3, or there really are no tags. We use # a heuristic: assume all version tags have a digit. The old git %d # expansion behaves like git log --decorate=short and strips out the # refs/heads/ and refs/tags/ prefixes that would let us distinguish # between branches and tags. By ignoring refnames without digits, we # filter out many common branch names like "release" and # "stabilization", as well as "HEAD" and "master". tags = set([r for r in refs if re.search(r'\d', r)]) if verbose: print("discarding '%s', no digits" % ",".join(refs-tags)) if verbose: print("likely tags: %s" % ",".join(sorted(tags))) for ref in sorted(tags): # sorting will prefer e.g. "2.0" over "2.0rc1" if ref.startswith(tag_prefix): r = ref[len(tag_prefix):] if verbose: print("picking %s" % r) return { "version": r, "full": variables["full"].strip() } # no suitable tags, so we use the full revision id if verbose: print("no suitable tags, using full revision id") return { "version": variables["full"].strip(), "full": variables["full"].strip() } def versions_from_vcs(tag_prefix, root, verbose=False): # this runs 'git' from the root of the source tree. This only gets called # if the git-archive 'subst' variables were *not* expanded, and # _version.py hasn't already been rewritten with a short version string, # meaning we're inside a checked out source tree. if not os.path.exists(os.path.join(root, ".git")): if verbose: print("no .git in %s" % root) return {} GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] stdout = run_command(GITS, ["describe", "--tags", "--dirty", "--always"], cwd=root) if stdout is None: return {} if not stdout.startswith(tag_prefix): if verbose: print("tag '%s' doesn't start with prefix '%s'" % (stdout, tag_prefix)) return {} tag = stdout[len(tag_prefix):] stdout = run_command(GITS, ["rev-parse", "HEAD"], cwd=root) if stdout is None: return {} full = stdout.strip() if tag.endswith("-dirty"): full += "-dirty" return {"version": tag, "full": full} def versions_from_parentdir(parentdir_prefix, root, verbose=False): # Source tarballs conventionally unpack into a directory that includes # both the project name and a version string. dirname = os.path.basename(root) if not dirname.startswith(parentdir_prefix): if verbose: print("guessing rootdir is '%s', but '%s' doesn't start with prefix '%s'" % (root, dirname, parentdir_prefix)) return None return {"version": dirname[len(parentdir_prefix):], "full": ""} import os.path import sys # os.path.relpath only appeared in Python-2.6 . Define it here for 2.5. def os_path_relpath(path, start=os.path.curdir): """Return a relative version of a path""" if not path: raise ValueError("no path specified") start_list = [x for x in os.path.abspath(start).split(os.path.sep) if x] path_list = [x for x in os.path.abspath(path).split(os.path.sep) if x] # Work out how much of the filepath is shared by start and path. i = len(os.path.commonprefix([start_list, path_list])) rel_list = [os.path.pardir] * (len(start_list)-i) + path_list[i:] if not rel_list: return os.path.curdir return os.path.join(*rel_list) def do_vcs_install(manifest_in, versionfile_source, ipy): GITS = ["git"] if sys.platform == "win32": GITS = ["git.cmd", "git.exe"] files = [manifest_in, versionfile_source, ipy] try: me = __file__ if me.endswith(".pyc") or me.endswith(".pyo"): me = os.path.splitext(me)[0] + ".py" versioneer_file = os_path_relpath(me) except NameError: versioneer_file = "versioneer.py" files.append(versioneer_file) present = False try: f = open(".gitattributes", "r") for line in f.readlines(): if line.strip().startswith(versionfile_source): if "export-subst" in line.strip().split()[1:]: present = True f.close() except EnvironmentError: pass if not present: f = open(".gitattributes", "a+") f.write("%s export-subst\n" % versionfile_source) f.close() files.append(".gitattributes") run_command(GITS, ["add", "--"] + files) SHORT_VERSION_PY = """ # This file was generated by 'versioneer.py' (0.10) from # revision-control system data, or from the parent directory name of an # unpacked source archive. Distribution tarballs contain a pre-generated copy # of this file. version_version = '%(version)s' version_full = '%(full)s' def get_versions(default={}, verbose=False): return {'version': version_version, 'full': version_full} """ DEFAULT = {"version": "unknown", "full": "unknown"} def versions_from_file(filename): versions = {} try: f = open(filename) except EnvironmentError: return versions for line in f.readlines(): mo = re.match("version_version = '([^']+)'", line) if mo: versions["version"] = mo.group(1) mo = re.match("version_full = '([^']+)'", line) if mo: versions["full"] = mo.group(1) f.close() return versions def write_to_version_file(filename, versions): f = open(filename, "w") f.write(SHORT_VERSION_PY % versions) f.close() print("set %s to '%s'" % (filename, versions["version"])) def get_root(): try: return os.path.dirname(os.path.abspath(__file__)) except NameError: return os.path.dirname(os.path.abspath(sys.argv[0])) def get_versions(default=DEFAULT, verbose=False): # returns dict with two keys: 'version' and 'full' assert versionfile_source is not None, "please set versioneer.versionfile_source" assert tag_prefix is not None, "please set versioneer.tag_prefix" assert parentdir_prefix is not None, "please set versioneer.parentdir_prefix" # I am in versioneer.py, which must live at the top of the source tree, # which we use to compute the root directory. py2exe/bbfreeze/non-CPython # don't have __file__, in which case we fall back to sys.argv[0] (which # ought to be the setup.py script). We prefer __file__ since that's more # robust in cases where setup.py was invoked in some weird way (e.g. pip) root = get_root() versionfile_abs = os.path.join(root, versionfile_source) # extract version from first of _version.py, 'git describe', parentdir. # This is meant to work for developers using a source checkout, for users # of a tarball created by 'setup.py sdist', and for users of a # tarball/zipball created by 'git archive' or github's download-from-tag # feature. variables = get_expanded_variables(versionfile_abs) if variables: ver = versions_from_expanded_variables(variables, tag_prefix) if ver: if verbose: print("got version from expanded variable %s" % ver) return ver ver = versions_from_file(versionfile_abs) if ver: if verbose: print("got version from file %s %s" % (versionfile_abs,ver)) return ver ver = versions_from_vcs(tag_prefix, root, verbose) if ver: if verbose: print("got version from git %s" % ver) return ver ver = versions_from_parentdir(parentdir_prefix, root, verbose) if ver: if verbose: print("got version from parentdir %s" % ver) return ver if verbose: print("got version from default %s" % ver) return default def get_version(verbose=False): return get_versions(verbose=verbose)["version"] class cmd_version(Command): description = "report generated version string" user_options = [] boolean_options = [] def initialize_options(self): pass def finalize_options(self): pass def run(self): ver = get_version(verbose=True) print("Version is currently: %s" % ver) class cmd_build(_build): def run(self): versions = get_versions(verbose=True) _build.run(self) # now locate _version.py in the new build/ directory and replace it # with an updated value target_versionfile = os.path.join(self.build_lib, versionfile_build) print("UPDATING %s" % target_versionfile) os.unlink(target_versionfile) f = open(target_versionfile, "w") f.write(SHORT_VERSION_PY % versions) f.close() if 'cx_Freeze' in sys.modules: # cx_freeze enabled? from cx_Freeze.dist import build_exe as _build_exe class cmd_build_exe(_build_exe): def run(self): versions = get_versions(verbose=True) target_versionfile = versionfile_source print("UPDATING %s" % target_versionfile) os.unlink(target_versionfile) f = open(target_versionfile, "w") f.write(SHORT_VERSION_PY % versions) f.close() _build_exe.run(self) os.unlink(target_versionfile) f = open(versionfile_source, "w") f.write(LONG_VERSION_PY % {"DOLLAR": "$", "TAG_PREFIX": tag_prefix, "PARENTDIR_PREFIX": parentdir_prefix, "VERSIONFILE_SOURCE": versionfile_source, }) f.close() class cmd_sdist(_sdist): def run(self): versions = get_versions(verbose=True) self._versioneer_generated_versions = versions # unless we update this, the command will keep using the old version self.distribution.metadata.version = versions["version"] return _sdist.run(self) def make_release_tree(self, base_dir, files): _sdist.make_release_tree(self, base_dir, files) # now locate _version.py in the new base_dir directory (remembering # that it may be a hardlink) and replace it with an updated value target_versionfile = os.path.join(base_dir, versionfile_source) print("UPDATING %s" % target_versionfile) os.unlink(target_versionfile) f = open(target_versionfile, "w") f.write(SHORT_VERSION_PY % self._versioneer_generated_versions) f.close() INIT_PY_SNIPPET = """ from ._version import get_versions __version__ = get_versions()['version'] del get_versions """ class cmd_update_files(Command): description = "install/upgrade Versioneer files: __init__.py SRC/_version.py" user_options = [] boolean_options = [] def initialize_options(self): pass def finalize_options(self): pass def run(self): print(" creating %s" % versionfile_source) f = open(versionfile_source, "w") f.write(LONG_VERSION_PY % {"DOLLAR": "$", "TAG_PREFIX": tag_prefix, "PARENTDIR_PREFIX": parentdir_prefix, "VERSIONFILE_SOURCE": versionfile_source, }) f.close() ipy = os.path.join(os.path.dirname(versionfile_source), "__init__.py") try: old = open(ipy, "r").read() except EnvironmentError: old = "" if INIT_PY_SNIPPET not in old: print(" appending to %s" % ipy) f = open(ipy, "a") f.write(INIT_PY_SNIPPET) f.close() else: print(" %s unmodified" % ipy) # Make sure both the top-level "versioneer.py" and versionfile_source # (PKG/_version.py, used by runtime code) are in MANIFEST.in, so # they'll be copied into source distributions. Pip won't be able to # install the package without this. manifest_in = os.path.join(get_root(), "MANIFEST.in") simple_includes = set() try: for line in open(manifest_in, "r").readlines(): if line.startswith("include "): for include in line.split()[1:]: simple_includes.add(include) except EnvironmentError: pass # That doesn't cover everything MANIFEST.in can do # (http://docs.python.org/2/distutils/sourcedist.html#commands), so # it might give some false negatives. Appending redundant 'include' # lines is safe, though. if "versioneer.py" not in simple_includes: print(" appending 'versioneer.py' to MANIFEST.in") f = open(manifest_in, "a") f.write("include versioneer.py\n") f.close() else: print(" 'versioneer.py' already in MANIFEST.in") if versionfile_source not in simple_includes: print(" appending versionfile_source ('%s') to MANIFEST.in" % versionfile_source) f = open(manifest_in, "a") f.write("include %s\n" % versionfile_source) f.close() else: print(" versionfile_source already in MANIFEST.in") # Make VCS-specific changes. For git, this means creating/changing # .gitattributes to mark _version.py for export-time keyword # substitution. do_vcs_install(manifest_in, versionfile_source, ipy) def get_cmdclass(): cmds = {'version': cmd_version, 'versioneer': cmd_update_files, 'build': cmd_build, 'sdist': cmd_sdist, } if 'cx_Freeze' in sys.modules: # cx_freeze enabled? cmds['build_exe'] = cmd_build_exe del cmds['build'] return cmds crochet-1.4.0/docs/0000775000175000017500000000000012522534160015567 5ustar itamarstitamarst00000000000000crochet-1.4.0/docs/Makefile0000664000175000017500000001270012215713056017231 0ustar itamarstitamarst00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Crochet.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Crochet.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/Crochet" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Crochet" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." crochet-1.4.0/docs/index.rst0000664000175000017500000000122512522533740017433 0ustar itamarstitamarst00000000000000Use Twisted Anywhere! ===================== Crochet is an MIT-licensed library that makes it easier for blocking and threaded applications like Flask or Django to use the Twisted networking framework. Here's an example of a program using Crochet. Notice that you get a completely blocking interface to Twisted and do not need to run the Twisted reactor, the event loop, yourself. .. literalinclude:: ../examples/blockingdns.py Run on the command line:: $ python blockingdns.py twistedmatrix.com twistedmatrix.com -> 66.35.39.66 Table of Contents ^^^^^^^^^^^^^^^^^ .. toctree:: :maxdepth: 3 introduction api using workarounds news crochet-1.4.0/docs/api.rst0000664000175000017500000001704112522533740017100 0ustar itamarstitamarst00000000000000The API ------- Using Crochet involves three parts: reactor setup, defining functions that call into Twisted's reactor, and using those functions. Setup ^^^^^ Crochet does a number of things for you as part of setup. Most significantly, it runs Twisted's reactor in a thread it manages. Doing setup is easy, just call the ``setup()`` function: .. code-block:: python from crochet import setup setup() Since Crochet is intended to be used as a library, multiple calls work just fine; if more than one library does ``crochet.setup()`` only the first one will do anything. @wait_for: Blocking Calls into Twisted ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Now that you've got the reactor running, the next stage is defining some functions that will run inside the Twisted reactor thread. Twisted's APIs are not thread-safe, and so they cannot be called directly from another thread. Moreover, results may not be available immediately. The easiest way to deal with these issues is to decorate a function that calls Twisted APIs with ``crochet.wait_for``. * When the decorated function is called, the code will not run in the calling thread, but rather in the reactor thread. * The function blocks until a result is available from the code running in the Twisted thread. The returned result is the result of running the code; if the code throws an exception, an exception is thrown. * If the underlying code returns a ``Deferred``, it is handled transparently; its results are extracted and passed to the caller. * ``crochet.wait_for`` takes a ``timeout`` argument, a ``float`` indicating the number of seconds to wait until a result is available. If the given number of seconds pass and the underlying operation is still unfinished a ``crochet.TimeoutError`` exception is raised, and the wrapped ``Deferred`` is canceled. If the underlying API supports cancellation this might free up any unused resources, close outgoing connections etc., but cancellation is not guaranteed and should not be relied on. .. note :: ``wait_for`` was added to Crochet in v1.2.0. Prior releases provided a similar API called ``wait_for_reactor`` which did not provide timeouts. This older API still exists but is deprecated since waiting indefinitely is a bad idea. To see what this means, let's return to the first example in the documentation: .. literalinclude:: ../examples/blockingdns.py Twisted's ``lookupAddress()`` call returns a ``Deferred``, but the code calling the decorated ``gethostbyname()`` doesn't know that. As far as the caller concerned is just calling a blocking function that returns a result or raises an exception. Run on the command line with a valid domain we get:: $ python blockingdns.py twistedmatrix.com twistedmatrix.com -> 66.35.39.66 If we try to call the function with an invalid domain, we get back an exception:: $ python blockingdns.py doesnotexist Trace back (most recent call last): File "examples/blockingdns.py", line 33, in ip = gethostbyname(name) File "/home/itamar/crochet/crochet/_eventloop.py", line 434, in wrapper return eventual_result.wait(timeout) File "/home/itamar/crochet/crochet/_eventloop.py", line 216, in wait result.raiseException() File "", line 2, in raiseException twisted.names.error.DNSNameError: ]> @run_in_reactor: Asynchronous Results ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ``wait_for`` is implemented using ``run_in_reactor``, a more sophisticated and lower-level API. Rather than waiting until a result is available, it returns a special object supporting multiple attempts to retrieve results, as well as manual cancellation. This can be useful for running tasks "in the background", i.e. asynchronously, as opposed to blocking and waiting for them to finish. Decorating a function that calls Twisted APIs with ``run_in_reactor`` has two consequences: * When the function is called, the code will not run in the calling thread, but rather in the reactor thread. * The return result from a decorated function is an ``EventualResult`` instance, wrapping the result of the underlying code, with particular support for ``Deferred`` instances. ``EventualResult`` has the following basic methods: * ``wait(timeout)``: Return the result when it becomes available; if the result is an exception it will be raised. The timeout argument is a ``float`` indicating a number of seconds; ``wait()`` will throw ``crochet.TimeoutError`` if the timeout is hit. * ``cancel()``: Cancel the operation tied to the underlying ``Deferred``. Many, but not all, ``Deferred`` results returned from Twisted allow the underlying operation to be canceled. Even if implemented, cancellation may not be possible for a variety of reasons, e.g. it may be too late. Its main purpose to free up no longer used resources, and it should not be relied on otherwise. There are also some more specialized methods: * ``original_failure()`` returns the underlying Twisted `Failure`_ object if your result was a raised exception, allowing you to print the original traceback that caused the exception. This is necessary because the default exception you will see raised from ``EventualResult.wait()`` won't include the stack from the underlying Twisted code where the exception originated. * ``stash()``: Sometimes you want to store the ``EventualResult`` in memory for later retrieval. This is specifically useful when you want to store a reference to the ``EventualResult`` in a web session like Flask's (see the example below). ``stash()`` stores the ``EventualResult`` in memory, and returns an integer uid that can be used to retrieve the result using ``crochet.retrieve_result(uid)``. Note that retrieval works only once per uid. You will need the stash the ``EventualResult`` again (with a new resulting uid) if you want to retrieve it again later. In the following example, you can see all of these APIs in use. For each user session, a download is started in the background. Subsequent page refreshes will eventually show the downloaded page. .. literalinclude:: ../examples/downloader.py .. _Failure: https://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html Using Crochet from Twisted Applications ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ If your application is already planning on running the Twisted reactor itself (e.g. you're using Twisted as a WSGI container), Crochet's default behavior of running the reactor in a thread is a problem. To solve this, Crochet provides the ``no_setup()`` function, which causes future calls to ``setup()`` to do nothing. Thus, an application that will run the Twisted reactor but also wants to use a Crochet-using library must run it first: .. code-block:: python from crochet import no_setup no_setup() # Only now do we import libraries that might run crochet.setup(): import blockinglib # ... setup application ... from twisted.internet import reactor reactor.run() Unit Testing ^^^^^^^^^^^^ Both ``@wait_for`` and ``@run_in_reactor`` expose the underlying Twisted function via a ``wrapped_function`` attribute. This allows unit testing of the Twisted code without having to go through the Crochet layer. .. literalinclude:: ../examples/testing.py When run, this gives the following output:: add() returns EventualResult: add.wrapped_function() returns result of underlying function: 3 crochet-1.4.0/docs/using.rst0000664000175000017500000000442012522533740017451 0ustar itamarstitamarst00000000000000Best Practices -------------- Hide Twisted and Crochet ^^^^^^^^^^^^^^^^^^^^^^^^ Consider some synchronous do-one-thing-after-the-other application code that wants to use event-driven Twisted-using code. We have two threads at a minimum: the application thread(s) and the reactor thread. There are also multiple layers of code involved in this interaction: * **Twisted code:** Should only be called in reactor thread. This may be code from the Twisted package itself, or more likely code you have written that is built on top of Twisted. * **@wait_for/@run_in_reactor wrappers:** The body of the functions runs in the reactor thread... but the caller should be in the application thread. * **The application code:** Runs in the application thread(s), expects synchronous/blocking calls. Sometimes the first two layers suffice, but there are some issues with only having these. First, if you're using ``@run_in_reactor`` it requires the application code to understand Crochet's API, i.e. ``EventualResult`` objects. Second, if the wrapped function returns an object that expects to interact with Twisted, the application code will not be able to use that object since it will be called in the wrong thread. A better solution is to have an additional layer in-between the application code and ``@wait_for/@run_in_reactor`` wrappers. This layer can hide the details of the Crochet API and wrap returned Twisted objects if necessary. As a result the application code simply seems a normal API, with no need to understand ``EventualResult`` objects or Twisted. In the following example the different layers of the code demonstrate this separation: ``_ExchangeRate`` is just Twisted code, and ``ExchangeRate`` provides a blocking wrapper using Crochet. .. literalinclude:: ../examples/scheduling.py Minimize Decorated Code ^^^^^^^^^^^^^^^^^^^^^^^ It's best to have as little code as possible in the ``@wait_for/@run_in_reactor`` wrappers. As this code straddles two worlds (or at least, two threads) it is more difficult to unit test. Having an extra layer between this code and the application code is also useful in this regard as well: Twisted code can be pushed into the lower-level Twisted layer, and code hiding the Twisted details from the application code can be pushed into the higher-level layer. crochet-1.4.0/docs/introduction.rst0000664000175000017500000000152512342417332021046 0ustar itamarstitamarst00000000000000Introduction ------------ .. include:: ../README.rst Examples ======== Background Scheduling ^^^^^^^^^^^^^^^^^^^^^ You can use Crochet to schedule events that will run in the background without slowing down the page rendering of your web applications: .. literalinclude:: ../examples/scheduling.py SSH into your Server ^^^^^^^^^^^^^^^^^^^^ You can SSH into your Python process and get a Python prompt, allowing you to poke around in the internals of your running program: .. literalinclude:: ../examples/ssh.py DNS Query ^^^^^^^^^ Twisted also has a fully featured DNS library: .. literalinclude:: ../examples/mxquery.py Using Crochet in Normal Twisted Code ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ You can use Crochet's APIs for calling into the reactor thread from normal Twisted applications: .. literalinclude:: ../examples/fromtwisted.py crochet-1.4.0/docs/make.bat0000664000175000017500000001175212215713056017204 0ustar itamarstitamarst00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Crochet.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Crochet.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end crochet-1.4.0/docs/conf.py0000664000175000017500000001722512516747054017110 0ustar itamarstitamarst00000000000000# -*- coding: utf-8 -*- # # Crochet documentation build configuration file, created by # sphinx-quickstart on Mon Sep 16 19:37:18 2013. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys, os # Make sure local crochet is used when importing: sys.path.insert(0, os.path.abspath('..')) # -- General configuration ----------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = [] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'Crochet' copyright = u'2013, Itamar Turner-Trauring' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. import crochet version = crochet.__version__ # Versioneer adds -dirty suffix to version if checkout is dirty, and # therefore ReadTheDocs somehow ends up with this in its version, so strip # it out. if version.endswith("-dirty"): version = version[:-len("-dirty")] # The full version, including alpha/beta/rc tags. release = version # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'default' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. html_use_index = False # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'crochetdoc' # -- Options for LaTeX output -------------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ('index', 'Crochet.tex', u'Crochet Documentation', u'Itamar Turner-Trauring', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'crochet', u'Crochet Documentation', [u'Itamar Turner-Trauring'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------------ # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'Crochet', u'Crochet Documentation', u'Itamar Turner-Trauring', 'Crochet', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' crochet-1.4.0/docs/news.rst0000664000175000017500000001071412522533740017303 0ustar itamarstitamarst00000000000000What's New ========== 1.4.0 ^^^^^ New features: * Added support for Python 3.4. Documentation: * Added a section on known issues and workarounds. Bug fixes: * Main thread detection (used to determine when Crochet should shutdown) is now less fragile. This means Crochet now supports more environments, e.g. uWSGI. Thanks to Ben Picolo for the patch. 1.3.0 ^^^^^ Bug fixes: * It is now possible to call ``EventualResult.wait()`` (or functions wrapped in ``wait_for``) at import time if another thread holds the import lock. Thanks to Ken Struys for the patch. 1.2.0 ^^^^^ New features: * ``crochet.wait_for`` implements the timeout/cancellation pattern documented in previous versions of Crochet. ``crochet.wait_for_reactor`` and ``EventualResult.wait(timeout=None)`` are now deprecated, since lacking timeouts they could potentially block forever. * Functions wrapped with ``wait_for`` and ``run_in_reactor`` can now be accessed via the ``wrapped_function`` attribute, to ease unit testing of the underlying Twisted code. API changes: * It is no longer possible to call ``EventualResult.wait()`` (or functions wrapped with ``wait_for``) at import time, since this can lead to deadlocks or prevent other threads from importing. Thanks to Tom Prince for the bug report. Bug fixes: * ``warnings`` are no longer erroneously turned into Twisted log messages. * The reactor is now only imported when ``crochet.setup()`` or ``crochet.no_setup()`` are called, allowing daemonization if only ``crochet`` is imported (http://tm.tl/7105). Thanks to Daniel Nephin for the bug report. Documentation: * Improved motivation, added contact info and news to the documentation. * Better example of using Crochet from a normal Twisted application. 1.1.0 ^^^^^ Bug fixes: * ``EventualResult.wait()`` can now be used safely from multiple threads, thanks to Gavin Panella for reporting the bug. * Fixed reentrancy deadlock in the logging code caused by http://bugs.python.org/issue14976, thanks to Rod Morehead for reporting the bug. * Crochet now installs on Python 3.3 again, thanks to Ben Cordero. * Crochet should now work on Windows, thanks to Konstantinos Koukopoulos. * Crochet tests can now run without adding its absolute path to PYTHONPATH or installing it first. Documentation: * ``EventualResult.original_failure`` is now documented. 1.0.0 ^^^^^ Documentation: * Added section on use cases and alternatives. Thanks to Tobias Oberstein for the suggestion. Bug fixes: * Twisted does not have to be pre-installed to run ``setup.py``, thanks to Paul Weaver for bug report and Chris Scutcher for patch. * Importing Crochet does not have side-effects (installing reactor event) any more. * Blocking calls are interrupted earlier in the shutdown process, to reduce scope for deadlocks. Thanks to rmorehead for bug report. 0.9.0 ^^^^^ New features: * Expanded and much improved documentation, including a new section with design suggestions. * New decorator ``@wait_for_reactor`` added, a simpler alternative to ``@run_in_reactor``. * Refactored ``@run_in_reactor``, making it a bit more responsive. * Blocking operations which would otherwise never finish due to reactor having stopped (``EventualResult.wait()`` or ``@wait_for_reactor`` decorated call) will be interrupted with a ``ReactorStopped`` exception. Thanks to rmorehead for the bug report. Bug fixes: * ``@run_in_reactor`` decorated functions (or rather, their generated wrapper) are interrupted by Ctrl-C. * On POSIX platforms, a workaround is installed to ensure processes started by `reactor.spawnProcess` have their exit noticed. See `Twisted ticket 6378`_ for more details about the underlying issue. .. _Twisted ticket 6378: http://tm.tl/6738 0.8.1 ^^^^^ * ``EventualResult.wait()`` now raises error if called in the reactor thread, thanks to David Buchmann. * Unittests are now included in the release tarball. * Allow Ctrl-C to interrupt ``EventualResult.wait(timeout=None)``. 0.7.0 ^^^^^ * Improved documentation. 0.6.0 ^^^^^ * Renamed ``DeferredResult`` to ``EventualResult``, to reduce confusion with Twisted's ``Deferred`` class. The old name still works, but is deprecated. * Deprecated ``@in_reactor``, replaced with ``@run_in_reactor`` which doesn't change the arguments to the wrapped function. The deprecated API still works, however. * Unhandled exceptions in ``EventualResult`` objects are logged. * Added more examples. * ``setup.py sdist`` should work now. 0.5.0 ^^^^^ * Initial release. crochet-1.4.0/docs/workarounds.rst0000664000175000017500000000634012522533740020705 0ustar itamarstitamarst00000000000000Known Issues and Workarounds ---------------------------- Preventing Deadlocks on Shutdown ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ To ensure a timely process exit, during reactor shutdown Crochet will try to interrupt calls to ``EventualResult.wait()`` or functions decorated with ``@wait_for`` with a ``crochet.ReactorStopped`` exception. This is still not a complete solution, unfortunately. If you are shutting down a thread pool as part of Twisted's reactor shutdown, this will wait until all threads are done. If you're blocking indefinitely, this may rely on Crochet interrupting those blocking calls... but Crochet's shutdown may be delayed until the thread pool finishes shutting down, depending on the ordering of shutdown events. The solution is to interrupt all blocking calls yourself. You can do this by firing or canceling any ``Deferred`` instances you are waiting on as part of your application shutdown, and do so before you stop any thread pools. Reducing Twisted Log Messages ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Twisted can be rather verbose with its log messages. If you wish to reduce the message flow you can limit them to error messages only: .. code-block:: python import logging logging.getLogger('twisted').setLevel(logging.ERROR) Missing Tracebacks ^^^^^^^^^^^^^^^^^^ In order to prevent massive memory leaks, Twisted currently wipes out the traceback from exceptions it captures (see https://tm.tl/7873 for ideas on improving this). This means that often exceptions re-raised by Crochet will be missing their tracebacks. You can however get access to a string version of the traceback, suitable for logging, from ``EventualResult`` objects returned by ``@run_in_reactor``\-wrapped functions: .. code-block:: python from crochet import run_in_reactor, TimeoutError @run_in_reactor def download_page(url): from twisted.web.client import getPage return getPage(url) result = download_page("https://github.com") try: page = result.wait(timeout=1000) except TimeoutError: # Handle timeout ... except: # Something else happened: print(result.original_failure().getTraceback()) uWSGI, multiprocessing, Celery ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ uWSGI, the standard library ``multiprocessing.py`` library and Celery by default use ``fork()`` without ``exec()`` to create child processes on Unix systems. This means they effectively clone a running parent Python process, preserving all existing imported modules. This is a fundamentally broken thing to do, e.g. it breaks the standard library's ``logging`` package. It also breaks Crochet. You have two options for dealing with this problem. The ideal solution is to avoid this "feature": uWSGI Use the ``--lazy-apps`` command-line option. ``multiprocessing.py`` Use the ``spawn`` (or possibly ``forkserver``) start methods when using Python 3. See https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods for more details. Alternatively, you can ensure you only start Crochet inside the child process: uWSGI Only run ``crochet.setup()`` inside the WSGI application function. ``multiprocessing.py`` Only run ``crochet.setup()`` in the child process. Celery Only run ``crochet.setup()`` inside tasks. crochet-1.4.0/setup.py0000664000175000017500000000271412516747054016370 0ustar itamarstitamarst00000000000000import os try: from setuptools import setup except ImportError: from distutils.core import setup import versioneer versioneer.versionfile_source = 'crochet/_version.py' versioneer.versionfile_build = 'crochet/_version.py' versioneer.tag_prefix = '' # tags are like 1.2.0 versioneer.parentdir_prefix = 'crochet-' def read(path): """ Read the contents of a file. """ with open(path) as f: return f.read() setup( classifiers=[ 'Intended Audience :: Developers', 'License :: OSI Approved :: MIT License', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 2.6', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3.3', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: Implementation :: CPython', 'Programming Language :: Python :: Implementation :: PyPy', ], name='crochet', version=versioneer.get_version(), cmdclass=versioneer.get_cmdclass(), description="Use Twisted anywhere!", install_requires=[ "Twisted>=11.1", ], keywords="twisted threading", license="MIT", packages=["crochet", "crochet.tests"], url="https://github.com/itamarst/crochet", maintainer='Itamar Turner-Trauring', maintainer_email='itamar@itamarst.org', long_description=read('README.rst') + '\n' + read('docs/news.rst'), )