module documentation

Helper functions for dealing with Twisted deferreds

Async Function aiter_errback Wraps an async iterable calling an errback if an error is caught while iterating it. Similar to scrapy.utils.defer.iter_errback()
Function defer_fail Same as twisted.internet.defer.fail but delay calling errback until next reactor loop
Function defer_result Undocumented
Function defer_succeed Same as twisted.internet.defer.succeed but delay calling callback until next reactor loop
Function deferred_f_from_coro_f Converts a coroutine function into a function that returns a Deferred.
Function deferred_from_coro Converts a coroutine into a Deferred, or returns the object as is if it isn't a coroutine
Function deferred_to_future .. versionadded:: 2.6.0
Function iter_errback Wraps an iterable calling an errback if an error is caught while iterating it.
Function maybe_deferred_to_future .. versionadded:: 2.6.0
Function maybeDeferred_coro Copy of defer.maybeDeferred that also converts coroutines to Deferreds.
Function mustbe_deferred Same as twisted.internet.defer.maybeDeferred, but delay calling callback/errback to next reactor loop
Function parallel Execute a callable over the objects in the given iterable, in parallel, using no more than ``count`` concurrent calls.
Function parallel_async Like parallel but for async iterators
Function process_chain Return a Deferred built by chaining the given callbacks
Function process_chain_both Return a Deferred built by chaining the given callbacks and errbacks
Function process_parallel Return a Deferred with the output of all successful calls to the given callbacks
Class _AsyncCooperatorAdapter A class that wraps an async iterable into a normal iterator suitable for using in Cooperator.coiterate(). As it's only needed for parallel_async(), it calls the callable directly in the callback, instead of providing a more generic interface.
async def aiter_errback(aiterable: AsyncIterable, errback: Callable, *a, **kw) -> AsyncGenerator: (source)

Wraps an async iterable calling an errback if an error is caught while iterating it. Similar to scrapy.utils.defer.iter_errback()

def defer_fail(_failure: Failure) -> Deferred: (source)

Same as twisted.internet.defer.fail but delay calling errback until next reactor loop It delays by 100ms so reactor has a chance to go through readers and writers before attending pending delayed calls, so do not set delay to zero.

def defer_result(result) -> Deferred: (source)

Undocumented

def defer_succeed(result) -> Deferred: (source)

Same as twisted.internet.defer.succeed but delay calling callback until next reactor loop It delays by 100ms so reactor has a chance to go through readers and writers before attending pending delayed calls, so do not set delay to zero.

def deferred_f_from_coro_f(coro_f: Callable[..., Coroutine]) -> Callable: (source)

Converts a coroutine function into a function that returns a Deferred. The coroutine function will be called at the time when the wrapper is called. Wrapper args will be passed to it. This is useful for callback chains, as callback functions are called with the previous callback result.

def deferred_from_coro(o) -> Any: (source)

Converts a coroutine into a Deferred, or returns the object as is if it isn't a coroutine

def deferred_to_future(d: Deferred) -> Future: (source)

.. versionadded:: 2.6.0 Return an :class:`asyncio.Future` object that wraps *d*. When :ref:`using the asyncio reactor <install-asyncio>`, you cannot await on :class:`~twisted.internet.defer.Deferred` objects from :ref:`Scrapy callables defined as coroutines <coroutine-support>`, you can only await on ``Future`` objects. Wrapping ``Deferred`` objects into ``Future`` objects allows you to wait on them:: class MySpider(Spider): ... async def parse(self, response): d = treq.get('https://example.com/additional') additional_response = await deferred_to_future(d)

def iter_errback(iterable: Iterable, errback: Callable, *a, **kw) -> Generator: (source)

Wraps an iterable calling an errback if an error is caught while iterating it.

def maybe_deferred_to_future(d: Deferred) -> Union[Deferred, Future]: (source)

.. versionadded:: 2.6.0 Return *d* as an object that can be awaited from a :ref:`Scrapy callable defined as a coroutine <coroutine-support>`. What you can await in Scrapy callables defined as coroutines depends on the value of :setting:`TWISTED_REACTOR`: - When not using the asyncio reactor, you can only await on :class:`~twisted.internet.defer.Deferred` objects. - When :ref:`using the asyncio reactor <install-asyncio>`, you can only await on :class:`asyncio.Future` objects. If you want to write code that uses ``Deferred`` objects but works with any reactor, use this function on all ``Deferred`` objects:: class MySpider(Spider): ... async def parse(self, response): d = treq.get('https://example.com/additional') extra_response = await maybe_deferred_to_future(d)

def maybeDeferred_coro(f: Callable, *args, **kw) -> Deferred: (source)

Copy of defer.maybeDeferred that also converts coroutines to Deferreds.

def mustbe_deferred(f: Callable, *args, **kw) -> Deferred: (source)

Same as twisted.internet.defer.maybeDeferred, but delay calling callback/errback to next reactor loop

def parallel(iterable: Iterable, count: int, callable: Callable, *args, **named) -> DeferredList: (source)

Execute a callable over the objects in the given iterable, in parallel, using no more than ``count`` concurrent calls. Taken from: https://jcalderone.livejournal.com/24285.html

def parallel_async(async_iterable: AsyncIterable, count: int, callable: Callable, *args, **named) -> DeferredList: (source)

Like parallel but for async iterators

def process_chain(callbacks: Iterable[Callable], input, *a, **kw) -> Deferred: (source)

Return a Deferred built by chaining the given callbacks

def process_chain_both(callbacks: Iterable[Callable], errbacks: Iterable[Callable], input, *a, **kw) -> Deferred: (source)

Return a Deferred built by chaining the given callbacks and errbacks

def process_parallel(callbacks: Iterable[Callable], input, *a, **kw) -> Deferred: (source)

Return a Deferred with the output of all successful calls to the given callbacks