Asynchronous Python HTTP Requests for Humans using twisted

Overview

Asynchronous Python HTTP Requests for Humans

https://travis-ci.org/tardyp/txrequests.png?branch=master

Small add-on for the python requests http library. Makes use twisted's ThreadPool, so that the requests'API returns deferred

The additional API and changes are minimal and strives to avoid surprises.

The following synchronous code:

from requests import Session

session = Session()
# first requests starts and blocks until finished
response_one = session.get('http://httpbin.org/get')
# second request starts once first is finished
response_two = session.get('http://httpbin.org/get?foo=bar')
# both requests are complete
print('response one status: {0}'.format(response_one.status_code))
print(response_one.content)
print('response two status: {0}'.format(response_two.status_code))
print(response_two.content)

Can be translated to make use of futures, and thus be asynchronous by creating a FuturesSession and catching the returned Future in place of Response. The Response can be retrieved by calling the result method on the Future:

from txrequests import Session
from twisted.internet import defer

@defer.inlineCallbacks
def main():
    # use with statement to cleanup session's threadpool, and connectionpool after use
    # you can also use session.close() if want to use session for long term use
    with Session() as session:
        # first request is started in background
        d1 = session.get('http://httpbin.org/get')
        # second requests is started immediately
        d2 = session.get('http://httpbin.org/get?foo=bar')
        # wait for the first request to complete, if it hasn't already
        response_one = yield d1
        print('response one status: {0}'.format(response_one.status_code))
        print(response_one.content)
        # wait for the second request to complete, if it hasn't already
        response_two = yield d2
        print('response two status: {0}'.format(response_two.status_code))
        print(response_two.content)

By default a ThreadPool is created with 4 max workers. If you would like to adjust that value or share a threadpool across multiple sessions you can provide one to the Session constructor.

from twisted.python.threadpool import ThreadPool
from txrequests import Session

session = FuturesSession(pool=ThreadPool(maxthreads=10))
# ...

As a shortcut in case of just increasing workers number you can pass minthreads and/or maxthreads straight to the Session constructor:

from txrequests import Session
session = Session(maxthreads=10)

That's it. The api of requests.Session is preserved without any modifications beyond returning a Deferred rather than Response. As with all futures exceptions are shifted to the deferred errback.

Working in the Background

There is one additional parameter to the various request functions, background_callback, which allows you to work with the Response objects in the background thread. This can be useful for shifting work out of the foreground, for a simple example take json parsing.

from pprint import pprint
from txrequests import Session
from twisted.internet import defer

@defer.inlineCallbacks
def main():
    with Session() as session:

        def bg_cb(sess, resp):
            # parse the json storing the result on the response object
            resp.data = resp.json()
            return resp

        d = session.get('http://httpbin.org/get', background_callback=bg_cb)
        # do some other stuff, send some more requests while this one works
        response = yield d
        print('response status {0}'.format(response.status_code))
        # data will have been attached to the response object in the background
        pprint(response.data)

Installation

pip install txrequests

Credits

txrequests is based on requests_future, from Ross McFarland

Owner
Pierre Tardy
Pierre Tardy
HackerNews digest using GitHub actions

HackerNews Digest This script makes use of GitHub actions to send daily newsletters with the top 10 posts from HackerNews of the previous day. How to

Rajkumar S 3 Jan 19, 2022
A Python obfuscator using HTTP Requests and Hastebin.

🔨 Jawbreaker 🔨 Jawbreaker is a Python obfuscator written in Python3, using double encoding in base16, base32, base64, HTTP requests and a Hastebin-l

Billy 50 Sep 28, 2022
r - a small subset of Python Requests

r a small subset of Python Requests a few years ago, when I was first learning Python and looking for http functionality, i found the batteries-includ

Gabriel Sroka 4 Dec 15, 2022
Probe and discover HTTP pathname using brute-force methodology and filtered by specific word or 2 words at once

pathprober Probe and discover HTTP pathname using brute-force methodology and filtered by specific word or 2 words at once. Purpose Brute-forcing webs

NFA 41 Jul 06, 2022
Aiohttp-openmetrics - OpenMetrics endpoint provider for aiohttp

aiohttp-openmetrics This project contains a simple middleware and /metrics route

Jelmer Vernooij 1 Dec 15, 2022
suite de mocks http em json

Ritchie Formula Repo Documentation Contribute to the Ritchie community This repository contains rit formulas which can be executed by the ritchie-cli.

Kaio Fábio Prates Prudêncio 1 Nov 01, 2021
Python package for caching HTTP response based on etag

Etag cache implementation for HTTP requests, to save request bandwidth for a non-modified response. Returns high-speed accessed dictionary data as cache.

Rakesh R 2 Apr 27, 2022
Python Client for the Etsy NodeJS Statsd Server

Introduction statsd is a client for Etsy's statsd server, a front end/proxy for the Graphite stats collection and graphing server. Links The source: h

Rick van Hattem 107 Jun 09, 2022
Get the HTTP code of websites along with a cute cat picture

Cat Requests What is this? Cat requests allows you to both get the HTTP response code of the website you wish and it displays it to your screen as a c

Oakchris1955 3 Feb 27, 2022
A toolbelt of useful classes and functions to be used with python-requests

The Requests Toolbelt This is just a collection of utilities for python-requests, but don't really belong in requests proper. The minimum tested reque

892 Jan 06, 2023
A next generation HTTP client for Python. 🦋

HTTPX - A next-generation HTTP client for Python. HTTPX is a fully featured HTTP client for Python 3, which provides sync and async APIs, and support

Encode 9.8k Jan 05, 2023
HTTP request/response parser for python in C

http-parser HTTP request/response parser for Python compatible with Python 2.x (=2.7), Python 3 and Pypy. If possible a C parser based on http-parser

Benoit Chesneau 334 Dec 24, 2022
Python Simple SOAP Library

PySimpleSOAP / soap2py Python simple and lightweight SOAP library for client and server webservices interfaces, aimed to be as small and easy as possi

PySimpleSOAP 369 Jan 02, 2023
Small, fast HTTP client library for Python. Features persistent connections, cache, and Google App Engine support. Originally written by Joe Gregorio, now supported by community.

Introduction httplib2 is a comprehensive HTTP client library, httplib2.py supports many features left out of other HTTP libraries. HTTP and HTTPS HTTP

457 Dec 10, 2022
Python HTTP library with thread-safe connection pooling, file post support, user friendly, and more.

urllib3 is a powerful, user-friendly HTTP client for Python. Much of the Python ecosystem already uses urllib3 and you should too. urllib3 brings many

urllib3 3.2k Jan 02, 2023
💡Python package for HTTP/1.1 style headers. Parse headers to objects. Most advanced available structure for http headers.

HTTP Headers, the Complete Toolkit 🧰 Object-oriented headers. Kind of structured headers. ❓ Why No matter if you are currently dealing with code usin

TAHRI Ahmed R. 103 Dec 02, 2022
Asynchronous Python HTTP Requests for Humans using twisted

Asynchronous Python HTTP Requests for Humans Small add-on for the python requests http library. Makes use twisted's ThreadPool, so that the requests'A

Pierre Tardy 32 Oct 27, 2021
HTTP/2 for Python.

Hyper: HTTP/2 Client for Python This project is no longer maintained! Please use an alternative, such as HTTPX or others. We will not publish further

Hyper 1k Dec 23, 2022
A simple, yet elegant HTTP library.

Requests Requests is a simple, yet elegant HTTP library. import requests r = requests.get('https://api.github.com/user', auth=('user', 'pass')

Python Software Foundation 48.8k Jan 05, 2023
T-Reqs: A grammar-based HTTP Fuzzer

T-Reqs HTTP Fuzzer T-Reqs (Two Requests) is a grammar-based HTTP Fuzzer written as a part of the paper titled "T-Reqs: HTTP Request Smuggling with Dif

Bahruz Jabiyev 207 Dec 06, 2022