fastapi-cache is a tool to cache fastapi response and function result, with backends support redis and memcached.

Overview

fastapi-cache

pypi license workflows workflows

Introduction

fastapi-cache is a tool to cache fastapi response and function result, with backends support redis, memcache, and dynamodb.

Features

  • Support redis, memcache, dynamodb, and in-memory backends.
  • Easily integration with fastapi.
  • Support http cache like ETag and Cache-Control.

Requirements

  • asyncio environment.
  • redis if use RedisBackend.
  • memcache if use MemcacheBackend.
  • aiobotocore if use DynamoBackend.

Install

> pip install fastapi-cache2

or

> pip install "fastapi-cache2[redis]"

or

> pip install "fastapi-cache2[memcache]"

or

> pip install "fastapi-cache2[dynamodb]"

Usage

Quick Start

import aioredis
from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import Response

from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache

app = FastAPI()


@cache()
async def get_cache():
    return 1


@app.get("/")
@cache(expire=60)
async def index(request: Request, response: Response):
    return dict(hello="world")


@app.on_event("startup")
async def startup():
    redis =  aioredis.from_url("redis://localhost", encoding="utf8", decode_responses=True)
    FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")

Initialization

Firstly you must call FastAPICache.init on startup event of fastapi, there are some global config you can pass in.

Use cache decorator

If you want cache fastapi response transparently, you can use cache as decorator between router decorator and view function and must pass request as param of view function.

Parameter type, description
expire int, states a caching time in seconds
namespace str, namespace to use to store certain cache items
coder which coder to use, e.g. JsonCoder
key_builder which key builder to use, default to builtin

And if you want use ETag and Cache-Control features, you must pass response param also.

You can also use cache as decorator like other cache tools to cache common function result.

Custom coder

By default use JsonCoder, you can write custom coder to encode and decode cache result, just need inherit fastapi_cache.coder.Coder.

@app.get("/")
@cache(expire=60,coder=JsonCoder)
async def index(request: Request, response: Response):
    return dict(hello="world")

Custom key builder

By default use builtin key builder, if you need, you can override this and pass in cache or FastAPICache.init to take effect globally.

def my_key_builder(
    func,
    namespace: Optional[str] = "",
    request: Request = None,
    response: Response = None,
    *args,
    **kwargs,
):
    prefix = FastAPICache.get_prefix()
    cache_key = f"{prefix}:{namespace}:{func.__module__}:{func.__name__}:{args}:{kwargs}"
    return cache_key

@app.get("/")
@cache(expire=60,coder=JsonCoder,key_builder=my_key_builder)
async def index(request: Request, response: Response):
    return dict(hello="world")

InMemoryBackend

InMemoryBackend store cache data in memory and use lazy delete, which mean if you don't access it after cached, it will not delete automatically.

License

This project is licensed under the Apache-2.0 License.

Comments
  • Make request parameter optional in user code

    Make request parameter optional in user code

    Hello, this PR makes request: Request parameter optional in user code by forcibly adding it to wrapped function signature I'd like to know if that change would be welcomed or not since it seems a bit hacky and it adds more complexity to the code

    opened by ThirVondukr 7
  • how to check if the data in cache is empty and then load data into the indicated cache?

    how to check if the data in cache is empty and then load data into the indicated cache?

    for example, how to check if there is None data in the cache "test" of the following function section_data? and how to load new data into it if it is empty.

    def my_key_builder(   func,  namespace: Optional[str] = "",  request: Request = None, response: Response = None, *args,  **kwargs,):
        prefix = FastAPICache.get_prefix()
        cache_key = f"{prefix}:{namespace}:{func.__module__}:{func.__name__}:{args}:{kwargs}"
        return cache_key
    
    @app.get('/data',response_model=List[Dict])
    @cache(namespace="test", expire=60,key_builder=my_key_builder)
    async def section_data( ):
        data= func('section')
        return data
    

    Thanks a lot

    opened by marcusau 6
  • [question] how to cache 200 http responses only?

    [question] how to cache 200 http responses only?

    I'm not sure how to discard 500 and 400 errors from cache using InMemoryBackend. My usecase depends on an external provider, but I want to cache only success responses. If the provider fails I want to keep asking for the data. I tried to inject 'no-cache' header but request headers are inmutable.

    opened by PieroValdebenito 5
  • Add dynamodb backend support

    Add dynamodb backend support

    Add support for Amazon DynamoDB backends using aiobotocore.

    Usage: >> dynamodb = DynamoBackend(table_name="your-cache", region="eu-west-1") >> await dynamodb.init() >> FastAPICache.init(dynamodb)

    opened by geo-mathijs 4
  • Error thrown : TypeError: object list can't be used in 'await' expression

    Error thrown : TypeError: object list can't be used in 'await' expression

    File "/usr/local/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 398, in run_asgi result = await app(self.scope, self.receive, self.send) File "/usr/local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in call return await self.app(scope, receive, send) File "/usr/local/lib/python3.8/site-packages/fastapi/applications.py", line 208, in call await super().call(scope, receive, send) File "/usr/local/lib/python3.8/site-packages/starlette/applications.py", line 112, in call await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 181, in call raise exc from None File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 159, in call await self.app(scope, receive, _send) File "/usr/local/lib/python3.8/site-packages/timing_asgi/middleware.py", line 68, in call await self.app(scope, receive, send_wrapper) File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 86, in call await self.simple_response(scope, receive, send, request_headers=headers) File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 142, in simple_response await self.app(scope, receive, send) File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 82, in call raise exc from None File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 71, in call await self.app(scope, receive, sender) File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 580, in call await route.handle(scope, receive, send) File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 241, in handle await self.app(scope, receive, send) File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 52, in app response = await func(request) File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 226, in app raw_response = await run_endpoint_function( File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 159, in run_endpoint_function return await dependant.call(**values) File "/usr/local/lib/python3.8/site-packages/fastapi_cache/decorator.py", line 47, in inner ret = await func(*args, **kwargs) TypeError: object list can't be used in 'await' expression

    opened by omshankar1 4
  • Add minimum dependencies version

    Add minimum dependencies version

    At the current moment, it's impossible to install this package via poetry because of a min version conflict with the extras specified in pyproject.toml. It's probably best to set a minimum/explicit version and then have Dependabot tests against new updates instead of having *

    opened by rushilsrivastava 4
  • fix bug by adding utf-8

    fix bug by adding utf-8

    fix File "c:\Users\jackm\GitHub\pool-api\wvenv\lib\site-packages\fastapi_cache\key_builder.py", line 21, in default_key_builder + hashlib.md5( # nosec:B303 TypeError: Unicode-objects must be encoded before hashing #19

    opened by jack60612 3
  • [Errno 111] Connection refused

    [Errno 111] Connection refused

    My application is fully dockerized, it was working fine with fastapi. Im trying to add cache but It seems to have an error.

    Im getting an error trying to create_redis_pool. I tried using localhost, also other ports without any luck.

    @app.on_event("startup")
    async def startup():
        redis = await aioredis.create_redis_pool("redis://127.0.0.1", encoding="utf8")
        FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
    

    Also ports are mapped on the docker-compose.yml

    opened by ollita7 3
  • fastapi-cache 0.1.4: TypeError: Unicode-objects must be encoded before hashing

    fastapi-cache 0.1.4: TypeError: Unicode-objects must be encoded before hashing

    Hi,

    I updated fastapi-cache using Github to solve the problem of "not JSON serializable", but now I have a new error:

      File "/lib/python3.9/site-packages/fastapi_cache/decorator.py", line 40, in inner
        cache_key = key_builder(
      File "/lib/python3.9/site-packages/fastapi_cache/key_builder.py", line 21, in default_key_builder
        + hashlib.md5(  # nosec:B303
    TypeError: Unicode-objects must be encoded before hashing
    

    Before (with version 1.3.4) I was using something like this bellow, but with the two lines commented and returning the json_updated_user:

    @router.get("/me/items/", response_model=List[schemas.User_Pydantic])
    @cache(namespace="test", expire=60, coder=JsonCoder)
    async def read_current_user_items(user: UserDB = Depends(fastapi_users.get_current_active_user)):
        user_cards = await schemas.Card_Pydantic.from_queryset(
            Card.filter(owner_id=user.id).all())
        for city in user_cards:
            await weather_api.get_api_info_by_id(city.id)
        # user_with_items = await schemas.User_Pydantic.from_queryset(UserModel.filter(id=user.id).all())
        # json_updated_user = jsonable_encoder(user_with_items)
        return await schemas.User_Pydantic.from_queryset(UserModel.filter(id=user.id).all())
    

    Without jsonable_encoder I get the error "not JSON serializable", but now I have the hash error. If I comment the @cache decorator, it works fine (and I don't need the jsonable_encoder).

    Thanks in advance!

    opened by diegocgaona 3
  • Default key builder does not work with memcached

    Default key builder does not work with memcached

    Memcached does not allow spaces in keys. Thus when using the default key builder with endpoint that has dependencies, a space can be introduced when object is being represented as string. Problem example:

    @router.get("/",
                 status_code=200,
                 response_model=list)
    async def distinct_values(field: str,
                                  es: Elasticsearch = Depends(ElasticConnector)):
    

    This result in ValidationError

    Proposed solution:

    def mcache_key_builder(
        func,
        namespace: Optional[str] = "",
        request: Request = None,
        response: Response = None,
        *args,
        **kwargs,
    ):
        prefix = FastAPICache.get_prefix()
        cache_key = f"{prefix}:{namespace}:{func.__module__}:{func.__name__}:{args}:{kwargs}".replace(" ","")
        return cache_key
            ```
    opened by MrAngry 3
  • Encode args and kwargs

    Encode args and kwargs

    The args and kwargs from the decorated are imported as a string directly which causes issues when using ORMs like SQLAlchemy that use instances of Sessions are have a different string representation. These values should be encoded via the coder.

    opened by rushilsrivastava 3
  • Transparent passthrough in the event of cache backend connection issues

    Transparent passthrough in the event of cache backend connection issues

    Currently if using the cache decorator on a function with a FastAPI path operation decorator, if there are any connection issues to a backend (e.g. Redis) an unhandled exception from the backend will cause a HTTP 500 - meaning cache backend uptime becomes a requirement for API functionality. This PR sets a better default - if cache backend connectivity fails, pass the request through as you would if the decorator isn't there (reimplementation of #14)

    @long2ice @mkdir700 could you have a look please? Hopefully a quick merge for you but this failure mode is a big blocker for me to use and evangelise this library more widely. As per #99 a new release would be awesome too!

    opened by hackjammer 1
  • when request is present, cache is disabled

    when request is present, cache is disabled

    Due to this line: https://github.com/long2ice/fastapi-cache/blob/8f0920d0d7f0a34bfb8987736cf794be5e3cc33f/fastapi_cache/decorator.py#L128

    When request is present, cache is disabled. Why do we do so? Isn't it quite normal if user want to customize key builder that takes in request object?

    I can create a pr to fix this but just wondering why the design is so in the first place.

    opened by schwannden 1
  • Feat/add@cacheable

    Feat/[email protected]

    #96

    add @cacheable

    example:

    from fastapi_cache.decorator import cacheable
    
    @cacheable(key="xxx:{0}:{b}", expire=60)
    async def test_func(a: int, b: int) -> int:
        return a + b
    
    async def main():
        await test_func(1, 2)
        # cache_key -> 'xxx:1:2'
    
    opened by mkdir700 0
  • How to cache `StreamingResponse`?

    How to cache `StreamingResponse`?

    Hi, is there a way to cache StreamingResponse?

    Currently i get:

    ValueError: [TypeError("'async_generator' object is not iterable"), TypeError('vars() argument must have __dict__ attribute')]
    
    opened by mglowinski93 5
Releases(v0.1.9)
Lazy package to start your project using FastAPI✨

Fastapi-lazy 🦥 Utilities that you use in various projects made in FastAPI. Source Code: https://github.com/yezz123/fastapi-lazy Install the project:

Yasser Tahiri 95 Dec 29, 2022
This repository contains learning resources for Python Fast API Framework and Docker

This repository contains learning resources for Python Fast API Framework and Docker, Build High Performing Apps With Python BootCamp by Lux Academy and Data Science East Africa.

Harun Mbaabu Mwenda 23 Nov 20, 2022
🚢 Docker images and utilities to power your Python APIs and help you ship faster. With support for Uvicorn, Gunicorn, Starlette, and FastAPI.

🚢 inboard 🐳 Docker images and utilities to power your Python APIs and help you ship faster. Description This repository provides Docker images and a

Brendon Smith 112 Dec 30, 2022
🍃 A comprehensive monitoring and alerting solution for the status of your Chia farmer and harvesters.

chia-monitor A monitoring tool to collect all important metrics from your Chia farming node and connected harvesters. It can send you push notificatio

Philipp Normann 153 Oct 21, 2022
CLI and Streamlit applications to create APIs from Excel data files within seconds, using FastAPI

FastAPI-Wrapper CLI & APIness Streamlit App Arvindra Sehmi, Oxford Economics Ltd. | Website | LinkedIn (Updated: 21 April, 2021) fastapi-wrapper is mo

Arvindra 49 Dec 03, 2022
This project shows how to serve an ONNX-optimized image classification model as a web service with FastAPI, Docker, and Kubernetes.

Deploying ML models with FastAPI, Docker, and Kubernetes By: Sayak Paul and Chansung Park This project shows how to serve an ONNX-optimized image clas

Sayak Paul 104 Dec 23, 2022
A server hosts a FastAPI application and multiple clients can be connected to it via SocketIO.

FastAPI_and_SocketIO A server hosts a FastAPI application and multiple clients can be connected to it via SocketIO. Executing server.py sets up the se

Ankit Rana 2 Mar 04, 2022
Sample project showing reliable data ingestion application using FastAPI and dramatiq

Create and deploy a reliable data ingestion service with FastAPI, SQLModel and Dramatiq This is the source code for the data ingestion service explain

François Voron 31 Nov 30, 2022
An extension library for FastAPI framework

FastLab An extension library for FastAPI framework Features Logging Models Utils Routers Installation use pip to install the package: pip install fast

Tezign Lab 10 Jul 11, 2022
Full stack, modern web application generator. Using FastAPI, PostgreSQL as database, Docker, automatic HTTPS and more.

Full Stack FastAPI and PostgreSQL - Base Project Generator Generate a backend and frontend stack using Python, including interactive API documentation

Sebastián Ramírez 10.8k Jan 08, 2023
FastAPI pagination

FastAPI Pagination Installation # Basic version pip install fastapi-pagination # All available integrations pip install fastapi-pagination[all] Avail

Yurii Karabas 561 Jan 07, 2023
Middleware for Starlette that allows you to store and access the context data of a request. Can be used with logging so logs automatically use request headers such as x-request-id or x-correlation-id.

starlette context Middleware for Starlette that allows you to store and access the context data of a request. Can be used with logging so logs automat

Tomasz Wójcik 300 Dec 26, 2022
Repository for the Demo of using DVC with PyCaret & MLOps (DVC Office Hours - 20th Jan, 2022)

Using DVC with PyCaret & FastAPI (Demo) This repo contains all the resources for my demo explaining how to use DVC along with other interesting tools

Tezan Sahu 6 Jul 22, 2022
Get MODBUS data from Sofar (K-TLX) inverter through LSW-3 or LSE module

SOFAR Inverter + LSW-3/LSE Small utility to read data from SOFAR K-TLX inverters through the Solarman (LSW-3/LSE) datalogger. Two scripts to get inver

58 Dec 29, 2022
FastAPI Project Template

The base to start an openapi project featuring: SQLModel, Typer, FastAPI, JWT Token Auth, Interactive Shell, Management Commands.

A.Freud 4 Dec 05, 2022
Cookiecutter template for FastAPI projects using: Machine Learning, Poetry, Azure Pipelines and Pytests

cookiecutter-fastapi In order to create a template to FastAPI projects. 🚀 Important To use this project you don't need fork it. Just run cookiecutter

Arthur Henrique 225 Dec 28, 2022
Fastapi-ml-template - Fastapi ml template with python

FastAPI ML Template Run Web API Local $ sh run.sh # poetry run uvicorn app.mai

Yuki Okuda 29 Nov 20, 2022
Opentracing support for Starlette and FastApi

Starlette-OpenTracing OpenTracing support for Starlette and FastApi. Inspired by: Flask-OpenTracing OpenTracing implementations exist for major distri

Rene Dohmen 63 Dec 30, 2022
✨️🐍 SPARQL endpoint built with RDFLib to serve machine learning models, or any other logic implemented in Python

✨ SPARQL endpoint for RDFLib rdflib-endpoint is a SPARQL endpoint based on a RDFLib Graph to easily serve machine learning models, or any other logic

Vincent Emonet 27 Dec 19, 2022
Socket.IO integration for Flask applications.

Flask-SocketIO Socket.IO integration for Flask applications. Installation You can install this package as usual with pip: pip install flask-socketio

Miguel Grinberg 4.9k Jan 03, 2023