Request ID propagation for ASGI apps

Overview

pypi test codecov

ASGI Correlation ID middleware

Middleware for loading and receiving correlation IDs from request HTTP headers, and making them available in application logs.

By default, the middleware loads correlation IDs from the Correlation-ID HTTP header, but the name of the header can be specified, and if you're, e.g., on a platform like Heroku you probably want to set the header name to X-Request-ID instead.

In addition to adding correlation IDs to logs, the middleware supports propagating correlation IDs to Sentry events and Celery tasks. See the relevant sections below for more details.

Table of contents

Installation

pip
install
asgi - correlation - id

Setting up the middleware

Adding the middleware

The middleware can be added like this

app = FastAPI(middleware=[Middleware(CorrelationIdMiddleware)])

or this

app = FastAPI()
app.add_middleware(CorrelationIdMiddleware)

For Starlette apps, just substitute FastAPI with Starlette in the example.

Middleware settings

The middleware has a few settings. These are the defaults:

class CorrelationIdMiddleware(
    header_name='Correlation-ID',
    validate_guid=True,
    uuid_length=32,
)

Each individual setting is described below:

header_name

The HTTP header key to read IDs from.

In additon to Correlation-ID, another popular choice for header name is X-Request-ID. Among other things, this is the standard header value for request IDs on Heroku.

Defaults to Correlation-ID.

validate_guid

By default, the middleware validates correlation IDs as valid UUIDs. If turned off, any string will be accepted.

An invalid header is discarded, and a fresh UUID is generated in its place.

Defaults to True.

uuid_length

Lets you optionally trim the length of correlation IDs. Probably not needed in most cases, but for, e.g., local development having 32-length UUIDs in every log output to your console can be excessive.

Defaults to 32.

Configuring logging

To get a benefit from the middleware, you'll want to configure your logging setup to log the correlation ID in some form or another. This way logs can be correlated to a single request - which is largely the point of the middleware.

To set up logging of the correlation ID, you simply have to implement the filter supplied by the package.

If your logging config looks something like this:

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s ... %(name)s %(message)s',
        },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
            'formatter': 'web',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': ['web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

You simply have to make these changes

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
+   'filters': {
+       'correlation_id': {'()': CorrelationId},
+   },
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
+           'format': '%(levelname)s ... [%(correlation_id)s] %(name)s %(message)s',
        },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
+           'filters': ['correlation_id'],
            'formatter': 'web',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': ['web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

And your log output should go from this:

INFO ... project.views This is a DRF view log, and should have a GUID.
WARNING ... project.services.file Some warning in a function
INFO ... project.views This is a DRF view log, and should have a GUID.
INFO ... project.views This is a DRF view log, and should have a GUID.
WARNING ... project.services.file Some warning in a function
WARNING ... project.services.file Some warning in a function

to this

INFO ... [773fa6885e03493498077a273d1b7f2d] project.views This is a DRF view log, and should have a GUID.
WARNING ... [773fa6885e03493498077a273d1b7f2d] project.services.file Some warning in a function
INFO ... [0d1c3919e46e4cd2b2f4ac9a187a8ea1] project.views This is a DRF view log, and should have a GUID.
INFO ... [99d44111e9174c5a9494275aa7f28858] project.views This is a DRF view log, and should have a GUID.
WARNING ... [0d1c3919e46e4cd2b2f4ac9a187a8ea1] project.services.file Some warning in a function
WARNING ... [99d44111e9174c5a9494275aa7f28858] project.services.file Some warning in a function

Setting up Celery support

Features

What we call "Celery support" is really two distinct features. If you don't care about the details and just want IDs in your logs, feel free to skip ahead to the implementation section.

Feature 1: Passing a correlation ID to a Celery worker

It's pretty useful to be able to discern which HTTP request spawned which background task, if that's something your app might do.

Celery already has a concept of correlation_id, but unfortunately it's not something users can hook into. Instead we can use the before_task_publish signal to pass our correlation ID to the receiving worker, using the task headers.

Feature 2: Keeping track of what spawned a Celery worker

In the case of a HTTP request spawning a background task, we have full information about the sequence of events.

But what happens if that background task spawns more background tasks, or retries and rejections are added to the mix? As soon as more than one task is spawned, the correlation ID is reduced to an "origin ID" - the ID of the HTTP request that spawned the first worker.

In the same way correlation IDs are nice, because it connects logs to a single HTTP request, we would like something to give us the sequence of events when things get complicated. For this purpose the package supplies two extra log filters:

  • The worker current_id, which is a generated UUID, unique to each new worker process
  • The worker parent_id which is either the correlation ID (when a background task was spawned from a HTTP request), or the current_id of the worker process that spawned the current worker process.

So to summarize:

  • correlation_id: The ID for the originating HTTP request
  • current_id: The ID of the current worker process
  • parent_id: The ID of the former HTTP/worker process (None if there was none, as in the case of scheduled tasks)

Adding Celery event hooks

Setting up the event hooks is simple, just import configure_celery and run it during startup.

from fastapi import FastAPI

from asgi_correlation_id import configure_celery

app = FastAPI()

app.add_event_handler('startup', configure_celery)

You can look over the event hooks here.

Celery event hook settings

The setup function has a few settings. These are the defaults:

def configure_celery(
        uuid_length=32,
        log_parent=True,
        parent_header='CELERY_PARENT_ID',
        correlation_id_header='CORRELATION_ID',
) -> None:

Each individual setting is described below:

  • uuid_length


    Lets you optionally trim the length of IDs. Probably not needed in most cases, but for, e.g., local development having 32-length UUIDs in every log output to your console *can* be excessive.


    Defaults to 32.

  • log_parent


    If `False` will only pass the correlation ID to worker processes spawned by HTTP requests - nothing else.


    Defaults to True.

  • correlation_id_header


    Same as `parent_header`.


    Defaults to CORRELATION_ID.

  • parent_header


    The key to store a `parent_id` header value in. There's no need to change this unless you have other signal functions for Celery interacting with task headers.


    Defaults to CELERY_PARENT_ID.

Configuring Celery logging

If this is your logging config:

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'filters': {
        'correlation_id': {'()': CorrelationId},
    },
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s ... [%(correlation_id)s] %(name)s %(message)s',
        },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
            'filters': ['correlation_id'],
            'formatter': 'web',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': ['web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

You simply need to add these lines of code to log the current_id and parent_id

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'filters': {
        'correlation_id': {'()': CorrelationId},
+       'celery_tracing': {'()': CeleryTracingIds},
    },
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s ... [%(correlation_id)s] %(name)s %(message)s',
        },
+       'celery': {
+           'class': 'logging.Formatter',
+           'datefmt': '%H:%M:%S',
+           'format': '%(levelname)s ... [%(correlation_id)s] [%(celery_parent_id)s-%(celery_current_id)s] %(name)s %(message)s',
+       },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
            'filters': ['correlation_id'],
            'formatter': 'web',
        },
+       'celery': {
+           'class': 'logging.StreamHandler',
+           'filters': ['correlation_id', 'celery_tracing'],
+           'formatter': 'celery',
+       },
    },
    'loggers': {
        'my_project': {
+           'handlers': ['celery' if any('celery' in i for i in sys.argv) else 'web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

Though at this point it might make more sense to use a JSON-formatter, since the logs will become pretty cluttered. Using the UUID length settings for local development can also be useful.

{
    'version': 1,
    'disable_existing_loggers': False,
    'filters': {
        'correlation_id': {'()': CorrelationId},
        'celery_tracing': {'()': CeleryTracingIds},
    },
    'formatters': {
        'dev': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s:\t\b%(asctime)s %(name)s:%(lineno)d [%(correlation_id)s] %(message)s',
        },
        'dev-celery': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': (
                '%(levelname)s:\t\b%(asctime)s %(name)s:%(lineno)d [%(correlation_id)s]'
                ' [%(celery_parent_id)s-%(celery_current_id)s] %(message)s'
            ),
        },
        'json': {
            '()': 'pythonjsonlogger.jsonlogger.JsonFormatter',
            'format': """
                asctime: %(asctime)s
                created: %(created)f
                filename: %(filename)s
                funcName: %(funcName)s
                levelname: %(levelname)s
                level: %(levelname)s
                levelno: %(levelno)s
                lineno: %(lineno)d
                message: %(message)s
                module: %(module)s
                msec: %(msecs)d
                name: %(name)s
                pathname: %(pathname)s
                process: %(process)d
                processName: %(processName)s
                relativeCreated: %(relativeCreated)d
                thread: %(thread)d
                threadName: %(threadName)s
                exc_info: %(exc_info)s
                correlation-id: %(correlation_id)s
                celery-current-id: %(celery_current_id)s
                celery-parent-id: %(celery_parent_id)s
            """,
            'datefmt': '%Y-%m-%d %H:%M:%S',
        },
    },
    'handlers': {
        'dev': {
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout',
            'filters': ['correlation_id'],
            'formatter': 'console',
        },
        'dev-celery': {
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout',
            'filters': ['correlation_id', 'celery_tracing'],
            'formatter': 'console-celery',
        },
        'json': {
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout',
            'filters': ['correlation_id'],
            'formatter': 'json',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': [
                'json' if settings.ENVIRONMENT != 'dev'
                else 'dev-celery' if any('celery' in i for i in sys.argv)
                else 'dev'
            ],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

Sentry support

If your project has sentry-sdk installed, correlation IDs are automatically added to Sentry events as a transaction_id. This is also the case for Celery tasks, if Celery support is enabled.

See this blogpost for more details.

Comments
  • ID generator for celery extension

    ID generator for celery extension

    Refer to #50

    Please review and I can address any requested changes :)

    Added customizable generator options to celery extension to better match capabilities of the correlation id middleware

    • Added optional "header_key" param to asgi_correlation_id.extensions.celery.load_correlation_id
    • Added optional "generator" param to asgi_correlation_id.extensions.celery.load_correlation_id
    • Added optional "generator" param to asgi_correlation_id.extensions.celery.load_celery_current_and_parent_ids
    opened by dapryor 16
  • Avoid using log filter to modify/enrich log record

    Avoid using log filter to modify/enrich log record

    The current implementation provides log filter classes which can be used to enrich the log records with correlation IDs. This is a surprising use of a filter since it has a side effect and modifies a log record. For example, when attaching the filter to one handler, another handler may see the modified log record.

    According to the documentation (just below https://docs.python.org/3/library/logging.html?highlight=logrecord#logging.LogRecord.getMessage), the intended pattern for custom log record generation is to use a modified log record factory function like this:

    old_factory = logging.getLogRecordFactory()
    
    def record_factory(*args, **kwargs):
        record = old_factory(*args, **kwargs)
        record.custom_attribute = 0xdecafbad
        return record
    
    logging.setLogRecordFactory(record_factory)
    

    The only annoying thing here is that this one-off setting of the factory cannot be done using logging.config.

    Short of asking app developers to bootstrap the record factory, manually, there may be an alternative whereby the ASGI middleware could be responsible for this. When using a class-based middleware implementation, the middleware's __init__() constructor may be able to set the log record factory?

    opened by faph 11
  • The headers with same key are overwritten

    The headers with same key are overwritten

    Starlette responses adds cookies as headers with the same key. self.raw_headers.append((b"set-cookie", cookie_val.encode("latin-1")))

    In your middleware handle outgoing request over writes the same key with only the last value.

            async def handle_outgoing_request(message: Message) -> None:
                if message['type'] == 'http.response.start':
                    headers = {k.decode(): v.decode() for (k, v) in message['headers']}
                    headers[self.header_name] = correlation_id.get()
                    headers['Access-Control-Expose-Headers'] = self.header_name
                    response_headers = Headers(headers=headers)
                    message['headers'] = response_headers.raw
                await send(message)
    
    opened by lakshaythareja 11
  • correlation_id.get got empty in fastapi exception_handler

    correlation_id.get got empty in fastapi exception_handler

    I want to use correlation_id.get() to get trace_id in my exception_handler but got empty, is there any wrong?

    full code:

    import uvicorn
    
    from asgi_correlation_id import CorrelationIdMiddleware, correlation_id
    from fastapi import FastAPI, Request, status
    from fastapi.responses import JSONResponse
    
    app = FastAPI()
    
    @app.exception_handler(Exception)
    async def unicorn_exception_handler(request: Request, exception: Exception):
        trace_id = correlation_id.get() or 'ccc'
        return JSONResponse(
            status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
            content={"code": config.errcode.INSIDE_ERROR, "msg": "error.", "data": [], "trace": trace_id}
        )
    
    app.add_middleware(
        CorrelationIdMiddleware,
        header_name="X-Trace"
    )
    
    @app.get("/items")
    async def read_item(request: Request):
        trace_id = correlation_id.get()
        print(trace_id)
        raise AttributeError
        return {"item_id": "100", "trace": trace_id}
    
    if __name__ == '__main__':
        uvicorn.run(app='main:app', host="0.0.0.0", port=8000, reload=True)
    

    request:

    curl   -XGET  http://<server-ip>:8000/items
    

    response:

    Http Code: 500
    {"code":99999,"msg":"error.","data":[],"trace":"ccc"}
    

    Thanks

    opened by liyongzhezz 9
  • Correlation-id header doesn't propagate to outgoing reqeusts.

    Correlation-id header doesn't propagate to outgoing reqeusts.

    As in a typical micro-service architecture, I want the correlation Id to be automatically propagated to the outbound requests made from one API to other APIs. I'm using the HTTPX AsyncClient for the same. But unless I explicitly add the header to the client object, the other API services do not receive it inherently and they create their own header which ultimately defeats the purpose of having a correlation-id.

    opened by prashantk1220 7
  • After asgi-correlation-id is used, the log cannot be printed when creating fastapi app

    After asgi-correlation-id is used, the log cannot be printed when creating fastapi app

    I configured loguru in fastapi according to the example, and the code is as follows.

    import logging
    import sys
    
    import uvicorn
    from asgi_correlation_id import CorrelationIdMiddleware
    from asgi_correlation_id.context import correlation_id
    from fastapi import FastAPI
    from loguru import logger
    
    logger.remove()
    
    
    def correlation_id_filter(record):
        record['correlation_id'] = correlation_id.get()
        return record['correlation_id']
    
    
    fmt = "<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | <level>{level: <8}</level> | <red> {correlation_id} </red> | <cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> - <level>{message}</level>"
    
    logger.add(sys.stderr, format=fmt, level=logging.DEBUG, filter=correlation_id_filter)
    
    
    def start_app():
        logger.info('start app....')
        print('start app....')
    
    
    app = FastAPI()
    
    app.add_middleware(CorrelationIdMiddleware)
    
    app.add_event_handler('startup', start_app)
    
    
    @app.get('/')
    def index():
        logger.info(f"Request with id ")
        return 'OK'
    
    
    if __name__ == '__main__':
        uvicorn.run(app='mainw:app', host='0.0.0.0', port=8000, reload=False, debug=True)
    

    After the project is started, logger did not print the log in start_app(), just print()

    /Users/slowchen/.virtualenvs/fastapi/bin/python /Users/slowchen/workspace/FastApi/tianya-fastapi/app/mainw.py
    INFO:     Will watch for changes in these directories: ['/Users/slowchen/workspace/FastApi/tianya-fastapi/app']
    INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
    INFO:     Started reloader process [31627] using statreload
    INFO:     Started server process [31629]
    INFO:     Waiting for application startup.
    INFO:     Application startup complete.
    start app....
    INFO:     127.0.0.1:52057 - "GET / HTTP/1.1" 200 OK
    2022-03-12 13:43:30.186 | INFO     |  048a508fb369425fb25a63631899b99c  | mainw:index:37 - Request with id 
    

    How can I configure it to print the log correctly when fastapi, startup or adding middleware

    opened by slowchen 7
  • Correlation header not present in response for 500 errors

    Correlation header not present in response for 500 errors

    Responses that are returned without error, or responses that are returned when an HTTPException is raised, include the correlation header name and the correlation id.

    However, when an unhandled exception is raised, for example with this code:

    @app.get("/")
    async def root():
        a = 1/0
        return {"message": "Hello!"}
    

    then a 500 error is returned to the client, and the correlation header is not included.

    It seems like it would be especially nice to have the correlation id included in the response headers for such cases.

    opened by brki 7
  • Can you please give an example of how to add parent id and current process id in celery tasks for loguru?

    Can you please give an example of how to add parent id and current process id in celery tasks for loguru?

    I use loguru for logging, saw your example for adding the correlation id for requests. But I am unable to add parent and current process id for celery tasks in loguru. Can you help?

    opened by lakshaythareja 6
  • How to get Correlation ID inside the view function in FastAPI?

    How to get Correlation ID inside the view function in FastAPI?

    Hey author, thanks for the amazing work! Our team is planning to onboard this library in our projects. I have been trying to retrieve the created correlation ID inside a view function in FastAPI but I am not able to query it somehow. I have tried querying the request.headers and request.body(), and am still not able to find it. How can I get that?

    opened by raghavsikaria 6
  • How to configure asgi-correlation with loguru ?

    How to configure asgi-correlation with loguru ?

    Hello there !

    Loguru seems to get more popularity as time goes by, I've tried to setup a very minimal example to get your library working with it, but it seems it can't find the correlation id.

    I've tried several things, and right now i'm doing this :

    import logging
    import sys
    
    import uvicorn
    from asgi_correlation_id import CorrelationIdMiddleware
    from fastapi import FastAPI
    from loguru import logger
    
    app = FastAPI()
    
    app.add_middleware(CorrelationIdMiddleware)
    
    # remove default handler
    logger.remove()
    
    fmt = "[{time}] [application_name] [{extra[correlation_id]}] [{level}] - {name}:{function}:{line} - {message}"
    logger.add(sys.stderr, format=fmt, level=logging.DEBUG)
    
    @app.get('/')
    def index():
        logger.info(f"Request with id ")
        return 'OK'
    
    
    
    if __name__ == '__main__':
        uvicorn.run(app)
    
    
    

    And I get a :

    Record was: {'elapsed': datetime.timedelta(seconds=1, microseconds=1880), 'exception': None, 'extra': {}, 'file': (name='another_main.py', path='C:\\Users\\User\\PycharmProjects\\x\\y\\another_main.py'), 'function': 'index', 'level': (name='INFO', no=20, icon='ℹ️'), 'line': 21, 'message': 'Request with id ', 'module': 'another_main', 'name': '__main__', 'process': (id=3412, name='MainProcess'), 'thread': (id=9164, name='asyncio_0'), 'time': datetime(2021, 11, 17, 15, 7, 54, 443182, tzinfo=datetime.timezone(datetime.timedelta(seconds=3600), 'Paris, Madrid'))}
    Traceback (most recent call last):
      File "C:\Users\User\PycharmProjects\x\venv\lib\site-packages\loguru\_handler.py", line 155, in emit
        formatted = precomputed_format.format_map(formatter_record)
    KeyError: 'correlation_id'
    --- End of logging error ---
    

    Got any idea ? Thanks !

    opened by sorasful 6
  • How to pass correlation_id to tasks executed in a multithreaded environment?

    How to pass correlation_id to tasks executed in a multithreaded environment?

    EDIT: Changed the name of the issue for better searchability; you can find the solution to the question here


    Hey there!

    I feel pretty stupid asking this question, but can you explain to me how I should create my logger instance to have a correlation_id?

    Currently I create my logger at the top of the a router file:

    import logging
    from fastapi import APIRouter, HTTPException
    
    LOG = logging.getLogger(__name__)
    
    router = APIRouter(prefix="/my/route", responses={404: {"description": "Not found"}})
    
    @router.get("/")
    def handler():
       LOG.info("Hello!")
    

    And I get

    [2022-07-19T20:37:48] INFO [None] path.to.module | Hello
    

    when my logging configuration is as follows:

        "formatters": {
            "default": {
                "format": "[%(asctime)s] %(levelname)s [%(correlation_id)s] %(name)s | %(message)s",
                "datefmt": "%Y-%m-%dT%H:%M:%S",
            }
        },
    
        app.add_middleware(
            CorrelationIdMiddleware,
            header_name='X-Request-ID',
            generator=lambda: uuid4().hex,
            validator=is_valid_uuid4,
            transformer=lambda a: a,
        )
     
    

    -- I would like to have my correlation_id show up in my log like so:

    [2022-07-19T20:37:48] INFO [8fe9728a] path.to.module | Hello
    

    I can't get anything about it in both the Starlette and FastAPI documentation. It's like everybody knows this and it's not worth mentionning 🤔

    Can you give me an example of how I should get a logger instance to have the request id show up?

    Thanks for your help!

    opened by philippefutureboy 5
Releases(v3.2.1)
  • v3.2.1(Nov 18, 2022)

    What's Changed

    • chore: allow newer starlette versions by @JonasKs in https://github.com/snok/asgi-correlation-id/pull/58
      • thanks to @greenape for his contributions in #56

    Full Changelog: https://github.com/snok/asgi-correlation-id/compare/v3.2.0...v3.2.1

    Source code(tar.gz)
    Source code(zip)
  • v3.2.0(Oct 14, 2022)

    What's Changed

    • Contextvars added to package __init__ by @Bobronium in https://github.com/snok/asgi-correlation-id/pull/54

    New Contributors

    • @Bobronium made their first contribution in https://github.com/snok/asgi-correlation-id/pull/54

    Full Changelog: https://github.com/snok/asgi-correlation-id/compare/v3.1.0...v3.2.0

    Source code(tar.gz)
    Source code(zip)
  • v3.1.0(Sep 29, 2022)

    What's Changed

    • docs: Add docs for how to integrate with saq by @sondrelg in https://github.com/snok/asgi-correlation-id/pull/47
    • chore: Update workflows by @sondrelg in https://github.com/snok/asgi-correlation-id/pull/48
    • chore: Upgrade to Poetry 1.2.0 by @sondrelg in https://github.com/snok/asgi-correlation-id/pull/49
    • feat: Add ability to specify celery-integration generated IDs and fix celery log filter signature inconsistency @dapryor in https://github.com/snok/asgi-correlation-id/pull/51

    New Contributors

    • @dapryor made their first contribution in https://github.com/snok/asgi-correlation-id/pull/51

    Full Changelog: https://github.com/snok/asgi-correlation-id/compare/v3.0.1...v3.1.0

    Source code(tar.gz)
    Source code(zip)
  • v3.0.1(Jul 27, 2022)

  • v3.0.0(May 18, 2022)

    Changes

    Breaking changes

    • Reworked the middleware settings (#39)

    • Reworded a warning logger (https://github.com/snok/asgi-correlation-id/commit/1883b31b0b115b2e6706c03f9bf94fadfeebca7a). This could potentially break log filters or monitoring dashboard, though is probably a non-issue for most.

    Migration guide

    The validate_header_as_uuid middleware argument was removed.

    If your project uses validate_header_as_uuid=False, this is how the middleware configuration should change:

    app.add_middleware(
        CorrelationIdMiddleware,
        header_name='X-Request-ID',
    -    validate_header_as_uuid=False
    +    validator=None,
    )
    

    Otherwise, just make sure to remove validate_header_as_uuid if used.

    Read more about the new configuration options here.

    Source code(tar.gz)
    Source code(zip)
  • v3.0.0a1(May 16, 2022)

  • v2.0.0(Apr 30, 2022)

    v2.0.0 release

    Changes

    Breaking changes

    • Drops Python 3.6
    • Old log filter factories were removed. All users will need to follow the migration guide below to upgrade.

    Non-breaking changes

    • Adds 3.11 support

    Migration guide

    The celery_tracing_id_filter and correlation_id_filter callables have been removed in the latest release.

    To upgrade, change from this log filter implementation:

    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'filters': {
            'correlation_id': {'()': correlation_id_filter(uuid_length=32)},
           'celery_tracing': {'()': celery_tracing_id_filter(uuid_length=32)},
        },
        ...
    }
    

    To this one:

    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'filters': {
            'correlation_id': {
                '()': 'asgi_correlation_id.CorrelationIdFilter',
                'uuid_length': 32,
            },
            'celery_tracing': {
                 '()': 'asgi_correlation_id.CeleryTracingIdsFilter',
                 'uuid_length': 32,
            },
        },
        ...
    }
    

    When upgrading a project which only implemented correlation_id_filter, you should expect this diff:

    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'filters': {
    -        'correlation_id': {'()': correlation_id_filter(uuid_length=32)},
    +        'correlation_id': {
    +            '()': 'asgi_correlation_id.CorrelationIdFilter',
    +            'uuid_length': 32,
    +        },
        },
        ...
    }
    

    See the repository README for updated documentation.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.4(Mar 22, 2022)

  • v1.1.3(Mar 21, 2022)

  • v1.1.2(Nov 30, 2021)

  • v1.1.1(Nov 20, 2021)

  • v1.1.0(Nov 9, 2021)

  • v1.0.1(Nov 2, 2021)

  • v1.0.0(Nov 1, 2021)

Owner
snok
Open source collaboration
snok
This is an online course where you can learn and master the skill of low-level performance analysis and tuning.

Performance Ninja Class This is an online course where you can learn to find and fix low-level performance issues, for example CPU cache misses and br

Denis Bakhvalov 1.2k Dec 30, 2022
Mnemosyne: efficient learning with powerful digital flash-cards.

Mnemosyne: Optimized Flashcards and Research Project Mnemosyne is: a free, open-source, spaced-repetition flashcard program that helps you learn as ef

359 Dec 24, 2022
:fishing_pole_and_fish: List of `pre-commit` hooks to ensure the quality of your `dbt` projects.

pre-commit-dbt List of pre-commit hooks to ensure the quality of your dbt projects. BETA NOTICE: This tool is still BETA and may have some bugs, so pl

Offbi 262 Nov 25, 2022
Python Service for MISP Feed Management

Python Service for MISP Feed Management This set of scripts is designed to offer better reliability and more control over the fetching of feeds into M

Chris 7 Aug 24, 2022
Google Foobar challenge solutions from my experience and other's on the web.

Google Foobar challenge Google Foobar challenge solutions from my experience and other's on the web. Note: Problems indicated with "Mine" are tested a

Islam Ayman 6 Jan 20, 2022
A Unified Framework for Hydrology

Unified Framework for Hydrology The Python package unifhy (Unified Framework for Hydrology) is a hydrological modelling framework which combines inter

Unified Framefork for Hydrology - Community Organisation 6 Jan 01, 2023
Small pip update helpers.

pipdate pipdate is a collection of small pip update helpers. The command pipdate # or python3.9 -m pipdate updates all your pip-installed packages. (O

Nico Schlömer 69 Dec 18, 2022
Reverse the infix string. Note that while reversing the string you must interchange left and right parentheses

Reverse the infix string. Note that while reversing the string you must interchange left and right parentheses. Obtain the postfix expression of the infix expression Step 1.Reverse the postfix expres

Sazzad Hossen 1 Jan 04, 2022
Demodulate and error correct FIS-B and ADS-B signals on 978 MHz.

FIS-B 978 ('fisb-978') is a set of programs that demodulates and error corrects FIS-B (Flight Information System - Broadcast) and ADS-B (Automatic Dep

2 Nov 15, 2022
A Kodi add-on for watching content hosted on PeerTube.

A Kodi add-on for watching content hosted on PeerTube. This add-on is under development so only basic features work, and you're welcome to improve it.

1 Dec 18, 2021
A step-by-step tutorial for how to work with some of the most basic features of Nav2 using a Jupyter Notebook in a warehouse environment to create a basic application.

This project has a step-by-step tutorial for how to work with some of the most basic features of Nav2 using a Jupyter Notebook in a warehouse environment to create a basic application.

Steve Macenski 49 Dec 22, 2022
Medical appointments No-Show classifier

Medical Appointments No-shows Why do 20% of patients miss their scheduled appointments? A person makes a doctor appointment, receives all the instruct

4 Apr 20, 2022
A continuation Of Project Glow By @glowstik-yt

Project Glow Greetings, I see you have stumbled upon project glow. Project glow is an open source bot worked on by many people to create a good and sa

1 Nov 17, 2021
A fast Python in-process signal/event dispatching system.

Blinker Blinker provides a fast dispatching system that allows any number of interested parties to subscribe to events, or "signals". Signal receivers

jason kirtland 1.4k Dec 31, 2022
Simple Calculator Mobile Apps

Simple Calculator Mobile Apps Screenshoot If you want to try it please click the link below to download, this application is 100% safe no virus. link

0 Sep 24, 2022
:snake: Complete C99 parser in pure Python

pycparser v2.20 Contents 1 Introduction 1.1 What is pycparser? 1.2 What is it good for? 1.3 Which version of C does pycparser support? 1.4 What gramma

Eli Bendersky 2.8k Dec 29, 2022
Open HW & SW for Scanning Electron Microscopes

OpenSEM Project Status: Preliminary The purpose of this project is to create a modern and open-source hardware and software platform for using vintage

Steven Lovegrove 7 Nov 01, 2022
This is a modified variation of abhiTronix's vidgear. In this variation, it is possible to write the output file anywhere regardless the permissions.

Info In order to download this package: Windows 10: Press Windows+S, Type PowerShell (cmd in older versions) and hit enter, Type pip install vidgear_n

Ege Akman 3 Jan 30, 2022
A tool to help calculate how to split conveyors in Satisfactory into specific ratios.

Satisfactory Splitter Calculator A tool to help calculate how to split conveyors in Satisfactory into specific ratios. Dependencies Python 3.9 PyYAML

RobotiCat 5 Dec 22, 2022
Strawberry Benchmark With Python

Strawberry benchmarks these benchmarks have been made to compare the performance of dataloaders and joined database queries. How to use You can run th

Doctor 4 Feb 23, 2022