Send logs to RabbitMQ from Python/Django.

Overview

python-logging-rabbitmq

Build Status

Logging handler to ships logs to RabbitMQ. Compatible with Django.

Installation

Install using pip.

pip install python_logging_rabbitmq

Versions

Version Dependency
>= 2.x Pika == 0.13
<= 1.1.1 Pika <= 0.10

Handlers

This package has two built-in handlers that you can import as follows:

from python_logging_rabbitmq import RabbitMQHandler

or (thanks to @wallezhang)

from python_logging_rabbitmq import RabbitMQHandlerOneWay
Handler Description
RabbitMQHandler Basic handler for sending logs to RabbitMQ. Every record will be delivered directly to RabbitMQ using the exchange configured.
RabbitMQHandlerOneWay High throughput handler. Initializes an internal queue where logs are stored temporarily. A thread is used to deliver the logs to RabbitMQ using the exchange configured. Your app doesn't need to wait until the log is delivered. Notice that if the main thread dies you might lose logs.

Standalone python

To use with python first create a logger for your app, then create an instance of the handler and add it to the logger created.

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)

rabbit = RabbitMQHandler(host='localhost', port=5672)
logger.addHandler(rabbit)

logger.debug('test debug')

As result, a similar message as follows will be sent to RabbitMQ:

{
	"relativeCreated":280.61580657958984,
	"process":13105,
	"args":[],
	"module":"test",
	"funcName":"<module>",
	"host":"albertomr86-laptop",
	"exc_text":null,
	"name":"myapp",
	"thread":140032818181888,
	"created":1482290387.454017,
	"threadName":"MainThread",
	"msecs":454.01692390441895,
	"filename":"test.py",
	"levelno":10,
	"processName":"MainProcess",
	"pathname":"test.py",
	"lineno":11,
	"msg":"test debug",
	"exc_info":null,
	"levelname":"DEBUG"
}

Sending logs

By default, logs will be sent to RabbitMQ using the exchange 'log', this should be of type topic. The routing key used is formed by concatenating the logger name and the log level. For example:

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)
logger.addHandler(RabbitMQHandler(host='localhost', port=5672))

logger.info('test info')
logger.debug('test debug')
logger.warning('test warning')

The messages will be sent using the following routing keys:

  • myapp.INFO
  • myapp.DEBUG
  • myapp.WARNING

For an explanation about topics and routing keys go to https://www.rabbitmq.com/tutorials/tutorial-five-python.html

When create the handler, you're able to specify different parameters in order to connect to RabbitMQ or configure the handler behavior.

Overriding routing-key creation

If you wish to override routing-key format entirely, you can pass routing_key_formatter function which takes LogRecord objects and returns routing-key. For example:

RabbitMQHandler(
	host='localhost',
	port=5672,
	routing_key_formatter=lambda r: (
		'some_exchange_prefix.{}'.format(r.levelname.lower())
	)
)

Configuration

These are the configuration allowed:

Parameter Description Default
host RabbitMQ Server hostname or ip address. localhost
port RabbitMQ Server port. 5672
username Username for authentication. None
password Provide a password for the username. None
exchange Name of the exchange to publish the logs. This exchange is considered of type topic. log
declare_exchange Whether or not to declare the exchange. False
routing_key_format Customize how messages are routed to the queues. {name}.{level}
routing_key_formatter Customize how routing-key is constructed. None
connection_params Allow extra params to connect with RabbitMQ. None
formatter Use custom formatter for the logs. python_logging_rabbitmq.JSONFormatter
close_after_emit Close the active connection after send a log. A new connection is open for the next log. False
fields Dict to add as a field in each logs send to RabbitMQ. This is useful when you want fields in each log but without pass them every time. None
fields_under_root When is True, each key in parameter 'fields' will be added as an entry in the log, otherwise they will be logged under the key 'fields'. True
message_headers A dictionary of headers to be published with the message. None
record_fields A set of attributes that should be preserved from the record object. None
exclude_record_fields A set of attributes that should be ignored from the record object. None
heartbeat Lower bound for heartbeat timeout 60

Examples

RabbitMQ Connection

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	username='guest',
	password='guest',
	connection_params={
		'virtual_host': '/',
		'connection_attempts': 3,
		'socket_timeout': 5000
	}
)

Custom fields

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	fields={
		'source': 'MyApp',
		'env': 'production'
	},
	fields_under_root=True
)

Custom formatter

By default, python_logging_rabbitmq implements a custom JSONFormatter; but if you prefer to format your own message you could do it as follow:

import logging
from python_logging_rabbitmq import RabbitMQHandler

FORMAT = '%(asctime)-15s %(message)s'
formatter = logging.Formatter(fmt=FORMAT)
rabbit = RabbitMQHandler(formatter=formatter)

For a custom JSON Formatter take a look at https://github.com/madzak/python-json-logger

Django

To use with Django add the handler in the logging config.

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Configuration

Same as when use it with standalone python, you could configure the handle directly when declaring it in the config:

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'port': 5672,
			'username': 'guest',
			'password': 'guest',
			'exchange': 'log',
			'declare_exchange': False,
			'connection_params': {
				'virtual_host': '/',
				'connection_attempts': 3,
				'socket_timeout': 5000
			},
			'fields': {
				'source': 'MainAPI',
				'env': 'production'
			},
			'fields_under_root': True
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Custom formatter

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'standard': {
			'format': '%(levelname)-8s [%(asctime)s]: %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'standard'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

JSON formatter

pip install python-json-logger
LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'json': {
			'()': 'pythonjsonlogger.jsonlogger.JsonFormatter',
			'fmt': '%(name)s %(levelname) %(asctime)s %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'json'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Releases

Date Version Notes
Mar 10, 2019 1.1.1 Removed direct dependency with Django. Integration with Travis CI. Configuration for tests. Using pipenv.
May 04, 2018 1.0.9 Fixed exchange_type parameter in channel.exchange_declare (Thanks to @cklos).
Mar 21, 2018 1.0.8 Allowing message headers (Thanks to @merretbuurman).
May 15, 2017 1.0.7 Adding support to customize the routing_key (Thanks to @hansyulian).
Mar 30, 2017 1.0.6 Fix compatibility with python3 in RabbitMQHandlerOneWay (by @sactre).
Mar 28, 2017 1.0.5 Explicit local imports.
Mar 16, 2017 1.0.4 Added new handler RabbitMQHandlerOneWay (by @wallezhang).
Mar 14, 2017 1.0.3 Added config parameter close_after_emit.
Dec 21, 2016 1.0.2 Minor fixes.
Dec 21, 2016 1.0.1 Minor fixes.
Dec 21, 2016 1.0.0 Initial release.

What's next?

  • Let's talk about tests.
  • Issues, pull requests, suggestions are welcome.
  • Fork and improve it. Free for all.

Similar efforts

Comments
  • TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    I always get this error.

    The error was from line 101 in handler,py.
    But I think it is because of the line 62
    self.connection_params.update(dict(host=host, port=port, heartbeat_interval=0))

    Just go through this Pika Documentation

    connection_params does not have heartbeat_interval

    bug 
    opened by raj-kiran-p 7
  • fix: handle thread shutdown

    fix: handle thread shutdown

    Introduce two events (stopping, stopped) to interlock with the worker thread and cause a graceful shutdown.

    Add a timeout to the Queue get of 10s, this means that a graceful shutdown will not be instantaneous.

    Switch to del on the Pika blocking channels.

    opened by donbowman 4
  • RabbitMQ server closes the connection because not receiving heartbeat

    RabbitMQ server closes the connection because not receiving heartbeat

    Hi Albert, Similar to issue https://github.com/pika/pika/issues/1104. After digging into Pika and RabbitMQ, I find with BlockedConnection, pika will not automatically send out the heartbeat. The heartbeat event will only be handled/sent in "start_consuming" and "process_data_events". For consumer, we will use "start_consuming", there will not be such issue. But for producer, normally we won't call "process_data_events" specifically, it will only be called when we call "basic_publish". Let's say we set "heartbeat" to 20s, if we don't log any message within 3x10s, the server would close the connection. (Different version of RabbitMQ might have different behaviors, some might take 3x20s) I didn't see anyone report this issue or talk this on the internet, so I'm not sure if my understanding is correct. Look forward to your response. Thanks in advance.

    bug wip 
    opened by yuanli-cn 4
  • Standalone not working

    Standalone not working

    Hello everybody,

    I'm trying to implement your lib in my python app. We're not using Django and we have this error raised :

    Traceback (most recent call last): File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/test.py", line 1, in <module> import DarwinLogger File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/DarwinLogger.py", line 4, in <module> from python_logging_rabbitmq import RabbitMQHandlerOneWay File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/__init__.py", line 2, in <module> from .formatters import JSONFormatter # noqa: F401 File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/formatters.py", line 5, in <module> from django.core.serializers.json import DjangoJSONEncoder ImportError: No module named django.core.serializers.json

    I simply followed the "standalone" part of the readme. Is this normal ? Or Am I doing something wrong ?

    Thx !

    bug 
    opened by Travincebarker 2
  • wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    Hi,

    Thank you for your great package.

    Is there any way to wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ? Naive method could be to wait a few seconds (time.sleep(2)) but there is maybe a better method.

    Thanks a lot.

    enhancement planning 
    opened by BenjaminSchmitt 2
  • Unconfigurable Routing Key Format

    Unconfigurable Routing Key Format

    I need to able to change the routing key format in my system, so i prefered that this file, python_loggin_rabbitmq/handlers.py:

    line 115:

                routing_key ="{name}.{level}".format(name=record.name, level=record.levelname)
    

    to be changed to:

    line 14:

                ROUTING_KEY_FORMAT = "{name}.{level}"
    

    line 115:

                routing_key = self.ROUTING_KEY_FORMAT.format(name=record.name, level=record.levelname)
    

    so it will be configurable thank you

    enhancement 
    opened by hansyulian 2
  • ImportError: No module named 'compat'

    ImportError: No module named 'compat'

    When I use the library I see an Exception:

    File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/init.py", line 2, in from .formatters import JSONFormatter # noqa: F401 File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/formatters.py", line 4, in from compat import json ImportError: No module named 'compat'

    are some wrong in ini?

    Regards and thank you for your library.

    bug 
    opened by sactre 2
  • Add content_type in pika.BasicProperties parameters

    Add content_type in pika.BasicProperties parameters

    https://github.com/albertomr86/python-logging-rabbitmq/blob/5d3ce4cc0b86b7303a2097d6acb46972d334e213/python_logging_rabbitmq/handlers.py#L164 The safest way to work is to add content_type = 'STRING' but could be as parameter key in class method.

    wip 
    opened by TopperBG 1
  • Fix in publish(): the body is already formatted.

    Fix in publish(): the body is already formatted.

    In emit(), the record is formatted and than queued. The worker, is getting from the queue the record to be published In publish(), that record was formatted again (a second time)

    Try a simple app like this:

    import time import logging from python_logging_rabbitmq import RabbitMQHandlerOneWay

    logger = logging.getLogger('myapp') logger.setLevel(logging.DEBUG)

    rabbit = RabbitMQHandlerOneWay(host='localhost', port=5672) logger.addHandler(rabbit)

    logger.debug('test debug') time.sleep(3)

    -- Error: File "python-logging-rabbitmq/python_logging_rabbitmq/formatters.py", line 22, in format data = record.dict.copy() AttributeError: 'str' object has no attribute 'dict'

    opened by ghost 1
  • Returning batch of changes to upstream

    Returning batch of changes to upstream

    Hi, I'm pleased to say that we've been using your library in our project and it turned out very helpful. We've made some changes to fit our needs and thought to return them to upstream, you may find them useful. In summary, we've:

    • Updated .gitignore to include broader range of Python/Vim-related files
    • Made some stylistic tweaks; sorted imports, PEP8-ified some comments
    • Added routing_key_formatter option which allows to pass lambda overriding routing-key creation
    • Added support for serialization of Django's requests (this means that Rabbit handlers can handle errors logged to django.requests)
    • Added record_fields and exclude_record_fields options which allow to include/exclude specified LogRecord attributes (sometimes fields such as levelno are just not helpful)
    • Imported DjangoJSONEncoder to json formatter in order to handle breader range of objects (such as Decimal)
    • Updated README
    opened by IwoHerka 1
  • call of channel.exchange_declare modified

    call of channel.exchange_declare modified

    According to the Pika source at: https://github.com/pika/pika/blob/master/pika/channel.py#L658 the channel.exchange_declare method has no argument 'type', the corresponding argument is 'exchange_type'.

    opened by cklos 1
  • fix: only mark task done when a task was dequeued

    fix: only mark task done when a task was dequeued

    task_done will fail if we mark a task as having finished when no task was dequeued. Since this can only happen after a task was retrieved from the queue, move the finally into an inner try so that we know task_done will work.

    Fixes #29 for the most part -- it does not address the leak regarding messages still in the queue when is_stopping is set.

    opened by klarose 0
  • Call queue.task_done() only after a successful get()

    Call queue.task_done() only after a successful get()

    queue.task_done() should be called only when an item was actually returned by get(). If get() raises a Empty exception, task_done() should not be called.

    Also, close the Pika connection only if it was actually opened.

    wip 
    opened by kmorwath 1
  •  self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    The changes in version 2.2 for fix #25 in python_logging_rabbitmq/handlers_oneway.py may have introduced an issue. Before the Queue.Empty exception was never raised because record, routing_key = self.queue.get() had no timeout. Now when the exception is raised if no messages arrives within 10s, the exception handler will call "continue" but still the "finally" block is executed anyway - and queue.task_done() could be called more times than put() and it will lead to a ValueError exception.

    queue.task_done() should be called in a inner "try..finally" block after a message has been dequeued actually, for example:

    record, routing_key = self.queue.get(block=True, timeout=10) try: #Actually got a message ... try to send the message ... finally: queue.task_done()

    Moreover when is_stopping is set the loop is exited before queue.task_done() is called, and messages still in the queue are not processed. If on the other side of the queue something attempts to call queue.join() it could never return.

    opened by kmorwath 0
  • `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    As per https://github.com/esnme/ultrajson/issues/124

    If you have a package that requires ujson, it is automatically picked up by the compat.py and used in JSONFormatter thereafter. Unfortunately, ujson is not fully compatible with the built-in json.dump and it does not understand the cls parameter.

    opened by EivV 1
  • SSL configuration isn't working automatically

    SSL configuration isn't working automatically

    As a workaround I initilize to following:

    SSLOptions(ssl.SSLContext(protocol=ssl.PROTOCOL_TLSv1_2))

    and pass it as connection_params under ssl_options

    Without a workaround I get a connection reset error.

    bug wip 
    opened by Ghost93 2
Releases(2.0.0)
Owner
Alberto Menendez Romero
Technical Manager at Globant SA.
Alberto Menendez Romero
A small Django app to easily broadcast an announcement across a website.

django-site-broadcasts The site broadcast application allows users to define short messages and announcements that should be displayed across a site.

Ben Lopatin 12 Jan 21, 2020
A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for quickly creating new images from the one assigned to the field.

django-versatileimagefield A drop-in replacement for django's ImageField that provides a flexible, intuitive and easily-extensible interface for creat

Jonathan Ellenberger 490 Dec 13, 2022
A simple djagno music website.

Mrock A simple djagno music website. I used this template and I translated it to eng. Also some changes commited. My Live Domo : https://mrock.pythona

Hesam N 1 Nov 30, 2021
A Django chatbot that is capable of doing math and searching Chinese poet online. Developed with django, channels, celery and redis.

Django Channels Websocket Chatbot A Django chatbot that is capable of doing math and searching Chinese poet online. Developed with django, channels, c

Yunbo Shi 8 Oct 28, 2022
A Django web application to receive, virus check and validate transfers of digital archival records, and allow archivists to appraise and accession those records.

Aurora Aurora is a Django web application that can receive, virus check and validate transfers of digital archival records, and allows archivists to a

Rockefeller Archive Center 20 Aug 30, 2022
Exemplo de biblioteca com Django

Bookstore Exemplo de biblioteca feito com Django. Este projeto foi feito com: Python 3.9.7 Django 3.2.8 Django Rest Framework 3.12.4 Bootstrap 4.0 Vue

Regis Santos 1 Oct 28, 2021
Django query profiler - one profiler to rule them all. Shows queries, detects N+1 and gives recommendations on how to resolve them

Django Query Profiler This is a query profiler for Django applications, for helping developers answer the question "My Django code/page/API is slow, H

Django Query Profiler 116 Dec 15, 2022
Faker is a Python package that generates fake data for you.

Faker is a Python package that generates fake data for you. Whether you need to bootstrap your database, create good-looking XML documents, fill-in yo

Daniele Faraglia 15.2k Jan 01, 2023
AUES Student Management System Developed for laboratory works №9 Purpose using Python (Django).

AUES Student Management System (L M S ) AUES Student Management System Developed for laboratory works №9 Purpose using Python (Django). I've created t

ANAS NABIL 2 Dec 06, 2021
Plug and play continuous integration with django and jenkins

django-jenkins Plug and play continuous integration with Django and Jenkins Installation From PyPI: $ pip install django-jenkins Or by downloading th

Mikhail Podgurskiy 941 Oct 22, 2022
A better and faster multiple selection widget with suggestions

django-searchable-select A better and faster multiple selection widget with suggestions for Django This project is looking for maintainers! Please ope

Andrew Dunai 105 Oct 22, 2022
Automatic caching and invalidation for Django models through the ORM.

Cache Machine Cache Machine provides automatic caching and invalidation for Django models through the ORM. For full docs, see https://cache-machine.re

846 Nov 26, 2022
Fully reponsive Chat Application built with django, javascript, materialUi, bootstrap4, html and css.

Chat app (Full Stack Frameworks with Django Project) Fully reponsive Chat Application built with django, javascript, materialUi, bootstrap4, html and

1 Jan 19, 2022
An API was build with Django to store and retrieve information about various musical instruments.

The project is meant to be a starting point, an experimentation or a basic example of a way to develop an API with Django. It is an exercise on using Django and various python technologies and design

Kostas Ziovas 2 Dec 25, 2021
Django API without Django REST framework.

Django API without DRF This is a API project made with Django, and without Django REST framework. This project was done with: Python 3.9.8 Django 3.2.

Regis Santos 3 Jan 19, 2022
Modular search for Django

Haystack Author: Daniel Lindsley Date: 2013/07/28 Haystack provides modular search for Django. It features a unified, familiar API that allows you to

Haystack Search 3.4k Jan 08, 2023
Projeto Crud Django and Mongo

Projeto-Crud_Django_and_Mongo Configuração para rodar o projeto Download Project

Samuel Fernandes Oliveira 2 Jan 24, 2022
Code coverage measurement for Python

Coverage.py Code coverage testing for Python. Coverage.py measures code coverage, typically during test execution. It uses the code analysis tools and

Ned Batchelder 2.3k Jan 05, 2023
This is a template tag project for django to calculate in templates , enjoy it

Calculator-Template-Django this is a template tag project for django to calculate in templates , enjoy it Get Started : 1 - Download Source Code 2 - M

1 Feb 01, 2022
Py-instant-search-redis - Source code example for how to build an instant search with redis in python

py-instant-search-redis Source code example for how to build an instant search (

Giap Le 4 Feb 17, 2022