Hook Slinger acts as a simple service that lets you send, retry, and manage event-triggered POST requests, aka webhooks

Overview

logo

>> A generic service to send, retry, and manage webhooks. <<

forthebadge forthebadge forthebadge

Table of Contents

Description

What?

Hook Slinger acts as a simple service that lets you send, retry, and manage event-triggered POST requests, aka webhooks. It provides a fully self-contained docker image that is easy to orchestrate, manage, and scale.

Why?

Technically, a webhook is a mere POST request—triggered by a system—when a particular event occurs. The following diagram shows how a simple POST request takes the webhook nomenclature when invoked by an event trigger.

Webhook Concept

However, there are a few factors that make it tricky to manage the life cycle of a webhook, such as:

  • Dealing with server failures on both the sending and the receiving end.
  • Managing HTTP timeouts.
  • Retrying the requests gracefully without overloading the recipients.
  • Avoiding retry loop on the sending side.
  • Monitoring and providing scope for manual interventions.
  • Scaling them quickly; either vertically or horizontally.
  • Decoupling webhook management logic from your primary application logic.

Properly dealing with these concerns can be cumbersome; especially when sending webhooks is just another small part of your application and you just want it to work without you having to deal with all the hairy details every time. Hook Slinger aims to alleviate this pain point.

How?

Hook Slinger exposes a single endpoint where you can post your webhook payload, destination URL, auth details, and it'll make the POST request for you asynchronously in the background. Under the hood, the service uses:

  • FastAPI to provide a Uvicorn driven ASGI server.

  • Redis and RQ for implementing message queues that provide the asynchrony and robust failure handling mechanism.

  • Rqmonitor to provide a dashboard for monitoring the status of the webhooks and manually retrying the failed jobs.

  • Rich to make the container logs colorful and more human friendly.

The simplified app architecture looks something this:

Topology

In the above image, the webhook payload is first sent to the app and the app leverages the worker instance to make the POST request. Redis DB is used for fast bookkeeping and async message queue implementation. The monitor instance provides a GUI to monitor and manage the webhooks. Multiple worker instances can be spawned to achieve linear horizontal scale-up.

Installation

  • Make sure you've got Docker and Docker Compose installed in your system.

  • Clone the repository and head over to the root directory.

  • To start the orchestra, run:

    make start_servers
    

    This will:

    • Start an app server that can be accessed from port 5000.

    • Start an Alpine-based Redis server that exposes port 6380.

    • Start a single worker that will carry out the actual tasks.

    • Start a rqmonitor instance that opens port 8899.

  • To shut down everything, run:

    make stop_servers
    

TODO: Generalize it more before making it installable with a docker pull command.

Usage

Exploring the Interactive API Docs

To try out the entire workflow interactively, head over to the following URL on your browser:

http://localhost:5000/docs

You should see a panel like this:

API Docs

This app implements a rudimentary token-based authentication system where you're expected to send an API token by adding Authorization: Token field to your request header. To do that here, click the POST /hook_slinger/ ribbon and that will reveal the API description like this:

API Description

Copy the default token value from the description corpus, then click the green button on the top right that says Authorize, and paste the value in the prompt box. Click the Authorize button again and that'll conclude the login step. In your production application, you should implement a robust authentication system or at least change this default token.

To send a webhook, you'll need a URL where you'll be able to make the POST request. For this demonstration, let's pick this webhook site service to monitor the received webhooks. It gives you a unique URL against which you'll be able to make the post requests and monitor them in a dashboard like this:

Webhook Site

On the API docs page, click the Try it out button near the request body section:

API Request

This should reveal a panel like the following one where you can make your request:

API Request

Notice that the section is prefilled with an example request payload. You can use this exact payload to make a request. Go ahead and click the execute button. If you scroll down a little, you'll notice the HTTP response:

API Response

Now, if you head over to the webhook site URL, you should be able to see your API payload:

API Response

To monitor the webhook tasks, head over to the following URL:

http://localhost:8899/

You should be presented with a GUI like this:

RQ Monitor

If you click Workers on the left panel, you'll be presented with a panel where you can monitor all the workers:

RQ Monitor

The Jobs panel lists all the tasks, and from there you'll be able to requeue a failed job. By default, Hook Slinger retries a failed job 3 times with 5 seconds linear backoff. However, this can be configured using environment variables in the .env file.

RQ Monitor

Sending A Webhook Via cURL

Run the following command on your terminal; this assumes that you haven't changed the auth token (you should):

curl -X 'POST' \
  'http://localhost:5000/hook_slinger/' \
  -H 'accept: application/json' \
  -H 'Authorization: Token $5$1O/inyTZhNvFt.GW$Zfckz9OL.lm2wh3IewTm8YJ914wjz5txFnXG5XW.wb4' \
  -H 'Content-Type: application/json' \
  -d '{
  "to_url": "https://webhook.site/37ad9530-59c3-430d-9db6-e68317321a9f",
  "to_auth": "",
  "tag": "Dhaka",
  "group": "Bangladesh",
  "payload": {
    "greetings": "Hello, world!"
  }
}' | python -m json.tool

You should expect the following output:

{
    "status": "queued",
    "ok": true,
    "message": "Webhook registration successful.",
    "job_id": "Bangladesh_Dhaka_a07ca786-0b7a-4029-bac0-9a7c6eb68a98",
    "queued_at": "2021-07-23T20:15:04.389690"
}

Sending A Webhook Via Python

For this purpose, you can use an HTTP library like httpx.

Make the request with the following script:

import asyncio
from http import HTTPStatus
from pprint import pprint

import httpx


async def send_webhook() -> None:

    wh_payload = {
        "to_url": "https://webhook.site/aa7e2e7e-a62d-4505-8879-13bd806da6d5",
        "to_auth": "",
        "tag": "Dhaka",
        "group": "Bangladesh",
        "payload": {"greetings": "Hello, world!"},
    }

    async with httpx.AsyncClient(http2=True) as session:
        headers = {
            "Content-Type": "application/json",
            "Authorization": (
                "Token $5$1O/inyTZhNvFt.GW$Zfckz9OL.lm2wh3IewTm8YJ914wjz5txFnXG5XW.wb4"
            ),
        }

        response = await session.post(
            "http://localhost:5000/hook_slinger",
            headers=headers,
            json=wh_payload,
        )

        # Hook Slinger returns http code 202, accepted, for a successful request.
        if response.status_code == HTTPStatus.ACCEPTED:
            result = response.json()
            pprint(result)


asyncio.run(send_webhook())

This should return a similar response as before:

{
    'job_id': 'Bangladesh_Dhaka_139fc35a-d2a5-4d01-a6af-e980c52f55bc',
    'message': 'Webhook registration successful.',
    'ok': True,
    'queued_at': '2021-07-23T20:15:04.389690',
    'status': 'queued'
}

Exploring the Container Logs

Hook Slinger overloads the Python root logger to give you a colorized and user-friendly logging experience. To explore the logging messages of the application server, run:

make app_logs

Notice the colorful logs cascading down from the app server:

App Logs

Now, to explore the worker instance logs, in a separate terminal, run:

make worker_logs

You should see something like this:

Worker Logs

Scaling Up the Service

Hook Slinger offers easy horizontal scale-up, powered by the docker-compose --scale command. In this case, scaling up means, spawning new workers in separate containers. Let's spawn 3 worker containers this time. To do so, first shut down the orchestra by running:

make stop_servers

Now, run:

make worker_scale n=3

This will start the App server, Redis DB, RQmonitor, and 3 Worker instances. Spawning multiple worker instances are a great way to achieve job concurrency with the least amount of hassle.

Philosophy & Limitations

Hooks Slinger is designed to be simple, transparent, upgradable, and easily extensible to cater to your specific needs. It's not built around AMQP compliant message queues with all the niceties and complexities that come with them—this is intentional.

Also, if you scrutinize the end-to-end workflow, you'll notice that it requires making HTTP requests from the sending service to the Hook Slinger. This inevitably adds another point of failure. However, from the sending service's POV, it's sending the HTTP requests to a single service, and the target service is responsible for fanning out the webhooks to the destinations. The developers are expected to have control over both services, which theoretically should mitigate the failures. The goal is to transfer some of the code complexity around managing webhooks from the sending service over to the Hook Slinger. Also, I'm playing around with some of the alternatives to using HTTP POST requests to send the payloads from the sending end to the Hook Slinger. Suggestions are always appreciated.

🍰
Owner
Redowan Delowar
Hacking healthcare @DendiSoftware. Writing & talking about—Statistics, Machine Learning, System Arch, APIs, Redis, Docker, Python, Go, etc.
Redowan Delowar
A set of demo of deploying a Machine Learning Model in production using various methods

Machine Learning Model in Production This git is for those who have concern about serving your machine learning model to production. Overview The tuto

Vo Van Tu 53 Sep 14, 2022
A FastAPI Plug-In to support authentication authorization using the Microsoft Authentication Library (MSAL)

FastAPI/MSAL - MSAL (Microsoft Authentication Library) plugin for FastAPI FastAPI - https://github.com/tiangolo/fastapi FastAPI is a modern, fast (hig

Dudi Levy 15 Jul 20, 2022
Voucher FastAPI

Voucher-API Requirement Docker Installed on system Libraries Pandas Psycopg2 FastAPI PyArrow Pydantic Uvicorn How to run Download the repo on your sys

Hassan Munir 1 Jan 26, 2022
A dynamic FastAPI router that automatically creates CRUD routes for your models

⚡ Create CRUD routes with lighting speed ⚡ A dynamic FastAPI router that automatically creates CRUD routes for your models

Adam Watkins 950 Jan 08, 2023
fastapi-mqtt is extension for MQTT protocol

fastapi-mqtt MQTT is a lightweight publish/subscribe messaging protocol designed for M2M (machine to machine) telemetry in low bandwidth environments.

Sabuhi 144 Dec 28, 2022
FastAPI-PostgreSQL-Celery-RabbitMQ-Redis bakcend with Docker containerization

FastAPI - PostgreSQL - Celery - Rabbitmq backend This source code implements the following architecture: All the required database endpoints are imple

Juan Esteban Aristizabal 54 Nov 26, 2022
SuperSaaSFastAPI - Python SaaS Boilerplate for building Software-as-Service (SAAS) apps with FastAPI, Vue.js & Tailwind

Python SaaS Boilerplate for building Software-as-Service (SAAS) apps with FastAP

Rudy Bekker 31 Jan 10, 2023
Asynchronous event dispatching/handling library for FastAPI and Starlette

fastapi-events An event dispatching/handling library for FastAPI, and Starlette. Features: straightforward API to emit events anywhere in your code ev

Melvin 238 Jan 07, 2023
FastAPI IPyKernel Sandbox

FastAPI IPyKernel Sandbox This repository is a light-weight FastAPI project that is meant to provide a wrapper around IPyKernel interactions. It is in

Nick Wold 2 Oct 25, 2021
A utility that allows you to use DI in fastapi without Depends()

fastapi-better-di What is this ? fastapi-better-di is a utility that allows you to use DI in fastapi without Depends() Installation pip install fastap

Maxim 9 May 24, 2022
A web application using [FastAPI + streamlit + Docker] Neural Style Transfer (NST) refers to a class of software algorithms that manipulate digital images

Neural Style Transfer Web App - [FastAPI + streamlit + Docker] NST - application based on the Perceptual Losses for Real-Time Style Transfer and Super

Roman Spiridonov 3 Dec 05, 2022
Keycloack plugin for FastApi.

FastAPI Keycloack Keycloack plugin for FastApi. Your aplication receives the claims decoded from the access token. Usage Run keycloak on port 8080 and

Elber 4 Jun 24, 2022
스타트업 개발자 채용

스타트업 개발자 채용 大 박람회 Seed ~ Series B에 있는 스타트업을 위한 채용정보 페이지입니다. Back-end, Frontend, Mobile 등 개발자를 대상으로 진행하고 있습니다. 해당 스타트업에 종사하시는 분뿐만 아니라 채용 관련 정보를 알고 계시다면

JuHyun Lee 58 Dec 14, 2022
:rocket: CLI tool for FastAPI. Generating new FastAPI projects & boilerplates made easy.

Project generator and manager for FastAPI. Source Code: View it on Github Features 🚀 Creates customizable project boilerplate. Creates customizable a

Yagiz Degirmenci 1k Jan 02, 2023
A FastAPI Framework for things like Database, Redis, Logging, JWT Authentication and Rate Limits

A FastAPI Framework for things like Database, Redis, Logging, JWT Authentication and Rate Limits Install You can install this Library with: pip instal

Tert0 33 Nov 28, 2022
Slack webhooks API served by FastAPI

Slackers Slack webhooks API served by FastAPI What is Slackers Slackers is a FastAPI implementation to handle Slack interactions and events. It serves

Niels van Huijstee 68 Jan 05, 2023
An alternative implement of Imjad API | Imjad API 的开源替代

HibiAPI An alternative implement of Imjad API. Imjad API 的开源替代. 前言 由于Imjad API这是什么?使用人数过多, 致使调用超出限制, 所以本人希望提供一个开源替代来供社区进行自由的部署和使用, 从而减轻一部分该API的使用压力 优势

Mix Technology 450 Dec 29, 2022
api versioning for fastapi web applications

fastapi-versioning api versioning for fastapi web applications Installation pip install fastapi-versioning Examples from fastapi import FastAPI from f

Dean Way 472 Jan 02, 2023
Twitter API monitor with fastAPI + MongoDB

Twitter API monitor with fastAPI + MongoDB You need to have a file .env with the following variables: DB_URL="mongodb+srv://mongodb_path" DB_URL2=

Leonardo Ferreira 3 Apr 08, 2022
FastAPI native extension, easy and simple JWT auth

fastapi-jwt FastAPI native extension, easy and simple JWT auth

Konstantin Chernyshev 19 Dec 12, 2022