Paprika is a python library that reduces boilerplate. Heavily inspired by Project Lombok.

Overview

A plate filled with paprika spice Image courtesy of Anna Quaglia (Photographer)

GitHub Workflow Status PyPI license Maintainability codecov PyPI Downloads

Paprika

Paprika is a python library that reduces boilerplate. It is heavily inspired by Project Lombok.

Table of Contents

Installation

paprika is available on PyPi.

$ pip install paprika

Usage

paprika is a decorator-only library and all decorators are exposed at the top-level of the module. If you want to use shorthand notation (i.e. @data), you can import all decorators as follows:

from paprika import *

Alternatively, you can opt to use the longhand notation (i.e. @paprika.data) by importing paprika as follows:

import paprika

Features and Examples

Object-oriented decorators

@to_string

The @to_string decorator automatically overrides __str__

Python

class Person:
    def __init__(self, name: str, age: int):
        self.name = name
        self.age = age

    def __str__(self):
        return f"{self.__name__}@[name={self.name}, age={self.age}]"

Python with paprika

@to_string
class Person:
    def __init__(self, name: str, age: int):
        self.name = name
        self.age = age

@equals_and_hashcode

The @equals_and_hashcode decorator automatically overrides __eq__ and __hash__

Python

class Person:
    def __init__(self, name: str, age: int):
        self.name = name
        self.age = age

    def __eq__(self, other):
        return (self.__class__ == other.__class__
                and
                self.__dict__ == other.__dict__)

    def __hash__(self):
        return hash((self.name, self.age))

Python with paprika

@equals_and_hashcode
class Person:
    def __init__(self, name: str, age: int):
        self.name = name
        self.age = age

@data

The @data decorator creates a dataclass by combining @to_string and @equals_and_hashcode and automatically creating a constructor!

Python

class Person:
    def __init__(self, name: str, age: int):
        self.name = name
        self.age = age

    def __str__(self):
        return f"{self.__name__}@[name={self.name}, age={self.age}]"

    def __eq__(self, other):
        return (self.__class__ == other.__class__
                and
                self.__dict__ == other.__dict__)

    def __hash__(self):
        return hash((self.name, self.age))

Python with paprika

@data
class Person:
    name: str
    age: int

On @data and NonNull

paprika exposes a NonNull generic type that can be used in conjunction with the @data decorator to enforce that certain arguments passed to the constructor are not null. The following snippet will raise a ValueError:

@data
class Person:
    name: NonNull[str]
    age: int

p = Person(name=None, age=42)  # ValueError ❌

@singleton

The @singleton decorator can be used to enforce that a class only gets instantiated once within the lifetime of a program. Any subsequent instantiation will return the original instance.

@singleton
class Person:
    def __init__(self, name: str, age: int):
        self.name = name
        self.age = age

p1 = Person(name="Rayan", age=19)
p2 = Person()
print(p1 == p2 and p1 is p2)  # True ✅

@singleton can be seamlessly combined with @data!

@singleton
@data
class Person:
    name: str
    age: int

p1 = Person(name="Rayan", age=19)
p2 = Person()
print(p1 == p2 and p1 is p2)  # True ✅

Important note on combining @data and @singleton

When combining @singleton with @data, @singleton should come before @data. Combining them the other way around will work in most cases but is not thoroughly tested and relies on assumptions that might not hold.

General utility decorators

@threaded

The @threaded decorator will run the decorated function in a thread by submitting it to a ThreadPoolExecutor. When the decorated function is called, it will immediately return a Future object. The result can be extracted by calling .result() on that Future

@threaded
def waste_time(sleep_time):
    thread_name = threading.current_thread().name
    time.sleep(sleep_time)
    print(f"{thread_name} woke up after {sleep_time}s!")
    return 42

t1 = waste_time(5)
t2 = waste_time(2)

print(t1)           # 
print(t1.result())  # 42
ThreadPoolExecutor-0_1 woke up after 2s!
ThreadPoolExecutor-0_0 woke up after 5s!

@repeat

The @repeat decorator will run the decorated function consecutively, as many times as specified.

@repeat(n=5)
def hello_world():
    print("Hello world!")

hello_world()
Hello world!
Hello world!
Hello world!
Hello world!
Hello world!

Benchmark decorators

timeit

The @timeit decorator times the total execution time of the decorated function. It uses a timer::perf_timer by default but that can be replaced by any object of type Callable[None, int].

def time_waster1():
    time.sleep(2)

def time_waster2():
    time.sleep(5)

@timeit
def test_timeit():
    time_waster1()
    time_waster2()
test_timeit executed in 7.002189894999999 seconds

Here's how you can replace the default timer:

@timeit(timer: lambda: 0) # Or something actually useful like time.time()
def test_timeit():
    time_waster1()
    time_waster2()
test_timeit executed in 0 seconds

@access_counter

The @access_counter displays a summary of how many times each of the structures that are passed to the decorated function are accessed (number of reads and number of writes).

@access_counter
def test_access_counter(list, dict, person, tuple):
    for i in range(500):
        list[0] = dict["key"]
        dict["key"] = person.age
        person.age = tuple[0]


test_access_counter([1, 2, 3, 4, 5], {"key": 0}, Person(name="Rayan", age=19),
                    (0, 0))
data access summary for function: test
+------------+----------+-----------+
| Arg Name   |   nReads |   nWrites |
+============+==========+===========+
| list       |        0 |       500 |
+------------+----------+-----------+
| dict       |      500 |       500 |
+------------+----------+-----------+
| person     |      500 |       500 |
+------------+----------+-----------+
| tuple      |      500 |         0 |
+------------+----------+-----------+

@hotspots

The @hotspots automatically runs cProfiler on the decorated function and display the top_n (default = 10) most expensive function calls sorted by cumulative time taken (this metric will be customisable in the future). The sample error can be reduced by using a higher n_runs (default = 1) parameter.

def time_waster1():
    time.sleep(2)

def time_waster2():
    time.sleep(5)

@hotspots(top_n=5, n_runs=2)  # You can also do just @hotspots
def test_hotspots():
    time_waster1()
    time_waster2()

test_hotspots()
   11 function calls in 14.007 seconds

   Ordered by: cumulative time

   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
        2    0.000    0.000   14.007    7.004 main.py:27(test_hot)
        4   14.007    3.502   14.007    3.502 {built-in method time.sleep}
        2    0.000    0.000   10.004    5.002 main.py:23(time_waster2)
        2    0.000    0.000    4.003    2.002 main.py:19(time_waster1)
        1    0.000    0.000    0.000    0.000 {method 'disable' of '_lsprof.Profiler' objects}

@profile

The @profile decorator is simply syntatic sugar that allows to perform both hotspot analysis and data access analysis. Under the hood, it simply uses @access_counter followed by @hotspots.

Error-handling decorators

@catch

The @catch decorator can be used to wrap a function inside a try/catch block. @catch expects to receive in the exceptions argument at least one exception that we want to catch.

If no exception is provided, @catch will by default catch all exceptions ( excluding SystemExit, KeyboardInterrupt and GeneratorExit since they do not subclass the generic Exception class).

@catch can take a custom exception handler as a parameter. If no handler is supplied, a stack trace is logged to stderr and the program will continue executing.

@catch(exception=ValueError)
def test_catch1():
    raise ValueError

@catch(exception=[EOFError, KeyError])
def test_catch2():
    raise ValueError

test_catch1()
print("Still alive!")  # This should get printed since we're catching the ValueError.

test_catch2()
print("Still alive?")  # This will not get printed since we're not catching ValueError in this case.
Traceback (most recent call last):
  File "/Users/rayan/Desktop/paprika/paprika/__init__.py", line 292, in wrapper_catch
    return func(*args, **kwargs)
  File "/Users/rayan/Desktop/paprika/main.py", line 29, in test_exception1
    raise ValueError
ValueError

Still alive!

Traceback (most recent call last):
  File "/Users/rayan/Desktop/paprika/main.py", line 40, in 
    test_exception2()
  File "/Users/rayan/Desktop/paprika/paprika/__init__.py", line 292, in wrapper_catch
    return func(*args, **kwargs)
  File "/Users/rayan/Desktop/paprika/main.py", line 37, in test_exception2
    raise ValueError
ValueError

Using a custom exception handler

If provided, a custom exception handler must be of type Callable[Exception, Generic[T]]. In other words, its signature must take one parameter of type Exception.

@catch(exception=ValueError,
       handler=lambda x: print(f"Ohno, a {repr(x)} was raised!"))
def test_custom_handler():
    raise ValueError

test_custom_handler()
Ohno, a ValueError() was raised!

@silent_catch

The @silent_catch decorator is very similar to the @catch decorator in its usage. It takes one or more exceptions but then simply catches them silently.

@silent_catch(exception=[ValueError, TypeError])
def test_silent_catch():
    raise TypeError

test_silent_catch()
print("Still alive!")
Still alive!

Contributing

Encountered a bug? Have an idea for a new feature? This project is open to all sorts of contribution! Feel free to head to the Issues tab and describe your request!

Authors

See also the list of contributors who participated in this project.

License

This project is licensed under the MIT License - see the LICENSE.md file for details

Comments
  • Rename `@serial` to `@pickled`

    Rename `@serial` to `@pickled`

    Summary

    @serial is misleading and may instead be misconstrued to mean JSON serializing. Use @pickled instead

    While we're at it, make hypothesis generate bytes of min_size=100 during randomised testing.

    Test Plan

    Unit tests pass.

    opened by lhl2617 2
  • Tests: Revamp Github Actions

    Tests: Revamp Github Actions

    Carried over from #11--Codecov was unhappy with that one.

    Summary

    • [x] Fetch only latest commit
    • [x] Test on both Python 3.8 and 3.9
    • [x] Test on macos-latest, windows-latest, ubuntu-20.04
    • [x] Use poetry to install dependencies

    Issues

    Unable to get coverage working. Leaving for now to solve later. See #15

    Test Plan

    See checks below.

    opened by lhl2617 2
  • Add `@serial` decorator

    Add `@serial` decorator

    Summary

    • See #10 for context
    • See changes to README.md for example.
    • Added hypothesis for property based testing

    Test plan

    • Added property-based unit tests.
    opened by lhl2617 2
  • Tests: add some tolerance in sleep_after tests

    Tests: add some tolerance in sleep_after tests

    Summary

    sleep_after tests may fail with

    AssertionError: 1.9999143000000004 not greater than or equal to 2
    

    Add 10ms tolerance to the test. Sleeping on non RTOSes are best effort anyway.

    Test Plan

    python -m unittest
    

    Passes on Windows on multiple tries.

    opened by lhl2617 2
  • Tests: Revamp Github Actions

    Tests: Revamp Github Actions

    Summary

    • [x] Fetch only latest commit
    • [x] Test on both Python 3.8 and 3.9
    • [x] Test on macos-latest, windows-latest, ubuntu-20.04
    • [x] Separate unittest and coverage workflows
    • [x] Use poetry to install dependencies

    Test Plan

    See checks below.

    opened by lhl2617 2
  • `pickle` decorator for `class`

    `pickle` decorator for `class`

    Is your feature request related to a problem? Please describe. Often it is required to serialise some data and then deserialise them for use. An example is during machine learning where it would be convenient to persist train/val/test splits.

    Describe the solution you'd like A @pickle decorator for a data class.

    @pickle
    @data
    class Foo:
        x: int
    

    This should override two methods (naming tentative for now)

    def __serialize__(self, file_name: str):
        # Serializes self into a pickled format then save into file_name
        pass
    
    @staticmethod
    def __deserialize__(file_name: str):
        # pickle.load the pickle file at `file_name`.
        pass
    

    Describe alternatives you've considered Doing it manually.

    Additional context Docs: https://docs.python.org/3/library/pickle.html It would be nice to be able to specify the protocol in the decorator as well, e.g. @pickle(protocol=5). Other pickle parameters as per docs are desireable.

    enhancement 
    opened by lhl2617 1
  • Re-structure the package

    Re-structure the package

    Issue

    All decorators are implemented in __init__.py. This will be bad for maintainability once the file gets bigger.

    Mitigation

    We should probably split the decorators into different python files (potentially following the structure from the documentation?) and import those inside __init__.py

    opened by rayanht 1
  • Tests + Coverage

    Tests + Coverage

    Issue

    We're writing code that leverages highly unusual (maybe even suspicious in some instances) python features with no assurance of correctness.

    Mitigation

    Unit tests + automated coverage reports (codecov?). We need to test the behaviour of single decorator as well as common combinations.

    opened by rayanht 0
  • Performance metrics decorators

    Performance metrics decorators

    Feature Description

    We need a family of decorators that allow us to seamlessly benchmark a function with respect to various metrics (number of data accesses, wall time...). Stretch goal: automated hotspot/hot path analysis

    opened by rayanht 0
  • @to_string:  decorated_class.__dict__[attr],

    @to_string: decorated_class.__dict__[attr], "__call__")) KeyError: 'xxx'

    When I decorated my class with @to_string decorator, I got a KeyError. I found that there is a problem with the source code after I debugged.

      attributes = [
          attr
          for attr in dir(self)
          if not attr.startswith("_") and not (
              hasattr(self.__dict__[attr], "__call__") if attr in self.__dict__ else hasattr(
                  decorated_class.__dict__[attr], "__call__"))
      ]
    

    After I print dir(self) and self.__dict__, the attr of dict(self) are more than self.__dict__, thus result in the self.__dict__[attr] KeyError

    bug 
    opened by lifefossil 1
  • Additional decorators

    Additional decorators

    Is your feature request related to a problem? Please describe. N/A

    Describe the solution you'd like A clear and concise description of what you want to happen.

    This blog post has few decorators that might be useful to integrate in paprika.

    Describe alternatives you've considered N/A

    Additional context N/A

    opened by rayanht 0
  • Github Actions: Use poetry + Coverage without breaking codecov

    Github Actions: Use poetry + Coverage without breaking codecov

    The right way to set up a poetry CI env is shown in #14 in .github/workflows/unittest.yml.

    The below was tried for .github/workflows/unittest.yml but does not work due to Codecov failures.

    name: Test and collect coverage
    on: [push, pull_request]
    jobs:
      run:
        runs-on: ubuntu-latest
        env:
          OS: ubuntu-latest
          PYTHON: "3.9"
        steps:
          - uses: actions/[email protected]
            with:
              fetch-depth: 2
          - name: Setup Python
            uses: actions/[email protected]
            with:
              python-version: 3.9
          - name: Setup Poetry
            uses: abatilo/[email protected]
            with:
              poetry-version: 1.1.4
          - name: Install dependencies
            run: poetry install
          - name: Collect coverage
            run: poetry run coverage run -m unittest
          - name: Upload Coverage to Codecov
            uses: codecov/[email protected]
            with:
              verbose: true
    

    The codecov step apparently passes, but the codecov website showed that it failed without much reason.

    My guess is that the previous workflow.yml file has a faulty .env in which the env entry gives 3.9 for python, but the Setup Python step uses a 3.7 version instead. This makes the env mismatch problem persist, but that doesn't make sense to be honest.

    The difference in running the test poetry run coverage run -m unittest vs. coverage run -m unittest may be an issue as well, but running test -f .coverage for the former does not fail.

    opened by lhl2617 2
  • Fix the behaviour of combining @data and @singleton

    Fix the behaviour of combining @data and @singleton

    Describe the bug

    When combining @singleton with @data, @singleton should come before @data. Combining them the other way around will work in most cases but is not thoroughly tested and relies on assumptions that might not hold.

    Expected behavior There should be a single correct way of emulating this behaviour. We should raise an immediately actionable error when they are combined in a way that isn't guaranteed to function.

    opened by rayanht 0
Releases(v1.3.0)
Owner
Rayan Hatout
CS student at Imperial College London and Software Engineer.
Rayan Hatout
Compiler Final Project - Lisp Interpreter

Compiler Final Project - Lisp Interpreter

2 Jan 23, 2022
Versión preliminar análisis general de Covid-19 en Colombia

Covid_Colombia_v09 Versión: Python 3.8.8 1/ La base de datos del Ministerio de Salud (Minsalud Colombia) está en https://www.datos.gov.co/Salud-y-Prot

Julián Gómez 1 Jan 30, 2022
Remote execution of a simple function on the server

FunFetch Remote execution of a simple function on the server All types of Python support objects.

Decave 4 Jun 30, 2022
System Design Assignments as part of Arpit's System Design Masterclass

System Design Assignments The repository contains a set of problem statements around Software Architecture and System Design as conducted by Arpit's S

Relog 1.1k Jan 09, 2023
A ULauncher/Albert extension that supports currency, units and date time conversion, as well as a calculator that supports complex numbers and functions.

Ulauncher/Albert Calculate Anything Ulauncher/Albert Calculate Anything is an extension for Ulauncher and Albert to calculate things like currency, ti

tchar 67 Jan 01, 2023
Launcher program to select which version of the Q-Sys software to launch.

QSC-QSYS Launcher Launcher program to select which version of the Q-Sys software to launch. Instructions To use the application simply save the "Q-Sys

Zach Lisko 2 Sep 28, 2022
This directory gathers the tools developed by the Data Sourcing Working Group

BigScience Data Sourcing Code This directory gathers the tools developed by the Data Sourcing Working Group First Sourcing Sprint: October 2021 The co

BigScience Workshop 27 Nov 04, 2022
Python Project Template

A low dependency and really simple to start project template for Python Projects.

Bruno Rocha 651 Dec 29, 2022
Simple python code for compile brainfuck program.

py-brainf*ck Just a basic compiled that compiles your brainf*ck codes and gives you informations about memory, used cells, dumped version, logs etc...

4 Jun 13, 2021
Python project that aims to discover CDP neighbors and map their Layer-2 topology within a shareable medium like Visio or Draw.io.

Python project that aims to discover CDP neighbors and map their Layer-2 topology within a shareable medium like Visio or Draw.io.

3 Feb 11, 2022
Open-source library for analyzing the results produced by ABINIT

Package Continuous Integration Documentation About AbiPy is a python library to analyze the results produced by Abinit, an open-source program for the

ABINIT 91 Dec 09, 2022
A framework to create reusable Dash layout.

dash_component_template A framework to create reusable Dash layout.

The TolTEC Project 4 Aug 04, 2022
Tools to convert SQLAlchemy models to Pydantic models

Pydantic-SQLAlchemy Tools to generate Pydantic models from SQLAlchemy models. Still experimental. How to use Quick example: from typing import List f

Sebastián Ramírez 893 Dec 29, 2022
Web App for University Project

University Project About I made this web app to finish a project assigned by my teacher. It is written entirely in Python, thanks to streamlit to make

15 Nov 27, 2022
Generate Gaussian 09 input files for the rotamers of an input compound.

Rotapy Purpose Generate Gaussian 09 input files for the rotamers of an input compound. Distance to the axis of rotation remains constant throughout th

1 Jul 16, 2021
Consolemenu on python with pynput

ConsoleMenu Consolemenu on python 3 with pynput Powered by pynput and colorama Description Модуль позволяющий сделать меню выбора с помощью стрелок дл

KrouZ_CZ 2 Nov 15, 2021
Interpreting-compiling programming language.

HoneyASM The programming language written on Python, which can be as interpreted as compiled. HoneyASM is easy for use very optimized PL, which can so

TalismanChet 1 Dec 25, 2021
Web service which feeds Navitia with real-time disruptions

Chaos Chaos is the web service which can feed Navitia with real-time disruptions. It can work together with Kirin which can feed Navitia with real-tim

KISIO Digital 7 Jan 07, 2022
Home Assistant integration for spanish electrical data providers (e.g., datadis)

homeassistant-edata Esta integración para Home Assistant te permite seguir de un vistazo tus consumos y máximas potencias alcanzadas. Para ello, se ap

VMG 163 Jan 05, 2023
Cool Bioinformatics Scripts

Cool Bioinformatics Scripts qqplot You can use this script in two ways read tons of millions of P values from stdin # python zcat pval.txt.gz | qqplo

8 Oct 30, 2022