msgspec is a fast and friendly implementation of the MessagePack protocol for Python 3.8+

Overview

msgspec

github pypi

msgspec is a fast and friendly implementation of the MessagePack protocol for Python 3.8+. In addition to serialization/deserialization, it supports runtime message validation using schemas defined via Python's type annotations.

from typing import Optional, List
import msgspec

# Define a schema for a `User` type
class User(msgspec.Struct):
    name: str
    groups: List[str] = []
    email: Optional[str] = None

# Create a `User` object
alice = User("alice", groups=["admin", "engineering"])

# Serialize `alice` to `bytes` using the MessagePack protocol
serialized_data = msgspec.encode(alice)

# Deserialize and validate the message as a User type
user = msgspec.Decoder(User).decode(serialized_data)

assert user == alice

msgspec is designed to be as performant as possible, while retaining some of the nicities of validation libraries like pydantic. For supported types, serializing a message with msgspec can be ~2-4x faster than alternative libraries.

https://github.com/jcrist/msgspec/raw/master/docs/source/_static/bench-1.png

See the documentation for more information.

LICENSE

New BSD. See the License File.

Comments
  • [FEA] A fallback mechanism

    [FEA] A fallback mechanism

    Feature request, make it possible to specify a fallback function that is called when the encoder encounters an unsupported object. The fallback function can then handle the unsupported object using only msgspec compatible types. When decoding, a corresponding fallback function are then called to handle the decoding of the object.

    Motivation

    In Dask/Distributed we are discussing the replacement of msgpack. The library should be:

    • fast
    • secure (no arbitrary code execution when decoding)
    • support basic Python types
    • not convert list to tuples (can be disabled in msgpack but with a performance penalty)
    • not convert numpy scalars to python scalars
    • support a fallback mechanism
    opened by madsbk 16
  • Expand `datetime` support

    Expand `datetime` support

    Currently msgspec supports encoding and decoding only timezone-aware datetime.datetime objects, holding strict conformance to RFC3339. Naive datetime.datetime objects can be encoded using a custom enc_hook, but there's no way to decode a naive datetime.datetime object.

    I would like to expand our builtin support for datetime types to include:

    • datetime.datetime (both aware and naive)
    • datetime.date
    • datetime.time (both aware and naive)

    Here's the plan I've come up with:

    Encoding

    To support encoding, we add a support_naive_datetimes keyword argument to msgspec.*.decode and msgspec.*.Decoder to configure the treatment of naive datetimes. This would take one of:

    • False: the default. Naive datetime and time objects error on encoding.
    • True: allow encoding naive datetime and time objects. These will be encoded as their RFC3339 compatible counterparts, just missing the offset component
    • "UTC": naive datetime and time objects will be treated as if they have a UTC timezone.
    • a tzinfo object: naive datetime and time objects will be treated as if they have this timezone.

    I'm not attached to the keyword name (or boolean options), so if someone can think of a nicer spelling I'd be happy. I think this supports all the common options.

    One benefit of supporting these options builtin is that we no longer have the weird behavior of enc_hook only being called for naive datetime.datetime objects. This would admittedly be less weird if Python had different types for aware and naive datetimes.

    I could hear an argument that the default should be True (encoding naive datetimes/times by default), but I'm hesitant to make that change. Having an error by default if you're using a naive datetime will force users to think about timezones early on - if they really want a naive datetime they can explicitly opt into it. Supporting naive datetimes/times by default could let programming errors slip by, since most times the user does want an aware datetime rather than a naive datetime.

    Decoding

    To support decoding, we want to handle the following use cases:

    • Only decode RFC3339 compatible datetimes and times (requiring a timezone)
    • Only decode naive datetimes and times (require no timezone)
    • Decode any datetime or time object (naive or aware)

    Since msgspec will only ever decode an object into a datetime if type information is provided, then the natural place to enable this configuration is through our existing type annotations system. The question then is - what does an unannotated datetime.datetime mean?

    I want msgspec to make it easy to do the right thing, and (within reason) possible to do the flexible thing. As such, I'd argue that raw datetime.datetime and datetime.time annotations should only decode timezone-aware objects. This means that by default APIs built with msgspec are compatible with json-schema (which lacks a naive datetime/time format), and common web languages like golang (which requires RFC3339 compatible strings in JSON by default).

    To support naive-datetime or any-datetime types, we'd add a new config to Meta annotations. Something like:

    from msgspec import Struct, Meta
    from typing import Annotated
    import datetime
    
    class Example(Struct):
        aware_only_datetime: datetime.datetime
        aware_only_time: datetime.time
        date: datetime.date  # date objects have no timezone
        naive_only_datetime: Annotated[datetime.datetime, Meta(timezone=False)]
        naive_only_time: Annotated[datetime.time, Meta(timezone=False)]
        any_datetime: Annotated[datetime.datetime, Meta(timezone=None)]
        any_time: Annotated[datetime.time, Meta(timezone=None)]
    

    Like above, I don't love the timezone=True (aware), timezone=False (naive), timezone=None (aware or naive) syntax, if anyone can think of a better API spelling please let me know.

    We could also add type aliases in a new submodule msgspec.types to make this easier to spell (since datetimes are common):

    from msgspec import Struct
    from msgspec.types import NaiveDatetime
    
    # NaiveDatetime = Annotated[datetime.datetime, Meta(timezone=False)]
    
    class Example(Struct):
        naive_only_datetime: NaiveDatetime
    

    Msgpack Complications

    Currently we use msgpack's timestamp extension (https://github.com/msgpack/msgpack/blob/master/spec.md#timestamp-extension-type) when encoding datetimes to msgpack. This extension by design only supports timezone-aware datetimes. msgpack has no standard representation for naive datetimes (or time/date objects in general). To handle this, I plan to encode naive datetimes as strings in the same format as JSON. This is an edge case that I don't expect to affect most users. I think the main benefit of supporting it is parity between types supported by both protocols.

    opened by jcrist 11
  • Support for `dataclass`es

    Support for `dataclass`es

    opened by jakirkham 11
  • Dataclasses bug: Encoding includes unspecified instance attributes

    Dataclasses bug: Encoding includes unspecified instance attributes

    When encoding a dataclass, instance attributes that aren't dataclass-fields are included in the result:

    import dataclasses
    import msgspec
    
    
    @dataclasses.dataclass
    class Foo:
        id: int
    
    
    foo = Foo(id=1)
    foo.bar = "test"
    
    
    print(msgspec.json.encode(foo))
    print(msgspec.json.encode(dataclasses.asdict(foo)))
    
    assert msgspec.json.encode(foo) == msgspec.json.encode(dataclasses.asdict(foo))
    

    This is problematic because it differs from dataclasses.asdict, and in turn makes mgspec incompatible with pydantic.dataclasses:

    from dataclasses import asdict
    
    import msgspec
    from pydantic.dataclasses import dataclass
    
    
    @dataclass
    class Foo:
        id: int
    
    
    foo = Foo(id=1)
    
    
    print(msgspec.json.encode(foo))
    print(msgspec.json.encode(asdict(foo)))
    
    opened by provinzkraut 10
  • Non-string tag values

    Non-string tag values

    I have an existing encoding that almost-but-not-quite fits msgspec. The "almost" is that tags are integer rather than string. Would it make sense to accept non-string tags? I think the only non-string case that would be valuable is for integers.

    opened by occasionallydavid 10
  • Support json-schema generation

    Support json-schema generation

    It would be helpful if JSON schema generation was supported

    class User(msgspec.Struct):
        """A new type describing a User"""
        name: str
        groups: Set[str] = set()
        email: Optional[str] = None
    
    schema = User.json_schema()
    

    Similar to the functionality seen in https://github.com/s-knibbs/dataclasses-jsonschema

    opened by old-ocean-creature 10
  • json.Decoder memory leak?

    json.Decoder memory leak?

    Hello,

    I've got a service written in Python that reads data from ElasticSearch frequently 24/7.

    Recently I've migrated it from orjson to msgspec.json. Since then, the service runs out of memory pretty quickly.

    Following https://docs.python.org/3/library/tracemalloc.html#pretty-top I'm able to capture the top 10 lines that contributes large memory usage, which turns out it's the decode(...) method from msgspec.json.Decoder

    Top 10 lines
    #1: /app/elasticsearch_repository.py:69: 26821.3 KiB
        self.__decoder.decode(
    #2: /app/prepare_data_service.py:280: 3649.2 KiB
        cache_info_rt = decoder.decode(cache_info_data)
    #3: /usr/local/lib/python3.10/multiprocessing/reduction.py:51: 1568.1 KiB
        cls(buf, protocol).dump(obj)
    #4: /usr/local/lib/python3.10/linecache.py:137: 628.2 KiB
        lines = fp.readlines()
    #5: /usr/local/lib/python3.10/json/decoder.py:353: 82.7 KiB
        obj, end = self.scan_once(s, idx)
    #6: /usr/local/lib/python3.10/multiprocessing/queues.py:122: 40.2 KiB
        return _ForkingPickler.loads(res)
    #7: /usr/local/lib/python3.10/tracemalloc.py:67: 11.9 KiB
        return (self.size, self.count, self.traceback)
    #8: /app/elasticsearch_repository.py:68: 3.7 KiB
        return [
    #9: /usr/local/lib/python3.10/http/client.py:1293: 3.6 KiB
        self.putrequest(method, url, **skips)
    #10: /app/venv/lib/python3.10/site-packages/elasticsearch/client/utils.py:347: 1.8 KiB
        return func(*args, params=params, headers=headers, **kwargs)
    193 other: 70.2 KiB
    Total allocated size: 32880.9 KiB
    

    Here are the structs for decoding:

    '''
    structs for msgspec
    '''
    from typing import Dict, List, Optional
    
    from msgspec import Struct
    
    
    class Query(Struct):
        """
        Struct for Query
        """
        date: str
        depAirport: str
        arrAirport: str
    
    
    class Request(Struct):
        """
        Struct for Request
        """
        supplier: str
        tripType: str
        fareClass: str
        adultAmount: int
        childAmount: int
        infantAmount: int
        queries: List[Query]
        timestamp: int
    
    
    class Segment(Struct):
        """
        Struct for Segment
        """
        fareClass: str
        depDate: str
        depTime: str
        flightNo: str
        carrier: str
        orgAirport: str
        arriveDate: str
        arriveTime: str
        dstAirport: str
    
    
    class Flight(Struct):
        """
        Struct for Flight
        """
        segments: List[Segment]
    
    
    class Price(Struct):
        """
        Struct for Price
        """
        price: float
        tax: float
        totalPrice: float
        seatsStatus: Optional[str] = None
        currencyCode: Optional[str] = None
    
    
    class Trip(Struct):
        """
        Struct for Trip
        """
        flights: List[Flight]
        prices: Dict[str, Price]
        extended = Dict[str, str]
    
    
    class Result(Struct):
        """
        Struct of Result
        """
        trips: List[Trip]
    
    
    class CacheInfo(Struct):
        """
        Struct of CacheInfo
        """
        request: Request
        result: Result
    

    I read from https://jcristharif.com/msgspec/structs.html#struct-nogc that

    structs referencing only scalar values (ints, strings, bools, …) won’t contribute to GC load, but structs referencing containers (lists, dicts, structs, …) will.

    Is this related? What's the recommendation to resolve this issue?

    Thanks!

    opened by tigerinus 10
  • Support annotated-types for metadata and constraint specification

    Support annotated-types for metadata and constraint specification

    My understanding is that msgspec currently only supports basic type/schema validation, but not more complex validations like a regex pattern or arbitrary user functions, which Pydantic and other validators do support.

    I think it would be really interesting if msgspec adopted annotated-types (maybe as an optional dependency), which would not only enable an API for these sorts of constraints but also:

    • Improve interoperability with other libraries (Pydantic V2 will support annotated-types, so users could go back and forth or share types between the two libraries)
    • Get automatic support for other parts of the ecosystem like Hypothesis
    opened by adriangb 9
  • Inconsistent support of subclassed types

    Inconsistent support of subclassed types

    When trying to encode subclasses of supported types, some will result in a TypeError, while others work as expected. It would be great to either support subclasses in general, or at least have an explanation of these limitations in the docs.

    For comparison, the below example is supported by json, orjson and ujson.

    import msgspec
    
    
    class CustomList(list):
        pass
    
    
    class CustomStr(str):
        pass
    
    
    msgspec.json.encode(CustomList())  # this works
    msgspec.json.encode(CustomStr())  # this is a TypeError
    
    opened by provinzkraut 8
  • Generate JSON Schemas from types

    Generate JSON Schemas from types

    This adds two functions:

    • msgspec.json.schema for generating a single JSON Schema from a given type.
    • msgspec.json.schema_components for generating multiple schemas (and a corresponding components mapping) from multiple types.

    ~Still needs docs and tests.~

    Fixes #125.

    opened by jcrist 8
  • Adding Linux and Mac ARM wheels

    Adding Linux and Mac ARM wheels

    This resolves #103 by adding steps for Linux and Mac ARM wheels in the CI build. I also bumped the version of pypa/cibuildwheel to v2.4.0.

    Mac support: https://cibuildwheel.readthedocs.io/en/stable/faq/#apple-silicon Linux support: https://cibuildwheel.readthedocs.io/en/stable/faq/#emulation

    I don't have a great way of testing this. :smile:

    opened by cjermain 8
  • Should we support querystring / `x-www-form-urlencoded` messages?

    Should we support querystring / `x-www-form-urlencoded` messages?

    URL querystrings/x-www-form-urlencoded forms are structured but untyped messages. The python standard library has a few tools for encoding/decoding these:

    In [2]: urllib.parse.parse_qs("x=1&y=true&z=a&z=b")
    Out[2]: {'x': ['1'], 'y': ['true'], 'z': ['a', 'b']}
    

    This is annoying to work with manually because the output is always of type dict[str, list[str]]. This means that:

    • The string values have to be manually cast to the expected types
    • Fields where you expect a single value have to be validated (or only the last value used)
    • Missing required fields and default values have to be manually handled

    A library like Pydantic may be used to ease some of the ergonomic issues here, but adds extra overhead.

    Since msgspec is already useful for parsing JSON payloads into typed & structured objects, we might support a new querystring encoding/decoding that makes use of msgspec's existing type system to handle the decoding and validation. A lot of the code needed to handle this parsing already exists in msgspec, it's mostly just plumbing needed to hook things together. For performance, I'd expect this to be ~as fast as our existing JSON encoder/decoder.

    Proposed interface:

    # msgspec/querystring.py
    
    def encode(obj: Any) -> bytes:
        """Encode an object as a querystring.
    
        This returns `bytes` not `str`, since that's what `msgspec` returns for other encodings.
        """
        ...
    
    def decode(buf: bytes | str, type: Type[T] = dict[str, list[str]]) -> T:
        """Decode a querystring.
    
        If `type` is passed, a value of that type is returned (or an error is raised).
    
        If `type` is not passed, a `dict[str, list[str]]` is returned containing all passed query parameters.
        This matches the behavior of `urllib.parse.parse_qs`.
        """
        ...
    

    Proposed encoding/decoding scheme:

    • Nested objects are not supported due to querystring restrictions. We don't try to do anything complicated like rails or sinatra do (i.e. no foo[][key]=bar stuff).
    • A valid type must be a top-level object-like (struct, dataclass, ...) type, mapping fields to value types

    The following value types are supported

    • int, float, str, and str-like types (datetimes, ...) map to/from their str representations, quoting as needed
    • bool serializes to "true"/"false". When deserializing, "", "1" and "0" are also accepted (are there other common values?)
    • None serializes as "". When decoding "null" is also accepted.
    • Sequences of the above (e.g. list/tuple/...) map to/from multiple values set for a field. So a field a with value ("x", None, True, 3) would be "a=x&a=&a=true&a=3"
    • All builtin constraints are also supported

    Questions:

    • Do the above encodings make sense?
    • Do the restrictions on supported types make sense? In particular, note the no-nested-objects/sequences restriction
    • Are there other options we'd want to expose on encode/decode? The stdlib also exposes a few options that I've never needed to change:
      • max_num_fields to limit the number of fields when decoding
      • separator to change the character used for separating fields (defaults to &).
    • Is msgspec.querystring the best namespace for this format, or is there a better name we could use?
    • Does this seem like something that would be useful to have in msgspec? The intent here is for msgpspec to handle much of the parsing/validation that a typical web server would need to handle in a performant and useful way.
    opened by jcrist 2
  • Support for `Field` and `extra`

    Support for `Field` and `extra`

    Hi again,

    In pydantic I can pass any kwargs to a Field and these will be set as a dictionary under the ModelField.field_info.extra dictionary (if extra is allowed). I can also pass an extra dict directly. This is useful for library authors such as myself because it allows us to pass meta data as part of a field defintion.

    For example, we have stuff like this: https://starlite-api.github.io/starlite/usage/3-parameters/3-the-parameter-function/.

    At present the closest thing possible with msgspec is to use the Meta object with Annotated, and hijack the extra_json_schema dictionary. This though is not a proper solution - first, because this is not about an extra json schema but rather about passing kwargs, and secondly because this is not necessarily about constraints and the usage of Annotated might be redundant in this case.

    opened by Goldziher 3
  • Access fields data

    Access fields data

    Hi again,

    So I encountered an issue when trying to integrate msgspec into the internal of our libraries, namely - there is no way for me to easily access the field information on a struct.

    The __struct_fields__ attribute only offers a tuple of field names. If I want to access the metadata regarding each field I have to result to some rather inefficent methods such as inspecting the attributes, which is very slow.

    I would like to regard a simple and fast way to get the following information about a field: its typing, any default value defined for it, and other meta data available.

    opened by Goldziher 13
  • `functools.cached_property` support?

    `functools.cached_property` support?

    from msgspec import Struct
    from functools import cached_property
    
    class Foo(Struct):
    	bar: tuple[str]
    
    	@cached_property
    	def bar_inner(self) -> str:
    		return self.bar[0]
    

    will fail because Struct does not implement __dict__. could it be possible to implement cached_property for Struct?

    Traceback (most recent call last):
      File "/home/scarf/repo/cata/tileset-tools/learnpillow.py", line 26, in <module>
        print(meta.fallback.ascii)
              ^^^^^^^^^^^^^
      File "/home/scarf/.asdf/installs/python/3.11.0rc2/lib/python3.11/functools.py", line 994, in __get__
        raise TypeError(msg) from None
    TypeError: No '__dict__' attribute on 'TileConfig' instance to cache 'fallback' property.
    

    @property decorator does work, but would be nice to have a cached one too (if possible)

    opened by scarf005 0
  • question: array of different types?

    question: array of different types?

    Case 1

    [
      { "height": 10, "width": 10 },                     // <- type 'A'
      { "ASCIITiles.png": { "//": "indices 0 to 79" } }, // <- type 'B'
      { "fallback.png": { "fallback": true } }
      ...
    ]
    

    it's guaranteed that the array will be shaped like [A, B, B, ...]. I'm not sure how to make msgspec understand that only first element has different type.

    Case 2

    [
            { "id": "lighting_hidden" },
            {
              "id": "explosion_weak",
              "multitile": true,
              "additional_tiles": "(...)"
            }
    ]
    

    the first and second element only differs in multitile and additional_tiles. additional_tiles only exist when multiile exists and is true. I wasn't able to turn both objects into tagged unions. could you help me to to correctly distinguish both objects?

    opened by scarf005 0
  • Apache Arrow Support

    Apache Arrow Support

    Would be great to have an efficient way to serialize msgspec structs to apache arrow, which would also open it up to using parquet and other tools in the arrow ecosystem like duckdb.

    opened by michalwols 2
Releases(0.11.0)
  • 0.11.0(Dec 19, 2022)

    • Improve performance of constructors for Struct types when using keyword arguments (#237).

    • Support constraints on dict keys for JSON (#239).

    • Add support for keyword-only arguments in Struct types, matching the behavior of kw_only for dataclasses (#242).

    • BREAKING: Change the parameter ordering rules used by Struct types to match the behavior of dataclasses. For most users this change shouldn't break anything. However, if your struct definitions have required fields after optional fields, you'll now get an error on import. This error can be fixed by either:

      • Reordering your fields so all required fields are before all optional fields
      • Using keyword-only parameters (by passing the kw_only=True option).

      See Field Ordering for more information (#242).

    • Support encoding/decoding dictionaries with integer keys for JSON (#243).

    Source code(tar.gz)
    Source code(zip)
  • 0.10.1(Dec 8, 2022)

  • 0.10.0(Dec 8, 2022)

    • Add forbid_unknown_fields configuration option to Struct types (#210)
    • BREAKING: Encode all enum types by value, rather than name (#211)
    • Fix a bug in the JSON encoder when base64 encoding binary objects (#217)
    • Add support for encoding/decoding dataclasses (#218)
    • Add support for encoding/decoding datetime.date objects (#221)
    • Add support for encoding/decoding uuid.UUID objects (#222)
    • BREAKING: support encoding/decoding datetime.datetime values without timezones by default (#224).
    • Add a tz constraint to require aware or naive datetime/time objects when decoding (#224).
    • Add support for encoding/decoding datetime.time objects (#225)
    • Add a msgspec.json.format utility for efficiently pretty-printing already encoded JSON documents (#226).
    • Support decoding JSON from strings instead of just bytes-like objects (#229)
    Source code(tar.gz)
    Source code(zip)
  • 0.9.1(Oct 28, 2022)

    • Enable Python 3.11 builds (#205)
    • Support greater than microsecond resolution when parsing JSON timestamps (#201)
    • Work around a limitation in mypy for typed decoders (#191)
    Source code(tar.gz)
    Source code(zip)
  • 0.9.0(Sep 14, 2022)

    • Support for constraints during validation. For example, this allows ensuring a field is an integer >= 0. (#176)
    • New utilities for generating JSON Schema from type definitions (#181)
    • Support for pretty printing using rich (#183)
    • Improve integer encoding performance (#170)
    • Builtin support for renaming fields using kebab-case (#175)
    • Support for passing a mapping when renaming fields (#185)
    Source code(tar.gz)
    Source code(zip)
  • 0.8.0(Aug 2, 2022)

    • Support integer tag values when using tagged unions (#135).
    • Support decoding into typing.TypedDict types (#142).
    • Support encoding/decoding typing.NamedTuple types (#161).
    • Test against CPython 3.11 prelease builds (#146).
    • Add ValidationError (a subclass of DecodeError) to allow differentiating between errors due to a message not matching the schema from those due to the message being invalid JSON (#155).
    • Support encoding subclasses of list/dict (#160).
    • Fix a bug preventing decoding custom types wrapped in a typing.Optional (#162).
    Source code(tar.gz)
    Source code(zip)
  • 0.7.1(Jun 28, 2022)

    • Further reduce the size of packaged wheels (#130).
    • Add weakref support for Struct types through a new weakref configuration option (#131).
    • Fix a couple unlikely (but possible) bugs in the deallocation routine for Struct types (#131).
    Source code(tar.gz)
    Source code(zip)
  • 0.7.0(Jun 20, 2022)

    • Dramatically speedup JSON string decoding, up to 2x speedup in some cases (#118).
    • Adds a cache for decoding short (< 32 character) ASCII dict keys. This results in up to a 40% speedup when decoding many dicts with common keys using an untyped decoder. It's still recommended to define Struct types when your messages have a common structure, but in cases where no type is provided decoding is now much more performant (#120, #121).
    • Adds order and eq configuration options for Struct types, mirroring the dataclasses options of the same name. Order comparisons for Struct types are very performant, roughly 10x to 70x faster than alternative libraries (#122).
    • Speedup Struct decoding for both JSON and MessagePack, on average 20% faster (#119).
    • Various additional performance improvements, mostly to the JSON implementation (#100, #101, #102).
    • Add defstruct method for dynamically defining new Struct types at runtime (#105).
    • Fix ARM support and publish ARM wheels for Linux and Mac (#104).
    • Reduce published wheel sizes by stripping debug symbols (#113).
    • Fix a memory leak in Struct.__reduce__ (#117).
    • Rename nogc struct option to gc (a breaking change). To disable GC on a Struct instance you now want to specify gc=False instead of nogc=True (#124).
    Source code(tar.gz)
    Source code(zip)
  • 0.6.0(Apr 6, 2022)

    • Add a new msgspec.Raw type for delayed decoding of message fields / serializing already encoded fields.
    • Add omit_defaults option to Struct types (docs). If enabled, fields containing their respective default value will be omitted from serialized message. This improves both encode and decode performance.
    • Add rename option to Struct types (docs) for altering the field names used for encoding. A major use of this is supporting camelCase JSON field names, while letting Python code use the more standard snake_case field names.
    • Improve performance of nogc=True structs. GC is now avoided in more cases, and nogc=True structs use 16 fewer bytes per instance. Also added a benchmark for how msgspec can interact with application GC usage.
    • Cache creation of tagged union lookup tables, reducing memory usage for applications making heavy use of tagged unions.
    • Support encoding and decoding frozenset instances
    • A smattering of other performance improvements.
    Source code(tar.gz)
    Source code(zip)
  • 0.5.0(Mar 9, 2022)

    • Support tagged unions for encoding/decoding a Union of msgspec.Struct types.
    • Further improve encoding performance of enum.Enum by 20-30%
    • Reduce overhead of calling msgspec.json.decode/msgspec.msgpack.decode with type=SomeStructType. It's still faster to create a Decoder once and call decoder.decode multiple times, but for struct types the overhead of calling the top-level function is decreased significantly.
    • Rename the Struct option asarray to array_like (a breaking change)
    Source code(tar.gz)
    Source code(zip)
  • 0.4.2(Feb 28, 2022)

  • 0.4.1(Feb 23, 2022)

    • Optimize decoding of Enum types, ~10x faster
    • Optimize decoding of IntEnum types, ~12 faster
    • Support decoding typing.Literal types
    • Add nogc option for Struct types, disabling the cyclic garbage collector for their instances
    Source code(tar.gz)
    Source code(zip)
  • 0.4.0(Feb 8, 2022)

    This is a major release with several large changes:

    • Moved MessagePack support to the msgspec.msgpack submodule
    • New JSON support available in msgspec.json
    • Improved error message generation to provide full path to the mistyped values
    • Renamed the immutable kwarg in msgspec.Struct to frozen to better match python conventions
    • Renamed EncodingError to EncodeError/DecodingError to DecodeError to better match python conventions
    • Improved pyright support, allowing more errors to be statically caught by their type checker
    • Adds support for Python 3.10 pattern matching on msgspec.Struct types
    • Adds support for decoding into typing.Union types (with a few restrictions)
    • General performance improvements across all encoders/decoders
    Source code(tar.gz)
    Source code(zip)
  • 0.3.2(Jul 23, 2021)

  • 0.3.1(Jul 13, 2021)

  • 0.3.0(Jul 7, 2021)

    • Add Encoder.encode_into api, for encoding into an existing buffer without copying
    • Add support for encoding/decoding MessagePack extensions
    • Add support for encoding/decoding datetime objects
    • Add support for encoding/decoding custom objects without relying on MessagePack extensions
    • Add support for marking Struct types as hashable
    • Add support for serializing Struct types as MessagePack array objects rather than map objects.
    • Several performance improvements. On average 50% faster encoding and 30% faster decoding.
    Source code(tar.gz)
    Source code(zip)
Owner
Jim Crist-Harif
Professional coffee drinker
Jim Crist-Harif
A Project to resolve hostname and receive IP

hostname-resolver A Project to resolve hostname and receive IP Installation git clone https://github.com/ihapiw/hostname-resolver.git Head into the ho

iHapiW 5 Sep 12, 2022
A python shell / chat bot for XMPP and cloud services

XMPP_Shell_Bot A python shell / chat bot for XMPP and cloud services, designed for penetration testers to bypass network filters. To better understand

Abdulkareem Aldeek 1 Jan 09, 2022
wg-exporter is a simple yet effective Prometheus exporter for Wireguard

wg-exporter wg-exporter is a simple yet effective Prometheus exporter for Wireguard. What are the collected metrics ? General: wg_connected_peers: num

men1n2 3 May 20, 2022
This application aims to read all wifi passwords and visualizes the complexity in graph formation by taking into account several criteria and help you generate new random passwords.

This application aims to read all wifi passwords and visualizes the complexity in graph formation by taking into account several criteria and help you generate new random passwords.

Njomza Rexhepi 0 May 29, 2022
Raspberry Pi Based Serial Console Server, with PushBullet Notification of IP changes, Automatic VPN termination, custom menu, Power Outlet Control, and a lot more

ConsolePi Acts as a serial Console Server, allowing you to remotely connect to ConsolePi via Telnet/SSH/bluetooth to gain Console Access to devices co

120 Jan 05, 2023
This python script can change the mac address after some attack

MAC-changer Hello people, this python script was written for people who want to change the mac address after some attack, I know there are many ways t

5 Oct 10, 2022
Proxlist - Retrieve proxy servers.

Finding and storing a list of proxies can be taxing - especially ones that are free and may not work only minutes from now. proxlist will validate the proxy and return a rotating random proxy to you

Justin Hammond 2 Mar 17, 2022
Python code that get the name and ip address of a computer/laptop

IP Address This is a python code that provides the name and the internet protocol address of the computer. You need to install socket pip install sock

CODE 2 Feb 21, 2022
PcapConverter - A project for generating 15min frames out of a .pcap file containing network traffic

CMB Assignment 02 code + notebooks This is a project for containing code for the

Yannik S 2 Jan 24, 2022
Ultimate transformation library that supports validation, contexts and aiohttp.

Trafaret Ultimate transformation library that supports validation, contexts and aiohttp. Trafaret is rigid and powerful lib to work with foreign data,

Mikhail Krivushin 174 Nov 27, 2022
Dos attack a Bluetooth connection!

Bluetooth Denial of service Script made for attacking Bluetooth Devices By Samrat Katwal. Warning This project was created only for fun purposes and p

Samrat 1 Oct 29, 2021
Python Scripts for Cisco Identity Services Engine (ISE)

A set of Python scripts to configure a freshly installed Cisco Identity Services Engine (ISE) for simple operation; in my case, a basic Cisco Software-Defined Access environment.

Roddie Hasan 9 Jul 19, 2022
SocksFlood, a DoS tools that sends attacks using Socks5 & Socks4

Information SocksFlood, a DoS tools that sends attacks using Socks5 and Socks4 Requirements Python 3.10.0 A little bit knowledge of sockets IDE / Code

ArtemisID 0 Dec 03, 2021
Linkedin Connection Automation

Why spend an hour+ a week, connecting with the correct people on LinkedIn when you can go for lunch and let your computer do the hard work?

1 Nov 29, 2021
Web-server with a parser, connection to DBMS, and the Hugging Face.

Final_Project Web-server with parser, connection to DBMS and the Hugging Face. Team: Aisha Bazylzhanova(SE-2004), Arysbay Dastan(SE-2004) Installation

Aisha Bazylzhanova 2 Nov 18, 2021
An ansible playbook to set up wireguard server.

Poor man's VPN (pay for only what you need) An ansible playbook to quickly set up Wireguard server for occasional personal use. It takes around five m

Amrit Bera 613 Dec 25, 2022
Ip-Seeker - See Details With Public Ip && Find Web Ip Addresses

IP SEEKER See Details With Public Ip && Find Web Ip Addresses Tool By Heshan

M.D.Heshan Sankalpa 1 Jan 02, 2022
netpy - more than implementation of netcat 🐍πŸ”₯

netpy - more than implementation of netcat 🐍πŸ”₯

Mahmoud S. ElGammal 1 Jan 26, 2022
A tool to generate valid ip addresses of 55 countries. These ip's can be used for OpenBullet.

IP-Grabber A tool to generate valid ip addresses of 55 countries. These ip's can be used for OpenBullet. ive added the feature to set the generated ip

Saad 9 Dec 17, 2022
It can be used both locally and remotely (indicating IP and port)

It can be used both locally and remotely (indicating IP and port). It automatically finds the offset to the Instruction Pointer stored in the stack.

DiegoAltF4 13 Dec 29, 2022