The comprehensive WSGI web application library.

Overview

Werkzeug

werkzeug German noun: "tool". Etymology: werk ("work"), zeug ("stuff")

Werkzeug is a comprehensive WSGI web application library. It began as a simple collection of various utilities for WSGI applications and has become one of the most advanced WSGI utility libraries.

It includes:

  • An interactive debugger that allows inspecting stack traces and source code in the browser with an interactive interpreter for any frame in the stack.
  • A full-featured request object with objects to interact with headers, query args, form data, files, and cookies.
  • A response object that can wrap other WSGI applications and handle streaming data.
  • A routing system for matching URLs to endpoints and generating URLs for endpoints, with an extensible system for capturing variables from URLs.
  • HTTP utilities to handle entity tags, cache control, dates, user agents, cookies, files, and more.
  • A threaded WSGI server for use while developing applications locally.
  • A test client for simulating HTTP requests during testing without requiring running a server.

Werkzeug doesn't enforce any dependencies. It is up to the developer to choose a template engine, database adapter, and even how to handle requests. It can be used to build all sorts of end user applications such as blogs, wikis, or bulletin boards.

Flask wraps Werkzeug, using it to handle the details of WSGI while providing more structure and patterns for defining powerful applications.

Installing

Install and update using pip:

pip install -U Werkzeug

A Simple Example

from werkzeug.wrappers import Request, Response

@Request.application
def application(request):
    return Response('Hello, World!')

if __name__ == '__main__':
    from werkzeug.serving import run_simple
    run_simple('localhost', 4000, application)

Links

Comments
  • Werkzeug crashes after writing to closed pipe

    Werkzeug crashes after writing to closed pipe

    I have a Werkzeug server running behind NGINX. When a client disconnects while waiting for the Werkzeug server to respond, NGINX closes the pipe to Werkzeug. When the python program writes the response to Werkzeug, the following exception occurs and Werkzeug crashes:

    Traceback (most recent call last): File "server.py", line 81, in app.run(host=args.host, port=args.port, debug=False) File "/usr/local/lib/python2.7/dist-packages/flask/app.py", line 843, in run run_simple(host, port, self, **options) File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 694, in run_simple inner() File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 659, in inner srv.serve_forever() File "/usr/local/lib/python2.7/dist-packages/werkzeug/serving.py", line 499, in serve_forever HTTPServer.serve_forever(self) File "/usr/lib/python2.7/SocketServer.py", line 238, in serve_forever self._handle_request_noblock() File "/usr/lib/python2.7/SocketServer.py", line 297, in _handle_request_noblock self.handle_error(request, client_address) File "/usr/lib/python2.7/SocketServer.py", line 295, in _handle_request_noblock self.process_request(request, client_address) File "/usr/lib/python2.7/SocketServer.py", line 321, in process_request self.finish_request(request, client_address) File "/usr/lib/python2.7/SocketServer.py", line 334, in finish_request self.RequestHandlerClass(request, client_address, self) File "/usr/lib/python2.7/SocketServer.py", line 651, in init self.finish() File "/usr/lib/python2.7/SocketServer.py", line 710, in finish self.wfile.close() File "/usr/lib/python2.7/socket.py", line 279, in close self.flush() File "/usr/lib/python2.7/socket.py", line 303, in flush self._sock.sendall(view[write_offset:write_offset+buffer_size]) socket.error: [Errno 32] Broken pipe

    Is there some configuration option I'm missing to keep it from crashing? Normally all exceptions are caught and a 500 error returned, with the server remaining alive.

    opened by alexandres 41
  • Werkzeug stat reloader problems

    Werkzeug stat reloader problems

    So far two people have run into serious performance issues with the stat reloader. Anybody who has those, the fix is to install watchdog from pypi.

    If you're interested in fixing this issue, uninstall watchdog again and:

    • How does setting reloader_interval on werkzeug.run_simple or app.run affect performance? Try various values:

      app.run(reloader_interval=1)  # the default
      
    • Run this code inside your virtualenv, post the returned number:

      from werkzeug._reloader import _iter_module_files
      print(len(list(_iter_module_files())))
      
    bug 
    opened by untitaker 37
  • werkzeug.formparser is really slow with large binary uploads

    werkzeug.formparser is really slow with large binary uploads

    When I perform a multipart/form-data upload of any large binary file in Flask, those uploads are very easily CPU bound (with Python consuming 100% CPU) instead of I/O bound on any reasonably fast network connection.

    A little bit of CPU profiling reveals that almost all CPU time during these uploads is spent in werkzeug.formparser.MultiPartParser.parse_parts(). The reason this that the method parse_lines() yields a lot of very small chunks, sometimes even just single bytes:

    # we have something in the buffer from the last iteration.
    # this is usually a newline delimiter.
    if buf:
        yield _cont, buf
        buf = b''
    

    So parse_parts() goes through a lot of small iterations (more than 2 million for a 100 MB file) processing single "lines", always writing just very short chunks or even single bytes into the output stream. This adds a lot of overhead slowing down those whole process and making it CPU bound very quickly.

    A quick test shows that a speed-up is very easily possible by first collecting the data in a bytearray in parse_lines() and only yielding that data back into parse_parts() when self.buffer_size is exceeded. Something like this:

    buf = b''
    collect = bytearray()
    for line in iterator:
        if not line:
            self.fail('unexpected end of stream')
    
        if line[:2] == b'--':
            terminator = line.rstrip()
            if terminator in (next_part, last_part):
                # yield remaining collected data
                if collect:
                    yield _cont, collect
                break
    
        if transfer_encoding is not None:
            if transfer_encoding == 'base64':
                transfer_encoding = 'base64_codec'
            try:
                line = codecs.decode(line, transfer_encoding)
            except Exception:
                self.fail('could not decode transfer encoded chunk')
    
        # we have something in the buffer from the last iteration.
        # this is usually a newline delimiter.
        if buf:
            collect += buf
            buf = b''
    
        # If the line ends with windows CRLF we write everything except
        # the last two bytes.  In all other cases however we write
        # everything except the last byte.  If it was a newline, that's
        # fine, otherwise it does not matter because we will write it
        # the next iteration.  this ensures we do not write the
        # final newline into the stream.  That way we do not have to
        # truncate the stream.  However we do have to make sure that
        # if something else than a newline is in there we write it
        # out.
        if line[-2:] == b'\r\n':
            buf = b'\r\n'
            cutoff = -2
        else:
            buf = line[-1:]
            cutoff = -1
    
        collect += line[:cutoff]
    
        if len(collect) >= self.buffer_size:
            yield _cont, collect
            collect.clear()
    

    This change alone reduces the upload time for my 34 MB test file from 4200 ms to around 1100 ms over localhost on my machine, that's almost a 4X increase in performance. All tests are done on Windows (64-bit Python 3.4), I'm not sure if it's as much of a problem on Linux.

    It's still mostly CPU bound, so I'm sure there is even more potential for optimization. I think I'll look into it when I find a bit more time.

    bug 
    opened by sekrause 33
  • Switch locals to be based on ContextVars

    Switch locals to be based on ContextVars

    ContextVar was introduced in Python 3.7 and is effectively the same as Local in Werkzeug (although a different API). They should work in Greenlet and threading contexts as before but also now in asyncio or other async/await contexts.

    The __storage__ attribute has been kept for backwards compatibility (even though it is a dunder it seems it is used, e.g. in tests).

    The ident_func though must be deprecated as this is now handled by the ContextVar. This may cause some backwards incompatibility.

    opened by pgjones 32
  • Add uWSGI caching backend

    Add uWSGI caching backend

    Since version 1.9 uWSGI includes an in-memory cache. Documentation on that can be found at the uWSGI docs.

    This PR contains a new UwsgiCache class using uWSGI's uwsgi.cache_* functions.

    I'm not sure yet about a few things:

    1. testing; as import uwsgi only works when the app is actually run by uwsgi, writing tests with the actual uwsgi module is basically impossible to my knowledge. Does anyone have an idea on how this could be done?

    2. uWSGI also has cache_inc and cache_dec functions since 1.9.9, but they are not documented, so I'm a bit hesitant on adding them.

    3. Naming; is the convention here to use uWSGICache, or UwsgiCache?

    4. Documentation; because I check for the existance of the uwsgi module and only define the actual UwsgiCache class when it exists, sphinx's automodule does not pick up on the "real" UwsgiCache class. What's the best way of handling this?

    opened by jaapz 32
  • now works with 2to3, passes all the tests in py25-27,31,32

    now works with 2to3, passes all the tests in py25-27,31,32

    This branch enables Werkzeug to work with 2to3, passing all the tests in py25-27 and py31-32. (The Werkzeug's web page says it supports Python 2.4, but it depends on newer libraries like hashlib already, so I thought 2.5 would be fair.)

    Small incompatibilities that might affect Python 2 users:

    • MapAdapter.match() and MapAdapter.dispatch() now has separate arguments for path: path and path_info.
      • This is due to the newer WSGI standard for Python 3. the WSGI servers give environ['PATH_INFO'] to applications as bytestrings decoded in latin1 (hence unicode strings) . On the other hand, web applications, internally, would just use "strings" (which are also unicode strings) to pass around their path informations. So we use two kinds of (unicode) strings to specify web locations, one is in bytes-like representation and another one is a normal form. To distinguish them, we name them path_info and path respectively.
      • There should be no problem for average cases, since many codes (including tutorials) have used the library with path_info (in older sense) as a positional argument, and path_info (in newer sense) is almost used internally in Werkzeug codes only.
    • Some functions could be more grumpy about passing str and unicode to their arguments.
    opened by puzzlet 30
  • Hook for a setup function when using run_simple and autoreloader

    Hook for a setup function when using run_simple and autoreloader

    Hi,

    I want to be able to run some arbitrary code everytime the werkzeug reloader decides to reload my application such that if that code fails, all I have to do is change the code and it will auto reload and try it again.

    As far as I can tell, the nicest way to achieve this is to run that code before the make_server().serve_forever() call in the inner() function in run_simple.

    So, for example, add a setup_func argument to run_simple and call it in the inner() function.

    So, in werkzeug/serving.py

    def run_simple(hostname, port, application, setup_func=None, [..]):
        [..]
    
        def inner():
            if setup_func:
                setup_func()
            make_server(hostname, port, application, threaded,
                        processes, request_handler,
                        passthrough_errors, ssl_context).serve_forever()
    
        [..]
    
    opened by delfick 26
  • Change to cache expiration

    Change to cache expiration

    I made these changes in response to a stackoverflow question where a user wanted to be able to have a non expiring cache: http://stackoverflow.com/questions/29561011/how-to-disable-memcached-timeout-in-werkzeug

    In the process I also fixed a potential Redis bug that was highlighted in issue #550

    opened by lionbee 25
  • regex URL converter matches incorrect value

    regex URL converter matches incorrect value

    I have the following code in a Flask App using Werkzeug

    
    # Redirect requests for the older image URLs to new URLs
    @app.route(
        '/static/images/2019/<regex("(privacy|jamstack|capabilities)"):folder>/<image>'
    )
    def redirect_old_hero_images(folder, image):
        return redirect("/static/images/2020/%s/%s" % (folder, image)), 301
    

    prior to 2.2.0 this would allow the following redirect:

    /static/images/2019/jamstack/random.png
    

    to

    /static/images/2020/jamstack/random.png
    

    Now however it incorrectly sends us to:

    /static/images/2020/jamstack/jamstack
    

    Pinning werkzeug to 2.1.2 fixes the issue.

    Environment:

    • Python version: 3.8
    • Werkzeug version: 2.2.0 and 2.2.1
    docs routing 
    opened by tunetheweb 22
  • Version 1.0.0 removed previously deprecated code

    Version 1.0.0 removed previously deprecated code

    The 1.0.0 release that was pushed 2 hours ago published breaking changes for people using flask. Not sure if it was a desired effect and flask should pin their version or if changes in this project need to happen.

    opened by evanlurvey 22
  • Reloader doesn't add

    Reloader doesn't add ".exe" to file on Windows

    if I run a flask app under 0.15.5 with FLASK_ENV=development the dev server throws an error.

    With FLASK_ENV=production the server starts ok

    After reinstalling 0.15.4 the same app works ok (both in dev and prod mode).

    Windows/py3.6.7/pipenv

    Kind regards

    reloader 
    opened by peppobon 22
  • added support for io.BufferedIOBase

    added support for io.BufferedIOBase

    Added support for already open file when using send_file, there was missing a check for io.BufferedIOBase to correctly set the attribute size. This will enable support for request 'Range' headers and correct response headers with 'Accept-Ranges' and 'Content-Range'

    • fixes #2566

    This is my first pull request, so a bit unsure about the unchecked in below checklist.

    Checklist:

    • [x] Add tests that demonstrate the correct behavior of the change. Tests should fail without the change.
    • [x] Add or update relevant docs, in the docs folder and in code.
    • [ ] Add an entry in CHANGES.rst summarizing the change and linking to the issue.
    • [ ] Add .. versionchanged:: entries in any relevant code docs.
    • [x] Run pre-commit hooks and fix any issues.
    • [x] Run pytest and tox, no tests failed.
    opened by kwismann 0
  • send_file should support range request with io.BufferedReader

    send_file should support range request with io.BufferedReader

    When using send_file, passing an open file path as io.BufferedReader, range headers are not supported. Do you see this as a possible feature?

    Using below code, will show Range request in the http header is ignored:

    file = open(filename, "rb")
    return send_file(file, download_name=filename, as_attachment=True)
    

    This is due to the below code. https://github.com/pallets/werkzeug/blob/3115aa6a6276939f5fd6efa46282e0256ff21f1a/src/werkzeug/utils.py#L428-L444

    When size is not set (line 441) the following line 504, rv.content_length, is skipped and range is not reported back to the browser as being supported and the complete file is send https://github.com/pallets/werkzeug/blob/3115aa6a6276939f5fd6efa46282e0256ff21f1a/src/werkzeug/utils.py#L503-L504

    My suggestion would be to add something like this:

            stat = os.stat(path)
            size = stat.st_size
            mtime = stat.st_mtime
        else:
            file = path_or_file
            stat = os.fstat(file.fileno())
            size = stat.st_size
            mtime = stat.st_mtime
    

    I have done some testing with above code and it seems to work fine. NB! This is obviously not a complete test and the added code would need to be refactored properly into utils.py.

    Environment:

    • Python version: 3.8.10
    • Werkzeug version: 2.1.2
    opened by kwismann 1
  • reloader exlude_patterns behavior differs for WatchdogReloaderLoop and StatReloaderLoop

    reloader exlude_patterns behavior differs for WatchdogReloaderLoop and StatReloaderLoop

    Setting exclude_patterns to an absolute path, e.g. /home/me/bar/*, works fine with StatReloaderLoop, and fails to exclude paths with WatchdogReloaderLoop. The root cause is that WatchdogReloaderLoop uses watchdog which relies on pathlib.PurePath.match and StatReloaderLoop relies on `fnmatch. Here's a simple demonstration:

    import fnmatch
    assert fnmatch.fnmatch("/foo/bar/baz", "/foo/*")
    
    import pathlib
    assert pathlib.PurePosixPath("/foo/bar/baz").match("/foo/*"), "FAIL"
    

    Pathlib's behavior is too strict for this particularly problem, since it matches from the right for relative paths and exact matches for absolute paths. fnmatch finds first match from the left.

    The workaround is to disable the watchdog by passing reloader_type="stat" to run_simple.

    Environment:

    • Python version: 3.7.2
    • Werkzeug version: 2.0.3
    opened by robnagler 0
  • fix LimitedStream.read method to work with raw IO streams

    fix LimitedStream.read method to work with raw IO streams

    This PR adds a test and fix for LimitedStream to work with raw IO streams. Raw IO streams can return fewer bytes than passed to size in a read call, and previously LimitedStream.read was not equipped to handle these cases. I had to

    • add a method to deal with cases where LimitedStream.read(-1) is called, that would exhaust the remainder of the stream and actually return the result. It uses a common "read into buffer" pattern.
    • change the condition on what the read method considers as a client disconnect. The added tests should clarify further what is considered a disconnect and what isn't.
    • fixes #2558

    Checklist:

    • [X] Add tests that demonstrate the correct behavior of the change. Tests should fail without the change.
    • [X] Add or update relevant docs, in the docs folder and in code.
    • [ ] Add an entry in CHANGES.rst summarizing the change and linking to the issue.
    • [ ] Add .. versionchanged:: entries in any relevant code docs.
    • [X] Run pre-commit hooks and fix any issues.
    • [ ] Run pytest and tox, no tests failed.
    opened by thrau 1
  • LimitedStream not respecting `read` contract of RawIOBase.

    LimitedStream not respecting `read` contract of RawIOBase.

    In the read implementation of LimitedStream, self._read refers to a read method of an underlying RawIOBase stream, and is called here: https://github.com/pallets/werkzeug/blob/3115aa6a6276939f5fd6efa46282e0256ff21f1a/src/werkzeug/wsgi.py#L957-L963

    According to the contract of RawIOBase, read can return fewer bytes than passed in the size argument:

    Fewer than size bytes may be returned if the operating system call returns fewer than size bytes.

    This does not however mean that the underlying stream won't return more bytes when read is called again (size is an upper bound). Raising an error on the condition in line 962if to_read and len(read) != to_read breaks the contract of IO.read.

    To add some context: This is a pretty significant problem for our server implementation in LocalStack that builds on Hypercorn, where we expose an ASGIReceiveCallable as a readable stream and plug it into Werkzeug. Our stream behaves according to IO.read, but Werkzeug will raise unexpected ClientDisconnect errors, even though there is still data in the stream.

    It seems that the condition could be safely removed, or at least weakened to if to_read and not len(read), which would indicate an unexpected EOF.

    Environment:

    • Python version: 3.10
    • Werkzeug version: 2.2.2, but the code has been affected for a while: https://github.com/pallets/werkzeug/commit/b677ecc9e089a2251f800d9ff26548eb3040d884
    opened by thrau 0
  • The existing means of obtaining docker container IDs no longer valid

    The existing means of obtaining docker container IDs no longer valid

    The default use of cgroups v2 in new kernels and new versions of docker makes the existing means of obtaining container IDs no longer valid

    /proc/self/cgoups now in my docker container:

    [email protected]:/proc/self# cat /proc/self/cgroup 
    0::/
    [email protected]:/proc/self# 
    
    opened by mrjesen 0
Releases(2.2.2)
  • 2.2.2(Aug 8, 2022)

    This is a fix release for the 2.2.0 feature release.

    • Changes: https://werkzeug.palletsprojects.com/en/2.2.x/changes/#version-2-2-2
    • Milestone: https://github.com/pallets/werkzeug/milestone/25?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.2.1(Jul 27, 2022)

    This is a fix release for the 2.2.0 feature release.

    • Changes: https://werkzeug.palletsprojects.com/en/2.2.x/changes/#version-2-2-1
    • Milestone: https://github.com/pallets/werkzeug/milestone/24?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.2.0(Jul 23, 2022)

    This is a feature release, which includes new features and removes previously deprecated features. The 2.2.x branch is now the supported bugfix branch, the 2.1.x branch will become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades.

    • Changes: https://werkzeug.palletsprojects.com/en/2.2.x/changes/#version-2-2-0
    • Milestone: https://github.com/pallets/werkzeug/milestone/20?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.1.2(Apr 28, 2022)

    This is a fix release for the 2.1.0 feature release.

    • Changes: https://werkzeug.palletsprojects.com/en/2.1.x/changes/#version-2-1-2
    • Milestone: https://github.com/pallets/werkzeug/milestone/22?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.1.1(Apr 1, 2022)

    This is a fix release for the 2.1.0 feature release.

    • Changes: https://werkzeug.palletsprojects.com/en/2.1.x/changes/#version-2-1-1
    • Milestone: https://github.com/pallets/werkzeug/milestone/19?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.1.0(Mar 28, 2022)

    This is a feature release, which includes new features and removes previously deprecated features. The 2.1.x branch is now the supported bugfix branch, the 2.0.x branch will become a tag marking the end of support for that branch. We encourage everyone to upgrade, and to use a tool such as pip-tools to pin all dependencies and control upgrades.

    • Changes: https://werkzeug.palletsprojects.com/en/2.1.x/changes/#version-2-1-0
    • Milestone: https://github.com/pallets/werkzeug/milestone/16?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.0.3(Feb 7, 2022)

    • Changes: https://werkzeug.palletsprojects.com/en/2.0.x/changes/#version-2-0-3
    • Milestone: https://github.com/pallets/werkzeug/milestone/18?closed=1
    Source code(tar.gz)
    Source code(zip)
  • 2.0.2(Oct 6, 2021)

  • 2.0.1(May 17, 2021)

  • 2.0.0(May 12, 2021)

    New major versions of all the core Pallets libraries, including Werkzeug 2.0, have been released! :tada:

    • Read the announcement on our blog: https://palletsprojects.com/blog/flask-2-0-released/
    • Read the full list of changes: https://werkzeug.palletsprojects.com/changes/#version-2-0-0
    • Retweet the announcement on Twitter: https://twitter.com/PalletsTeam/status/1392266507296514048
    • Follow our blog, Twitter, or GitHub to see future announcements.

    This represents a significant amount of work, and there are quite a few changes. Be sure to carefully read the changelog, and use tools such as pip-compile and Dependabot to pin your dependencies and control your updates.

    Source code(tar.gz)
    Source code(zip)
  • 2.0.0rc5(May 3, 2021)

  • 2.0.0rc4(Apr 16, 2021)

  • 2.0.0rc3(Mar 17, 2021)

    • Changes: https://werkzeug.palletsprojects.com/en/master/changes/#version-2-0-0

    Use the --pre flag to install this pre-release:

    pip install --pre Werkzeug==2.0.0rc3
    
    Source code(tar.gz)
    Source code(zip)
  • 2.0.0rc2(Mar 3, 2021)

    • Changes: https://werkzeug.palletsprojects.com/en/master/changes/#version-2-0-0

    Use the --pre flag to install this pre-release:

    pip install --pre Werkzeug==2.0.0rc2
    
    Source code(tar.gz)
    Source code(zip)
  • 2.0.0rc1(Feb 9, 2021)

    • Changes: https://werkzeug.palletsprojects.com/en/master/changes/#version-2-0-0

    Use the --pre flag to install this pre-release:

    pip install --pre Werkzeug==2.0.0rc1
    
    Source code(tar.gz)
    Source code(zip)
  • 1.0.1(Mar 31, 2020)

  • 1.0.0(Mar 31, 2020)

    After 13 years of development, we're finally 1.0!

    Note that previously deprecated code has been removed in this release. Use 0.16.1 as an intermediate step to see deprecation warnings and upgrade.

    • Blog: https://palletsprojects.com/blog/werkzeug-1-0-0-released
    • Changelog: https://werkzeug.palletsprojects.com/en/1.0.x/changes/#version-1-0-0
    • Twitter: https://twitter.com/PalletsTeam/status/1225561504004689920
    Source code(tar.gz)
    Source code(zip)
  • 1.0.0rc1(Jan 31, 2020)

    • Changes: https://werkzeug.palletsprojects.com/en/master/changes/#version-1-0-0

    Use the --pre flag to install this pre-release:

    pip install --pre Werkzeug==1.0.0rc1
    
    Source code(tar.gz)
    Source code(zip)
  • 0.16.1(Jan 27, 2020)

  • 0.16.0(Sep 19, 2019)

    Most of the top-level attributes in the werkzeug module are now deprecated, and will be removed in 1.0.0.

    For example, instead of import werkzeug; werkzeug.url_quote, do from werkzeug.urls import url_quote. A deprecation warning will show the correct import to use. werkzeug.exceptions and werkzeug.routing should also be imported instead of accessed, but for technical reasons can’t show a warning.

    • Blog: https://palletsprojects.com/blog/werkzeug-0-16-0-released
    • Changelog: https://werkzeug.palletsprojects.com/en/0.16.x/changes/#version-0-16-0
    Source code(tar.gz)
    Source code(zip)
  • 0.15.6(Sep 4, 2019)

    The issue causing the reloader to fail when running from a setuptools entry point (like flask run) on Windows has been fixed.

    • Changelog: http://werkzeug.palletsprojects.com/en/0.15.x/changes/#version-0-15-6
    Source code(tar.gz)
    Source code(zip)
  • 0.15.5(Jul 17, 2019)

  • 0.15.4(May 15, 2019)

    • Blog: https://palletsprojects.com/blog/werkzeug-0-15-3-released/
    • Changes: https://werkzeug.palletsprojects.com/en/0.15.x/changes/#version-0-15-4
    Source code(tar.gz)
    Source code(zip)
  • 0.15.3(May 15, 2019)

    • Blog: https://palletsprojects.com/blog/werkzeug-0-15-3-released/
    • Changes: https://werkzeug.palletsprojects.com/en/0.15.x/changes/#version-0-15-3
    Source code(tar.gz)
    Source code(zip)
  • 0.15.2(Apr 2, 2019)

    • Blog: https://palletsprojects.com/blog/werkzeug-0-15-2-released/
    • Changes: https://werkzeug.palletsprojects.com/en/0.15.x/changes/#version-0-15-2
    Source code(tar.gz)
    Source code(zip)
  • 0.15.1(Mar 21, 2019)

  • 0.15.0(Mar 19, 2019)

Pyramid - A Python web framework

Pyramid Pyramid is a small, fast, down-to-earth, open source Python web framework. It makes real-world web application development and deployment more

Pylons Project 3.7k Dec 30, 2022
A micro web-framework using asyncio coroutines and chained middleware.

Growler master ' dev Growler is a web framework built atop asyncio, the asynchronous library described in PEP 3156 and added to the standard library i

687 Nov 27, 2022
easyopt is a super simple yet super powerful optuna-based Hyperparameters Optimization Framework that requires no coding.

easyopt is a super simple yet super powerful optuna-based Hyperparameters Optimization Framework that requires no coding.

Federico Galatolo 9 Feb 04, 2022
Ape is a framework for Web3 Python applications and smart contracts, with advanced functionality for testing, deployment, and on-chain interactions.

Ape Framework Ape is a framework for Web3 Python applications and smart contracts, with advanced functionality for testing, deployment, and on-chain i

ApeWorX Ltd. 552 Dec 30, 2022
Bionic is Python Framework for crafting beautiful, fast user experiences for web and is free and open source

Bionic is fast. It's powered core python without any extra dependencies. Bionic offers stateful hot reload, allowing you to make changes to your code and see the results instantly without restarting

⚓ 0 Mar 05, 2022
Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed.

Tornado Web Server Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed. By using non-blocking ne

20.9k Jan 01, 2023
Pyrin is an application framework built on top of Flask micro-framework to make life easier for developers who want to develop an enterprise application using Flask

Pyrin A rich, fast, performant and easy to use application framework to build apps using Flask on top of it. Pyrin is an application framework built o

Mohamad Nobakht 10 Jan 25, 2022
An effective, simple, and async security library for the Sanic framework.

Sanic Security An effective, simple, and async security library for the Sanic framework. Table of Contents About the Project Getting Started Prerequis

Sunset Dev 72 Nov 30, 2022
🦍 The Cloud-Native API Gateway

Kong or Kong API Gateway is a cloud-native, platform-agnostic, scalable API Gateway distinguished for its high performance and extensibility via plugi

Kong 33.8k Jan 09, 2023
Bablyon 🐍 A small ASGI web framework

A small ASGI web framework that you can make asynchronous web applications using uvicorn with using few lines of code

xArty 8 Dec 07, 2021
An easy-to-use high-performance asynchronous web framework.

中文 | English 一个易用的高性能异步 web 框架。 Index.py 文档 Index.py 实现了 ASGI3 接口,并使用 Radix Tree 进行路由查找。是最快的 Python web 框架之一。一切特性都服务于快速开发高性能的 Web 服务。 大量正确的类型注释 灵活且高效的

Index.py 264 Dec 31, 2022
Sanic integration with Webargs

webargs-sanic Sanic integration with Webargs. Parsing and validating request arguments: headers, arguments, cookies, files, json, etc. IMPORTANT: From

Endurant Devs 13 Aug 31, 2022
bottle.py is a fast and simple micro-framework for python web-applications.

Bottle: Python Web Framework Bottle is a fast, simple and lightweight WSGI micro web-framework for Python. It is distributed as a single file module a

Bottle Micro Web Framework 7.8k Dec 31, 2022
APIFlask is a lightweight Python web API framework based on Flask and marshmallow-code projects

APIFlask APIFlask is a lightweight Python web API framework based on Flask and marshmallow-code projects. It's easy to use, highly customizable, ORM/O

Grey Li 705 Jan 04, 2023
Light, Flexible and Extensible ASGI API framework

Starlite Starlite is a light, opinionated and flexible ASGI API framework built on top of pydantic and Starlette. Check out the Starlite documentation

Na'aman Hirschfeld 1.6k Jan 09, 2023
A library that makes consuming a RESTful API easier and more convenient

Slumber is a Python library that provides a convenient yet powerful object-oriented interface to ReSTful APIs. It acts as a wrapper around the excellent requests library and abstracts away the handli

Sam Giles 597 Dec 13, 2022
Flask + Docker + Nginx + Gunicorn + MySQL + Factory Method Pattern

This Flask project is reusable and also an example of how to merge Flask, Docker, Nginx, Gunicorn, MySQL, new: Flask-RESTX, Factory Method design pattern, and other optional dependencies such as Dyna

Facundo Padilla 19 Jul 23, 2022
Appier is an object-oriented Python web framework built for super fast app development.

Joyful Python Web App development Appier is an object-oriented Python web framework built for super fast app development. It's as lightweight as possi

Hive Solutions 122 Dec 22, 2022
Dockerized web application on Starlite, SQLAlchemy1.4, PostgreSQL

Production-ready dockerized async REST API on Starlite with SQLAlchemy and PostgreSQL

Artur Shiriev 10 Jan 03, 2023
The Python micro framework for building web applications.

Flask Flask is a lightweight WSGI web application framework. It is designed to make getting started quick and easy, with the ability to scale up to co

The Pallets Projects 61.5k Jan 06, 2023