Skip to content

anton-l/sklearn-docker-api

Repository files navigation

sklearn-docker-api

 _______________________________________
/ A minimalistic example of preparing a \
| model for (synchronous) inference in  |
\ production.                           /
 ---------------------------------------
        \   ^__^
         \  (oo)\_______
            (__)\       )\/\
                ||----w |
                ||     ||

Getting started

  1. Clone or download this repository: git clone https://github.com/crocopie/sklearn-docker-api

  2. Install docker-ce and docker-compose

  3. Train an example iris classifier python experiments/train.py

  4. Build a docker container and deploy the model docker-compose -f dev.compose.yml up --build

  5. This will start a uvicorn web server inside a docker container that will expose a FastAPI endpoint with a /classify method

  6. Enjoy the autogenerated API docs at http://127.0.0.1:9090/docs. Press the Try it out button for an interactive example.

How to build your own project based on this example

Directory structure overview

.
├── data  # training data
│   └── iris.csv
├── experiments  # here you can keep the training workflows and notebooks
│   └── train.py
├── models  # a place for your trained model dumps
│   └── svm.joblib
├── service  # api service directory
│   ├── api.py  # the main script for our model server
|   └── utils.py  # utility functions, e.g. a logging helper
├── .dockerignore  # list of files that aren't needed inside the docker container
├── conda-env.yml  # conda environment configuration
├── Dockerfile  # docker image configuration
├── dev.compose.yml  # docker-compose configuration for a DEVELOPMENT deployment
└── prod.compose.yml  # docker-compose configuration for a PRODUCTION deployment

Where to look first

To be continued

About

A minimalistic example of preparing a model for (synchronous) inference in production.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published