Build, deploy and extract satellite public constellations with one command line.

Overview

Logo

SatExtractor

Build, deploy and extract satellite public constellations with one command line.
Logo

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Contributing
  5. License

About The Project

  • tldr: SatExtractor gets all revisits in a date range from a given geojson region from any public satellite constellation and store it in a cloud friendly format.

The large amount of image data makes it difficult to create datasets to train models quickly and reliably. Existing methods for extracting satellite images take a long time to process and have user quotas that restrict access.

Therefore, we created an open source extraction tool SatExtractor to perform worldwide datasets extractions using serverless providers such as Google Cloud Platform or AWS and based on a common existing standard: STAC.

The tool scales horizontally as needed, extracting revisits and storing them in zarr format to be easily used by deep learning models.

It is fully configurable using Hydra.

(back to top)

Getting Started

SatExtractor needs a cloud provider to work. Before you start using it, you'll need to create and configure a cloud provider account.

We provide the implementation to work with Google Cloud, but SatExtractor is implemented to be easily extensible to other providers.

Structure

The package is structured in a modular and configurable approach. It is basically a pipeline containing 6 important steps (separated in modules).

  • Builder: contains the logic to build the container that will run the extraction.

    more info SatExtractor is based on a docker container. The Dockerfile in the root dir is used to build the core package and a reference in it to the specific provider extraction logic should be explicitly added (see the gcp example in directory providers/gcp).

    This is done by setting ENV PROVIDER var to point the provider directory. In the default Dockerfile it is set to gcp: ENV PROVIDER providers/gcp .

  • Stac: converts a public constellation to the STAC standard.
    more info If the original constellation is not already in STAC standard it should be converted. To do so, you have to implement the constellation specific STAC conversor. Sentinel 2 and Landsat 7/8 examples can be found in src/satextractor/stac . The function that is actually called to perform the conversion to the STAC standard is set in stac hydra config file ( conf/stac/gcp.yaml )
  • Tiler: Creates tiles of the given region to perform the extraction.
    more info The Tiler split the region in UTM tiles using SentinelHub splitter . There will be one Extraction Task per Tile. The config about the tiler can be found in conf/tiler/utm.yaml . There, the size of the tiles can be specified. Take into account that these tiles are not the actual patches that are later stored in your cloud provider, this is just the unit from where the (smaller) patches will be extracted.
  • Scheduler: Decides how those tiles are going to be scheduled creating extractions tasks.

    more info The Scheduler takes the resulting tiles from the Tiler and creates the actual patches (called also tiles) to be extracted.

    For example, if the Tiler splitted the region in 10000x10000 tiles, now the scheduler can be set to extract from each of the tiles smaller patches of, say, 1000x1000. Also, the scheduler calculates the intersection between the patches and the constellation STAC assets. At the end, you'll have and object called ExtractionTask with the information to extract one revisit, one band and one tile splitted in multiple patches. This ExtractionTask will be send to the cloud provider to perform the actual extraction.

    The config about the scheduler can be found in conf/scheduler/utm.yaml .

  • Preparer: Prepare the files in the cloud storage.

    more info The Preparer creates the cloud file structure. It creates the needed zarr groups and arrays in order to later store the extracted patches.

    The gcp preparer config can be found in conf/preparer/gcp.yaml .

  • Deployer: Deploy the extraction tasks created by the scheduler to perform the extraction.
    more info The Deployer sends one message per ExtractionTask to the cloud provider to perform the actal extraction. It works by publishing messages to a PubSub queue where the extraction is subscribed to. When a new message (ExtractionTask) arrives it will be automatically run on the cloud autoscaling. The gcp deployer config can be found in conf/deployer/gcp.yaml .

All the steps are optional and the user decides which to run the main config file.

Prerequisites

In order to run SatExtractor we recommend to have a virtual env and a cloud provider user should already been created.

Installation

  1. Clone the repo
    git clone https://github.com/FrontierDevelopmentLab/sat-extractor
  2. Install python packages
    pip install .

(back to top)

Usage

πŸ”΄ πŸ”΄ πŸ”΄

- WARNING!!!!:
Running SatExtractor will use your billable cloud provider services. 
We strongly recommend testing it with a small region to see if everything is working ok. 
Be sure you are running all your cloud provider services in the same region to avoid extra costs.

πŸ”΄ πŸ”΄ πŸ”΄

Once a cloud provider user is set and the package is installed you'll need to grab the geojson region you want (you can get it from the super-cool tool geojson.io) and change the config files.

  1. Save the region as .geojson and store it in the outputs folder (you can change your output dir in the config.yaml)
  2. Open the config.yaml and you'll see something like this:

Logo

The important here is to set the dataset_name to , define the start_date and end_date for your revisits, your constellations and the tasks to be run (you would want to run the build only one time and the comment it out.)

Important: the token.json contains the needed credentials to access you cloud provider. In this example case it contains the gcp credentials. You'll need to provide it.

  1. Open the cloud/ .yaml and add there your account info as in the default provided file. (optional): you can choose different configurations by changing modules configs: builder, stac, tiler, scheduler, preparer, etc. There you can change things like patch_size, chunk_size.

  2. Run python src/satextractor/cli.py and enjoy!

(back to top)

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the BSD 2 License. See LICENSE.txt for more information.

(back to top)

Acknowledgments

This work is the result of the 2021 ESA Frontier Development Lab World Food Embeddings team. We are grateful to all organisers, mentors and sponsors for providing us this opportunity. We thank Google Cloud for providing computing and storage resources to complete this work.

Comments
  • Dockerfile path not found

    Dockerfile path not found

    in gcp builder the dockerfile path is set to: dockerfile_path = Path(__file__).parents[3] It returns '/home/fran/miniconda3/envs/sat-extractor/lib/python3.9'.

    Should be changed to pass the path as parameter.

    bug 
    opened by frandorr 3
  • Loosen version locking

    Loosen version locking

    I think it might be necessary to loosen the dependency version locking in setup.py a bit. Quite difficult to e.g. pip install prefect[google] sat-extractor because of the conflicts.

    opened by carderne 3
  • DLQ, Pyarrow, backoff

    DLQ, Pyarrow, backoff

    Without pyarrow installed, I got the following error:

    2021-11-05 16:10:20.389 | INFO     | __main__:stac:33 - using satextractor.stac.gcp_region_to_item_collection stac creator.
    Error executing job with overrides: []
    Traceback (most recent call last):
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
        return _target_(*args, **kwargs)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 47, in gcp_region_to_item_collection
        df = get_sentinel_2_assets_df(client, region, start_date, end_date)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 168, in get_sentinel_2_assets_df
        dfs.append(query_job.to_dataframe())
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/job/query.py", line 1644, in to_dataframe
        return query_result.to_dataframe(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1938, in to_dataframe
        record_batch = self.to_arrow(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1713, in to_arrow
        raise ValueError(_NO_PYARROW_ERROR)
    ValueError: The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "sat-extractor/src/satextractor/cli.py", line 185, in main
        stac(cfg)
      File "sat-extractor/src/satextractor/cli.py", line 44, in stac
        item_collection = hydra.utils.call(
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 180, in instantiate
        return instantiate_node(config, *args, recursive=_recursive_, convert=_convert_)
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 249, in instantiate_node
        return _call_target(_target_, *args, **kwargs)
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 64, in _call_target
        raise type(e)(
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
        return _target_(*args, **kwargs)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 47, in gcp_region_to_item_collection
        df = get_sentinel_2_assets_df(client, region, start_date, end_date)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 168, in get_sentinel_2_assets_df
        dfs.append(query_job.to_dataframe())
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/job/query.py", line 1644, in to_dataframe
        return query_result.to_dataframe(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1938, in to_dataframe
        record_batch = self.to_arrow(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1713, in to_arrow
        raise ValueError(_NO_PYARROW_ERROR)
    ValueError: Error instantiating 'satextractor.stac.stac.gcp_region_to_item_collection' : The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.
    
    
    opened by carderne 3
  • Fix lazy scheduler

    Fix lazy scheduler

    This is much smaller than it looks, basically just:

    1. Add back support for passing item_collection as an already-loaded ItemCollection.
    2. Rename item to it on L132 to avoid shadowing item from the main loop.
    opened by carderne 2
  • satextractor.schedler causing circular import

    satextractor.schedler causing circular import

    The first line of this imports gcp_scheduler https://github.com/FrontierDevelopmentLab/sat-extractor/blob/d50713143c12c62b56927b526826fedfb52e77b4/src/satextractor/scheduler/init.py#L1-L2

    Which itself imports from the same __init__.py again, causing a circular loop: https://github.com/FrontierDevelopmentLab/sat-extractor/blob/d50713143c12c62b56927b526826fedfb52e77b4/src/satextractor/scheduler/gcp_scheduler.py#L8

    opened by carderne 2
  • Get DLQ working

    Get DLQ working

    @frandorr @Lkruitwagen

    From here, the service account for PubSub looks like this:

    PUBSUB_SERVICE_ACCOUNT="service-${project-number}@gcp-sa-pubsub.iam.gserviceaccount.com"
    

    i.e. instead of using the service account from the token.json.

    From CLI:

    PROJ_NUMBER=$(gcloud projects list \
    --filter="$(gcloud config get-value project)" \
    --format="value(PROJECT_NUMBER)")
    
    PUBSUB_SERVICE_ACCOUNT="s[email protected]"
    

    And then bind the account as already done:

    gcloud pubsub topics add-iam-policy-binding "$DLQ_TOPIC" \
      --member="serviceAccount:$PUBSUB_SERVICE_ACCOUNT" \
      --role=roles/pubsub.publisher
    
    gcloud pubsub subscriptions add-iam-policy-binding "$MAIN_SUBSCRIPTION" \
      --member="serviceAccount:$PUBSUB_SERVICE_ACCOUNT" \
      --role=roles/pubsub.subscriber
    
    opened by carderne 2
  • GCP dead-letter-queue not properly being created

    GCP dead-letter-queue not properly being created

    When the pubsub cloud run subscription is created, it creates also a dlq but it doesn't assign the correct roles and permissions and doesn't create the topic for the dlq: image

    As the dlq doesn't exist, the extraction task messages that fails will loop forever in the the main queue, restarting the cloud run service until messages are manually purged.

    We should add this permissions and create the dlq topic automatically to avoid an infinite loop.

    bug 
    opened by frandorr 2
  • mask and percentiles

    mask and percentiles

    These don't seem to be used for anything?

    https://github.com/FrontierDevelopmentLab/sat-extractor/blob/7a0821360ffdab8403563ca651b5bd43ecca3dc4/src/satextractor/preparer/preparer.py#L35-L51

    opened by carderne 1
  • Succeeds but get error

    Succeeds but get error

    @frandorr just sharing this here from last week

    [2021-10-29 13:17:48,746][grpc._plugin_wrapping][ERROR] - AuthMetadataPluginCallback "<google.auth.transport.grpc.AuthMetadataPlugin object at 0x7fc4d86812b0>" raised exception!
    Traceback (most recent call last):
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/grpc/_plugin_wrapping.py", line 89, in __call__
        self._metadata_plugin(
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/transport/grpc.py", line 101, in __call__
        callback(self._get_authorization_headers(context), None)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/transport/grpc.py", line 87, in _get_authorization_headers
        self._credentials.before_request(
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/credentials.py", line 134, in before_request
        self.apply(headers)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/credentials.py", line 110, in apply
        _helpers.from_bytes(token or self.token)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/_helpers.py", line 129, in from_bytes
        raise ValueError("{0!r} could not be converted to unicode".format(value))
    ValueError: None could not be converted to unicode
    
    opened by carderne 1
  • Only works with pip install -e .

    Only works with pip install -e .

    This:

    cd sat-extractor
    pyenv global 3.9.7
    mkvirtualenv test
    pip install .
    python ./src/satextractor/cli.py
    

    Fails with:

    ImportError: Encountered error: `No module named 'satextractor.builder'` when
    loading module 'satextractor.builder.gcp_builder.build_gcp'
    

    However, installing with pip install -e . (what I did initially, which is why I didn't notice this) works fine.

    Maybe because when using a non-editable install, cli.py gets confused about whether it should be looking for modules in its directory or in the somehwere-else/site-packages/satextractor directory...

    opened by carderne 1
  • Feat/append data

    Feat/append data

    • add overwrite to main config
    • create a hash string for the config spec (for unique date-range & constellation combination)
    • if extraction archive already exists, resize any existing data and masks for the union of the new timeseries. Overwrite timeries with union prior to extraction.
    opened by Lkruitwagen 0
  • For consideration: Bring main extract function into package

    For consideration: Bring main extract function into package

    @Lkruitwagen @frandorr

    Moving all the extract_patches logic into the main package under satextractor.extractor, leaving only the HTTP and BQ stuff in the Flask app.

    Seems cleaner, and gives the option to do the extractions in-process if wanted! Could make it easier to try out sat-extractor on a local machine with a local output directory, no need to spin up PubSub, CloudRun etc.

    opened by carderne 0
  • Store bands info

    Store bands info

    Current implementation doesn't store the bands for each constellation. It would be nice to have that info stored. Some ideas:

    • Store a simple metadata json at constellation level (easiest)
    • At the end of each extraction create a STAC catalog that contains metadata info (maybe better but would take longer to implement)
    • Store at array level the info, something like xarray coordinates.
    enhancement 
    opened by frandorr 0
Releases(v0.3.3)
  • v0.3.3(Dec 7, 2021)

    What's Changed

    • fix COG reader by @Lkruitwagen in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/27
    • Remove download for jp2. Now using same function for geotiff and jp2 with rio. Remove gdal by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/28

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.2...v0.3.3

    Source code(tar.gz)
    Source code(zip)
  • v0.3.2(Dec 7, 2021)

    What's Changed

    • Remove COG downloader and use same as jp2k because it was buggy by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/26

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.1...v0.3.2

    Source code(tar.gz)
    Source code(zip)
  • v0.3.1(Dec 6, 2021)

    What's Changed

    • Fix Dockerfile path regression by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/24
    • Fix rescaling bug by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/25

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.0...v0.3.1

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Dec 2, 2021)

    What's Changed

    • Make tile IDs globally unique by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/17
    • Improve Band common_names and Tile properties by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/21
    • Add task id to vsimem to avoid multiple tasks using the same in-memory file by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/22

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.1.1...v0.3.0

    Source code(tar.gz)
    Source code(zip)
  • v0.1.1(Nov 18, 2021)

    What's Changed

    • Feat/typos+readme by @Lkruitwagen in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/7
    • README clarifications, add pyarrow, specify platform in gcloud run by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/8
    • DLQ, Pyarrow, backoff by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/11
    • Fix constellations bug. by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/12
    • Compatible deps; refactor build_gcp; explicit Dockerfile; ItemCollection as object by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/14
    • Deployer returns job_id so callers can track monitor tables by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/15
    • Fix jp2 lossy compresion bug

    New Contributors

    • @Lkruitwagen made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/7

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.1.0...v0.1.1

    Source code(tar.gz)
    Source code(zip)
  • v0.1.0(Oct 28, 2021)

    What's Changed

    • Small README and config improvements by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/2
    • small README modif by @rramosp in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/3
    • Add init file fixes #4 by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/5

    New Contributors

    • @carderne made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/2
    • @rramosp made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/3
    • @frandorr made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/5

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/commits/v0.1.0

    Source code(tar.gz)
    Source code(zip)
Owner
Frontier Development Lab
For more projects, see: https://gitlab.com/frontierdevelopmentlab
Frontier Development Lab
Interactive Maps with Geopandas

Create Interactive maps πŸ—ΊοΈ with your geodataframe Geopatra extends geopandas for interactive mapping and attempts to wrap the goodness of amazing map

sangarshanan 46 Aug 16, 2022
GebPy is a Python-based, open source tool for the generation of geological data of minerals, rocks and complete lithological sequences.

GebPy is a Python-based, open source tool for the generation of geological data of minerals, rocks and complete lithological sequences. The data can be generated randomly or with respect to user-defi

Maximilian Beeskow 16 Nov 29, 2022
Python bindings and utilities for GeoJSON

geojson This Python library contains: Functions for encoding and decoding GeoJSON formatted data Classes for all GeoJSON Objects An implementation of

Jazzband 765 Jan 06, 2023
scalable analysis of images and time series

thunder scalable analysis of image and time series analysis in python Thunder is an ecosystem of tools for the analysis of image and time series data

thunder-project 813 Dec 29, 2022
Raster-based Spatial Analysis for Python

🌍 xarray-spatial: Raster-Based Spatial Analysis in Python πŸ“ Fast, Accurate Python library for Raster Operations ⚑ Extensible with Numba ⏩ Scalable w

makepath 649 Jan 01, 2023
Hapi is a Python library for building Conceptual Distributed Model using HBV96 lumped model & Muskingum routing method

Current build status All platforms: Current release info Name Downloads Version Platforms Hapi - Hydrological library for Python Hapi is an open-sourc

Mostafa Farrag 15 Dec 26, 2022
Spectral decomposition for characterizing long-range interaction profiles in Hi-C maps

Inspectral Spectral decomposition for characterizing long-range interaction prof

Nezar Abdennur 6 Dec 13, 2022
Advanced raster and geometry manipulations

buzzard In a nutshell, the buzzard library provides powerful abstractions to manipulate together images and geometries that come from different kind o

Earthcube Lab 30 Jun 20, 2022
This app displays interesting statistical weather records and trends which can be used in climate related research including study of global warming.

This app displays interesting statistical weather records and trends which can be used in climate related research including study of global warming.

0 Dec 27, 2021
🌐 Local tile server for viewing geospatial raster files with ipyleaflet

🌐 Local Tile Server for Geospatial Rasters Need to visualize a rather large raster (gigabytes) you have locally? This is for you. A Flask application

Bane Sullivan 192 Jan 04, 2023
Obtain a GNSS position fix from an 11-millisecond raw GNSS signal snapshot

Obtain a GNSS position fix from an 11-millisecond raw GNSS signal snapshot without any prior knowledge about the position of the receiver and only coarse knowledge about the time.

Jonas Beuchert 2 Nov 17, 2022
Zora is a python program that searches for GeoLocation info for given CIDR networks , with options to search with API or without API

Zora Zora is a python program that searches for GeoLocation info for given CIDR networks , with options to search with API or without API Installing a

z3r0day 1 Oct 26, 2021
Histogram matching plugin for rasterio

rio-hist Histogram matching plugin for rasterio. Provides a CLI and python module for adjusting colors based on histogram matching in a variety of col

Mapbox 75 Sep 23, 2022
GeoIP Legacy Python API

MaxMind GeoIP Legacy Python Extension API Requirements Python 2.5+ or 3.3+ GeoIP Legacy C Library 1.4.7 or greater Installation With pip: $ pip instal

MaxMind 230 Nov 10, 2022
This is the antenna performance plotted from tinyGS reception data.

tinyGS-antenna-map This is the antenna performance plotted from tinyGS reception data. See their repository. The code produces a plot that provides Az

Martin J. Levy 14 Aug 21, 2022
Tool to display your current position and angle above your radar

πŸ›  Tool to display your current position and angle above your radar. As a response to the CS:GO Update on 1.2.2022, which makes cl_showpos a cheat-pro

Miko 6 Jan 04, 2023
Python library to visualize circular plasmid maps

Plasmidviewer Plasmidviewer is a Python library to visualize plasmid maps from GenBank. This library provides only the function to visualize circular

Mori Hideto 9 Dec 04, 2022
Logging the position of the car on an sdcard

audi-mmi-3g-gps-logging Logging the position of the car on an sdcard, startup script origin not clear to me, logging setup and time change is what I d

2 May 31, 2022
Manage your XYZ Hub or HERE Data Hub spaces from Python.

XYZ Spaces for Python Manage your XYZ Hub or HERE Data Hub spaces and Interactive Map Layer from Python. FEATURED IN: Online Python Machine Learning C

HERE Technologies 30 Oct 18, 2022
Example of animated maps in matplotlib + geopandas using entire time series of congressional district maps from UCLA archive. rendered, interactive version below

Example of animated maps in matplotlib + geopandas using entire time series of congressional district maps from UCLA archive. rendered, interactive version below

Apoorva Lal 5 May 18, 2022