Turn a STAC catalog into a dask-based xarray

Overview

StackSTAC

Documentation Status

Turn a list of STAC items into a 4D xarray DataArray (dims: time, band, y, x), including reprojection to a common grid. The array is a lazy Dask array, so loading and processing the data in parallel—locally or on a cluster—is just a compute() call away.

For more information and examples, please see the documentation.

import stackstac
import satsearch

stac_items = satsearch.Search(
    url="https://earth-search.aws.element84.com/v0",
    intersects=dict(type="Point", coordinates=[-105.78, 35.79]),
    collections=["sentinel-s2-l2a-cogs"],
    datetime="2020-04-01/2020-05-01"
).items()

stack = stackstac.stack(stac_items)
print(stack)
<xarray.DataArray 'stackstac-f350f6bfc3213d7eee2e6cb159246d88' (time: 13, band: 17, y: 10980, x: 10980)>
dask.array<fetch_raster_window, shape=(13, 17, 10980, 10980), dtype=float64, chunksize=(1, 1, 1024, 1024), chunktype=numpy.ndarray>
Coordinates: (12/23)
  * time                        (time) datetime64[ns] 2020-04-01T18:04:04 ......
    id                          (time) <U24 'S2B_13SDV_20200401_0_L2A' ... 'S...
  * band                        (band) <U8 'overview' 'visual' ... 'WVP' 'SCL'
  * x                           (x) float64 4e+05 4e+05 ... 5.097e+05 5.098e+05
  * y                           (y) float64 4e+06 4e+06 ... 3.89e+06 3.89e+06
    eo:cloud_cover              (time) float64 29.24 1.16 27.26 ... 87.33 5.41
    ...                          ...
    data_coverage               (time) object 33.85 100 33.9 ... 32.84 100 34.29
    platform                    (time) <U11 'sentinel-2b' ... 'sentinel-2b'
    sentinel:sequence           <U1 '0'
    proj:epsg                   int64 32613
    sentinel:data_coverage      (time) float64 33.85 100.0 33.9 ... 100.0 34.29
    title                       (band) object None ... 'Scene Classification ...
Attributes:
    spec:        RasterSpec(epsg=32613, bounds=(399960.0, 3890220.0, 509760.0...
    crs:         epsg:32613
    transform:   | 10.00, 0.00, 399960.00|\n| 0.00,-10.00, 4000020.00|\n| 0.0...
    resolution:  10.0

Once in xarray form, many operations become easy. For example, we can compute a low-cloud weekly mean-NDVI timeseries:

lowcloud = stack[stack["eo:cloud_cover"] < 40]
nir, red = lowcloud.sel(band="B08"), lowcloud.sel(band="B04")
ndvi = (nir - red) / (nir + red)
weekly_ndvi = ndvi.resample(time="1w").mean(dim=("time", "x", "y")).rename("NDVI")
# Call `weekly_ndvi.compute()` to process ~25GiB of raster data in parallel. Might want a dask cluster for that!

Installation

pip install stackstac

Things stackstac does for you:

  • Figure out the geospatial parameters from the STAC metadata (if possible): a coordinate reference system, resolution, and bounding box.
  • Transfer the STAC metadata into xarray coordinates for easy indexing, filtering, and provenance of metadata.
  • Efficiently generate a Dask graph for loading the data in parallel.
  • Mediate between Dask's parallelism and GDAL's aversion to it, allowing for fast, multi-threaded reads when possible, and at least preventing segfaults when not.
  • Mask nodata and rescale by dataset-level scales/offsets.

Limitations:

  • Raster data only! We are currently ignoring other types of assets you might find in a STAC (XML/JSON metadata, point clouds, video, etc.).
  • Single-band raster data only! Each band has to be a separate STAC asset—a separate red, green, and blue asset on each Item is great, but a single RGB asset containing a 3-band GeoTIFF is not supported yet.
  • COGs work best. "Normal" GeoTIFFs that aren't internally tiled, or don't have overviews, will see much worse performance. Sidecar files (like .msk files) are ignored for performace. JPEG2000 will probably work, but probably be slow unless you buy kakadu. Formats make a big difference.
  • BYOBlocksize. STAC doesn't offer any metadata about the internal tiling scheme of the data. Knowing it can make IO more efficient, but actually reading the data to figure it out is slow. So it's on you to set this parameter. (But if you don't, things should be fine for any reasonable COG.)
  • Doesn't make geospatial data any easier to work with in xarray. Common operations (picking bands, clipping to bounds, etc.) are tedious to type out. Real geospatial operations (shapestats on a GeoDataFrame, reprojection, etc.) aren't supported at all. rioxarray might help with some of these, but it has limited support for Dask, so be careful you don't kick off a huge computation accidentally.
  • I haven't even written tests yet! Don't use this in production. Or do, I guess. Up to you.

Roadmap:

Short-term:

  • Write tests and add CI (including typechecking)
  • Support multi-band assets
  • Easier access to s3://-style URIs (right now, you'll need to pass in gdal_env=stackstac.DEFAULT_GDAL_ENV.updated(always=dict(session=rio.session.AWSSession(...))))
  • Utility to guess blocksize (open a few assets)
  • Support item assets to provide more useful metadata with collections that use it (like S2 on AWS)
  • Rewrite dask graph generation once the Blockwise IO API settles

Long term (if anyone uses this thing):

  • Support other readers (aiocogeo?) that may perform better than GDAL for specific formats
  • Interactive mapping with xarray_leaflet, made performant with some Dask graph-rewriting tricks to do the initial IO at coarser resolution for lower zoom levels (otherwize zooming out could process terabytes of data)
  • Improve ergonomics of xarray for raster data (in collaboration with rioxarray)
  • Implement core geospatial routines (warp, vectorize, vector stats, GeoPandas/spatialpandas interop) in Dask
Comments
  • Incorrect bounds and shape when stacking 3DEP items from Planetary Computer

    Incorrect bounds and shape when stacking 3DEP items from Planetary Computer

    Hi @gjoseph92, I'm using stackstac.stack with two adjacent COGs from the Planetary Computer 3dep-seamless collection, and the output bounds and shape are incorrect. If I load them manually with xarray and concat them, the bounds are correct.

    Reproducing

    import planetary_computer
    from pystac_client import Client
    import stackstac
    import xarray as xr
    
    catalog = Client.open("https://planetarycomputer.microsoft.com/api/stac/v1")
    collection = "3dep-seamless"
    # This bbox spans two items that are adjacent in x dimension
    bbox = [-124.210979, 43.336502, -123.657496, 43.798309]
    
    search = catalog.search(
        collections=[collection],
        bbox=bbox,
        # 3DEP contains both 10m and 30m items
        filter={"eq": [{"property": "gsd"}, 30]},
    )
    
    signed = planetary_computer.sign(search)
    
    # Stack the two adjacent items with stackstac (doesn't work as expected)
    da_stackstac = stackstac.stack(signed, assets=["data"])
    
    # Stack the two items manually by opening and concatenating with xarray (works as expected)
    da_list = [xr.open_rasterio(item.assets["data"].href, chunks=1024) for item in signed]
    da_xarray = xr.concat(da_list, dim="time")
    

    Issues

    The bounds for the two arrays are much different and the stackstac version doesn't cover the the search bbox:

    # Bounds don't cover the bbox: (-125.00167, 43.965540000000004, -123.96554, 44.001670000000004)
    print(stackstac.array_bounds(da_stackstac))
    
    # Bounds do cover the bbox: (-125.00152777777778, 42.99819444444364, -122.99819444444364, 44.001527777777774)
    print(stackstac.array_bounds(da_xarray))
    

    The array shapes (ignoring the time and band dims) are also much different:

    # (3613, 103613)
    print(da_stackstac.shape[2:])
    
    # (3612, 7224)
    print(da_xarray.shape[2:])
    

    Checking the proj:shape property for the items indicates their shape should be 3612 x 3612, so it looks like the stackstac version is off by 1 in the y dimension and 100k in the x dimension.

    Any thoughts on what's going on here? Thanks!

    For reference, I'm running stackstac=0.4.1 and xarray=2022.3.0.

    EDIT: This looked like it might be related to #132, but the proj:transform of the items seem to be in the correct order:

    >> signed[0].properties
    
    {'gsd': 30,
     'datetime': '2013-01-01T00:00:00Z',
     'proj:epsg': 5498,
     'proj:shape': [3612, 3612],
     'end_datetime': '2013-11-01T00:00:00Z',
     'proj:transform': [1e-05,
      0.0,
      -125.0016666667,
      0.0,
      -1e-05,
      44.00166666666,
      0.0,
      0.0,
      1.0],
     'start_datetime': '1999-02-01T00:00:00Z',
     'threedep:region': 'n40w130'}
    
    opened by aazuspan 12
  • Remove calls to rio.parse_path (maintain compat with rasterio==1.3)

    Remove calls to rio.parse_path (maintain compat with rasterio==1.3)

    opened by carderne 10
  • Fix for Pystac ItemCollections

    Fix for Pystac ItemCollections

    Follow on to #64 to work with Pystac ItemCollection https://pystac.readthedocs.io/en/latest/api.html#itemcollection.

    Partially addresses #65, but did not update all the docs and examples

    Not sure if I should add the poetry.lock file that I updated to test this out locally

    opened by scottyhq 10
  • Time coordinates are sometimes integers, not datetime64

    Time coordinates are sometimes integers, not datetime64

    Noticed this in https://gist.github.com/rmg55/b144cb273d9ccfdf979e9843fdf5e651, and I've had it happen before myself:

    Coordinates:
      * time            (time) object 1594404158627000000 ... 1614276155393000000
    

    Pretty sure stackstac is correctly making it into a pandas DatetimeIndex: https://github.com/gjoseph92/stackstac/blob/b652a07f9b2ae27235aea4db4ef0f1f594fd8941/stackstac/prepare.py#L354-L358

    but something is going weird when xarray receives that, and it reinterprets it as an object array.

    needs-future-test 
    opened by gjoseph92 9
  • 500 Internal Server Error when requesting tiles from within docker container

    500 Internal Server Error when requesting tiles from within docker container

    This is a separate issue to discuss the separate problem I was running into last week (partially discussed in #96 and #97). I am running stackstac.show from within a JupyterLab notebook running in a docker container.

    When running show, I can see the checkerboard pattern, but not the data itself. When looking at developer tools, I can see that the tiles for the data itself is giving a 500 error:

    image

    Unfortunately, the content of the response is just:

    500 Internal Server Error
    Server got itself in trouble
    

    Do you have any suggestions as to how we could debug this further, and get more information out of the server?

    needs-future-test 
    opened by robintw 8
  • Missing file handling

    Missing file handling

    While looking at data in various areas, have come across missing bands, etc, yesterday this while testing stackstac:

    File "/home/ubuntu/anaconda3/envs/richard/lib/python3.8/site-packages/stackstac/rio_reader.py", line 393, in read reader = self.dataset File "/home/ubuntu/anaconda3/envs/richard/lib/python3.8/site-packages/stackstac/rio_reader.py", line 389, in dataset self._dataset = self._open() File "/home/ubuntu/anaconda3/envs/richard/lib/python3.8/site-packages/stackstac/rio_reader.py", line 330, in _open ds = SelfCleaningDatasetReader(rio.parse_path(self.url), sharing=False) File "rasterio/_base.pyx", line 262, in rasterio._base.DatasetBase.init rasterio.errors.RasterioIOError: HTTP response code: 404

    so perhaps can wrap this and report which one it failed on so can remove that and try again - and find out perhaps if a bad url or missing for the data provider end?

    good first issue 
    opened by RichardScottOZ 8
  • Coordinate computation does not take into accound inverted axis in `center` mode

    Coordinate computation does not take into accound inverted axis in `center` mode

    To be compatible with rioxarray one needs to use stackstac.stack(..., xy_coords="center") when computing X/Y coordinate values. When using this mode on data that contains "inverted Y axis", a most common scenario, Y axis coordinates are offset by 1 pixel size into positive direction.

    I have made a small reproducer. Data is a global synthetic image with 1 degree per pixel in EPSG:4326, when loading it with xy_coords="center" you would expect Y coordinate to span from -89.5 to 89.5, but instead it goes from 90.5 to -88.5.

    https://nbviewer.org/gist/Kirill888/b3dad8afdc10b37cd21af4aea8f417e3/stackstac-xy_coords-error-report.ipynb https://gist.github.com/Kirill888/b3dad8afdc10b37cd21af4aea8f417e3

    This causes issue reported earlier here: #68

    Code that performs computation of the coordinate just offsets "top-left" coordinate by positive half pixel size, but instead should offset by sign(coord[1] - coord[0])*abs(resolution)*0.5

    needs-future-test 
    opened by Kirill888 7
  • Cannot pick a common CRS, since assets have multiple CRSs: asset 'overview'

    Cannot pick a common CRS, since assets have multiple CRSs: asset 'overview'

    lib/python3.9/site-packages/stackstac/prepare.py in prepare_items(items, assets, epsg, resolution, bounds, bounds_latlon, snap_bounds) 142 out_epsg = asset_epsg 143 elif out_epsg != asset_epsg: --> 144 raise ValueError( 145 f"Cannot pick a common CRS, since assets have multiple CRSs: asset {id!r} of item " 146 f"{item_i} {item['id']!r} is in EPSG:{asset_epsg}, "

    ValueError: Cannot pick a common CRS, since assets have multiple CRSs: asset 'overview' of item 1 'S2B_11UQS_20210501_0_L2A' is in EPSG:32611, but assets before it were in EPSG:32612.

    opened by goriliukasbuxton 7
  • "Assets must have exactly 1 band"

    I tried stackstac for the first time today. I started by just running the Basic example from the documentation. Here is the code I ran

    import pystac_client
    import stackstac
    
    URL = "https://earth-search.aws.element84.com/v0"
    catalog = pystac_client.Client.open(URL)
    lon, lat = -105.78, 35.79
    
    items = catalog.search(
        intersects=dict(type="Point", coordinates=[lon, lat]),
        collections=["sentinel-s2-l2a-cogs"],
        datetime="2020-04-01/2020-05-01"
    ).get_all_items()
    
    stack = stackstac.stack(items)
    

    So far so good. Then I tried to actually compute something

    stack[0, 0].compute()
    

    This gave the following error

    ---------------------------------------------------------------------------
    RuntimeError                              Traceback (most recent call last)
    Cell In [4], line 1
    ----> 1 stack[0, 0].compute()
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataarray.py:993, in DataArray.compute(self, **kwargs)
        974 """Manually trigger loading of this array's data from disk or a
        975 remote source into memory and return a new array. The original is
        976 left unaltered.
       (...)
        990 dask.compute
        991 """
        992 new = self.copy(deep=False)
    --> 993 return new.load(**kwargs)
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataarray.py:967, in DataArray.load(self, **kwargs)
        949 def load(self: T_DataArray, **kwargs) -> T_DataArray:
        950     """Manually trigger loading of this array's data from disk or a
        951     remote source into memory and return this array.
        952 
       (...)
        965     dask.compute
        966     """
    --> 967     ds = self._to_temp_dataset().load(**kwargs)
        968     new = self._from_temp_dataset(ds)
        969     self._variable = new._variable
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py:733, in Dataset.load(self, **kwargs)
        730 import dask.array as da
        732 # evaluate all the dask arrays simultaneously
    --> 733 evaluated_data = da.compute(*lazy_data.values(), **kwargs)
        735 for k, data in zip(lazy_data, evaluated_data):
        736     self.variables[k].data = data
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py:600, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
        597     keys.append(x.__dask_keys__())
        598     postcomputes.append(x.__dask_postcompute__())
    --> 600 results = schedule(dsk, keys, **kwargs)
        601 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, **kwargs)
         86     elif isinstance(pool, multiprocessing.pool.Pool):
         87         pool = MultiprocessingPoolExecutor(pool)
    ---> 89 results = get_async(
         90     pool.submit,
         91     pool._max_workers,
         92     dsk,
         93     keys,
         94     cache=cache,
         95     get_id=_thread_get_id,
         96     pack_exception=pack_exception,
         97     **kwargs,
         98 )
        100 # Cleanup pools associated to dead threads
        101 with pools_lock:
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py:511, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs)
        509         _execute_task(task, data)  # Re-execute locally
        510     else:
    --> 511         raise_exception(exc, tb)
        512 res, worker_id = loads(res_info)
        513 state["cache"][key] = res
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py:319, in reraise(exc, tb)
        317 if exc.__traceback__ is not tb:
        318     raise exc.with_traceback(tb)
    --> 319 raise exc
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py:224, in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
        222 try:
        223     task, data = loads(task_info)
    --> 224     result = _execute_task(task, data)
        225     id = get_id()
        226     result = dumps((result, id))
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
        115     func, args = arg[0], arg[1:]
        116     # Note: Don't assign the subtask results to a variable. numpy detects
        117     # temporaries by their reference count and can execute certain
        118     # operations in-place.
    --> 119     return func(*(_execute_task(a, cache) for a in args))
        120 elif not ishashable(arg):
        121     return arg
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py:119, in <genexpr>(.0)
        115     func, args = arg[0], arg[1:]
        116     # Note: Don't assign the subtask results to a variable. numpy detects
        117     # temporaries by their reference count and can execute certain
        118     # operations in-place.
    --> 119     return func(*(_execute_task(a, cache) for a in args))
        120 elif not ishashable(arg):
        121     return arg
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
        115     func, args = arg[0], arg[1:]
        116     # Note: Don't assign the subtask results to a variable. numpy detects
        117     # temporaries by their reference count and can execute certain
        118     # operations in-place.
    --> 119     return func(*(_execute_task(a, cache) for a in args))
        120 elif not ishashable(arg):
        121     return arg
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/optimization.py:990, in SubgraphCallable.__call__(self, *args)
        988 if not len(args) == len(self.inkeys):
        989     raise ValueError("Expected %d args, got %d" % (len(self.inkeys), len(args)))
    --> 990 return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args)))
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py:149, in get(dsk, out, cache)
        147 for key in toposort(dsk):
        148     task = dsk[key]
    --> 149     result = _execute_task(task, cache)
        150     cache[key] = result
        151 result = _execute_task(out, cache)
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
        115     func, args = arg[0], arg[1:]
        116     # Note: Don't assign the subtask results to a variable. numpy detects
        117     # temporaries by their reference count and can execute certain
        118     # operations in-place.
    --> 119     return func(*(_execute_task(a, cache) for a in args))
        120 elif not ishashable(arg):
        121     return arg
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/stackstac/to_dask.py:185, in fetch_raster_window(reader_table, slices, dtype, fill_value)
        178 # Only read if the window we're fetching actually overlaps with the asset
        179 if windows.intersect(current_window, asset_window):
        180     # NOTE: when there are multiple assets, we _could_ parallelize these reads with our own threadpool.
        181     # However, that would probably increase memory usage, since the internal, thread-local GDAL datasets
        182     # would end up copied to even more threads.
        183 
        184     # TODO when the Reader won't be rescaling, support passing `output` to avoid the copy?
    --> 185     data = reader.read(current_window)
        187     if all_empty:
        188         # Turn `output` from a broadcast-trick array to a real array, so it's writeable
        189         if (
        190             np.isnan(data)
        191             if np.isnan(fill_value)
        192             else np.equal(data, fill_value)
        193         ).all():
        194             # Unless the data we just read is all empty anyway
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/stackstac/rio_reader.py:385, in AutoParallelRioReader.read(self, window, **kwargs)
        384 def read(self, window: Window, **kwargs) -> np.ndarray:
    --> 385     reader = self.dataset
        386     try:
        387         result = reader.read(
        388             window=window,
        389             masked=True,
       (...)
        392             **kwargs,
        393         )
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/stackstac/rio_reader.py:381, in AutoParallelRioReader.dataset(self)
        379 with self._dataset_lock:
        380     if self._dataset is None:
    --> 381         self._dataset = self._open()
        382     return self._dataset
    
    File /srv/conda/envs/notebook/lib/python3.9/site-packages/stackstac/rio_reader.py:340, in AutoParallelRioReader._open(self)
        338 if ds.count != 1:
        339     ds.close()
    --> 340     raise RuntimeError(
        341         f"Assets must have exactly 1 band, but file {self.url!r} has {ds.count}. "
        342         "We can't currently handle multi-band rasters (each band has to be "
        343         "a separate STAC asset), so you'll need to exclude this asset from your analysis."
        344     )
        346 # Only make a VRT if the dataset doesn't match the spatial spec we want
        347 if self.spec.vrt_params != {
        348     "crs": ds.crs.to_epsg(),
        349     "transform": ds.transform,
        350     "height": ds.height,
        351     "width": ds.width,
        352 }:
    
    RuntimeError: Assets must have exactly 1 band, but file 'https://sentinel-cogs.s3.us-west-2.amazonaws.com/sentinel-s2-l2a-cogs/13/S/DV/2020/4/S2B_13SDV_20200401_0_L2A/L2A_PVI.tif' has 3. We can't currently handle multi-band rasters (each band has to be a separate STAC asset), so you'll need to exclude this asset from your analysis.
    

    The relevant code is quite old:

    https://github.com/gjoseph92/stackstac/blob/d53f3dc94e49b95b950201f9e53539a92b1458b6/stackstac/rio_reader.py#L338-L344

    so this doesn't seem like a recent change. It is very common for STAC assets (e.g. COGs) to have multiple bands, so I really don't understand this limitation. All the other examples I tried fail in the same way.

    What am I missing here?


    I ran this on https://us-central1-b.gcp.pangeo.io/ with the following packages:

    • stackstac: 0.4.3
    • rasterio: 1.3.2
    opened by rabernat 6
  • Using DigitalEarthAfrica data

    Using DigitalEarthAfrica data

    Hi @gjoseph92

    (solved some of my issues myself, thats why I edit again)

    I would like to use Sentinel 1 RTC from Digital Earth Africa on AWS.

    I try the following:

    import stackstac
    import pystac_client
    import rasterio
    
    URL = "https://explorer.digitalearth.africa/stac/"
    catalog = pystac_client.Client.open(URL)
    
    bbox = (-3, 15.25, -2.9, 15.35)
    
    items = catalog.search(
        bbox = bbox,
        collections=["s1_rtc"],
        datetime="2021-11-01/2021-12-01"
    ).get_all_items()
    
    with rasterio.Env(aws_unsigned = True,AWS_S3_ENDPOINT= 's3.af-south-1.amazonaws.com'):
         stack = stackstac.stack(items)
         stack.compute()
    
    

    When I do this I get the error

    [NotImplementedError: Cannot automatically compute the resolution, since asset 'vh' on item 0 '620fe0bb-2e29-5ac3-86c1-7bcc9698da29' has a non-rectilinear geotrans (its data is is not axis-aligned, or "north-up"): [-4.0, 0.0002, 0.0, 16.0, 0.0, -0.0002, 0.0, 0.0, 1.0]. We should be able to handle this but just don't want to deal with it right now.
    
    Please specify the `resolution=` argument.]()
    

    However when opening such Sentinel 1 tif manually:

    test = xr.open_rasterio("https://deafrica-sentinel-1.s3.af-south-1.amazonaws.com/s1_rtc/N14W009/2018/01/06/022245/s1_rtc_022245_N14W009_2018_01_06_VH.tif")
    

    I get test.transform == (0.0002, 0.0, -9.0, 0.0, -0.0002, 15.0), which to me seems like a legit Geotransform..

    Do you have any idea?

    Thanks(:

    opened by vitusbenson 6
  • stackstac.show gives empty map

    stackstac.show gives empty map

    I'm running into a problem with stackstac.show. I've got a simple notebook that grabs some data from Planetary Computer, mosaics it, and then shows it on a map using stackstac.show. This works fine when I run it on Microsoft Planetary Computer, but when I run it locally, I don't get any data (or any checkerboard pattern) on the map - I just get the background OpenStreetMap data.

    The key parts of the code are below, and the full ipynb file is available here.

    search = stac_client.search(
            collections=["cop-dem-glo-30"],
            bbox=bounds)
    
    items = list(search.get_items())
    
    data = stackstac.stack([item.to_dict() for item in items], bounds_latlon=bounds)
    
    mosaic = data.max(dim='time').squeeze()
    
    mosaic = mosaic.persist()
    
    stackstac.show(mosaic, checkerboard=True)
    

    Am I doing something wrong here when running locally, or have I discovered a bug? When running the show call I can see various Dask jobs being run when looking at the Dask dashboard - which suggests that something is happening - but the result never seems to be displayed.

    In case it is relevant, I'm running on Windows 10, with the following in my conda enviroment:

    # packages in environment at C:\Users\rwilson3\Documents\mambaforge\envs\anglo:
    #
    # Name                    Version                   Build  Channel
    abseil-cpp                20210324.2           h0e60522_0    conda-forge
    affine                    2.3.0                      py_0    conda-forge
    aiohttp                   3.8.1                    pypi_0    pypi
    aiosignal                 1.2.0                    pypi_0    pypi
    alabaster                 0.7.12                     py_0    conda-forge
    alembic                   1.7.3                    pypi_0    pypi
    altair                    4.1.0                    pypi_0    pypi
    anyio                     3.3.2                    pypi_0    pypi
    appdirs                   1.4.4              pyh9f0ad1d_0    conda-forge
    argon2-cffi               20.1.0           py39hb82d6ee_2    conda-forge
    arrow-cpp                 5.0.0           py39he0f88eb_8_cpu    conda-forge
    asciitree                 0.3.3                      py_2    conda-forge
    asgiref                   3.4.1                    pypi_0    pypi
    astor                     0.8.1                    pypi_0    pypi
    async-timeout             4.0.1                    pypi_0    pypi
    async_generator           1.10                       py_0    conda-forge
    asyncio                   3.4.3                    pypi_0    pypi
    asyncpg                   0.22.0                   pypi_0    pypi
    atomicwrites              1.4.0                    pypi_0    pypi
    attrs                     21.2.0             pyhd8ed1ab_0    conda-forge
    aws-c-cal                 0.5.11               he19cf47_0    conda-forge
    aws-c-common              0.6.2                h8ffe710_0    conda-forge
    aws-c-event-stream        0.2.7               h70e1b0c_13    conda-forge
    aws-c-io                  0.10.5               h2fe331c_0    conda-forge
    aws-checksums             0.1.11               h1e232aa_7    conda-forge
    aws-sdk-cpp               1.8.186              hb0612c5_3    conda-forge
    azure-core                1.20.1                   pypi_0    pypi
    azure-storage-blob        12.9.0                   pypi_0    pypi
    babel                     2.9.1              pyh44b312d_0    conda-forge
    backcall                  0.2.0              pyh9f0ad1d_0    conda-forge
    backports                 1.0                        py_2    conda-forge
    backports-entry-points-selectable 1.1.0                    pypi_0    pypi
    backports.functools_lru_cache 1.6.4              pyhd8ed1ab_0    conda-forge
    base58                    2.1.0                    pypi_0    pypi
    basemap                   1.2.2            py39h381b4b0_3    conda-forge
    beautifulsoup4            4.10.0                   pypi_0    pypi
    black                     21.9b0                   pypi_0    pypi
    blas                      2.111                       mkl    conda-forge
    blas-devel                3.9.0              11_win64_mkl    conda-forge
    bleach                    4.1.0              pyhd8ed1ab_0    conda-forge
    blinker                   1.4                      pypi_0    pypi
    blosc                     1.21.0               h0e60522_0    conda-forge
    bokeh                     2.4.0            py39hcbf5309_0    conda-forge
    boost-cpp                 1.74.0               h5b4e17d_4    conda-forge
    boto3                     1.18.53                  pypi_0    pypi
    botocore                  1.21.53                  pypi_0    pypi
    braceexpand               0.1.7                    pypi_0    pypi
    branca                    0.4.2              pyhd8ed1ab_0    conda-forge
    brotli                    1.0.9                    pypi_0    pypi
    brotli-asgi               1.1.0                    pypi_0    pypi
    brotli-bin                1.0.9                h8ffe710_5    conda-forge
    brotlipy                  0.7.0           py39hb82d6ee_1001    conda-forge
    bs4                       0.0.1                    pypi_0    pypi
    buildpg                   0.3                      pypi_0    pypi
    bzip2                     1.0.8                h8ffe710_4    conda-forge
    c-ares                    1.17.2               h8ffe710_0    conda-forge
    ca-certificates           2021.10.8            h5b45459_0    conda-forge
    cachetools                4.2.4              pyhd8ed1ab_0    conda-forge
    cachey                    0.2.1              pyh9f0ad1d_0    conda-forge
    cairo                     1.16.0            hb19e0ff_1008    conda-forge
    cartopy                   0.20.0           py39h381b4b0_0    conda-forge
    certifi                   2021.10.8        py39hcbf5309_1    conda-forge
    cffi                      1.14.6           py39h0878f49_1    conda-forge
    cfgv                      3.3.1                    pypi_0    pypi
    cfitsio                   3.470                h0af3d06_7    conda-forge
    cftime                    1.5.1            py39h5d4886f_0    conda-forge
    chardet                   4.0.0            py39hcbf5309_1    conda-forge
    charls                    2.2.0                h39d44d4_0    conda-forge
    charset-normalizer        2.0.6                    pypi_0    pypi
    click                     7.1.2              pyh9f0ad1d_0    conda-forge
    click-plugins             1.1.1                      py_0    conda-forge
    cligj                     0.6.0              pyh9f0ad1d_0    conda-forge
    cloudpickle               2.0.0              pyhd8ed1ab_0    conda-forge
    cogeo-mosaic              3.0.2                    pypi_0    pypi
    colorama                  0.4.4              pyh9f0ad1d_0    conda-forge
    colorcet                  2.0.6              pyhd8ed1ab_0    conda-forge
    colorlog                  6.6.0                    pypi_0    pypi
    coverage                  6.0.2                    pypi_0    pypi
    cramjam                   2.4.0                    pypi_0    pypi
    cryptography              3.4.7            py39hd8d06c1_0    conda-forge
    cudatoolkit               10.2.89              hb195166_9    conda-forge
    curl                      7.79.1               h789b8ee_1    conda-forge
    cycler                    0.10.0                     py_2    conda-forge
    cython                    0.29.24                  pypi_0    pypi
    cytoolz                   0.11.0           py39hb82d6ee_3    conda-forge
    dask                      2021.9.1           pyhd8ed1ab_0    conda-forge
    dask-core                 2021.9.1           pyhd8ed1ab_0    conda-forge
    dask-image                0.6.0                    pypi_0    pypi
    datacube                  1.8.6              pyhd8ed1ab_0    conda-forge
    datashader                0.13.0             pyh6c4a22f_0    conda-forge
    datashape                 0.5.4                      py_1    conda-forge
    debugpy                   1.4.1            py39h415ef7b_0    conda-forge
    decorator                 5.1.0              pyhd8ed1ab_0    conda-forge
    defusedxml                0.7.1              pyhd8ed1ab_0    conda-forge
    distlib                   0.3.3                    pypi_0    pypi
    distributed               2021.9.1         py39hcbf5309_0    conda-forge
    docstring-parser          0.7.3                    pypi_0    pypi
    docstring_parser          0.12               pyhd8ed1ab_0    conda-forge
    docutils                  0.17.1           py39hcbf5309_0    conda-forge
    easyprocess               0.3                      pypi_0    pypi
    entrypoint2               0.2.4                    pypi_0    pypi
    entrypoints               0.3             py39hde42818_1002    conda-forge
    expat                     2.4.1                h39d44d4_0    conda-forge
    falcon                    2.0.0                    pypi_0    pypi
    fastapi                   0.67.0                   pypi_0    pypi
    fastapi-utils             0.2.1                    pypi_0    pypi
    fasteners                 0.16               pyhd8ed1ab_0    conda-forge
    filelock                  3.3.0                    pypi_0    pypi
    fiona                     1.8.20           py39hea8b339_1    conda-forge
    folium                    0.12.1                   pypi_0    pypi
    font-ttf-dejavu-sans-mono 2.37                 hab24e00_0    conda-forge
    font-ttf-inconsolata      3.000                h77eed37_0    conda-forge
    font-ttf-source-code-pro  2.038                h77eed37_0    conda-forge
    font-ttf-ubuntu           0.83                 hab24e00_0    conda-forge
    fontconfig                2.13.1            h1989441_1005    conda-forge
    fonts-conda-ecosystem     1                             0    conda-forge
    fonts-conda-forge         1                             0    conda-forge
    freetype                  2.10.4               h546665d_1    conda-forge
    freetype-py               2.2.0              pyh9f0ad1d_0    conda-forge
    freexl                    1.0.6                ha8e266a_0    conda-forge
    frozenlist                1.2.0                    pypi_0    pypi
    fsspec                    2021.10.0          pyhd8ed1ab_0    conda-forge
    gdal                      3.3.2            py39h7c9a9b1_2    conda-forge
    geoalchemy2               0.7.0                    pypi_0    pypi
    geojson-pydantic          0.3.1                    pypi_0    pypi
    geopandas                 0.10.0             pyhd8ed1ab_0    conda-forge
    geopandas-base            0.10.0             pyha770c72_0    conda-forge
    geos                      3.9.1                h39d44d4_2    conda-forge
    geotiff                   1.7.0                ha8a8a2d_0    conda-forge
    gettext                   0.19.8.1          ha2e2712_1008    conda-forge
    gflags                    2.2.2             ha925a31_1004    conda-forge
    ghp-import                2.0.2                    pypi_0    pypi
    giflib                    5.2.1                h8d14728_2    conda-forge
    gitdb                     4.0.7                    pypi_0    pypi
    gitpython                 3.1.24                   pypi_0    pypi
    glog                      0.5.0                h4797de2_0    conda-forge
    greenlet                  1.1.2            py39h415ef7b_0    conda-forge
    grpc-cpp                  1.40.0               h2431d41_2    conda-forge
    h11                       0.12.0                   pypi_0    pypi
    hdf4                      4.2.15               h0e5069d_3    conda-forge
    hdf5                      1.12.1          nompi_h2a0e4a3_100    conda-forge
    heapdict                  1.0.1                      py_0    conda-forge
    holoviews                 1.14.6             pyhd8ed1ab_0    conda-forge
    hsluv                     5.0.2              pyh44b312d_0    conda-forge
    httpcore                  0.13.7                   pypi_0    pypi
    httpx                     0.20.0                   pypi_0    pypi
    hug                       2.6.1                    pypi_0    pypi
    hvplot                    0.7.3              pyh6c4a22f_0    conda-forge
    icu                       68.1                 h0e60522_0    conda-forge
    identify                  2.3.0                    pypi_0    pypi
    idna                      3.2                      pypi_0    pypi
    imagecodecs               2021.6.8         py39h166567b_0    conda-forge
    imageio                   2.9.0                      py_0    conda-forge
    imagesize                 1.2.0                      py_0    conda-forge
    importlib-metadata        4.8.1            py39hcbf5309_0    conda-forge
    importlib_metadata        4.8.1                hd8ed1ab_0    conda-forge
    iniconfig                 1.1.1                    pypi_0    pypi
    intel-openmp              2021.3.0          h57928b3_3372    conda-forge
    ipykernel                 6.4.1            py39h832f523_0    conda-forge
    ipyleaflet                0.13.6                   pypi_0    pypi
    ipyspin                   0.1.5                    pypi_0    pypi
    ipython                   7.28.0           py39h832f523_0    conda-forge
    ipython_genutils          0.2.0                      py_1    conda-forge
    ipyurl                    0.1.2                    pypi_0    pypi
    ipywidgets                7.6.5              pyhd8ed1ab_0    conda-forge
    iso-639                   0.4.5                    pypi_0    pypi
    iso3166                   2.0.2                    pypi_0    pypi
    isodate                   0.6.0                    pypi_0    pypi
    jbig                      2.1               h8d14728_2003    conda-forge
    jedi                      0.18.0           py39hcbf5309_2    conda-forge
    jinja2                    2.11.3                   pypi_0    pypi
    jmespath                  0.10.0             pyh9f0ad1d_0    conda-forge
    joblib                    1.0.1              pyhd8ed1ab_0    conda-forge
    jpeg                      9d                   h8ffe710_0    conda-forge
    json5                     0.9.6                    pypi_0    pypi
    jsonschema                4.0.1              pyhd8ed1ab_0    conda-forge
    jupyter                   1.0.0            py39hcbf5309_6    conda-forge
    jupyter-server            1.11.1                   pypi_0    pypi
    jupyter_client            6.1.12             pyhd8ed1ab_0    conda-forge
    jupyter_console           6.4.0              pyhd8ed1ab_1    conda-forge
    jupyter_core              4.8.1            py39hcbf5309_0    conda-forge
    jupyterlab                3.1.17                   pypi_0    pypi
    jupyterlab-server         2.8.2                    pypi_0    pypi
    jupyterlab_pygments       0.1.2              pyh9f0ad1d_0    conda-forge
    jupyterlab_widgets        1.0.2              pyhd8ed1ab_0    conda-forge
    jxrlib                    1.1                  h8ffe710_2    conda-forge
    kealib                    1.4.14               h8995ca9_3    conda-forge
    kiwisolver                1.3.2            py39h2e07f2f_0    conda-forge
    krb5                      1.19.2               hbae68bd_2    conda-forge
    lark-parser               0.12.0             pyhd8ed1ab_0    conda-forge
    lcms2                     2.12                 h2a16943_0    conda-forge
    lerc                      2.2.1                h0e60522_0    conda-forge
    libaec                    1.0.6                h39d44d4_0    conda-forge
    libblas                   3.9.0              11_win64_mkl    conda-forge
    libbrotlicommon           1.0.9                h8ffe710_5    conda-forge
    libbrotlidec              1.0.9                h8ffe710_5    conda-forge
    libbrotlienc              1.0.9                h8ffe710_5    conda-forge
    libcblas                  3.9.0              11_win64_mkl    conda-forge
    libclang                  11.1.0          default_h5c34c98_1    conda-forge
    libcurl                   7.79.1               h789b8ee_1    conda-forge
    libdeflate                1.7                  h8ffe710_5    conda-forge
    libffi                    3.4.2                h0e60522_4    conda-forge
    libgdal                   3.3.2                hfb14b67_2    conda-forge
    libglib                   2.68.4               h3be07f2_1    conda-forge
    libiconv                  1.16                 he774522_0    conda-forge
    libkml                    1.3.0             h9859afa_1014    conda-forge
    liblapack                 3.9.0              11_win64_mkl    conda-forge
    liblapacke                3.9.0              11_win64_mkl    conda-forge
    libnetcdf                 4.8.1           nompi_h1cc8e9d_101    conda-forge
    libpng                    1.6.37               h1d00b33_2    conda-forge
    libpq                     13.3                 hfcc5ef8_0    conda-forge
    libprotobuf               3.18.1               h7755175_0    conda-forge
    librttopo                 1.1.0                hb340de5_6    conda-forge
    libsodium                 1.0.18               h8d14728_1    conda-forge
    libspatialindex           1.9.3                h39d44d4_4    conda-forge
    libspatialite             5.0.1                h762a7f4_6    conda-forge
    libssh2                   1.10.0               h680486a_2    conda-forge
    libthrift                 0.15.0               h636ae23_1    conda-forge
    libtiff                   4.3.0                h0c97f57_1    conda-forge
    libutf8proc               2.6.1                hcb41399_0    conda-forge
    libuv                     1.42.0               h8ffe710_0    conda-forge
    libwebp-base              1.2.1                h8ffe710_0    conda-forge
    libxml2                   2.9.12               hf5bbc77_0    conda-forge
    libzip                    1.8.0                hfed4ece_1    conda-forge
    libzlib                   1.2.11            h8ffe710_1013    conda-forge
    libzopfli                 1.0.3                h0e60522_0    conda-forge
    llvmlite                  0.37.0           py39ha0cd8c8_0    conda-forge
    locket                    0.2.0                      py_2    conda-forge
    loguru                    0.5.3                    pypi_0    pypi
    lxml                      4.6.4                    pypi_0    pypi
    lz4-c                     1.9.3                h8ffe710_1    conda-forge
    m2w64-gcc-libgfortran     5.3.0                         6    conda-forge
    m2w64-gcc-libs            5.3.0                         7    conda-forge
    m2w64-gcc-libs-core       5.3.0                         7    conda-forge
    m2w64-gmp                 6.1.0                         2    conda-forge
    m2w64-libwinpthread-git   5.0.0.4634.697f757               2    conda-forge
    magicgui                  0.3.2              pyhd8ed1ab_0    conda-forge
    mako                      1.1.5                    pypi_0    pypi
    mapclassify               2.4.3              pyhd8ed1ab_0    conda-forge
    markdown                  3.3.4              pyhd8ed1ab_0    conda-forge
    markupsafe                2.0.1            py39hb82d6ee_0    conda-forge
    matplotlib-base           3.4.3            py39h581301d_1    conda-forge
    matplotlib-inline         0.1.3              pyhd8ed1ab_0    conda-forge
    mercantile                1.2.1                    pypi_0    pypi
    mergedeep                 1.3.4                    pypi_0    pypi
    mistune                   0.8.4           py39hb82d6ee_1004    conda-forge
    mkdocs                    1.2.2                    pypi_0    pypi
    mkdocs-material           7.3.2                    pypi_0    pypi
    mkdocs-material-extensions 1.0.3                    pypi_0    pypi
    mkl                       2021.3.0           hb70f87d_564    conda-forge
    mkl-devel                 2021.3.0           h57928b3_565    conda-forge
    mkl-include               2021.3.0           hb70f87d_564    conda-forge
    monotonic                 1.5                        py_0    conda-forge
    morecantile               2.1.4                    pypi_0    pypi
    msgpack-python            1.0.2            py39h2e07f2f_1    conda-forge
    msrest                    0.6.21                   pypi_0    pypi
    mss                       6.1.0                    pypi_0    pypi
    msys2-conda-epoch         20160418                      1    conda-forge
    multidict                 5.2.0                    pypi_0    pypi
    multipledispatch          0.6.0                      py_0    conda-forge
    munch                     2.5.0                      py_0    conda-forge
    mutagen                   1.45.1                   pypi_0    pypi
    mypy-extensions           0.4.3                    pypi_0    pypi
    napari                    0.4.12             pyhd8ed1ab_0    conda-forge
    napari-console            0.0.4              pyhd8ed1ab_0    conda-forge
    napari-plugin-engine      0.2.0            py39hcbf5309_0    conda-forge
    napari-svg                0.1.5              pyhd8ed1ab_0    conda-forge
    nbclassic                 0.3.2                    pypi_0    pypi
    nbclient                  0.5.4              pyhd8ed1ab_0    conda-forge
    nbconvert                 6.2.0            py39hcbf5309_0    conda-forge
    nbformat                  5.1.3              pyhd8ed1ab_0    conda-forge
    nest-asyncio              1.5.1              pyhd8ed1ab_0    conda-forge
    netcdf4                   1.5.7           nompi_py39hf113b1f_103    conda-forge
    networkx                  2.5                        py_0    conda-forge
    nodeenv                   1.6.0                    pypi_0    pypi
    notebook                  6.4.4              pyha770c72_0    conda-forge
    numba                     0.54.0           py39hb8cd55e_0    conda-forge
    numcodecs                 0.9.1            py39h415ef7b_1    conda-forge
    numexpr                   2.7.3                    pypi_0    pypi
    numpy                     1.20.0           py39h6635163_0    conda-forge
    numpydoc                  1.1.0                      py_1    conda-forge
    oauthlib                  3.1.1                    pypi_0    pypi
    odc-algo                  0.2.0a4                  pypi_0    pypi
    odc-io                    0.2.0a1                  pypi_0    pypi
    odc-stac                  0.2.0a8                  pypi_0    pypi
    olefile                   0.46               pyh9f0ad1d_1    conda-forge
    opencv-python             4.5.4.58                 pypi_0    pypi
    openjpeg                  2.4.0                hb211442_1    conda-forge
    openssl                   1.1.1l               h8ffe710_0    conda-forge
    orjson                    3.6.4                    pypi_0    pypi
    packaging                 21.0               pyhd8ed1ab_0    conda-forge
    pafy                      0.5.5                    pypi_0    pypi
    pandas                    1.2.5                    pypi_0    pypi
    pandoc                    2.14.2               h8ffe710_0    conda-forge
    pandocfilters             1.5.0              pyhd8ed1ab_0    conda-forge
    panel                     0.12.4             pyhd8ed1ab_0    conda-forge
    param                     1.11.1             pyh6c4a22f_0    conda-forge
    parquet-cpp               1.5.1                         1    conda-forge
    parso                     0.8.2              pyhd8ed1ab_0    conda-forge
    partd                     1.2.0              pyhd8ed1ab_0    conda-forge
    pathspec                  0.9.0                    pypi_0    pypi
    pcre                      8.45                 h0e60522_0    conda-forge
    pdocs                     1.1.1                    pypi_0    pypi
    pickleshare               0.7.5           py39hde42818_1002    conda-forge
    pillow                    8.3.2            py39h916092e_0    conda-forge
    pims                      0.5                      pypi_0    pypi
    pint                      0.18               pyhd8ed1ab_0    conda-forge
    pip                       21.2.4             pyhd8ed1ab_0    conda-forge
    pixman                    0.40.0               h8ffe710_0    conda-forge
    platformdirs              2.4.0                    pypi_0    pypi
    plotext                   2.3.1                    pypi_0    pypi
    pluggy                    1.0.0                    pypi_0    pypi
    pooch                     1.5.2              pyhd8ed1ab_0    conda-forge
    poppler                   21.09.0              h24fffdf_3    conda-forge
    poppler-data              0.4.11               hd8ed1ab_0    conda-forge
    postgresql                13.3                 h1c22c4f_0    conda-forge
    pre-commit                2.15.0                   pypi_0    pypi
    proj                      8.0.1                h1cfcee9_0    conda-forge
    prometheus_client         0.11.0             pyhd8ed1ab_0    conda-forge
    prompt-toolkit            3.0.20             pyha770c72_0    conda-forge
    prompt_toolkit            3.0.20               hd8ed1ab_0    conda-forge
    protobuf                  3.18.1                   pypi_0    pypi
    psutil                    5.8.0            py39hb82d6ee_1    conda-forge
    psycopg2                  2.9.1            py39h0878f49_0    conda-forge
    psycopg2-binary           2.9.1                    pypi_0    pypi
    psygnal                   0.1.4            py39h2e07f2f_0    conda-forge
    py                        1.10.0                   pypi_0    pypi
    pyarrow                   5.0.0           py39hf9247be_8_cpu    conda-forge
    pycparser                 2.20               pyh9f0ad1d_2    conda-forge
    pycryptodome              3.11.0                   pypi_0    pypi
    pycryptodomex             3.11.0                   pypi_0    pypi
    pyct                      0.4.6                      py_0    conda-forge
    pyct-core                 0.4.6                      py_0    conda-forge
    pydantic                  1.8.2            py39hb82d6ee_0    conda-forge
    pydeck                    0.7.0                    pypi_0    pypi
    pyee                      8.2.2                    pypi_0    pypi
    pygeos                    0.10.2                   pypi_0    pypi
    pygments                  2.10.0             pyhd8ed1ab_0    conda-forge
    pymdown-extensions        9.0                      pypi_0    pypi
    pyopengl                  3.1.5                      py_0    conda-forge
    pyopenssl                 21.0.0             pyhd8ed1ab_0    conda-forge
    pyparsing                 2.4.7              pyh9f0ad1d_0    conda-forge
    pypgstac                  0.3.4                    pypi_0    pypi
    pyppeteer                 0.2.6                    pypi_0    pypi
    pyproj                    3.2.1            py39ha996c60_2    conda-forge
    pyqt                      5.12.3           py39hcbf5309_7    conda-forge
    pyqt-impl                 5.12.3           py39h415ef7b_7    conda-forge
    pyqt5-sip                 4.19.18          py39h415ef7b_7    conda-forge
    pyqtchart                 5.12             py39h415ef7b_7    conda-forge
    pyqtwebengine             5.12.1           py39h415ef7b_7    conda-forge
    pyrsistent                0.17.3           py39hb82d6ee_2    conda-forge
    pyscreenshot              3.0                      pypi_0    pypi
    pyshp                     2.1.3              pyh44b312d_0    conda-forge
    pysocks                   1.7.1            py39hcbf5309_3    conda-forge
    pystac                    1.2.0                    pypi_0    pypi
    pystac-client             0.3.0                    pypi_0    pypi
    pytest                    6.2.5                    pypi_0    pypi
    pytest-asyncio            0.16.0                   pypi_0    pypi
    pytest-cov                3.0.0                    pypi_0    pypi
    python                    3.9.7           h7840368_3_cpython    conda-forge
    python-dateutil           2.8.2              pyhd8ed1ab_0    conda-forge
    python-dotenv             0.19.0                   pypi_0    pypi
    python-snappy             0.6.0            py39h1d87f24_0    conda-forge
    python_abi                3.9                      2_cp39    conda-forge
    pytorch                   1.10.0          py3.9_cuda10.2_cudnn7_0    pytorch
    pytorch-mutex             1.0                        cuda    pytorch
    pytz                      2021.1             pyhd8ed1ab_0    conda-forge
    pytz-deprecation-shim     0.1.0.post0              pypi_0    pypi
    pyviz_comms               2.1.0              pyhd8ed1ab_0    conda-forge
    pywavelets                1.1.1            py39h5d4886f_3    conda-forge
    pywin32                   301              py39hb82d6ee_0    conda-forge
    pywinpty                  1.1.4            py39h99910a6_0    conda-forge
    pyyaml                    5.4.1            py39hb82d6ee_1    conda-forge
    pyyaml-env-tag            0.1                      pypi_0    pypi
    pyzmq                     22.3.0           py39he46f08e_0    conda-forge
    qt                        5.12.9               h5909a2a_4    conda-forge
    qtconsole                 5.1.1              pyhd8ed1ab_0    conda-forge
    qtpy                      1.11.2             pyhd8ed1ab_0    conda-forge
    rasterio                  1.2.8            py39h85efae1_0    conda-forge
    re2                       2021.09.01           h0e60522_0    conda-forge
    regex                     2021.9.30                pypi_0    pypi
    requests                  2.26.0             pyhd8ed1ab_0    conda-forge
    requests-oauthlib         1.3.0                    pypi_0    pypi
    requests-unixsocket       0.2.0                    pypi_0    pypi
    retrying                  1.3.3                      py_2    conda-forge
    rfc3986                   1.5.0                    pypi_0    pypi
    rio-cogeo                 2.3.1                    pypi_0    pypi
    rio-color                 1.0.4                    pypi_0    pypi
    rio-mucho                 1.0.0                    pypi_0    pypi
    rio-stac                  0.3.1                    pypi_0    pypi
    rio-tiler                 2.1.3                    pypi_0    pypi
    rio-tiler-pds             0.5.2                    pypi_0    pypi
    rio-toa                   0.3.0                    pypi_0    pypi
    rio-viz                   0.7.2                    pypi_0    pypi
    rioxarray                 0.7.1              pyhd8ed1ab_0    conda-forge
    rtree                     0.9.7            py39h09fdee3_2    conda-forge
    s3transfer                0.5.0              pyhd8ed1ab_0    conda-forge
    scikit-image              0.18.3                   pypi_0    pypi
    scikit-learn              1.0              py39h74df8f2_1    conda-forge
    scipy                     1.7.1            py39hc0c34ad_0    conda-forge
    send2trash                1.8.0              pyhd8ed1ab_0    conda-forge
    setuptools                58.0.4           py39hcbf5309_2    conda-forge
    shapely                   1.7.1            py39haadaec5_5    conda-forge
    simplejpeg                1.6.2                    pypi_0    pypi
    simplejson                3.17.5                   pypi_0    pypi
    six                       1.16.0             pyh6c4a22f_0    conda-forge
    slicerator                1.0.0                    pypi_0    pypi
    smart-open                4.2.0                    pypi_0    pypi
    smmap                     4.0.0                    pypi_0    pypi
    snappy                    1.1.8                ha925a31_3    conda-forge
    sniffio                   1.2.0                    pypi_0    pypi
    snowballstemmer           2.1.0              pyhd8ed1ab_0    conda-forge
    snuggs                    1.4.7                      py_0    conda-forge
    sortedcontainers          2.4.0              pyhd8ed1ab_0    conda-forge
    soupsieve                 2.2.1                    pypi_0    pypi
    spatialpandas             0.4.3              pyhd8ed1ab_0    conda-forge
    sphinx                    4.2.0              pyh6c4a22f_0    conda-forge
    sphinxcontrib-applehelp   1.0.2                      py_0    conda-forge
    sphinxcontrib-devhelp     1.0.2                      py_0    conda-forge
    sphinxcontrib-htmlhelp    2.0.0              pyhd8ed1ab_0    conda-forge
    sphinxcontrib-jsmath      1.0.1                      py_0    conda-forge
    sphinxcontrib-qthelp      1.0.3                      py_0    conda-forge
    sphinxcontrib-serializinghtml 1.1.5              pyhd8ed1ab_0    conda-forge
    sqlakeyset                1.0.1629029818           pypi_0    pypi
    sqlalchemy                1.3.23                   pypi_0    pypi
    sqlite                    3.36.0               h8ffe710_2    conda-forge
    stac-fastapi-api          2.1.1                     dev_0    <develop>
    stac-fastapi-extensions   2.1.1                     dev_0    <develop>
    stac-fastapi-pgstac       2.1.1                     dev_0    <develop>
    stac-fastapi-sqlalchemy   2.1.1                     dev_0    <develop>
    stac-fastapi-types        2.1.1                     dev_0    <develop>
    stac-nb                   0.4.0                    pypi_0    pypi
    stac-pydantic             2.0.1                    pypi_0    pypi
    stackstac                 0.2.1              pyhd8ed1ab_0    conda-forge
    stacterm                  0.1.0                    pypi_0    pypi
    starlette                 0.14.2                   pypi_0    pypi
    starlette-cramjam         0.1.0                    pypi_0    pypi
    streamlink                2.4.0                    pypi_0    pypi
    streamlit                 1.0.0                    pypi_0    pypi
    streamlit-folium          0.4.0                    pypi_0    pypi
    supermercado              0.2.0                    pypi_0    pypi
    superqt                   0.2.4              pyhd8ed1ab_0    conda-forge
    tbb                       2021.3.0             h2d74725_0    conda-forge
    tblib                     1.7.0              pyhd8ed1ab_0    conda-forge
    terminado                 0.12.1           py39hcbf5309_0    conda-forge
    termtables                0.2.4                    pypi_0    pypi
    testpath                  0.5.0              pyhd8ed1ab_0    conda-forge
    threadpoolctl             3.0.0              pyh8a188c0_0    conda-forge
    tifffile                  2021.8.30                pypi_0    pypi
    tiledb                    2.3.4                h78dabda_0    conda-forge
    titiler-application       0.3.11                    dev_0    <develop>
    titiler-core              0.3.11                    dev_0    <develop>
    titiler-mosaic            0.3.11                    dev_0    <develop>
    titiler-pgstac            0.1.0a1                   dev_0    <develop>
    tk                        8.6.11               h8ffe710_1    conda-forge
    toml                      0.10.2                   pypi_0    pypi
    tomli                     1.2.1                    pypi_0    pypi
    toolz                     0.11.1                     py_0    conda-forge
    torchvision               0.11.1               py39_cu102    pytorch
    tornado                   6.1              py39hb82d6ee_1    conda-forge
    tqdm                      4.62.3             pyhd8ed1ab_0    conda-forge
    traitlets                 5.1.0              pyhd8ed1ab_0    conda-forge
    traittypes                0.2.1                    pypi_0    pypi
    typer                     0.3.2                    pypi_0    pypi
    typing-extensions         3.10.0.2             hd8ed1ab_0    conda-forge
    typing_extensions         3.10.0.2           pyha770c72_0    conda-forge
    tzdata                    2021.2.post0             pypi_0    pypi
    tzlocal                   4.0                      pypi_0    pypi
    ucrt                      10.0.20348.0         h57928b3_0    conda-forge
    urllib3                   1.26.7             pyhd8ed1ab_0    conda-forge
    uvicorn                   0.15.0                   pypi_0    pypi
    validators                0.18.2                   pypi_0    pypi
    vc                        14.2                 hb210afc_5    conda-forge
    vidgear                   0.2.3                    pypi_0    pypi
    virtualenv                20.8.1                   pypi_0    pypi
    vispy                     0.9.2            py39h5d4886f_0    conda-forge
    vs2015_runtime            14.29.30037          h902a5da_5    conda-forge
    watchdog                  2.1.6                    pypi_0    pypi
    wcwidth                   0.2.5              pyh9f0ad1d_2    conda-forge
    webencodings              0.5.1                      py_1    conda-forge
    websocket-client          1.2.1                    pypi_0    pypi
    websockets                9.1                      pypi_0    pypi
    wheel                     0.37.0             pyhd8ed1ab_1    conda-forge
    widgetsnbextension        3.5.1            py39hcbf5309_4    conda-forge
    win32-setctime            1.0.3                    pypi_0    pypi
    win_inet_pton             1.1.0            py39hcbf5309_2    conda-forge
    winpty                    0.4.3                         4    conda-forge
    wrapt                     1.13.2           py39hb82d6ee_0    conda-forge
    xarray                    0.19.0             pyhd8ed1ab_1    conda-forge
    xarray-leaflet            0.1.15                   pypi_0    pypi
    xarray-spatial            0.2.9              pyhd8ed1ab_0    conda-forge
    xerces-c                  3.2.3                h0e60522_2    conda-forge
    xyzservices               2021.9.1           pyhd8ed1ab_0    conda-forge
    xz                        5.2.5                h62dcd97_1    conda-forge
    yaml                      0.2.5                he774522_0    conda-forge
    yarl                      1.7.2                    pypi_0    pypi
    yt-dlp                    2021.11.10.1             pypi_0    pypi
    zarr                      2.10.1             pyhd8ed1ab_0    conda-forge
    zeromq                    4.3.4                h0e60522_1    conda-forge
    zfp                       0.5.5                h0e60522_7    conda-forge
    zict                      2.0.0                      py_0    conda-forge
    zipp                      3.6.0              pyhd8ed1ab_0    conda-forge
    zlib                      1.2.11            h8ffe710_1013    conda-forge
    zstd                      1.5.0                h6255e5f_0    conda-forge
    
    opened by robintw 6
  • Issues with numpy 1.24.0

    Issues with numpy 1.24.0

    Hi @gjoseph92 , I am getting an error when trying to stack data when the latest numpy version 1.24.0 is installed. All works fine with 1.23.5.

    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    Cell In[48], line 1
    ----> 1 stack = stackstac.stack(itms.get_all_items(), resolution=10)
    
    File ~/anaconda3/envs/ag-ml/lib/python3.8/site-packages/stackstac/stack.py:311, in stack(items, assets, epsg, resolution, bounds, bounds_latlon, snap_bounds, resampling, chunksize, dtype, fill_value, rescale, sortby_date, xy_coords, properties, band_coords, gdal_env, errors_as_nodata, reader)
        287 asset_table, spec, asset_ids, plain_items = prepare_items(
        288     plain_items,
        289     assets=assets,
       (...)
        294     snap_bounds=snap_bounds,
        295 )
        296 arr = items_to_dask(
        297     asset_table,
        298     spec,
       (...)
        306     errors_as_nodata=errors_as_nodata,
        307 )
        309 return xr.DataArray(
        310     arr,
    --> 311     *to_coords(
        312         plain_items,
        313         asset_ids,
        314         spec,
        315         xy_coords=xy_coords,
        316         properties=properties,
        317         band_coords=band_coords,
        318     ),
        319     attrs=to_attrs(spec),
        320     name="stackstac-" + dask.base.tokenize(arr),
        321 )
    
    File ~/anaconda3/envs/ag-ml/lib/python3.8/site-packages/stackstac/prepare.py:500, in to_coords(items, asset_ids, spec, xy_coords, properties, band_coords)
        496     except KeyError:
        497         pass
        499 coords.update(
    --> 500     accumulate_metadata.metadata_to_coords(
        501         flattened_metadata_by_asset,
        502         "band",
        503         skip_fields={"href"},
        504         # skip_fields={"href", "title", "description", "type", "roles"},
        505     )
        506 )
        507 if any(eo_by_asset):
        508     coords.update(
        509         accumulate_metadata.metadata_to_coords(
        510             eo_by_asset,
       (...)
        513         )
        514     )
    
    File ~/anaconda3/envs/ag-ml/lib/python3.8/site-packages/stackstac/accumulate_metadata.py:29, in metadata_to_coords(items, dim_name, fields, skip_fields)
         23 def metadata_to_coords(
         24     items: Iterable[Mapping[str, object]],
         25     dim_name: str,
         26     fields: Union[str, Sequence[str], Literal[True]] = True,
         27     skip_fields: Container[str] = (),
         28 ) -> Dict[str, xr.Variable]:
    ---> 29     return dict_to_coords(
         30         accumulate_metadata(
         31             items,
         32             fields=[fields] if isinstance(fields, str) else fields,
         33             skip_fields=skip_fields,
         34         ),
         35         dim_name,
         36     )
    
    File ~/anaconda3/envs/ag-ml/lib/python3.8/site-packages/stackstac/accumulate_metadata.py:168, in dict_to_coords(metadata, dim_name)
        163     except TypeError:
        164         # if it's not set-able, just give up
        165         break
        167 props_arr = np.squeeze(
    --> 168     np.array(
        169         props,
        170         # Avoid DeprecationWarning creating ragged arrays when elements are lists/tuples of different lengths
        171         dtype="object"
        172         if (
        173             isinstance(props, _ourlist)
        174             and len(set(len(x) for x in props if isinstance(x, (list, tuple))))
        175             > 1
        176         )
        177         else None,
        178     )
        179 )
        181 if (
        182     props_arr.ndim > 1
        183     or props_arr.ndim == 1
       (...)
        187     # our "bands", "y", and "x" dimensions, and xarray won't let us use unrelated
        188     # dimensions. so just skip it for now.
        189     continue
    
    ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (17,) + inhomogeneous part.
    
    opened by julianblue 5
  • Enable opening datasets with gcps even if no valid crs is present

    Enable opening datasets with gcps even if no valid crs is present

    Some datasets like Sentinel-1 GRD files don't have a coordinate reference system (crs), but they may have ground control points (gcps) and can still be opened by rasterio.

    Test this branch using pip install git+https://github.com/weiji14/[email protected]_vrt_with_gcps

    Fixes #181.

    opened by weiji14 6
  • AttributeError: 'NoneType' object has no attribute 'to_epsg'

    AttributeError: 'NoneType' object has no attribute 'to_epsg'

    Hi there, just encountering some issues while trying to stack Sentinel-1 GRD data from Planetary Computer at https://github.com/weiji14/zen3geo/pull/62, and I'm wondering if it's an issue on the STAC metadata side (c.f. #152), or if it's something to fix in stackstac.

    Here's a MCWE, using pystac=1.4.0, planetary-computer=0.4.7, stackstac=0.4.3:

    import stackstac
    import pystac
    import planetary_computer
    import xarray as xr
    
    # %%
    item_urls: list = [
        "https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-1-grd/items/S1A_IW_GRDH_1SDV_20220320T230514_20220320T230548_042411_050E99",
        "https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-1-grd/items/S1A_IW_GRDH_1SDV_20220308T230513_20220308T230548_042236_0508AF",
    ]
    
    # Load the individual item metadata and sign the assets
    items: list[pystac.Item] = [pystac.Item.from_file(item_url) for item_url in item_urls]
    signed_items: list[pystac.Item] = [planetary_computer.sign(item) for item in items]
    
    # Stack Sentinel-1 GRD files
    dataarray: xr.DataArray = stackstac.stack(
        items=signed_items,
        epsg=32647,
        resolution=30,
        bounds_latlon=[99.933681, -0.009951, 100.065765, 0.147054],  # W, S, E, N
    )
    assert dataarray.crs == "epsg:32647"
    print(dataarray)
    

    The output xarray repr looks ok like so. Dimensions are (time:2, band:2, y:579, x:491).

    <xarray.DataArray 'stackstac-950602eb423dd1d439106f6794699f05' (time: 2,
                                                                    band: 2,
                                                                    y: 579, x: 491)>
    dask.array<fetch_raster_window, shape=(2, 2, 579, 491), dtype=float64, chunksize=(1, 1, 579, 491), chunktype=numpy.ndarray>
    Coordinates: (12/37)
      * time                                   (time) datetime64[ns] 2022-03-08T2...
        id                                     (time) <U62 'S1A_IW_GRDH_1SDV_2022...
      * band                                   (band) <U2 'vh' 'vv'
      * x                                      (x) float64 6.039e+05 ... 6.186e+05
      * y                                      (y) float64 1.626e+04 ... -1.08e+03
        end_datetime                           (time) <U32 '2022-03-08 23:05:48.2...
        ...                                     ...
        s1:resolution                          <U4 'high'
        s1:product_timeliness                  <U8 'Fast-24h'
        sat:relative_orbit                     int64 164
        description                            (band) <U145 'Amplitude of signal ...
        title                                  (band) <U41 'VH: vertical transmit...
        epsg                                   int64 32647
    Attributes:
        spec:        RasterSpec(epsg=32647, bounds=(603870, -1110, 618600, 16260)...
        crs:         epsg:32647
        transform:   | 30.00, 0.00, 603870.00|\n| 0.00,-30.00, 16260.00|\n| 0.00,...
        resolution:  30
    

    but when I try to run dataarray.compute(), this AttributeError: 'NoneType' object has no attribute 'to_epsg' message popped up. Here's the full traceback:

    ---------------------------------------------------------------------------
    AttributeError                            Traceback (most recent call last)
    Input In [11], in <cell line: 1>()
    ----> 1 dataarray.compute()
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/xarray/core/dataarray.py:993, in DataArray.compute(self, **kwargs)
        974 """Manually trigger loading of this array's data from disk or a
        975 remote source into memory and return a new array. The original is
        976 left unaltered.
       (...)
        990 dask.compute
        991 """
        992 new = self.copy(deep=False)
    --> 993 return new.load(**kwargs)
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/xarray/core/dataarray.py:967, in DataArray.load(self, **kwargs)
        949 def load(self: T_DataArray, **kwargs) -> T_DataArray:
        950     """Manually trigger loading of this array's data from disk or a
        951     remote source into memory and return this array.
        952 
       (...)
        965     dask.compute
        966     """
    --> 967     ds = self._to_temp_dataset().load(**kwargs)
        968     new = self._from_temp_dataset(ds)
        969     self._variable = new._variable
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/xarray/core/dataset.py:733, in Dataset.load(self, **kwargs)
        730 import dask.array as da
        732 # evaluate all the dask arrays simultaneously
    --> 733 evaluated_data = da.compute(*lazy_data.values(), **kwargs)
        735 for k, data in zip(lazy_data, evaluated_data):
        736     self.variables[k].data = data
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/base.py:602, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
        599     keys.append(x.__dask_keys__())
        600     postcomputes.append(x.__dask_postcompute__())
    --> 602 results = schedule(dsk, keys, **kwargs)
        603 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, **kwargs)
         86     elif isinstance(pool, multiprocessing.pool.Pool):
         87         pool = MultiprocessingPoolExecutor(pool)
    ---> 89 results = get_async(
         90     pool.submit,
         91     pool._max_workers,
         92     dsk,
         93     keys,
         94     cache=cache,
         95     get_id=_thread_get_id,
         96     pack_exception=pack_exception,
         97     **kwargs,
         98 )
        100 # Cleanup pools associated to dead threads
        101 with pools_lock:
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/local.py:511, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs)
        509         _execute_task(task, data)  # Re-execute locally
        510     else:
    --> 511         raise_exception(exc, tb)
        512 res, worker_id = loads(res_info)
        513 state["cache"][key] = res
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/local.py:319, in reraise(exc, tb)
        317 if exc.__traceback__ is not tb:
        318     raise exc.with_traceback(tb)
    --> 319 raise exc
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/local.py:224, in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
        222 try:
        223     task, data = loads(task_info)
    --> 224     result = _execute_task(task, data)
        225     id = get_id()
        226     result = dumps((result, id))
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
        115     func, args = arg[0], arg[1:]
        116     # Note: Don't assign the subtask results to a variable. numpy detects
        117     # temporaries by their reference count and can execute certain
        118     # operations in-place.
    --> 119     return func(*(_execute_task(a, cache) for a in args))
        120 elif not ishashable(arg):
        121     return arg
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/optimization.py:990, in SubgraphCallable.__call__(self, *args)
        988 if not len(args) == len(self.inkeys):
        989     raise ValueError("Expected %d args, got %d" % (len(self.inkeys), len(args)))
    --> 990 return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args)))
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/core.py:149, in get(dsk, out, cache)
        147 for key in toposort(dsk):
        148     task = dsk[key]
    --> 149     result = _execute_task(task, cache)
        150     cache[key] = result
        151 result = _execute_task(out, cache)
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/dask/core.py:119, in _execute_task(arg, cache, dsk)
        115     func, args = arg[0], arg[1:]
        116     # Note: Don't assign the subtask results to a variable. numpy detects
        117     # temporaries by their reference count and can execute certain
        118     # operations in-place.
    --> 119     return func(*(_execute_task(a, cache) for a in args))
        120 elif not ishashable(arg):
        121     return arg
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/stackstac/to_dask.py:185, in fetch_raster_window(reader_table, slices, dtype, fill_value)
        178 # Only read if the window we're fetching actually overlaps with the asset
        179 if windows.intersect(current_window, asset_window):
        180     # NOTE: when there are multiple assets, we _could_ parallelize these reads with our own threadpool.
        181     # However, that would probably increase memory usage, since the internal, thread-local GDAL datasets
        182     # would end up copied to even more threads.
        183 
        184     # TODO when the Reader won't be rescaling, support passing `output` to avoid the copy?
    --> 185     data = reader.read(current_window)
        187     if all_empty:
        188         # Turn `output` from a broadcast-trick array to a real array, so it's writeable
        189         if (
        190             np.isnan(data)
        191             if np.isnan(fill_value)
        192             else np.equal(data, fill_value)
        193         ).all():
        194             # Unless the data we just read is all empty anyway
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/stackstac/rio_reader.py:385, in AutoParallelRioReader.read(self, window, **kwargs)
        384 def read(self, window: Window, **kwargs) -> np.ndarray:
    --> 385     reader = self.dataset
        386     try:
        387         result = reader.read(
        388             window=window,
        389             masked=True,
       (...)
        392             **kwargs,
        393         )
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/stackstac/rio_reader.py:381, in AutoParallelRioReader.dataset(self)
        379 with self._dataset_lock:
        380     if self._dataset is None:
    --> 381         self._dataset = self._open()
        382     return self._dataset
    
    File ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/stackstac/rio_reader.py:348, in AutoParallelRioReader._open(self)
        340     raise RuntimeError(
        341         f"Assets must have exactly 1 band, but file {self.url!r} has {ds.count}. "
        342         "We can't currently handle multi-band rasters (each band has to be "
        343         "a separate STAC asset), so you'll need to exclude this asset from your analysis."
        344     )
        346 # Only make a VRT if the dataset doesn't match the spatial spec we want
        347 if self.spec.vrt_params != {
    --> 348     "crs": ds.crs.to_epsg(),
        349     "transform": ds.transform,
        350     "height": ds.height,
        351     "width": ds.width,
        352 }:
        353     with self.gdal_env.open_vrt:
        354         vrt = WarpedVRT(
        355             ds,
        356             sharing=False,
        357             resampling=self.resampling,
        358             **self.spec.vrt_params,
        359         )
    
    AttributeError: 'NoneType' object has no attribute 'to_epsg'
    

    So I verified that the stacked Sentinel-1 dataarray does have a crs attribute like 'epsg:32647', but it seems like ds is something else? I did try to step through the code at https://github.com/gjoseph92/stackstac/blob/9106708bcc20daec4c5975e9c1240de38c38f2f1/stackstac/rio_reader.py#L347-L362

    but am having trouble understaing what ds is representing, or how ds.crs can be set to a proper coordinate reference system value. My guess that the crs needs to be set in the STAC Item metadata? Or perhaps the if-statement needs to be revised.

    Oh, and just to make sure the Sentinel-1 GRD STAC Items are readable, I did try reading it using rioxarray=0.11.1:

    import rioxarray
    
    url: str = signed_items[1].assets["vv"].href
    dataarray = rioxarray.open_rasterio(filename=url, overview_level=5)
    dataarray = dataarray.compute()
    print(dataarray)
    # <xarray.DataArray (band: 1, y: 361, x: 396)>
    # array([[[  0, 224, 259, ...,  45,   0,   0],
    #         [  0, 243, 286, ...,  44,   0,   0],
    #         [  0, 274, 248, ...,  43,   0,   0],
    #         ...,
    #         [  0,   0,   0, ...,  34,  36,  36],
    #         [  0,   0,   0, ...,  33,  35,  34],
    #         [  0,   0,   0, ...,  17,  17,  17]]], dtype=uint16)
    # Coordinates:
    #   * band         (band) int64 1
    #     spatial_ref  int64 0
    # Dimensions without coordinates: y, x
    # Attributes:
    #     _FillValue:    0.0
    #     scale_factor:  1.0
    #     add_offset:    0.0
    

    Here's the output from rioxarray.show_version() for completeness to show my GDAL and other geo-library versions:

    rioxarray (0.11.1) deps:
      rasterio: 1.3.0
        xarray: 2022.3.0
          GDAL: 3.5.0
          GEOS: 3.10.2
          PROJ: 9.0.0
     PROJ DATA: ~/mambaforge/envs/zen3geo/lib/python3.10/site-packages/rasterio/proj_data
     GDAL DATA: None
    
    Other python deps:
         scipy: 1.9.0
        pyproj: 3.3.1
    
    System:
        python: 3.10.6 | packaged by conda-forge | (main, Aug 22 2022, 20:36:39) [GCC 10.4.0]
    executable: ~/mambaforge/envs/zen3geo/bin/python
       machine: Linux-5.17.0-1016-oem-x86_64-with-glibc2.35
    

    P.S. Thanks for this amazing library, really like the design and it's been a pleasure to use :smile:

    opened by weiji14 7
  • Unrecognized STAC collection type <class 'pystac.item_collection.ItemCollection'>

    Unrecognized STAC collection type

    I was following the basic example from the docs https://stackstac.readthedocs.io/en/latest/basic.html but it falls over at stackstac.stack(items):

    TypeError: Unrecognized STAC collection type <class 'pystac.item_collection.ItemCollection'>: <pystac.item_collection.ItemCollection object at 0x000001C57CB0BCD0>

    opened by letmaik 4
  • Possibly switch back to Poetry?

    Possibly switch back to Poetry?

    I just switched to PDM, because Poetry was taking multiple hours to lock dependencies (due to Coiled being a dependency with way too many transitive deps, and https://github.com/python-poetry/poetry/issues/5121).

    Testing with Poetry 1.2.0 though, the situation doesn't seem as bad.

    Poetry is still 2x slower than PDM—170 seconds instead of 42. But 170s is way, way better than 3h. It's within the range of tolerability. And since Poetry is more widely used and mature than PDM, and a bit more straightforward to work with in virtualenvs, I'm tempted to stick with it for now, even with the longer lock times. It still doesn't have https://github.com/python-poetry/poetry/issues/697, which could end up being critical at some point, but so far hasn't been too much of an issue.

    Probably won't take any action on this for now, just noting this in case other compelling reasons come up one way or another.

    gabe dev/stackstac ‹fc326d0› » time pdm add -dG docs sphinx-paramlinks
    Adding packages to docs dev-dependencies: sphinx-paramlinks
    Virtualenv /Users/gabe/dev/stackstac/.venv is reused.
    🔒 Lock successful
    Changes are written to pdm.lock.
    Changes are written to pyproject.toml.
    All packages are synced to date, nothing to do.
    
    🎉 All complete!
    
    pdm add -dG docs sphinx-paramlinks  42.10s user 1.82s system 105% cpu 41.601 total
    gabe dev/stackstac ‹fc326d0*› » 
    gabe dev/stackstac ‹fc326d0*› » 
    gabe dev/stackstac ‹c67944b› » time poetry add sphinx-paramlinks
    Using version ^0.5.4 for sphinx-paramlinks
    
    Updating dependencies
    Resolving dependencies... Downloading https://files.pythonhosted.org/packages/2f/be/7d6e073a3eb740ebeba43a69f5de2b141fea50b801e24e0ae024ac94d4ac/matplotlib-3.5.2.tar.gz  28% (2
    Resolving dependencies... Downloading https://files.pythonhosted.org/packages/2f/be/7d6e073a3eb740ebeba43a69f5de2b141fea50b801e24e0ae024ac94d4ac/matplotlib-3.5.2.tar.gz  58% (2
    Resolving dependencies... Downloading https://files.pythonhosted.org/packages/2f/be/7d6e073a3eb740ebeba43a69f5de2b141fea50b801e24e0ae024ac94d4ac/matplotlib-3.5.2.tar.gz  88% (2
    Resolving dependencies... Downloading https://files.pythonhosted.org/packages/03/c6/14a17e10813b8db20d1e800ff9a3a898e65d25f2b0e9d6a94616f1e3362c/numpy-1.23.0.tar.gz  28% (42.5s
    Resolving dependencies... (76.1s)
    
    Writing lock file
    
    Package operations: 1 install, 49 updates, 0 removals
    
      • Updating attrs (22.1.0 -> 21.4.0)
      • Updating fastjsonschema (2.16.1 -> 2.15.3)
      • Updating jsonschema (4.15.0 -> 4.6.1)
      • Updating jupyter-core (4.11.1 -> 4.10.0)
      • Updating pyzmq (23.2.1 -> 23.2.0)
      • Updating tornado (6.1 -> 6.2)
      • Updating mistune (2.0.4 -> 0.8.4)
      • Updating nbclient (0.6.7 -> 0.6.6)
      • Updating pygments (2.13.0 -> 2.12.0)
      • Updating sniffio (1.3.0 -> 1.2.0)
      • Updating matplotlib-inline (0.1.6 -> 0.1.3)
      • Updating nbconvert (7.0.0 -> 6.5.0)
      • Updating prompt-toolkit (3.0.31 -> 3.0.30)
      • Updating websocket-client (1.4.1 -> 1.3.3)
      • Updating charset-normalizer (2.1.1 -> 2.1.0)
      • Updating debugpy (1.6.3 -> 1.6.0)
      • Updating frozenlist (1.3.1 -> 1.3.0)
      • Updating psutil (5.9.2 -> 5.9.1)
      • Updating pytz (2022.2.1 -> 2022.1)
      • Updating urllib3 (1.26.12 -> 1.26.9)
      • Updating zipp (3.8.1 -> 3.8.0)
      • Updating ipykernel (6.15.2 -> 6.15.0)
      • Updating json5 (0.9.10 -> 0.9.8)
      • Updating toolz (0.12.0 -> 0.11.2)
      • Updating yarl (1.8.1 -> 1.7.2)
      • Updating cloudpickle (2.2.0 -> 2.1.0)
      • Updating docutils (0.19 -> 0.18.1)
      • Updating fsspec (2022.8.2 -> 2022.5.0)
      • Updating jupyterlab-server (2.15.1 -> 2.15.0)
      • Updating nbclassic (0.4.3 -> 0.4.2)
      • Updating numpy (1.23.2 -> 1.23.0)
      • Updating partd (1.3.0 -> 1.2.0)
      • Updating dask (2022.9.0 -> 2022.6.1)
      • Updating jupyterlab (3.4.6 -> 3.4.3)
      • Updating pystac (1.6.1 -> 1.4.0)
      • Updating sphinx (5.1.1 -> 5.0.2)
      • Updating distributed (2022.9.0 -> 2022.6.1)
      • Updating exceptiongroup (1.0.0rc9 -> 1.0.0rc8)
      • Updating keyring (23.9.1 -> 23.6.0)
      • Updating pathspec (0.10.1 -> 0.9.0)
      • Updating py-spy (0.3.13 -> 0.3.12)
      • Updating readme-renderer (37.1 -> 35.0)
      • Installing tqdm (4.64.0)
      • Updating viztracer (0.15.4 -> 0.12.3)
      • Updating xarray (2022.6.0 -> 2022.3.0)
      • Updating black (22.8.0 -> 22.6.0)
      • Updating graphviz (0.20.1 -> 0.16)
      • Updating hypothesis (6.54.5 -> 6.49.1)
      • Updating rasterio (1.3.2 -> 1.3.0)
      • Updating twine (4.0.1 -> 3.8.0)
    poetry add sphinx-paramlinks  170.55s user 25.80s system 160% cpu 2:02.03 total
    
    opened by gjoseph92 4
Releases(v0.4.3)
  • v0.4.3(Sep 14, 2022)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#043-2022-09-14

    What's Changed

    • Support sequence of Items by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/164
    • Switch to PDM for dependency management by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/169
    • Add param links to docs by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/170
    • Automatically hide pdm.lock in GitHub diffs by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/171
    • Fix docs build for visualization functions by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/172
    • Fix docs build for PDM again by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/175
    • Remove deprecated skip_equivalent pyproj arg by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/174
    • Release v0.4.3 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/176

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.4.2...v0.4.3

    Source code(tar.gz)
    Source code(zip)
  • v0.4.2(Jul 7, 2022)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#042-2022-07-06

    What's Changed

    • Ignore axis if dim is given by @aazuspan in https://github.com/gjoseph92/stackstac/pull/149
    • Update sphinx by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/158
    • Use poetry lockfile in docs build by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/159
    • Remove calls to rio.parse_path (maintain compat with rasterio==1.3) by @carderne in https://github.com/gjoseph92/stackstac/pull/155
    • Remove rasterio windows.from_bounds workaround by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/160
    • Update notebooks for release by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/161
    • Release 0.4.2 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/162

    New Contributors

    • @aazuspan made their first contribution in https://github.com/gjoseph92/stackstac/pull/149
    • @carderne made their first contribution in https://github.com/gjoseph92/stackstac/pull/155

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.4.1...v0.4.2

    Source code(tar.gz)
    Source code(zip)
  • v0.4.1(Apr 15, 2022)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#041-2022-04-15

    What's Changed

    • Pandas compat by @TomAugspurger in https://github.com/gjoseph92/stackstac/pull/143
    • Better error when forgetting mosaic custom nodata by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/144
    • Release 0.4.1 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/145

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.4.0...v0.4.1

    Source code(tar.gz)
    Source code(zip)
  • v0.4.0(Mar 17, 2022)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#040-2022-03-16

    What's Changed

    • Support chunking the time/band dimensions by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/116
    • Better errors for pystac/satstac import issues by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/125
    • Use tree-reduction in mosaic by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/126
    • Update Coiled senv script and dev dependencies by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/127
    • Allow new releases of xarray using "calver" versioning by @scottyhq in https://github.com/gjoseph92/stackstac/pull/137
    • Pillow 9.0.1 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/141
    • Release 0.4.0 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/142

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.3.1...v0.4.0

    Source code(tar.gz)
    Source code(zip)
  • v0.3.1(Jan 20, 2022)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#031-2022-01-20

    What's Changed

    • Support nodata= in mosaic by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/121
    • Catch some mosaic usage errors with nodata=nan by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/123
    • Release 0.3.1 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/122

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.3.0...v0.3.1

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Jan 20, 2022)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#030-2022-01-20

    What's Changed

    • Relax NumPy requirement by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/101
    • Update base URL regex to make show with paths with 'notebook' in them by @robintw in https://github.com/gjoseph92/stackstac/pull/104
    • No fill_value=None; use fill value out-of-bounds by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/115
    • Support 2022 Dask by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/119
    • Release 0.3.0 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/120

    New Contributors

    • @robintw made their first contribution in https://github.com/gjoseph92/stackstac/pull/104

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.2.2...v0.3.0

    Source code(tar.gz)
    Source code(zip)
  • v0.2.2(Dec 3, 2021)

    Changelog: https://github.com/gjoseph92/stackstac/blob/main/CHANGELOG.md#022-2021-12-03

    What's Changed

    • fixed upper right corner bounds transformation by @g2giovanni in https://github.com/gjoseph92/stackstac/pull/60
    • Support pystac.ItemCollection by @TomAugspurger in https://github.com/gjoseph92/stackstac/pull/64
    • Ignore assets that don't have a type, if filtering by mimetype by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/71
    • Fix for Pystac ItemCollections by @scottyhq in https://github.com/gjoseph92/stackstac/pull/69
    • Scripts: only make coiled env in one region by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/72
    • Add __iter__ to PystacItemCollection stub by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/73
    • Allow reading .tiff extensions from storage by @JamesOConnor in https://github.com/gjoseph92/stackstac/pull/75
    • Switch examples back to satsearch by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/80
    • Nit: remove default args from Reader protocol by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/82
    • Switch examples back to pystac again by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/81
    • Bump planetary-computer requirement by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/89
    • Fix accumulate_metadata replacing coordinates with Nones by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/88
    • Fix one-pixel shift with xy_coords="center" by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/94
    • Remove thread-unsafe event logging code by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/95
    • Switch examples to Binder by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/98
    • Release 0.2.2 by @gjoseph92 in https://github.com/gjoseph92/stackstac/pull/99

    New Contributors

    • @g2giovanni made their first contribution in https://github.com/gjoseph92/stackstac/pull/60
    • @TomAugspurger made their first contribution in https://github.com/gjoseph92/stackstac/pull/64
    • @JamesOConnor made their first contribution in https://github.com/gjoseph92/stackstac/pull/75

    Full Changelog: https://github.com/gjoseph92/stackstac/compare/v0.2.1...v0.2.2

    Source code(tar.gz)
    Source code(zip)
The open-source tool for building high-quality datasets and computer vision models

The open-source tool for building high-quality datasets and computer vision models. Website • Docs • Try it Now • Tutorials • Examples • Blog • Commun

Voxel51 2.4k Jan 07, 2023
🐞 📊 Ladybug extension to generate 2D charts

ladybug-charts Ladybug extension to generate 2D charts. Installation pip install ladybug-charts QuickStart import ladybug_charts API Documentation Loc

Ladybug Tools 3 Dec 30, 2022
Python library that makes it easy for data scientists to create charts.

Chartify Chartify is a Python library that makes it easy for data scientists to create charts. Why use Chartify? Consistent input data format: Spend l

Spotify 3.2k Jan 04, 2023
A python visualization of the A* path finding algorithm

A python visualization of the A* path finding algorithm. It allows you to pick your start, end location and make obstacles and then view the process of finding the shortest path. You can also choose

Kimeon 4 Aug 02, 2022
Comparing USD and GBP Exchange Rates

Currency Data Visualization Comparing USD and GBP Exchange Rates This is a bar graph comparing GBP and USD exchange rates. I chose blue for the UK bec

5 Oct 28, 2021
Schema validation just got Pythonic

Schema validation just got Pythonic schema is a library for validating Python data structures, such as those obtained from config-files, forms, extern

Vladimir Keleshev 2.7k Jan 06, 2023
Create 3d loss surface visualizations, with optimizer path. Issues welcome!

MLVTK A loss surface visualization tool Simple feed-forward network trained on chess data, using elu activation and Adam optimizer Simple feed-forward

7 Dec 21, 2022
A simple python tool for explore your object detection dataset

A simple tool for explore your object detection dataset. The goal of this library is to provide simple and intuitive visualizations from your dataset and automatically find the best parameters for ge

GRADIANT - Centro Tecnolóxico de Telecomunicacións de Galicia 142 Dec 25, 2022
University of Missouri - Kansas City: CS451R: Capstone

CS451RC University of Missouri - Kansas City: CS451R: Capstone Installation cd git clone https://github.com/ala2q6/CS451RC.git cd CS451RC pip3 instal

Alex Arbuckle 1 Nov 17, 2021
Apache Superset is a Data Visualization and Data Exploration Platform

Apache Superset is a Data Visualization and Data Exploration Platform

The Apache Software Foundation 49.9k Jan 02, 2023
Visualization of the World Religion Data dataset by Correlates of War Project.

World Religion Data Visualization Visualization of the World Religion Data dataset by Correlates of War Project. Mostly personal project to famirializ

Emile Bangma 1 Oct 15, 2022
Blender addon that creates a temporary window of any type from the 3D View.

CreateTempWindow2.8 Blender addon that creates a temporary window of any type from the 3D View. Features Can the following window types: 3D View Graph

3 Nov 27, 2022
RockNext is an Open Source extending ERPNext built on top of Frappe bringing enterprise ready utilization.

RockNext is an Open Source extending ERPNext built on top of Frappe bringing enterprise ready utilization.

Matheus Breguêz 13 Oct 12, 2022
Python Data. Leaflet.js Maps.

folium Python Data, Leaflet.js Maps folium builds on the data wrangling strengths of the Python ecosystem and the mapping strengths of the Leaflet.js

6k Jan 02, 2023
Python wrapper for Synoptic Data API. Retrieve data from thousands of mesonet stations and networks. Returns JSON from Synoptic as Pandas DataFrame

☁ Synoptic API for Python (unofficial) The Synoptic Mesonet API (formerly MesoWest) gives you access to real-time and historical surface-based weather

Brian Blaylock 23 Jan 06, 2023
Sentiment Analysis application created with Python and Dash, hosted at socialsentiment.net

Social Sentiment Dash Application Live-streaming sentiment analysis application created with Python and Dash, hosted at SocialSentiment.net. Dash Tuto

Harrison 456 Dec 25, 2022
Generate graphs with NetworkX, natively visualize with D3.js and pywebview

webview_d3 This is some PoC code to render graphs created with NetworkX natively using D3.js and pywebview. The main benifit of this approac

byt3bl33d3r 68 Aug 18, 2022
🗾 Streamlit Component for rendering kepler.gl maps

streamlit-keplergl 🗾 Streamlit Component for rendering kepler.gl maps in a streamlit app. 🎈 Live Demo 🎈 Installation pip install streamlit-keplergl

Christoph Rieke 39 Dec 14, 2022
Analytical Web Apps for Python, R, Julia, and Jupyter. No JavaScript Required.

Dash Dash is the most downloaded, trusted Python framework for building ML & data science web apps. Built on top of Plotly.js, React and Flask, Dash t

Plotly 17.9k Dec 31, 2022
Bioinformatics tool for exploring RNA-Protein interactions

Explore RNA-Protein interactions. RNPFind is a bioinformatics tool. It takes an RNA transcript as input and gives a list of RNA binding protein (RBP)

Nahin Khan 3 Jan 27, 2022