Spatial Single-Cell Analysis Toolkit

Overview

Single-Cell Image Analysis Package


build: Unix-Mac-Win Docs Downloads PyPI Version PyPI License Gitter chat



Scimap is a scalable toolkit for analyzing spatial molecular data. The underlying framework is generalizable to spatial datasets mapped to XY coordinates. The package uses the anndata framework making it easy to integrate with other popular single-cell analysis toolkits. It includes preprocessing, phenotyping, visualization, clustering, spatial analysis and differential spatial testing. The Python-based implementation efficiently deals with large datasets of millions of cells.

Installation

We strongly recommend installing scimap in a fresh virtual environment.

# If you have conda installed
conda create --name scimap python=3.7
conda activate scimap

Install scimap directly into an activated virtual environment:

$ pip install scimap

After installation, the package can be imported as:

$ python
>>> import scimap as sm

Get Started

Detailed documentation of scimap functions and tutorials are available here.

SCIMAP development is led by Ajit Johnson Nirmal at the Laboratory of Systems Pharmacology, Harvard Medical School.

Funding

This work is supported by the following NIH grant K99-CA256497

Comments
  • sm.tl.spatial_lda stores result in adata.uns not adata.obs

    sm.tl.spatial_lda stores result in adata.uns not adata.obs

    Hello,

    I am trying to run sm.tl.spatial_lda with my adata object but for some reason the spatial_lda results are stored in adata.uns and not adata.obs. This is causing some errors with the clustering downstream using sm.tl.cluster. Any help would be appreciated!

    I am working in Python v3.9.

    After running the code:

    adata= sm.tl.spatial_lda(adata, x_coordinate='X', y_coordinate='Y', phenotype='celltype', method='radius', radius=30, knn=10, imageid='UniqueID', num_motifs=10, random_state=0, subset=None, label='spatial_lda')

    adata

    AnnData object with n_obs × n_vars = 79308 × 34 obs: 'Unnamed: 0', 'X', 'Y', 'Area', 'celltype', 'TLSType', 'UniqueID' uns: 'spatial_lda', 'spatial_lda_probability', 'spatial_lda_model'

    Scimap

    opened by marinabroz 12
  • strange spatial_LDA -> spatial_cluster results

    strange spatial_LDA -> spatial_cluster results

    Hello again!

    When clustering (spatial_cluster) on spatial_LDA results, I am getting strange results, as below.

    I always get reasonable spatial_cluster results when training on a single ROI. But with as few as 2 ROIs, I start to get this artifactual-seeming result, visible as clusters forming vertical stripes in one or more of the ROIs.

    I have tried both 'knn' and 'radius' as spatial_LDA methods with varying values of motifs, knn, and radius. Clustering method was always kmeans (leiden and phenograph were always giving me 99 clusters even with resolution set to 0.1—so I am actually not sure if it's spatial_LDA or instead the clustering that is contributing to this)

    Conditions which promote the appearance of this "artifact":

    • more than one ROI trained together
    • radius larger than 30
    • smaller/more numerous cells

    Example of a "sensible" spatial clustering result:

    image

    when one additional ROI is trained together with it, with all the same spatial_LDA and spatial_cluster parameters, that ROI becomes:

    image

    Some real structure is retained in the lower left corner, while the right side no longer makes sense...

    Any idea what could be causing this, or parameters to try which could mitigate?

    Thank you again!!

    opened by yerahko 5
  • Segmentation mask error in pl.image_viewer

    Segmentation mask error in pl.image_viewer

    I am getting an error when i include a segmentation mask file:

    # Pre- subset the anndata to prevent indexing error. Also, don't use `imageid` and `subset` args to image_viewer().
    #selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1']), :]          # This doesn't work
    selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1']), :].copy() # Making a copy makes it work
    
    sm.pl.image_viewer(image_path=img_path,
                       seg_mask=seg_mask,   # seg_mask=None,
                       #adata=adata, imageid='Unique_ID', subset = 'PT7.ROI_1',  # Don't do this, Use pre-subsetted data instead
                       adata=selected,
                       overlay='spatial_kmeans', 
                       channel_names=layers_order,
                       x_coordinate='X', y_coordinate='Y', flip_y=False, point_size=4)
    

    The segmentation mask file is readable, as in, I am able to display it in napari by running this chunk of code from sm.pl.image_viewer (minus the last two lines that I commented out—those break):

    # Load the segmentation mask
            if seg_mask is not None:
                seg_m = tiff.imread(seg_mask)
                # if seg_m.shape[0] > 1: 
                #    seg_m = seg_m[0]   
    

    followed by:

    napari.view_image(seg_m)
    

    However, I get the error below when trying to open it with image_viewer().

    Click to expand error
    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    Input In [198], in <cell line: 5>()
          1 # Pre- subset the anndata to prevent indexing error. Also, don't use imageid and subset args to image_viewer().
          2 #selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1'])]          # This doesn't work
          3 selected = adata[adata.obs['Unique_ID'].isin(['PT7.ROI_1']), :].copy() # Making a copy makes it work
    ----> 5 viewer=sm.pl.image_viewer(image_path=img_path,
          6                    seg_mask=seg_path, 
          7                    #adata=adata, imageid='Unique_ID', subset = 'PT7.ROI_1',  # Don't do this, Use pre-subsetted data instead
          8                    adata=selected,
          9                    overlay='cellsimple2', # 'spatial_kmeans', # 
         10                    channel_names=layers_order,
         11                    x_coordinate='X', y_coordinate='Y', flip_y=False, point_size=4)
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/scimap/plotting/_image_viewer.py:184, in image_viewer(image_path, adata, overlay, flip_y, overlay_category, markers, channel_names, x_coordinate, y_coordinate, point_size, point_color, subset, imageid, seg_mask, **kwargs)
        182 # Add the seg mask
        183 if seg_mask is not None:
    --> 184     viewer.add_labels(seg_m, name='segmentation mask', visible=False)
        186 # Add phenotype layer function
        187 def add_phenotype_layer (adata, overlay, phenotype_layer,x,y,viewer,point_size,point_color):
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/components/viewer_model.py:4, in add_labels(self, data, num_colors, features, properties, color, seed, name, metadata, scale, translate, rotate, shear, affine, opacity, blending, rendering, depiction, visible, multiscale, cache, plane, experimental_clipping_planes)
          1 from __future__ import annotations
          3 import inspect
    ----> 4 import itertools
          5 import os
          6 import warnings
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/layers/labels/labels.py:259, in Labels.__init__(self, data, num_colors, features, properties, color, seed, name, metadata, scale, translate, rotate, shear, affine, opacity, blending, rendering, depiction, visible, multiscale, cache, plane, experimental_clipping_planes)
        256 self._show_selected_label = False
        257 self._contour = 0
    --> 259 data = self._ensure_int_labels(data)
        260 self._color_lookup_func = None
        262 super().__init__(
        263     data,
        264     rgb=False,
       (...)
        284     experimental_clipping_planes=experimental_clipping_planes,
        285 )
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/layers/labels/labels.py:554, in Labels._ensure_int_labels(self, data)
        552 def _ensure_int_labels(self, data):
        553     """Ensure data is integer by converting from bool if required, raising an error otherwise."""
    --> 554     looks_multiscale, data = guess_multiscale(data)
        555     if not looks_multiscale:
        556         data = [data]
    
    File ~/anaconda3/envs/napari-imc/lib/python3.9/site-packages/napari/layers/image/_image_utils.py:76, in guess_multiscale(data)
         72 consistent = bool(np.all(sizes[:-1] > sizes[1:]))
         73 if np.all(sizes == sizes[0]):
         74     # note: the individual array case should be caught by the first
         75     # code line in this function, hasattr(ndim) and ndim > 1.
    ---> 76     raise ValueError(
         77         trans._(
         78             'Input data should be an array-like object, or a sequence of arrays of decreasing size. Got arrays of single shape: {shape}',
         79             deferred=True,
         80             shape=shapes[0],
         81         )
         82     )
         83 if not consistent:
         84     raise ValueError(
         85         trans._(
         86             'Input data should be an array-like object, or a sequence of arrays of decreasing size. Got arrays in incorrect order, shapes: {shapes}',
       (...)
         89         )
         90     )
    
    ValueError: Input data should be an array-like object, or a sequence of arrays of decreasing size. Got arrays of single shape: ()
    
    

    Thank you!

    opened by yerahko 3
  • sm.pl.foldchange runs into KeyError

    sm.pl.foldchange runs into KeyError

    Dear scimap developer,

    I am Jose, a biologist who now requires to perform multiplex image analysis. I have ran MCMICRO, and load the output csv as AnnData object. While trying to visualize the data, the foldchange function runs into an error, I am probably doing something wrong. foldchange_KeyError.txt conda list.txt

    opened by josenimo 3
  • sm.pl.image_viewer works but the Napiri visualization wont pop up

    sm.pl.image_viewer works but the Napiri visualization wont pop up

    Halloo everyone........

    I am a new for the bioinformatic analysis. Currently, I am working on spatial CODEX imaging analysis and I found this package very useful. However I have encounter a problem in using sm.pl.image_viewer. It seems the code is working but Napiri visualization wont pop up. this is my code I put on:

    import sys import os import anndata as ad import pandas as pd import scanpy as sc import seaborn as sns; sns.set(color_codes=True)

    Import Scimap

    import scimap as sm

    Set the working directory

    os.chdir ("/Users/admin/Desktop/scimap package.CODEX.analysis/")

    Load data

    adata = ad.read('tutorial_data_D20LN1_setting5.h5ad') adata AnnData object with n_obs × n_vars = 8957 × 24 obs: 'X_centroid', 'Y_centroid', 'Area', 'MajorAxisLength', 'MinorAxisLength', 'Eccentricity', 'Solidity', 'Extent', 'Orientation', 'CellID', 'imageid' uns: 'all_markers', 'pca' obsm: 'X_pca' varm: 'PCs' image_path = '/Users/admin/Desktop/scimap package.CODEX.analysis/image.ome.tif' sm.pl.image_viewer(image_path, adata, overlay='leiden',overlay_category=None, ... markers=[ 'CD8'], imageid='imageid', seg_mask=None,
    ... point_size=7,point_color='white')

    the data put for adata was h5ad.

    Any help I really appreciate.

    Best regards,

    Bugie

    opened by bugie19 2
  • image_viewer and gate_finder do not work: type object 'SubControl' has no attribute 'SC_None'

    image_viewer and gate_finder do not work: type object 'SubControl' has no attribute 'SC_None'

    Hi,

    and thank you for developing scimap! Unfortunately though, I have had problems with sm.pl.image_viewer and sm.pl.gate_finder, where I keep getting the same error: type object 'SubControl' has no attribute 'SC_None' All the previous steps in scimap tutorials have worked smoothly. Apparently the problem is in opening Napari. Two collaborators have kindly tried to use image_viewer and gate_finder using the same data and they have not had any problems. However, I am the only one using Mac (currently Monterey 12.4), so could this be a Mac-Napari thing?

    Here is a link to a jupyter notebook example, error message and to example image.

    Any help will be much appreciated.

    Best, Joona

    opened by SarkkinenJ 2
  • spatial_LDA with KNN neighborhoods

    spatial_LDA with KNN neighborhoods

    Hi there,

    I am running into trouble when I try to run the function spatial_LDA using method='knn', i.e.: https://github.com/labsyspharm/scimap/blob/b957d8d771260947c4f783b62533c0d0c3c48e5a/scimap/tools/_spatial_lda.py#L96-L112

    I am getting the following error at line 112:

    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    
          [1] for i in range(len(ind)):
    ----> [2]     ind[i] = [phenomap[letter] for letter in ind[i]]
    
    ValueError: invalid literal for int() with base 10: 'B'
    

    Using method='radius' works.

    Thank you very much for your help!

    opened by yerahko 2
  • install errors

    install errors

    Dear developer,

    I am just wondering that is there any way to install scimap on a M1 chip MacBook? I got errors in pip install scimap

    Thanks in adcance

    opened by sailseem 2
  • Bump nltk from 3.5 to 3.6.6

    Bump nltk from 3.5 to 3.6.6

    Bumps nltk from 3.5 to 3.6.6.

    Changelog

    Sourced from nltk's changelog.

    Version 3.7 2022-02-09

    • Improve and update the NLTK team page on nltk.org (#2855, #2941)
    • Drop support for Python 3.6, support Python 3.10 (#2920)

    Version 3.6.7 2021-12-28

    • Resolve IndexError in sent_tokenize and word_tokenize (#2922)

    Version 3.6.6 2021-12-21

    • Refactor gensim.doctest to work for gensim 4.0.0 and up (#2914)
    • Add Precision, Recall, F-measure, Confusion Matrix to Taggers (#2862)
    • Added warnings if .zip files exist without any corresponding .csv files. (#2908)
    • Fix FileNotFoundError when the download_dir is a non-existing nested folder (#2910)
    • Rename omw to omw-1.4 (#2907)
    • Resolve ReDoS opportunity by fixing incorrectly specified regex (#2906)
    • Support OMW 1.4 (#2899)
    • Deprecate Tree get and set node methods (#2900)
    • Fix broken inaugural test case (#2903)
    • Use Multilingual Wordnet Data from OMW with newer Wordnet versions (#2889)
    • Keep NLTKs "tokenize" module working with pathlib (#2896)
    • Make prettyprinter to be more readable (#2893)
    • Update links to the nltk book (#2895)
    • Add CITATION.cff to nltk (#2880)
    • Resolve serious ReDoS in PunktSentenceTokenizer (#2869)
    • Delete old CI config files (#2881)
    • Improve Tokenize documentation + add TokenizerI as superclass for TweetTokenizer (#2878)
    • Fix expected value for BLEU score doctest after changes from #2572
    • Add multi Bleu functionality and tests (#2793)
    • Deprecate 'return_str' parameter in NLTKWordTokenizer and TreebankWordTokenizer (#2883)
    • Allow empty string in CFG's + more (#2888)
    • Partition tree.py module into tree package + pickle fix (#2863)
    • Fix several TreebankWordTokenizer and NLTKWordTokenizer bugs (#2877)
    • Rewind Wordnet data file after each lookup (#2868)
    • Correct init call for SyntaxCorpusReader subclasses (#2872)
    • Documentation fixes (#2873)
    • Fix levenstein distance for duplicated letters (#2849)
    • Support alternative Wordnet versions (#2860)
    • Remove hundreds of formatting warnings for nltk.org (#2859)
    • Modernize nltk.org/howto pages (#2856)
    • Fix Bleu Score smoothing function from taking log(0) (#2839)
    • Update third party tools to newer versions and removing MaltParser fixed version (#2832)
    • Fix TypeError: _pretty() takes 1 positional argument but 2 were given in sem/drt.py (#2854)
    • Replace http with https in most URLs (#2852)

    Thanks to the following contributors to 3.6.6 Adam Hawley, BatMrE, Danny Sepler, Eric Kafe, Gavish Poddar, Panagiotis Simakis, RnDevelover, Robby Horvath, Tom Aarsen, Yuta Nakamura, Mohaned Mashaly

    ... (truncated)

    Commits
    • 4862b09 updates for 3.6.6
    • 6b60213 Refactor gensim.doctest to work for gensim 4.0.0 and up (#2914)
    • 59aa3fb Fix decode error for bllip parser (#2897)
    • a28d256 Add Precision, Recall, F-measure, Confusion Matrix to Taggers (#2862)
    • 72d9885 Added warnings if .zip files exist without any corresponding .csv files. (#2908)
    • dea7b44 Fix FileNotFoundError when the download_dir is a non-existing nested fold...
    • abbe86b Undo #2909 due to unexpected test failure
    • c075dab Allow commits with /nocache to not use the cache (#2909)
    • d6d513d Renamed omw to omw-1.4 (#2907)
    • 2a50a3e Resolve ReDoS opportunity by fixing incorrectly specified regex (#2906)
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump ipython from 7.19.0 to 7.31.1

    Bump ipython from 7.19.0 to 7.31.1

    Bumps ipython from 7.19.0 to 7.31.1.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump pillow from 8.0.1 to 9.0.1

    Bump pillow from 8.0.1 to 9.0.1

    Bumps pillow from 8.0.1 to 9.0.1.

    Release notes

    Sourced from pillow's releases.

    9.0.1

    https://pillow.readthedocs.io/en/stable/releasenotes/9.0.1.html

    Changes

    • In show_file, use os.remove to remove temporary images. CVE-2022-24303 #6010 [@​radarhere, @​hugovk]
    • Restrict builtins within lambdas for ImageMath.eval. CVE-2022-22817 #6009 [radarhere]

    9.0.0

    https://pillow.readthedocs.io/en/stable/releasenotes/9.0.0.html

    Changes

    ... (truncated)

    Changelog

    Sourced from pillow's changelog.

    9.0.1 (2022-02-03)

    • In show_file, use os.remove to remove temporary images. CVE-2022-24303 #6010 [radarhere, hugovk]

    • Restrict builtins within lambdas for ImageMath.eval. CVE-2022-22817 #6009 [radarhere]

    9.0.0 (2022-01-02)

    • Restrict builtins for ImageMath.eval(). CVE-2022-22817 #5923 [radarhere]

    • Ensure JpegImagePlugin stops at the end of a truncated file #5921 [radarhere]

    • Fixed ImagePath.Path array handling. CVE-2022-22815, CVE-2022-22816 #5920 [radarhere]

    • Remove consecutive duplicate tiles that only differ by their offset #5919 [radarhere]

    • Improved I;16 operations on big endian #5901 [radarhere]

    • Limit quantized palette to number of colors #5879 [radarhere]

    • Fixed palette index for zeroed color in FASTOCTREE quantize #5869 [radarhere]

    • When saving RGBA to GIF, make use of first transparent palette entry #5859 [radarhere]

    • Pass SAMPLEFORMAT to libtiff #5848 [radarhere]

    • Added rounding when converting P and PA #5824 [radarhere]

    • Improved putdata() documentation and data handling #5910 [radarhere]

    • Exclude carriage return in PDF regex to help prevent ReDoS #5912 [hugovk]

    • Fixed freeing pointer in ImageDraw.Outline.transform #5909 [radarhere]

    ... (truncated)

    Commits
    • 6deac9e 9.0.1 version bump
    • c04d812 Update CHANGES.rst [ci skip]
    • 4fabec3 Added release notes for 9.0.1
    • 02affaa Added delay after opening image with xdg-open
    • ca0b585 Updated formatting
    • 427221e In show_file, use os.remove to remove temporary images
    • c930be0 Restrict builtins within lambdas for ImageMath.eval
    • 75b69dd Dont need to pin for GHA
    • cd938a7 Autolink CWE numbers with sphinx-issues
    • 2e9c461 Add CVE IDs
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Bump certifi from 2021.10.8 to 2022.12.7

    Bump certifi from 2021.10.8 to 2022.12.7

    Bumps certifi from 2021.10.8 to 2022.12.7.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • supplying additional groupings

    supplying additional groupings

    Just wondering if there is support to add additional arguments into some functions to change the way the data is grouped other than by 'imageid'. eg in tl.foldchange 'from_group' could be another categorical variable in adata.obs, eg 'mutation1', and 'to_group' could be 'mutation2' similarly tl.spatial_interaction is performed by 'imageid', can these scores be further grouped by another adata.obs variable?

    thanks

    opened by jamesMo84 1
  • Bump pillow from 9.1.1 to 9.3.0

    Bump pillow from 9.1.1 to 9.3.0

    Bumps pillow from 9.1.1 to 9.3.0.

    Release notes

    Sourced from pillow's releases.

    9.3.0

    https://pillow.readthedocs.io/en/stable/releasenotes/9.3.0.html

    Changes

    ... (truncated)

    Changelog

    Sourced from pillow's changelog.

    9.3.0 (2022-10-29)

    • Limit SAMPLESPERPIXEL to avoid runtime DOS #6700 [wiredfool]

    • Initialize libtiff buffer when saving #6699 [radarhere]

    • Inline fname2char to fix memory leak #6329 [nulano]

    • Fix memory leaks related to text features #6330 [nulano]

    • Use double quotes for version check on old CPython on Windows #6695 [hugovk]

    • Remove backup implementation of Round for Windows platforms #6693 [cgohlke]

    • Fixed set_variation_by_name offset #6445 [radarhere]

    • Fix malloc in _imagingft.c:font_setvaraxes #6690 [cgohlke]

    • Release Python GIL when converting images using matrix operations #6418 [hmaarrfk]

    • Added ExifTags enums #6630 [radarhere]

    • Do not modify previous frame when calculating delta in PNG #6683 [radarhere]

    • Added support for reading BMP images with RLE4 compression #6674 [npjg, radarhere]

    • Decode JPEG compressed BLP1 data in original mode #6678 [radarhere]

    • Added GPS TIFF tag info #6661 [radarhere]

    • Added conversion between RGB/RGBA/RGBX and LAB #6647 [radarhere]

    • Do not attempt normalization if mode is already normal #6644 [radarhere]

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump jupyter-core from 4.9.2 to 4.11.2

    Bump jupyter-core from 4.9.2 to 4.11.2

    Bumps jupyter-core from 4.9.2 to 4.11.2.

    Release notes

    Sourced from jupyter-core's releases.

    4.11.1

    What's Changed

    Full Changelog: https://github.com/jupyter/jupyter_core/compare/4.11.0...4.11.1

    4.11.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/jupyter/jupyter_core/compare/4.10.0...4.11.0

    4.10.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/jupyter/jupyter_core/compare/4.9.2...4.10.0

    Changelog

    Sourced from jupyter-core's changelog.

    Changes in jupyter-core

    5.0.0

    (Full Changelog)

    Major Changes

    Prefer Environment Level Configuration

    We now make the assumption that if we are running in a virtual environment, we should prioritize the environment-level sys.prefix over the user-level paths. Users can opt out of this behavior by setting JUPYTER_PREFER_ENV_PATH, which takes precedence over our autodetection.

    Migrate to Standard Platform Directories

    In version 5, we introduce a JUPYTER_PLATFORM_DIRS environment variable to opt in to using more appropriate platform-specific directories. We raise a deprecation warning if the variable is not set. In version 6, JUPYTER_PLATFORM_DIRS will be opt-out. In version 7, we will remove the environment variable checks and old directory logic.

    Drop Support for Python 3.7

    We are dropping support for Python 3.7 ahead of its official end of life, to reduce maintenance burden as we add support for Python 3.11.

    Enhancements made

    Bugs fixed

    Maintenance and upkeep improvements

    Documentation

    Contributors to this release

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • Bump joblib from 1.1.0 to 1.2.0

    Bump joblib from 1.1.0 to 1.2.0

    Bumps joblib from 1.1.0 to 1.2.0.

    Changelog

    Sourced from joblib's changelog.

    Release 1.2.0

    • Fix a security issue where eval(pre_dispatch) could potentially run arbitrary code. Now only basic numerics are supported. joblib/joblib#1327

    • Make sure that joblib works even when multiprocessing is not available, for instance with Pyodide joblib/joblib#1256

    • Avoid unnecessary warnings when workers and main process delete the temporary memmap folder contents concurrently. joblib/joblib#1263

    • Fix memory alignment bug for pickles containing numpy arrays. This is especially important when loading the pickle with mmap_mode != None as the resulting numpy.memmap object would not be able to correct the misalignment without performing a memory copy. This bug would cause invalid computation and segmentation faults with native code that would directly access the underlying data buffer of a numpy array, for instance C/C++/Cython code compiled with older GCC versions or some old OpenBLAS written in platform specific assembly. joblib/joblib#1254

    • Vendor cloudpickle 2.2.0 which adds support for PyPy 3.8+.

    • Vendor loky 3.3.0 which fixes several bugs including:

      • robustly forcibly terminating worker processes in case of a crash (joblib/joblib#1269);

      • avoiding leaking worker processes in case of nested loky parallel calls;

      • reliability spawn the correct number of reusable workers.

    Release 1.1.1

    • Fix a security issue where eval(pre_dispatch) could potentially run arbitrary code. Now only basic numerics are supported. joblib/joblib#1327
    Commits
    • 5991350 Release 1.2.0
    • 3fa2188 MAINT cleanup numpy warnings related to np.matrix in tests (#1340)
    • cea26ff CI test the future loky-3.3.0 branch (#1338)
    • 8aca6f4 MAINT: remove pytest.warns(None) warnings in pytest 7 (#1264)
    • 067ed4f XFAIL test_child_raises_parent_exits_cleanly with multiprocessing (#1339)
    • ac4ebd5 MAINT add back pytest warnings plugin (#1337)
    • a23427d Test child raises parent exits cleanly more reliable on macos (#1335)
    • ac09691 [MAINT] various test updates (#1334)
    • 4a314b1 Vendor loky 3.2.0 (#1333)
    • bdf47e9 Make test_parallel_with_interactively_defined_functions_default_backend timeo...
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
Releases(0.22.0)
  • 0.19.0(Apr 3, 2022)

    • Included support for Apple M1 machines
    • Included support for native rendering of Zarr stored images using Napari: pl.image_viewer and pl.gate_finder

    Temporary workaround for installing in Apple M1 machines

    # reate and load a new environment
    conda create --name scimap python=3.8 -y
    conda activate scimap
    
    # if you do not have xcode please install it
    xcode-select --install
    
    # if you do not have homebrew please install it
    /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
    
    # if you do not have cmake install it
    brew install cmake
    
    # install h5py
    brew install [email protected]
    export HDF5_DIR=/opt/homebrew/Cellar/hdf5/1.12.1_1/
    pip install --no-binary=h5py h5py
    
    # install llvmlite
    conda install llvmlite -y
    
    # install leidenalg
    pip install git+https://github.com/vtraag/leidenalg.git
    
    # install scimap
    pip install -U scimap
    
    # uninstall 
    conda remove llvmlite -y
    pip uninstall numba -y
    pip uninstall numpy -y
    
    # reinstall this specific version of llvmlite (ignore errors/warning)
    pip install -i https://pypi.anaconda.org/numba/label/wheels_experimental_m1/simple llvmlite
    
    # reinstall this specific version of numpy (ignore errors/warning)
    pip install numpy==1.22.3
    
    # reinstall this specific version of numba (ignore errors/warning)
    pip install -i https://pypi.anaconda.org/numba/label/wheels_experimental_m1/simple numba
    
    Source code(tar.gz)
    Source code(zip)
  • 0.17.7(Aug 5, 2021)

    Testing the new GitHub Action that automatically builds, tags and pushed Docker container images. The action is triggered by new releases.

    Source code(tar.gz)
    Source code(zip)
  • 0.17.2(Jul 1, 2021)

  • 0.1.10(Jun 29, 2020)

Owner
Laboratory of Systems Pharmacology @ Harvard
Reinventing the fundamental science underlying the development of new medicines and their use in individual patients.
Laboratory of Systems Pharmacology @ Harvard
PyTorch Implementation of DSB for Score Based Generative Modeling. Experiments managed using Hydra.

Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling This repository contains the implementation for the paper Diffusion

James Thornton 50 Jan 03, 2023
Official PyTorch implementation of U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation

U-GAT-IT — Official PyTorch Implementation : Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Imag

Hyeonwoo Kang 2.4k Jan 04, 2023
Code & Data for Enhancing Photorealism Enhancement

Code & Data for Enhancing Photorealism Enhancement

Intel ISL (Intel Intelligent Systems Lab) 1.1k Jan 08, 2023
VITS: Conditional Variational Autoencoder with Adversarial Learning for End-to-End Text-to-Speech

VITS: Conditional Variational Autoencoder with Adversarial Learning for End-to-End Text-to-Speech Jaehyeon Kim, Jungil Kong, and Juhee Son In our rece

Jaehyeon Kim 1.7k Jan 08, 2023
Spline is a tool that is capable of running locally as well as part of well known pipelines like Jenkins (Jenkinsfile), Travis CI (.travis.yml) or similar ones.

Welcome to spline - the pipeline tool Important note: Since change in my job I didn't had the chance to continue on this project. My main new project

Thomas Lehmann 29 Aug 22, 2022
LEAP: Learning Articulated Occupancy of People

LEAP: Learning Articulated Occupancy of People Paper | Video | Project Page This is the official implementation of the CVPR 2021 submission LEAP: Lear

Neural Bodies 60 Nov 18, 2022
End-to-End Dense Video Captioning with Parallel Decoding (ICCV 2021)

PDVC Official implementation for End-to-End Dense Video Captioning with Parallel Decoding (ICCV 2021) [paper] [valse论文速递(Chinese)] This repo supports:

Teng Wang 118 Dec 16, 2022
MQBench Quantization Aware Training with PyTorch

MQBench Quantization Aware Training with PyTorch I am using MQBench(Model Quantization Benchmark)(http://mqbench.tech/) to quantize the model for depl

Ling Zhang 29 Nov 18, 2022
Relative Positional Encoding for Transformers with Linear Complexity

Stochastic Positional Encoding (SPE) This is the source code repository for the ICML 2021 paper Relative Positional Encoding for Transformers with Lin

Antoine Liutkus 48 Nov 16, 2022
PyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and reinforcement learning

safe-control-gym Physics-based CartPole and Quadrotor Gym environments (using PyBullet) with symbolic a priori dynamics (using CasADi) for learning-ba

Dynamic Systems Lab 300 Dec 28, 2022
CAPRI: Context-Aware Interpretable Point-of-Interest Recommendation Framework

CAPRI: Context-Aware Interpretable Point-of-Interest Recommendation Framework This repository contains a framework for Recommender Systems (RecSys), a

RecSys Lab 8 Jul 03, 2022
A Tensorflow implementation of the Text Conditioned Auxiliary Classifier Generative Adversarial Network for Generating Images from text descriptions

A Tensorflow implementation of the Text Conditioned Auxiliary Classifier Generative Adversarial Network for Generating Images from text descriptions

Ayushman Dash 93 Aug 04, 2022
(JMLR' 19) A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)

Python Outlier Detection (PyOD) Deployment & Documentation & Stats & License PyOD is a comprehensive and scalable Python toolkit for detecting outlyin

Yue Zhao 6.6k Jan 05, 2023
Match SafeGraph POIs with Data collected through a cultural resource survey in Washington DC.

Match SafeGraph POI data with Cultural Resource Places in Washington DC Match SafeGraph POIs with Data collected through a cultural resource survey in

Changjie Chen 1 Jan 05, 2022
A Python wrapper for Google Tesseract

Python Tesseract Python-tesseract is an optical character recognition (OCR) tool for python. That is, it will recognize and "read" the text embedded i

Matthias A Lee 4.6k Jan 05, 2023
Code for "Unsupervised Source Separation via Bayesian inference in the latent domain"

LQVAE-separation Code for "Unsupervised Source Separation via Bayesian inference in the latent domain" Paper Samples GT Compressed Separated Drums GT

Michele Mancusi 30 Oct 25, 2022
Sub-Cluster AdaCos: Learning Representations for Anomalous Sound Detection.

Accompanying code for the paper Sub-Cluster AdaCos: Learning Representations for Anomalous Sound Detection.

Kevin Wilkinghoff 6 Dec 01, 2022
Flexible-CLmser: Regularized Feedback Connections for Biomedical Image Segmentation

Flexible-CLmser: Regularized Feedback Connections for Biomedical Image Segmentation The skip connections in U-Net pass features from the levels of enc

Boheng Cao 1 Dec 29, 2021
A PyTorch Implementation of "Neural Arithmetic Logic Units"

Neural Arithmetic Logic Units [WIP] This is a PyTorch implementation of Neural Arithmetic Logic Units by Andrew Trask, Felix Hill, Scott Reed, Jack Ra

Kevin Zakka 181 Nov 18, 2022