A meta plugin for processing timelapse data timepoint by timepoint in napari

Overview

napari-time-slicer

License PyPI Python Version tests codecov napari hub

A meta plugin for processing timelapse data timepoint by timepoint. It enables a list of napari plugins to process 2D+t or 3D+t data step by step when the user goes through the timelapse. Currently, these plugins are using napari-time-slicer:

napari-time-slicer enables inter-plugin communication, e.g. allowing to combine the plugins listed above in one image processing workflow for segmenting a timelapse dataset:

If you want to convert a 3D dataset into as 2D + time dataset, use the menu Tools > Utilities > Convert 3D stack to 2D timelapse (time-slicer). It will turn the 3D dataset to a 4D datset where the Z-dimension (index 1) has only 1 element, which will in napari be displayed with a time-slider. Note: It is recommended to remove the original 3D dataset after this conversion.

Usage for plugin developers

Plugins which implement the napari_experimental_provide_function hook can make use the @time_slicer. At the moment, only functions which take napari.types.ImageData, napari.types.LabelsData and basic python types such as int and float are supported. If you annotate such a function with @time_slicer it will internally convert any 4D dataset to a 3D dataset according to the timepoint currently selected in napari. Furthermore, when the napari user changes the current timepoint or the input data of the function changes, a re-computation is invoked. Thus, it is recommended to only use the time_slicer for functions which can provide [almost] real-time performance. Another constraint is that these annotated functions have to have a viewer parameter. This is necessary to read the current timepoint from the viewer when invoking the re-computions.

Example

import napari
from napari_time_slicer import time_slicer

@time_slicer
def threshold_otsu(image:napari.types.ImageData, viewer: napari.Viewer = None) -> napari.types.LabelsData:
    # ...

You can see a full implementations of this concept in the napari plugins listed above.


This napari plugin was generated with Cookiecutter using @napari's cookiecutter-napari-plugin template.

Installation

You can install napari-time-slicer via pip:

pip install napari-time-slicer

To install latest development version :

pip install git+https://github.com/haesleinhuepf/napari-time-slicer.git

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the BSD-3 license, "napari-time-slicer" is free and open source software

Issues

If you encounter any problems, please file an issue along with a detailed description.

Comments
  • pyqt5 dependency

    pyqt5 dependency

    The dependency on pyqt5 which gets installed via pip can create trouble if napari has been installed via conda (see https://napari.org/plugins/best_practices.html#don-t-include-pyside2-or-pyqt5-in-your-plugin-s-dependencies). Is there any reason for this dependency? As this plugin is itself a dependency of other plugins like napari-segment-blobs-and-things-with-membranes this can create trouble down the chain.

    opened by guiwitz 7
  • PyQt5 version requirement breaks environment

    PyQt5 version requirement breaks environment

    Hi @haesleinhuepf ,

    I wanted to ask whether it is really strictly necessary to use the current PyQt5 requirement?

    pyqt5>=5.15.0
    

    It collides with current Spyder versions that only support PyQt up to 5.13:

    spyder 5.1.5 requires pyqtwebengine<5.13, which is not installed.
    spyder 5.1.5 requires pyqt5<5.13, but you have pyqt5 5.15.6 which is incompatible.
    

    Since the time slicer is used downstream in quite a few plugins of yours (e.g., segment-blobs-and-things-with-membranes, etc.) this is quite a restriction.

    opened by jo-mueller 5
  • Bug report: `KeyError: 'viewer'`

    Bug report: `KeyError: 'viewer'`

    Hi @haesleinhuepf ,

    I am getting an error in this notebook in the 5th cell on this command:

    surface = nppas.largest_label_to_surface(labels)
    

    where nppas is napari-process-points-and-surfaces. Labels is a regular label image as made with skimage.measure.label().

    Thanks for looking at it!

    opened by jo-mueller 2
  • Make dask arrays instead of computing slice for slice

    Make dask arrays instead of computing slice for slice

    Hey @haesleinhuepf! this is the first implementation of the time slicer wrapper using dask instead of computing the time slices based on the current time index. I could re-use some a little of the previous code but the wrappers start to differ from eachother pretty soon. At the moment I'm also unsure if this wrapper can replace the original time slicer function as a substitute so I kept both your old version and the dask version. An idea that I had which could be useful for saving the dask images is a function which processes each time slice and saves it as a separate image (If images are saved one by one it's really easy to load them as dask arrays!)

    opened by Cryaaa 1
  • Tests failing

    Tests failing

    source:

     if sys.platform.startswith('linux') and running_as_bundled_app():
      .tox/py37-linux/lib/python3.7/site-packages/napari/utils/misc.py:65: in running_as_bundled_app
          metadata = importlib_metadata.metadata(app_module)
      .tox/py37-linux/lib/python3.7/site-packages/importlib_metadata/__init__.py:1005: in metadata
      return Distribution.from_name(distribution_name).metadata
      .tox/py37-linux/lib/python3.7/site-packages/importlib_metadata/__init__.py:562: in from_name
      raiseValueError("A distribution name is required.")
      E   ValueError: A distribution name is required.
    

    See also:

    https://github.com/napari/napari/issues/4797

    opened by haesleinhuepf 0
  • Have 4D dask arrays as result of time-sliced functions

    Have 4D dask arrays as result of time-sliced functions

    This turns result of time-slicer annotated functions into 4D delayed dask arrays as proposed by @Cryaaa in #5

    This PR doesn't fully work yet in the interactive napari user-interface. After setting up a workflow and when going through time, it crashes sometimes with a KeyError while saving the duration of an operation. This is related to a computation finishing while the result has already be replaced. Basically multiple threads writing to the same result. It's this error: https://github.com/dask/dask/issues/896

    Reproduce:

    • Start napari
    • Open the Example dataset clEsperanto > CalibZapwfixed
    • Turn it into a 2D+t dataset using Tools > Utilities
    • Open the assistant
    • Setup a workflow, e.g. Denoise, Threshold, Label
    • Move the time-bar a couple of times until it crashes.

    I'm not sure yet how to solve this.

    opened by haesleinhuepf 8
  • Aggregate points and surfaces in 4D

    Aggregate points and surfaces in 4D

    Hi Robert @haesleinhuepf ,

    I am seeing some issues with using the timeslicer on 4D points/surface data in napari. For instance, using the label_to_surface() function from napari-process-points-and-surfaces throws an error:

    ValueError: Input volume should be a 3D numpy array.
    

    which comes from the marching_cubes function under the hood. Here is a small example script to reproduce the error:

    import napari
    import napari_process_points_and_surfaces as nppas
    # Make a blurry sphere
    s = 100
    data = np.zeros((s, s, s), dtype=float)
    x0 = 50
    radius = 15
    
    for x in range(s):
        for y in range(s):
            for z in range(s):
                if np.sqrt((x-x0)**2 + (y-x0)**2 + (z-x0)**2) < radius:
                    data[x, y, z] = 1.0
    
    viewer = make_napari_viewer()
    viewer.add_image(image)
    
    segmentation = image > filters.threshold_otsu(image)
    viewer.add_labels(segmentation)
    
    surf = nppas.label_to_surface(segmentation.astype(int))
    viewer.add_surface(surf)
    

    When introspecting the call to marching_cubes within the time_slicer function it is also evident that the image is somehow still a 4D image.

    opened by jo-mueller 4
Releases(0.4.9)
Owner
Robert Haase
Computational Microscopist, BioImage Analyst, Code Jockey
Robert Haase
OpenARB is an open source program aiming to emulate a free market while encouraging players to participate in arbitrage in order to increase working capital.

Overview OpenARB is an open source program aiming to emulate a free market while encouraging players to participate in arbitrage in order to increase

Tom 3 Feb 12, 2022
A set of functions and analysis classes for solvation structure analysis

SolvationAnalysis The macroscopic behavior of a liquid is determined by its microscopic structure. For ionic systems, like batteries and many enzymes,

MDAnalysis 19 Nov 24, 2022
Time ranges with python

timeranges Time ranges. Read the Docs Installation pip timeranges is available on pip: pip install timeranges GitHub You can also install the latest v

Micael Jarniac 2 Sep 01, 2022
PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams

PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams Motivation When dataset freshness is critical, the annotating of high speed

4 Aug 02, 2022
The lastest all in one bombing tool coded in python uses tbomb api

BaapG-Attack is a python3 based script which is officially made for linux based distro . It is inbuit mass bomber with sms, mail, calls and many more bombing

59 Dec 25, 2022
An ETL Pipeline of a large data set from a fictitious music streaming service named Sparkify.

An ETL Pipeline of a large data set from a fictitious music streaming service named Sparkify. The ETL process flows from AWS's S3 into staging tables in AWS Redshift.

1 Feb 11, 2022
Renato 214 Jan 02, 2023
Analysis scripts for QG equations

qg-edgeofchaos Analysis scripts for QG equations FIle/Folder Structure eigensolvers.py - Spectral and finite-difference solvers for Rossby wave eigenf

Norman Cao 2 Sep 27, 2022
Tools for the analysis, simulation, and presentation of Lorentz TEM data.

ltempy ltempy is a set of tools for Lorentz TEM data analysis, simulation, and presentation. Features Single Image Transport of Intensity Equation (SI

McMorran Lab 1 Dec 26, 2022
Tuplex is a parallel big data processing framework that runs data science pipelines written in Python at the speed of compiled code

Tuplex is a parallel big data processing framework that runs data science pipelines written in Python at the speed of compiled code. Tuplex has similar Python APIs to Apache Spark or Dask, but rather

Tuplex 791 Jan 04, 2023
Produces a summary CSV report of an Amber Electric customer's energy consumption and cost data.

Amber Electric Usage Summary This is a command line tool that produces a summary CSV report of an Amber Electric customer's energy consumption and cos

Graham Lea 12 May 26, 2022
Python for Data Analysis, 2nd Edition

Python for Data Analysis, 2nd Edition Materials and IPython notebooks for "Python for Data Analysis" by Wes McKinney, published by O'Reilly Media Buy

Wes McKinney 18.6k Jan 08, 2023
ped-crash-techvol: Texas Ped Crash Tech Volume Pack

ped-crash-techvol: Texas Ped Crash Tech Volume Pack In conjunction with the Final Report "Identifying Risk Factors that Lead to Increase in Fatal Pede

Network Modeling Center; Center for Transportation Research; The University of Texas at Austin 2 Sep 28, 2022
Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods

Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods Introduction Graph Neural Networks (GNNs) have demonstrated

37 Dec 15, 2022
nrgpy is the Python package for processing NRG Data Files

nrgpy nrgpy is the Python package for processing NRG Data Files Website and source: https://github.com/nrgpy/nrgpy Documentation: https://nrgpy.github

NRG Tech Services 23 Dec 08, 2022
Anomaly Detection with R

AnomalyDetection R package AnomalyDetection is an open-source R package to detect anomalies which is robust, from a statistical standpoint, in the pre

Twitter 3.5k Dec 27, 2022
PyEmits, a python package for easy manipulation in time-series data.

PyEmits, a python package for easy manipulation in time-series data. Time-series data is very common in real life. Engineering FSI industry (Financial

Thompson 5 Sep 23, 2022
Techdegree Data Analysis Project 2

Basketball Team Stats Tool In this project you will be writing a program that reads from the "constants" data (PLAYERS and TEAMS) in constants.py. Thi

2 Oct 23, 2021
ASTR 302: Python for Astronomy (Winter '22)

ASTR 302, Winter 2022, University of Washington: Python for Astronomy Mario Jurić Location When: 2:30-3:50, Monday & Wednesday, Winter quarter 2022 Wh

UW ASTR 302: Python for Astronomy 4 Jan 12, 2022
Package for decomposing EMG signals into motor unit firings, as used in Formento et al 2021.

EMGDecomp Package for decomposing EMG signals into motor unit firings, created for Formento et al 2021. Based heavily on Negro et al, 2016. Supports G

13 Nov 01, 2022