Deep universal probabilistic programming with Python and PyTorch

Overview

Build Status codecov.io Latest Version Documentation Status CII Best Practices

Getting Started | Documentation | Community | Contributing

Pyro is a flexible, scalable deep probabilistic programming library built on PyTorch. Notably, it was designed with these principles in mind:

  • Universal: Pyro is a universal PPL - it can represent any computable probability distribution.
  • Scalable: Pyro scales to large data sets with little overhead compared to hand-written code.
  • Minimal: Pyro is agile and maintainable. It is implemented with a small core of powerful, composable abstractions.
  • Flexible: Pyro aims for automation when you want it, control when you need it. This is accomplished through high-level abstractions to express generative and inference models, while allowing experts easy-access to customize inference.

Pyro was originally developed at Uber AI and is now actively maintained by community contributors, including a dedicated team at the Broad Institute. In 2019, Pyro became a project of the Linux Foundation, a neutral space for collaboration on open source software, open standards, open data, and open hardware.

For more information about the high level motivation for Pyro, check out our launch blog post. For additional blog posts, check out work on experimental design and time-to-event modeling in Pyro.

Installing

Installing a stable Pyro release

Install using pip:

Pyro supports Python 3.6+.

pip install pyro-ppl

Install from source:

git clone [email protected]:pyro-ppl/pyro.git
cd pyro
git checkout master  # master is pinned to the latest release
pip install .

Install with extra packages:

To install the dependencies required to run the probabilistic models included in the examples/tutorials directories, please use the following command:

pip install pyro-ppl[extras] 

Make sure that the models come from the same release version of the Pyro source code as you have installed.

Installing Pyro dev branch

For recent features you can install Pyro from source.

Install Pyro using pip:

pip install git+https://github.com/pyro-ppl/pyro.git

or, with the extras dependency to run the probabilistic models included in the examples/tutorials directories:

pip install git+https://github.com/pyro-ppl/pyro.git#egg=project[extras]

Install Pyro from source:

git clone https://github.com/pyro-ppl/pyro
cd pyro
pip install .  # pip install .[extras] for running models in examples/tutorials

Running Pyro from a Docker Container

Refer to the instructions here.

Citation

If you use Pyro, please consider citing:

@article{bingham2019pyro,
  author    = {Eli Bingham and
               Jonathan P. Chen and
               Martin Jankowiak and
               Fritz Obermeyer and
               Neeraj Pradhan and
               Theofanis Karaletsos and
               Rohit Singh and
               Paul A. Szerlip and
               Paul Horsfall and
               Noah D. Goodman},
  title     = {Pyro: Deep Universal Probabilistic Programming},
  journal   = {J. Mach. Learn. Res.},
  volume    = {20},
  pages     = {28:1--28:6},
  year      = {2019},
  url       = {http://jmlr.org/papers/v20/18-403.html}
}
Comments
  • Wishart / InverseWishart / LKJ priors

    Wishart / InverseWishart / LKJ priors

    Would it be possible to implement Wishart / InverseWishart / LKJ priors?

    gpytorch has them already, but when I tried mixing in TorchDistributionMixin to get something useable in Pyro, I realized that they don't have a .sample method.

    I don't think it's easy to get efficient samplers for the InverseWishart (I think trying to build it via a TransformedDistribution might be too slow), but curious to see other approaches. There's a Tensorflow Probability tutorial on this here, which explains some of the underlying ideas, as well as a note here.

    Great work by the way! Pyro is so easy to use it's incredible.

    opened by rachtsingh 42
  • Alternative design of variable scoping and naming

    Alternative design of variable scoping and naming

    Curious why all the params are auto-magically global with a shared namespace. Seems like a very different design choice than the nice cleanly scoped pytorch variables.

    discussion 
    opened by srush 42
  • Supporting multi-chain mcmc with cuda tensors

    Supporting multi-chain mcmc with cuda tensors

    This PR is just used for discussion with @neerajprad .

    Hi Neeraj, I added events to workers in ParallelSampler so we can solve the problem at https://discuss.pytorch.org/t/cuda-tensors-on-multiprocessing-queue/28626.

    However, there is also another problem: Invalid device pointer.

    After debugging, I noticed that if we clone args and kwargs in the returned trace of each mcmc step, then this error will go away. Of course, if we only put z (instead of trace) to the Queue (by returning z at the sample() method of HMC/NUTS), and get_trace from z retrieved from Queue, things will be fine (note that because we cache (z, pe, grad), we don't need to use previous trace for the next sampling step). However, it will add overhead to the workers: paused workers have to wait for get_trace from z of the current worker.

    Maybe I am missing something so I make PR to discuss with you about solutions. :)

    awaiting review 
    opened by fehiepsi 28
  • Improve documentation for LKJCorrCholesky and point to example usage

    Improve documentation for LKJCorrCholesky and point to example usage

    Issue Description

    LKJCorrCholesky seems to give the same log_prog irrespective of input. The log_prob should probably also has a type check that ensures the correct input is given.

    Environment

    • MacOS 10.14.3 and Python 3.7.3
    • Pytorch 1.1
    • Pyro 0.3.3

    Code Snippet

    import torch as t
    import pyro.distributions as pd
    X = pd.LKJCorrCholesky(2,t.tensor(1.))
    
    smp = X.sample()
    C = smp @ t.transpose(smp,0,1) # seems to sample correlation matrices correctly
    
    print(X.log_prob(t.tensor([[.1,.2,.3]]).reshape(-1,1)))
    print(X.log_prob(t.tensor([[.1,0],[.1,.3]])))
    print(X.log_prob(C))
    

    Results in:

    tensor(-0.6931) tensor(-0.6931) tensor(-0.6931)

    opened by robsalomone 27
  • Mass Matrix Adaptation for HMC and NUTS

    Mass Matrix Adaptation for HMC and NUTS

    I'd like to contribute the mass matrix adaptation mentioned in #1093 . I actually have some preliminary work on it already, but I've run into an issue and I'm not sure how you'd like to proceed:

    Currently the MCMC kernels have no access to the number of warmup iterations (since transition between warmup and sampling is handled by the MCMC wrapper), but if we want to adapt the mass matrix during warmup using a block scheme similar to stan (see section 34.2 in the stan reference manual) then the kernel needs access to the number of warmup iterations to divide the warmup into blocks.

    I thought about putting this scheduling logic into the MCMC wrapper, but since adapting a mass matrix in this way is specific to certain kernels (HMC and NUTS) it feels like it belongs in the kernels themselves.

    However, giving the kernels access to the number of warmup iterations is not quite straightforward: since they are initialized by the user (not the MCMC wrapper) we can't pass it to the constructor, and currently all of the arguments to HMC.setup() are passed directly into get_trace().

    Some options that come to mind are:

    1. Add the number of warmup iterations as a parameter in HMC.setup() (Seems reasonably clean to me, but maybe I'm missing something?)
    2. Implementing something like kernel.configure_adaptation(warmup_duration=200) which would be called by the wrapper. (This approach feels less clean because maybe not every kernel will need parameter adaptation, but it won't modify the existing kernel.setup() signature).
    3. In the MCMC wrapper directly set a property on the kernel: self.kernel.warmup_duration = 200. (Simple, but feels more like a hack).

    Any thoughts on which of these options (or others?) would be the best approach?

    enhancement discussion 
    opened by LoganWalls 27
  • DMM / DVBF examples

    DMM / DVBF examples

    Deep time-series models such as DMM and DVBF will be anchor models for the first release. We need to implement them in pyro, implement any training tricks needed, and replicate a few results.

    @karalets has some pytorch code for DMM and @null-a has a webppl implementation, so it should be straightforward to implement.

    high priority Examples 
    opened by ngoodman 26
  • Add a new intro tutorial

    Add a new intro tutorial

    open in nbviewer

    Derived from Bayesian regression tutorials

    Tasks:

    • [x] Draft tutorial
    • [x] Mention pyro.condition in sample section
    • [x] Add better introduction and conclusion/next steps
    • [x] Support CI smoke testing
    • [x] Table of contents
    • [x] Fix seaborn warnings
    • ~~Add images separately if necessary~~
    • ~~Fix model rendering~~
    • [x] Check all links in notebook and add any missing ones e.g. to SVI tutorials
    • [x] Update examples homepage sphinx
    • [x] Deprecate existing intro tutorials
    • [x] Update links to intro tutorial(s) in other documentation
    • [x] Rename file to intro_long.ipynb
    • [x] Fix Sphinx rendering of links around code markdown
    • [x] Check and fix HTML rendering locally
    Examples awaiting review 
    opened by eb8680 25
  • Reorganization of IAF + helper functions to create transforms

    Reorganization of IAF + helper functions to create transforms

    Renamed IAF transforms to match other transform names: pyro.distributions.transforms.AffineAutoregressive. Combined two very similar versions of IAF into one class with a switch keyword argument

    opened by stefanwebb 24
  • Implement AIR model

    Implement AIR model

    The attend-infer-repeat (AIR) model is a great example of stochastic recursion in deep probabilistic models. It is going to be one of the anchor examples for the first pyro release.

    We need to implement it, implement any extensions that training relies on, and replicate (some) results from the paper.

    Getting acceptably low variance for the gradient estimator term from the discrete RV will likely take some amount of rao-blackwellization and data dependent baselines. @martinjankowiak has started or planned these.

    high priority Examples 
    opened by ngoodman 24
  • Improvements to GP documentation

    Improvements to GP documentation

    1. Added xlabel and ylabel to loss v/s iterations plots
    2. Added animation for fitting GP
    3. Added plot of inducing points
    4. Added animation for fitting inducing point GP
    5. Added example showing impact of lengthscale#
    Examples awaiting review 
    opened by nipunbatra 23
  • Lift Poutine

    Lift Poutine

    Addresses #80, #40 and possibly #77

    Allows the user to specify a prior or a dict of priors over parameters where the keys correspond to the parameter names . See the linked issues for more details

    examples shortly to follow

    awaiting review 
    opened by jpchen 22
  • Allow registering of custom exception handlers for potential_fn computations

    Allow registering of custom exception handlers for potential_fn computations

    In some cases evaluation the potential funciton may result in numerical issues. Currently the code hard-codes the handling of a RuntimeError raised when matrices are (numerically) singular. This PR adds the ability to register custom exception handlers. This allows other code depending on pyro to register custom exception handlers without having to modify core pyro code.

    There are some other places in which potential_fn is called that could benefit from being guarded by these handlers (one is HMC._find_reasonable_step_size). I'm not sure what the right thing to do there is when encountering numerical isssues, but happy to add this in as needed.

    awaiting review 
    opened by Balandat 7
  • Update base learning rate scheduler

    Update base learning rate scheduler

    Resolves #3166

    PyTorch recently changed the name of the base class used in pyro.optim to identify and wrap the learning rate schedulers in torch.optim from _LRScheduler to LRScheduler, leading to a silent failure to create wrappers.

    This PR adds a backwards-compatible check to the wrapping logic to handle newer PyTorch releases.

    bug awaiting review easy 
    opened by eb8680 1
  • bug with OneCycleLR on Apple Silicone

    bug with OneCycleLR on Apple Silicone

    Guidelines

    NOTE: Issues are for bugs and feature requests only. If you have a question about using Pyro or general modeling questions, please post it on the forum.

    If you would like to address any minor bugs in the documentation or source, please feel free to contribute a Pull Request without creating an issue first.

    Please tag the issue appropriately in the title e.g. [bug], [feature request], [discussion], etc.

    Please provide the following details:

    Issue Description

    Provide a brief description of the issue.

    Environment

    For any bugs, please provide the following:

    • OS and python version.
    • PyTorch version, or if relevant, output of pip freeze.
    • Pyro version: output of python -c 'import pyro; print pyro.__version__'

    Code Snippet

    Provide any relevant code snippets and commands run to replicate the issue.

    bug 
    opened by jopeptid 2
  • [FR] a Renderer module that allows passing the model arguments directly instead of wrapping them into dict/tuples

    [FR] a Renderer module that allows passing the model arguments directly instead of wrapping them into dict/tuples

    Issue Description

    Currently the render_model function requires to store the model arguments into a dict before passing them

    pyro.render_model(model, model_kwargs={'data':data, ...})

    It would be nice to have a Renderer class that instead allows passing the rendering option into a constructor and the model arguments as parameters of a render method, the same as poutine.trace(model).get_trace(*args, **kwargs) For instance the following class:

    class Renderer:
        def __init__(self, model, *args,**kwargs):
            self.model=model
            self.args=args
            self.kwargs=kwargs
    
        def render(self, *model_args, **model_kwargs):
            return pyro.render_model(
                self.model,
                model_args=model_args,
                model_kwargs=model_kwargs,
                *self.args,
                **self.kwargs
            )
    

    allows rendering the models

    def model(x=0, y=1, z=2):
        ...
        return   
    

    using

    Renderer(model, render_params=True, render_distributions=True).render(x=0,y=1,z=2)
    

    instead of pyro.render_model(model, render_params=True, render_distributions=True, model_kwargs={'x':0,'y':1,'z':2})

    Thanks !

    enhancement usability 
    opened by rapharomero 0
  • Document error or misunderstanding in bayesian_regression [discussion]

    Document error or misunderstanding in bayesian_regression [discussion]

    In the last paragraph

    Finally, let us revisit our earlier question of how robust the relationship between terrain ruggedness and GDP is against any uncertainty in the parameter estimates from our model. For this, we plot the distribution of the slope of the log GDP given terrain ruggedness for nations within and outside Africa. As can be seen below, the probability mass for African nations is largely concentrated in the positive region and vice-versa for other nations, lending further credence to the original hypothesis.

    weight = samples["linear.weight"]
    weight = weight.reshape(weight.shape[0], 3)
    
    # here make me confusion
    gamma_within_africa = weight[:, 1] + weight[:, 2]
    gamma_outside_africa = weight[:, 1]
    
    
    fig = plt.figure(figsize=(10, 6))
    sns.distplot(gamma_within_africa, kde_kws={"label": "African nations"},)
    sns.distplot(gamma_outside_africa, kde_kws={"label": "Non-African nations"})
    fig.suptitle("Density of Slope : log(GDP) vs. Terrain Ruggedness");
    
    

    As data is created by:

    df["cont_africa_x_rugged"] = df["cont_africa"] * df["rugged"]
    data = torch.tensor(df[["cont_africa", "rugged", "cont_africa_x_rugged", "rgdppc_2000"]].values,
                            dtype=torch.float)
    x_data, y_data = data[:, :-1], data[:, -1]
    
    

    So

    • weight[:, 1] : "rugged" (all nations)
    • weight[:, 2]: "cont_africa_x_rugged" ( africa )
    • weight[:, 1] + weight[:, 2] : "rugged" (all nations) + "cont_africa_x_rugged" ( africa )

    In my vision, the variables should be named

    slope_all_plus_africa = weight[:, 1] + weight[:, 2]
    slope_all = weight[:, 1]
    

    There is not within or outside Africa . Distinguish within and outside Africa need filter first :

    african_nations = predictions[predictions["cont_africa"] == 1]
    non_african_nations = predictions[predictions["cont_africa"] == 0]
    
    Examples 
    opened by eromoe 0
  • feat: add type hints for param_store.py in `params` module

    feat: add type hints for param_store.py in `params` module

    opened by willtai 0
Releases(1.8.3)
  • 1.8.3(Nov 22, 2022)

    What's Changed

    • rename custom_objectives_training.ipynb -> custom_objectives.ipynb by @martinjankowiak in https://github.com/pyro-ppl/pyro/pull/3141
    • Expose Gaussian algorithms by @fritzo in https://github.com/pyro-ppl/pyro/pull/3145
    • fix_reverse_out_bound_quadratic_spline by @LiaoShiqi97 in https://github.com/pyro-ppl/pyro/pull/3140
    • Improve sequential_gaussian_filter_sample() by @fritzo in https://github.com/pyro-ppl/pyro/pull/3146
    • Add jitter to Cholesky factorization in Gaussian ops by @fritzo in https://github.com/pyro-ppl/pyro/pull/3151
    • Clean up handling of global settings by @fritzo in https://github.com/pyro-ppl/pyro/pull/3152
    • Add an option to stop PyroModules from sharing parameters by @eb8680 in https://github.com/pyro-ppl/pyro/pull/3149
    • Ensure compatibility with torch>=1.13 torchvision>=0.14 by @fritzo in https://github.com/pyro-ppl/pyro/pull/3155
    • Adds min_stepsize and max_stepsize to HMC, avoiding infinite loop by @fritzo in https://github.com/pyro-ppl/pyro/pull/3157

    New Contributors

    • @LiaoShiqi97 made their first contribution in https://github.com/pyro-ppl/pyro/pull/3140

    Full Changelog: https://github.com/pyro-ppl/pyro/compare/1.8.2...1.8.3

    Source code(tar.gz)
    Source code(zip)
  • 1.8.2(Sep 6, 2022)

    What's Changed

    • Fix nbshpinx rendering of tutorials by @fritzo in https://github.com/pyro-ppl/pyro/pull/3055
    • Modified docker file to use wget instead of curl by @Jayanth-kumar5566 in https://github.com/pyro-ppl/pyro/pull/3058
    • Updating Docker file to use Ubuntu 18.04 by @Jayanth-kumar5566 in https://github.com/pyro-ppl/pyro/pull/3059
    • Updated CUDA and base OS version by @Jayanth-kumar5566 in https://github.com/pyro-ppl/pyro/pull/3060
    • Fix contrib.funsor.Trace_EnumELBO model enumeration by @ordabayevy in https://github.com/pyro-ppl/pyro/pull/3063
    • Fix docstrings of distributions rendered on sphinx by @fehiepsi in https://github.com/pyro-ppl/pyro/pull/3064
    • Add test applying AutoNormal to two different models by @fritzo in https://github.com/pyro-ppl/pyro/pull/3065
    • Fix link in AIR tutorial by @dilaragokay in https://github.com/pyro-ppl/pyro/pull/3066
    • Numerically stabilize ProjectedNormal.log_prob() via erfc by @fritzo in https://github.com/pyro-ppl/pyro/pull/3071
    • fix documentation class names to VariationalSparseGP by @ivetasarfyova in https://github.com/pyro-ppl/pyro/pull/3076
    • fix Nystrom approximation equation indices by @ivetasarfyova in https://github.com/pyro-ppl/pyro/pull/3077
    • remove extraneous svi step call in documentation by @ivetasarfyova in https://github.com/pyro-ppl/pyro/pull/3078
    • Add type assertion for model_args and model_kwargs of render_model by @dilaragokay in https://github.com/pyro-ppl/pyro/pull/3083
    • Remove batch_expand helper function in air example by @ordabayevy in https://github.com/pyro-ppl/pyro/pull/3086
    • Fix a typo in air tutorial by @ordabayevy in https://github.com/pyro-ppl/pyro/pull/3087
    • Fix Nyström Typo by @adiehl96 in https://github.com/pyro-ppl/pyro/pull/3084
    • Use provenance tracking to compute downstream costs in TraceGraph_ELBO by @ordabayevy in https://github.com/pyro-ppl/pyro/pull/3081
    • Use : for param domain in render_model by @fehiepsi in https://github.com/pyro-ppl/pyro/pull/3097
    • store batch_shape as torch.Size object in MultivariateStudentT distribution by @flo-schu in https://github.com/pyro-ppl/pyro/pull/3099
    • Fix bound partial for python 3.10 by @fehiepsi in https://github.com/pyro-ppl/pyro/pull/3101
    • Fix _logmatmulexp comment by @fritzo in https://github.com/pyro-ppl/pyro/pull/3105
    • Fixing typos in enumeration tutorial by @ordabayevy in https://github.com/pyro-ppl/pyro/pull/3109
    • Expose docs for pyro.contrib.funsor.infer.infer_discrete by @fritzo in https://github.com/pyro-ppl/pyro/pull/3111
    • Update signature of sample in tutorial intro_long by @fraterenz in https://github.com/pyro-ppl/pyro/pull/3112
    • Allow autocorrelation() to run without MKL by @fritzo in https://github.com/pyro-ppl/pyro/pull/3113
    • Fix multinomial_goodness_of_fit printing by @fritzo in https://github.com/pyro-ppl/pyro/pull/3115
    • Numerically stabilize autocorrelation() by @fritzo in https://github.com/pyro-ppl/pyro/pull/3114
    • Implement NanMaskedNormal, NanMaskedMultivariateNormal by @fritzo in https://github.com/pyro-ppl/pyro/pull/3116
    • Fix links in workflow tutorial by @fritzo in https://github.com/pyro-ppl/pyro/pull/3119
    • Implement Resamper for interactive prior tuning by @fritzo in https://github.com/pyro-ppl/pyro/pull/3118
    • fix entropy term in distributions.py (wrong isinstance check) by @martinjankowiak in https://github.com/pyro-ppl/pyro/pull/3120
    • Custom loss documentation by @e-pet in https://github.com/pyro-ppl/pyro/pull/3122
    • Fix model_7 in examples/contrib/funsor/hmm.py by @ordabayevy in https://github.com/pyro-ppl/pyro/pull/3126
    • Fix memory leak in TraceEnumELBO by @fehiepsi in https://github.com/pyro-ppl/pyro/pull/3131
    • Bump version to 1.8.2 by @fritzo in https://github.com/pyro-ppl/pyro/pull/3135

    New Contributors

    • @Jayanth-kumar5566 made their first contribution in https://github.com/pyro-ppl/pyro/pull/3058
    • @dilaragokay made their first contribution in https://github.com/pyro-ppl/pyro/pull/3066
    • @ivetasarfyova made their first contribution in https://github.com/pyro-ppl/pyro/pull/3076
    • @adiehl96 made their first contribution in https://github.com/pyro-ppl/pyro/pull/3084
    • @flo-schu made their first contribution in https://github.com/pyro-ppl/pyro/pull/3099
    • @fraterenz made their first contribution in https://github.com/pyro-ppl/pyro/pull/3112
    • @e-pet made their first contribution in https://github.com/pyro-ppl/pyro/pull/3122

    Full Changelog: https://github.com/pyro-ppl/pyro/compare/1.8.1...1.8.2

    Source code(tar.gz)
    Source code(zip)
  • 1.8.1(Mar 24, 2022)

    Breaking changes

    • Update to PyTorch 1.11.0 in https://github.com/pyro-ppl/pyro/pull/3045
    • Drop support for Python 3.6 in https://github.com/pyro-ppl/pyro/pull/3047

    New features

    • New tutorial on Bayesian workflow, with a SARS-CoV-2 running example in https://github.com/pyro-ppl/pyro/pull/2977
    • Render parameters in render_model() @karm-patel in https://github.com/pyro-ppl/pyro/pull/3039
    • Distributions:

    Misc changes

    • fix docstring in repeated_matmul by @martinjankowiak in https://github.com/pyro-ppl/pyro/pull/2999
    • use more conservative learning rate in bart.py by @martinjankowiak in https://github.com/pyro-ppl/pyro/pull/3002
    • Clarified comments in minipyro.py to fix #3003 by @luiarthur in https://github.com/pyro-ppl/pyro/pull/3004
    • Update black formatting by @fritzo in https://github.com/pyro-ppl/pyro/pull/3020
    • Closes #3016 by @nipunbatra in https://github.com/pyro-ppl/pyro/pull/3017
    • Fix timeseries tutorial link by @fritzo in https://github.com/pyro-ppl/pyro/pull/3021
    • Some documentation improvements to MLE_MAP and SVI_P by @nipunbatra in https://github.com/pyro-ppl/pyro/pull/3022
    • Improvements to GP tutorial by @nipunbatra in https://github.com/pyro-ppl/pyro/pull/3027
    • change sample to param statements in guides by @dhudsmith in https://github.com/pyro-ppl/pyro/pull/3042
    • Added type assertion for better code clarity by @GautamV234 in https://github.com/pyro-ppl/pyro/pull/3036
    • add GitHub URL for PyPi by @andriyor in https://github.com/pyro-ppl/pyro/pull/3038

    New Contributors

    • @luiarthur made their first contribution in https://github.com/pyro-ppl/pyro/pull/3004
    • @nipunbatra made their first contribution in https://github.com/pyro-ppl/pyro/pull/3017
    • @dhudsmith made their first contribution in https://github.com/pyro-ppl/pyro/pull/3042
    • @GautamV234 made their first contribution in https://github.com/pyro-ppl/pyro/pull/3036
    • @karm-patel made their first contribution in https://github.com/pyro-ppl/pyro/pull/3039
    • @andriyor made their first contribution in https://github.com/pyro-ppl/pyro/pull/3038

    Full Changelog: https://github.com/pyro-ppl/pyro/compare/1.8.0...1.8.1

    Source code(tar.gz)
    Source code(zip)
  • 1.8.0(Dec 14, 2021)

    New features and improvements

    Source code(tar.gz)
    Source code(zip)
  • 1.7.0(Jul 6, 2021)

    New features

    • Update to PyTorch 1.9 #2887
    • A StreamingMCMC class for high-dimensional Bayesian inference using NUTS or HMC, thanks to @mtsokol #2857 . StreamingMCMC is a drop-in replacement for MCMC that avoids storing samples during inference by streamingly computing statistics such as mean, variance, and r_hat. You can define your own statistics using the pyro.ops.streaming module by either composing existing statistics or defining your own subclass of StreamingStats #2856 .
    • Make poutine.reparam compatible with initialization logic in autoguides and MCMC #2876 . Previously you needed to manually transform the value in init_to_value() when using a reparametrizer. In Pyro 1.7 you can specify a single init_to_value() output that should work regardless of whether your model is transformed by a reparametrizer. Note this involves a major refactoring of the Reparam interface, namely replacing .call() with .apply(). If you have defined custom reparametrizers using .__call__() you should refactor them before the next Pyro release.
    • Add an AutoStructured guide with flexible distributions, sparse flexible dependencies among latent variables, and support for reparametrization #2812 . This autoguide is somewhere between AutoNormal and AutoIAFNormal+NeuTraReparam. Like AutoNormal this guide is interpretable and structured. Like NeuTraReparam this guide is flexible and can be used to improve geometry for subsequent inference via HMC or NUTS.
    • New directional distributions thanks to @OlaRonning
    • New distributions over genetic and amino acid sequences, thanks to @EWeinstein #2728 (see https://www.biorxiv.org/content/10.1101/2020.07.31.231381v2)
    • New distributions with exponential tails
    • Add AutoGuideList.quantiles() thanks to @vitkl #2896
    • Allow saving a subset of variables in MCMC via the save_params option, which can save memory #2793
    • Add a sample option to pyro.contrib.funsor.infer_discrete #2789

    New tutorials

    Bug fixes

    • Fix poutine.do to avoid duplicate entries in cond_indep_stack #2846
    • Fix infer.csis to ignore unused gradients, thanks to @fshipy #2828
    • Catch NAN values in TraceENUM_ELBO grads #2782

    Breaking changes

    • Simplify param names in callable optim configs #2814
    • Switch to softplus transform when using autoguide scales, thanks to experiments performed by @vitkl #2823

    Ceanup

    • Start using mypy for type checking, thanks to @kamathhrishi #2853 #2858
    • Start using black code formatter #2891
    Source code(tar.gz)
    Source code(zip)
  • 1.6.0(Mar 4, 2021)

    Breaking changes

    • Update to PyTorch 1.8 release (required).
    • Enable validation by default #2701. To disable globally call pyro.enable_validation(False); or disable locally to one distribution via e.g. Normal(loc, scale, validate_args=False).
    • Switch from LKJCorrCholesky distribution to upstream LKJCholesky distribution #2771.

    New Tutorials

    New features

    Bugfixes

    Source code(tar.gz)
    Source code(zip)
  • 1.5.2(Feb 1, 2021)

    This patch release merely

    • Pins to requirements to torch<1.8 to avoid breaking changes in torch 1.8.0 (introduced in pytorch/pytorch#50547 pytorch/pytorch#50581).
    • Fixes an fft bug #2731
    Source code(tar.gz)
    Source code(zip)
  • 1.5.1(Nov 17, 2020)

    New features

    Bug fixes

    • #2683 Support PyTorch 1.7
    • #2682 Fix help(MyDistribution)
    • #2679 Fix TraceEnum_ELBO.compute_marginals()
    • #2677 Warn if infer_discrete() finds no discrete sites
    Source code(tar.gz)
    Source code(zip)
  • 1.5.0(Oct 15, 2020)

    New features

    Breaking changes

    • Require PyTorch 1.6.
    • Drop support for Python 3.5; require Python 3.6+.
    • Zero inflated distributions changed interface. #2643

    Bug fixes & performance tweaks

    • pyro.factor statements are now allowed in guides without warning. #2664
    • Fix model-directed subsampling in autoguides. #2638
    • Fix sample shape bug in LKJCorrCholesky distribution. #2617
    • Speed up log-matmul-exp operations in discrete enumeration and DiscreteHMM. #2640
    • Fix potential_fn issues in MCMC. #2591
    Source code(tar.gz)
    Source code(zip)
  • 1.4.0(Jul 20, 2020)

    New features

    Bug fixes

    • Support sequential plates in RenyiELBO #2541
    • Fixes to AffineAutogregressive #2504
    • Fixes to BatchNorm TransformModule #2459
    • Fixes to how some transforms handle parameters #2544
    • Fixes to reraising logic that clean up error reporting during inference #2494
    • many other fixes to documentation and code
    Source code(tar.gz)
    Source code(zip)
  • 1.3.1(Apr 7, 2020)

    New features

    • A new Spline transform which implements element-wise rational spline bijections of linear order.
    • A new ConditionalAffineCoupling transform which implements the affine coupling layer of RealNVP that conditions on an additional context variable.

    Enhancements to the pyro.contrib.forecast module

    • Support drawing samples in batches.
    • Add walltime to backtest to measure performance of model training and forecasting.
    • Support more likelihood distributions: Geometric, NegativeBinomial, ZeroInflatedNegativeBinomial.

    Bug fixes

    • #2399 raises an error when HMC/NUTS is used for a model with subsampling.
    • #2390 makes PyroModule compatible with torch.nn.RNN.
    • #2388 allows unused params in CSIS inference.
    • #2384 fixes some caching issues in calculation of log_abs_det_jacobian of TransformModules
    • #2365 fixes a naming bug in LocScaleReparam whereby all loc-scale reparameterized sites shared a single centeredness parameter.
    • #2355 makes jit_compile=True flag in HMC/NUTS work for models with pyro.param statements.
    Source code(tar.gz)
    Source code(zip)
  • 1.3.0(Mar 8, 2020)

    New features

    Bug fixes

    • #2345 remove pillow-simd dependency
    • #2327 Make pyro.deterministic not warn when called outside of inference
    • #2321 Support plates in RenyiELBO
    • #2266 Fixes to transform handling in MCMC api
    Source code(tar.gz)
    Source code(zip)
  • 1.2.1(Jan 23, 2020)

    Patches 1.2.0 with the following bug fixes:

    • Fix for MCMC with parallel chains using multiprocessing, where transforms to the latent sites' support was not being correctly stored.
    • Other minor rendering related fixes for tutorials.
    Source code(tar.gz)
    Source code(zip)
  • 1.2.0(Jan 17, 2020)

    Misc changes

    • Updated to PyTorch 1.4.0 and torchvision 0.5.0.
    • Changed license from MIT to Apache 2.0 and removed Uber CLA as part of Pyro's move to the Linux foundation.

    Reparameterization

    This release adds a new effect handler and a collection of strategies that reparameterize models to improve geometry. These tools are largely orthogonal to other inference tools in Pyro, and can be used with SVI, MCMC, and other inference algorithms.

    Other new features

    Bug fixes

    • #2263 fixes MCMC api to allow implementations other than HMC and NUTS.
    • #2244 fixes an event_dim issue in ConditionedPlanar flow.
    • #2243 fixes a bug in AffineCoupling.
    • #2227 fixes device placement of the MultivariateStudentT.df param.
    • #2226 fixes an edge case bug in discrete enumeration.
    Source code(tar.gz)
    Source code(zip)
  • 1.1.0(Dec 7, 2019)

    New Features

    New distributions and transforms

    Other Changes / Bug Fixes

    • pyro.util.save_visualization has been deprecated, and dependency on graphviz is removed.
    • #2197 fixed a naming bug in PyroModule that affected mutliple sub-PyroModules with conflicting names.
    • #2192 Bug fix in Planar normalizing flow implementation
    • #2188 Make error messages for incorrect arguments to effect handlers more informative
    Source code(tar.gz)
    Source code(zip)
  • 1.0.0(Nov 16, 2019)

    The objective of this release is to stabilize Pyro's interface and thereby make it safer to build high level components on top of Pyro.

    Stability statement

    • Behavior of documented APIs will remain stable across minor releases, except for bug fixes and features marked EXPERIMENTAL or DEPRECATED.
    • Serialization formats will remain stable across patch releases, but may change across minor releases (e.g. if you save a model in 1.0.0, it will be safe to load it in 1.0.1, but not in 1.1.0).
    • Undocumented APIs, features marked EXPERIMENTAL or DEPRECATED, and anything inpyro.contrib may change at any time (though we aim for stability).
    • All deprecated features throw a FutureWarning and specify possible work-arounds. Features marked as deprecated will not be maintained, and are likely to be removed in a future release.
    • If you want more stability for a particular feature, contribute a unit test.

    New features

    • pyro.infer.Predictive is a new utility for serving models, supporting jit tracing and serialization.
    • pyro.distributions.transforms has many new transforms, and includes helper functions to easily create a variety of normalizing flows. The transforms library has also been reorganized.
    • pyro.contrib.timeseries is an experimental new module with fast Gaussian Process inference for univariate and multivariate time series and state space models.
    • pyro.nn.PyroModule is an experimental new interface that adds Pyro effects to an nn.Module. PyroModule is already used internally by AutoGuide, EasyGuide pyro.contrib.gp, pyro.contrib.timeseries, and elsewhere.
    • FoldedDistribution is a new distribution factory, essentially equivalent to TransformedDistribution(-, AbsTransform()) but providing a .log_prob() method.
    • A new tutorial illustrates the usage of pyro.contrib.oed in the context of adaptive election polling.

    Breaking changes

    • Autoguides have slightly changed interfaces:
      • AutoGuide and EasyGuide are now nn.Modules and can be serialized separately from the param store. This enables serving via torch.jit.trace_module.
      • The Auto*Normal family of autoguides now have init_scale arguments, and init_loc_fn has better support. Autoguides no longer support initialization by writing directly to the param store.
    • Many transforms have been renamed to enforce a consistent interface, such as the renaming of InverseAutoregressiveFlow to AffineAutoregressive.
    • pyro.generic has been moved to a separate project pyroapi.
    • poutine.do has slightly changed semantics to follow Single World Intervention Graph semantics.
    • pyro.contrib.glmm has been moved to pyro.contrib.oed.glmm and will eventually be replaced by BRMP.
    • Existing DeprecationWarnings have been promoted to FutureWarnings.

    Deprecated features

    • pyro.random_module: The pyro.random_module primitive has been deprecated in favor of PyroModule which can be used to create Bayesian modules from torch.nn.Module instances.
    • SVI.run: The SVI.run method is deprecated and users are encouraged to use the .step method directly to run inference. For drawing samples from the posterior distribution, we recommend using the Predictive utility class, or directly by using the trace and replay effect handlers.
    • TracePredictive: The TracePredictive class is deprecated in favor of Predictive, that can be used to gather samples from the posterior and predictive distributions in SVI and MCMC.
    • mcmc.predictive: This utility function has been absorbed into the more general Predictive class.
    Source code(tar.gz)
    Source code(zip)
  • 0.5.1(Oct 24, 2019)

    Patches 0.5.0 with the following bug fixes:

    • Removes f-string which is only supported in Python 3.6+, so that Python 3.5 is supported.
    • Fix incompatibility with recent tqdm releases which make multiple bars not work in the notebook environment (for MCMC with multiple chains).
    Source code(tar.gz)
    Source code(zip)
  • 0.5.0(Oct 23, 2019)

    New features

    Code changes and bug fixes

    • Moved pyro.generic to a separate pyro-api package.
    • Minor changes to ensure compatibility with pyro-api, a generic modeling and inference API for dispatch to different Pyro backends.
    • Improve numerical stability of MixtureOfDiagonals distribution using logsumexp operation.
    • Improved U-Turn check condition in NUTS for better sampling efficiency.
    • Reorganized constraints and transforms module to match torch.distributions.
    • Fixed AutoGuide intitialization stragtegies, resolving a bug in init_to_median.
    Source code(tar.gz)
    Source code(zip)
  • 0.4.1(Aug 19, 2019)

    New Features:

    • *HMM.filter() methods for forecasting.
    • Support for Independent(Normal) observations in GaussianHMM.

    Fixes:

    • Fix for HMC / NUTS to handle errors arising from numerical issues when computing Cholesky decomposition.
    Source code(tar.gz)
    Source code(zip)
  • 0.4.0(Aug 10, 2019)

    This release drops support for Python 2. Additionally, it includes a few fixes to enable Pyro to use the latest PyTorch release, version 1.2.

    Some other additions / minor changes:

    • Add option for sequential predictions for MCMC predictive.
    • Move pyro.contrib.autoguide to the core Pyro repo.
    • Additional inference algorithms
      • SMCFilter for filtering via Sequential Monte Carlo
      • Stein Variational Gradient Descent (SVGD).
    • Add a GaussianHMM distribution for fast tuning of Gaussian state space models / Kalman filters
    Source code(tar.gz)
    Source code(zip)
  • 0.3.4(Jul 16, 2019)

    New features

    • A more flexible easyguide module. Refer to the tutorial for usage instructions.
    • Different initialization methods for autoguides.
    • More normalizing flows - Block Neural Autoregressive Flow, Sum of Squares, Sylvester Flow, DeepELUFlow, Householder Flow, RealNVP.
    • Support ReduceLROnPlateau scheduler.
    • New interface for MCMC inference:
      • Ability to specify a potential function directly instead of Pyro model in HMC/NUTS kernels.
      • MCMC.summary() method that provides site level summary and diagnostic information.
      • Utility function for predictive that replaces the TracePredictive class.
      • Add divergence information to MCMC.diagnostics().
    • A DiscreteHMM distribution for fast parallel training of discrete-state Hidden Markov Models with arbitrary observation distributions. See examples/hmm.py for example usage in a neural HMM.

    Code changes and bug fixes

    • Addresses pickling issue with Pyro handlers that makes it possible to pickle a much larger class of models.
    • Multiple fixes for multiprocessing bugs with MCMC. With the new interface, the memory consumption is low thereby allowing for collecting many more samples.
    • Performance enhancements for models with many sample sites.
    Source code(tar.gz)
    Source code(zip)
  • 0.3.3(May 3, 2019)

    Updates code for compatibility with PyTorch's latest release of version 1.1.0 - mostly function renaming, and using alternate tensor constructors in JIT to avoid DeprecationWarning.

    Source code(tar.gz)
    Source code(zip)
  • 0.3.2(Apr 23, 2019)

    New features

    • A capture-recapture example using stochastic variational inference.
    • ELBO with trace adaptive f-divergence - TraceTailAdaptive_ELBO.
    • New distribution classes - LKJCorrCholesky, SpanningTree.
    • Distribution transforms - RadialFlow, DeepSigmoidalFlow, BatchNormTransform.
    • Pareto-smoothed Importance Sampling (PSIS) diagnostic for Variational Inference.
    • Vectorized indexing with Vindex. Refer to the enumeration tutorial for more details on usage.
    • pyro.contrib.minipyro now supports constrained parameters.
    • pyro.generic module to support an API for backend-agnostic Pyro models. This makes it easier to switch between full Pyro and Minipyro. New backends like funsor and numpyro are under active development.
    • pyro.contrib.conjugate that provides utilities for exact inference on a small subset of conjugate models.

    Code changes and bug fixes

    • Optimized Categorical.log_prob so that evaluation on the distribution's support is much faster leading to almost 2X faster inference on models with enumerated discrete random variables.
    • pyro.module ignores params with requires_grad=False.
    • Correct shape inference in MaskedDistribution when run under torch.jit.trace.
    • Fixed infer_discrete to support plates of size 1 and variable dependencies across plate contexts.
    Source code(tar.gz)
    Source code(zip)
  • 0.3.1(Feb 13, 2019)

    Dependency changes

    • Removes dependency on networkx.
    • Uses opt_einsum version 2.3.2.

    New features

    Minor changes and bug fixes

    • Renamed ubersum(..., batch_dims=...) (deprecated) to einsum(..., plates=...).
    • HMC - fix bug in initial trace setter and diagnostics, resolve slowness issues during adaptation, expose target_accept_prob and max_tree_depth as arguments to the constructor to allow finer grained control over hyper-parameters.
    • Many small fixes to the tutorials.
    Source code(tar.gz)
    Source code(zip)
  • 0.3.0(Dec 7, 2018)

    New features

    Parallel sampling

    Building on poutine.broadcast, Pyro's SVI and HMC inference algorithms now support parallel sampling. For example to use parallel sampling in SVI, create an ELBO instance and configure two particles options, e.g.

    elbo = Trace_ELBO(num_particles=100,
                      vectorize_particles=True)
    

    Dependent enumeration

    TraceEnum_ELBO, HMC, NUTS, and infer_discrete can now exactly marginalize-out discrete latent variables. For dependency structures with narrow treewidth, Pyro performs cheap marginalization via message-passing algorithms, enabling classic learning algorithms such as Baum-Welch for HMMs, DBNs, and CRFs. See our enumeration tutorial for details.

    HMC/NUTS enhancements

    • Mass matrix can be learned along with step size during the warmup phase. This feature significantly improves the performance of HMC/NUTS.
    • Multiple parallel chain is supported (on the CPU), together with various chain diagnostics such as R-hat and effective sample size.
    • Models with discrete latent variables will be enumerated over in parallel.
    • In NUTS, there are two ways of choosing a candidate from a trajectory: multinomial sampling and slice sampling. We have implemented and used multinomial sampling by default.

    New distributions

    • MaskedMixture interleaves two distributions element-wise, as a succinct alternative to multiple sample statements inside multiple poutine.mask contexts.
    • RelaxedBernoulliStraightThrough and RelaxedOneHotCategoricalStraightThrough These are discrete distributions that have been relaxed to continuous space and thus are equipped with pathwise gradients. Thus these distributions can be useful in the context of variational inference, where they can provide lower variance (but biased) gradient estimates. Note that these distributions may be numerically finicky so please let us know if you run into any problems.
    • VonMises and VonMises3D are likelihood-only distributions that are useful for observing 2D or 3D angle data.
    • AVFMultivariateNormal is a multivariate normal distribution that comes equipped with an adaptive gradient estimator that can lead to reduce gradient variance.
    • MixtureOfDiagNormals, MixtureOfDiagNormalsSharedCovariance and GaussianScaleMixture are three families of mixture distributions that come equipped with pathwise gradient estimators (which tend to yield low variance gradients).
    • PlanarFlow and PermutationFlow are two transforms useful for constructing normalizing flows.
    • InverseAutoregressiveFlow improvements such as an explicit inversion operator.

    GPyTorch integration

    This isn't really a feature of Pyro, but we'd like to point out a new feature of the excellent GPyTorch library: GPyTorch can now use Pyro for variational inference, and GPyTorch models can now be used in Pyro models. We recommend the new TraceMeanField_ELBO loss for GPyTorch models.

    Analytic KL in ELBO

    TraceMeanField_ELBO can take advantage of analytic KL divergence expressions in ELBO computations, when available. This ELBO implementation makes some restriction on variable dependency structure. This is especially useful for GPyTorch models.

    IWELBO

    An implementation of the Importance Weighted ELBO objective (pyro.infer.RenyiELBO) is now included. This implementation also includes the generalization of IWELBO to the alpha-divergence (or Rényi divergence of order α) case.

    Automatic max_plate_nesting

    Pyro's ELBO implementations can now automatically determine max_plate_nesting (formerly know as max_iarange_nesting) in models with static plate nesting structure.

    Autoguide

    Some new autoguides are implemented: AutoIAFNormal and AutoLaplaceApproximation.

    Support for the PyTorch JIT

    The PyTorch JIT compiler currently has only partial support for ops used in Pyro programs. If your model has static structure and you're lucky enough to use ops supported by the JIT, you can sometimes get an order-of-magnitude speedup. To enable the JIT in SVI, simply replace Trace_ELBO, TraceGraph_ELBO, or TraceEnum_ELBO classes with their JIT wrappers JitTrace_ELBO, JitTraceGraph_ELBO, or JitTraceEnum_ELBO. To enable the JIT in HMC or NUTS pass the jit_compile kwarg. See our JIT tutorial for details.

    Stats

    pyro.ops.stats contains many popular statistics functions such as resample, quantile, pi (percentile interval), hpdi (highest posterior density interval), autocorrelation, autocovariance, etc

    Better error messages

    Pyro now provides more validation checks and better error messages, including shape logging using the Trace.format_shapes() method. This is very useful for debugging shape errors. See the tensor shapes tutorial for help in reading the shapes table.

    New Tutorials

    Language

    Examples

    and additional examples in the examples directory.

    Breaking changes

    • pyro.plate replaces all of pyro.irange, pyro.iarange, and poutine.broadcast. You should no longer need to use poutine.broadcast manually.
    • independent() is now renamed to_event()
    • poutine.mask was separated from poutine.scale. Now you should use poutine.mask with ByteTensors and poutine.scale for positive tensors (usually just scalars).
    • .enumerate_support(expand=False)
    • Some distributions interfaces changed, e.g. LowRankMultivariateNormal and HalfNormal
    • Pyro Gaussian Process module has been revised:
      • Variational inference now works with PyTorch parameters directly instead of interacting with Pyro ParamStoreDict as before.
      • Methods .get_param(name) and .fix_param(name, value) are removed.
      • Auto guide is supported through the method .autoguide(name, ...). And we have implemented Bayesian GPLVM model to illustrate autoguide functionality.
      • Kernel methods .sum(), .product() are removed. Instead, we encourage users using a better paradigm: Sum(kern0, kern1), Product(kern0, kern1).

    Note also that Pyro 0.3 now uses PyTorch 1.0, which makes a number of breaking changes.

    Experimental features

    We are releasing the following features early, intended for experimental use only. Pyro provides no backward-compatibility guarantees for these features.

    Tracking and data association

    pyro.contrib.tracking is provides some experimental components for data association and multiple-object tracking. See the object tracking and Kalman Filter tutorials for examples of using the library.

    Optimal experimental design

    pyro.contrib.oed This package provides some support for doing Bayesian Optimal Experimental Design (OED) in Pyro. In particular it provides support for estimating the Estimated Information Gain, which is one of the key quantities required for Bayesian OED. This package is in active development and is expected to undergo significant changes. See the docs for more details.

    Automatic naming and scoping

    pyro.contrib.autoname provides some automatic naming utilities that can ease the burden of subscripting like "x_{}".format(t).

    Source code(tar.gz)
    Source code(zip)
  • 0.2.1(May 22, 2018)

    Source code(tar.gz)
    Source code(zip)
  • 0.2.0(Apr 25, 2018)

    Support for PyTorch 0.4

    Pyro 0.2 supports PyTorch 0.4. See PyTorch release notes for comprehensive changes. The most important change is that Variable and Tensor have been merged, so you can now simplify

    - pyro.param("my_param", Variable(torch.ones(1), requires_grad=True))
    + pyro.param("my_param", torch.ones(1))
    

    PyTorch distributions

    PyTorch's torch.distributions library is now Pyro’s main source for distribution implementations. The Pyro team helped create this library by collaborating with Adam Paszke, Alican Bozkurt, Vishwak Srinivasan, Rachit Singh, Brooks Paige, Jan-Willem Van De Meent, and many other contributors and reviewers. See the Pyro wrapper docs for wrapped PyTorch distributions and the Pyro distribution docs for Pyro-specific distributions.

    Constrained parameters

    Parameters can now be constrained easily using notation like

    from torch.distributions import constraints
    
    pyro.param(“sigma”, torch.ones(10), constraint=constraints.positive)
    

    See the torch.distributions.constraints library and all of our Pyro tutorials for example usage.

    Arbitrary tensor shapes

    Arbitrary tensor shapes and batching are now supported in Pyro. This includes support for nested batching via iarange and support for batched multivariate distributions. The iarange context and irange generator are now much more flexible and can be combined freely. With power comes complexity, so check out our tensor shapes tutorial (hint: you’ll need to use .expand_by() and .independent()).

    Parallel enumeration

    Discrete enumeration can now be parallelized. This makes it especially easy and cheap to enumerate out discrete latent variables. Check out the Gaussian Mixture Model tutorial for example usage. To use parallel enumeration, you'll need to first configure sites, then use the TraceEnum_ELBO losss:

    def model(...):
        ...
    
    @config_enumerate(default="parallel")  # configures sites
    def guide(...):
        with pyro.iarange("foo", 10):
            x = pyro.sample("x", dist.Bernoulli(0.5).expand_by([10]))
            ...
    
    svi = SVI(model, guide, Adam({}),
              loss=TraceEnum_ELBO(max_iarange_nesting=1))  # specify loss
    svi.step()
    

    Markov chain monte carlo via HMC and NUTS

    This release adds experimental support for gradient-based Markov Chain Monte Carlo inference via Hamiltonian Monte Carlo pyro.infer.HMC and the No U-Turn Sampler pyro.infer.NUTS. See the docs and example for details.

    Gaussian Processes

    A new Gaussian Process module pyro.contrib.gp provides a framework for learning with Gaussian Processes. To get started, take a look at our Gaussian Process Tutorial. Thanks to Du Phan for this extensive contribution!

    Automatic guide generation

    Guides can now be created automatically with the pyro.contrib.autoguide library. These work only for models with simple structure (no irange or iarange), and are easy to use:

    from pyro.contrib.autoguide import AutoDiagNormal
    def model(...):
        ...
        
    guide = AutoDiagonalNormal(model)
    svi = SVI(model, guide, ...)
    

    Validation

    Model validation is now available via three toggles:

    pyro.enable_validation()
    pyro.infer.enable_validation()
    # Turns on validation for PyTorch distributions.
    pyro.distributions.enable_validation()
    

    These can also be used temporarily as context managers

    # Run with validation in first step.
    with pyro.validation_enabled(True):
        svi.step()
    # Avoid validation on subsequent steps (may miss NAN errors).
    with pyro.validation_enabled(False):
        for i in range(1000):
            svi.step()
    

    Rejection sampling variational inference (RSVI)

    We've added support for vectorized rejection sampling in a new Rejector distribution. See docs or RejectionStandardGamma class for example usage.

    Source code(tar.gz)
    Source code(zip)
  • 0.1.2(Nov 10, 2017)

    • #533 Fix for bug in gradient scaling of subsampled non-reparameterized sites
    • Misc improvements in documentation
    • Fixes to tests in prep for PyTorch v0.3.0 release
    • #530 Split LambdaPoutine into ScalePoutine + IndepPoutine
    Source code(tar.gz)
    Source code(zip)
  • 0.1.1(Nov 5, 2017)

Owner
Pyro - Deep Universal Probabilistic Programming
Stitch together Nanopore tiled amplicon data without polishing a reference

Stitch together Nanopore tiled amplicon data using a reference guided approach Tiled amplicon data, like those produced from primers designed with pri

Amanda Warr 14 Aug 30, 2022
Maximum Covariance Analysis in Python

xMCA | Maximum Covariance Analysis in Python The aim of this package is to provide a flexible tool for the climate science community to perform Maximu

Niclas Rieger 39 Jan 03, 2023
Evidence enables analysts to deliver a polished business intelligence system using SQL and markdown.

Evidence enables analysts to deliver a polished business intelligence system using SQL and markdown

915 Dec 26, 2022
Sample code for Harry's Airflow online trainng course

Sample code for Harry's Airflow online trainng course You can find the videos on youtube or bilibili. I am working on adding below things: the slide p

102 Dec 30, 2022
A model checker for verifying properties in epistemic models

Epistemic Model Checker This is a model checker for verifying properties in epistemic models. The goal of the model checker is to check for Pluralisti

Thomas Träff 2 Dec 22, 2021
Top 50 best selling books on amazon

It's a dashboard that shows the detailed information about each book in the top 50 best selling books on amazon over the last ten years

Nahla Tarek 1 Nov 18, 2021
An Integrated Experimental Platform for time series data anomaly detection.

Curve Sorry to tell contributors and users. We decided to archive the project temporarily due to the employee work plan of collaborators. There are no

Baidu 486 Dec 21, 2022
Data analysis and visualisation projects from a range of individual projects and applications

Python-Data-Analysis-and-Visualisation-Projects Data analysis and visualisation projects from a range of individual projects and applications. Python

Tom Ritman-Meer 1 Jan 25, 2022
Python script to automate the plotting and analysis of percentage depth dose and dose profile simulations in TOPAS.

topas-create-graphs A script to automatically plot the results of a topas simulation Works for percentage depth dose (pdd) and dose profiles (dp). Dep

Sebastian Schäfer 10 Dec 08, 2022
A real data analysis and modeling project - restaurant inspections

A real data analysis and modeling project - restaurant inspections Jafar Pourbemany 9/27/2021 This project represents data analysis and modeling of re

Jafar Pourbemany 2 Aug 21, 2022
Statsmodels: statistical modeling and econometrics in Python

About statsmodels statsmodels is a Python package that provides a complement to scipy for statistical computations including descriptive statistics an

statsmodels 8k Dec 29, 2022
PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams

PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams Motivation When dataset freshness is critical, the annotating of high speed

4 Aug 02, 2022
Includes all files needed to satisfy hw02 requirements

HW 02 Data Sets Mean Scale Score for Asian and Hispanic Students, Grades 3 - 8 This dataset provides insights into the New York City education system

7 Oct 28, 2021
A collection of robust and fast processing tools for parsing and analyzing web archive data.

ChatNoir Resiliparse A collection of robust and fast processing tools for parsing and analyzing web archive data. Resiliparse is part of the ChatNoir

ChatNoir 24 Nov 29, 2022
Pandas and Dask test helper methods with beautiful error messages.

beavis Pandas and Dask test helper methods with beautiful error messages. test helpers These test helper methods are meant to be used in test suites.

Matthew Powers 18 Nov 28, 2022
Data Scientist in Simple Stock Analysis of PT Bukalapak.com Tbk for Long Term Investment

Data Scientist in Simple Stock Analysis of PT Bukalapak.com Tbk for Long Term Investment Brief explanation of PT Bukalapak.com Tbk Bukalapak was found

Najibulloh Asror 2 Feb 10, 2022
Driver Analysis with Factors and Forests: An Automated Data Science Tool using Python

Driver Analysis with Factors and Forests: An Automated Data Science Tool using Python 📊

Thomas 2 May 26, 2022
A columnar data container that can be compressed.

Unmaintained Package Notice Unfortunately, and due to lack of resources, the Blosc Development Team is unable to maintain this package anymore. During

944 Dec 09, 2022
This is a repo documenting the best practices in PySpark.

Spark-Syntax This is a public repo documenting all of the "best practices" of writing PySpark code from what I have learnt from working with PySpark f

Eric Xiao 447 Dec 25, 2022