The unified machine learning framework, enabling framework-agnostic functions, layers and libraries.

Overview

https://github.com/unifyai/ivy/blob/master/docs/partial_source/logos/logo.png?raw=true



The unified machine learning framework, enabling framework-agnostic functions, layers and libraries.

Contents

Overview

What is Ivy?

Ivy is a unified machine learning framework which maximizes the portability of machine learning codebases. Ivy wraps the functional APIs of existing frameworks. Framework-agnostic functions, libraries and layers can then be written using Ivy, with simultaneous support for all frameworks. Ivy currently supports Jax, TensorFlow, PyTorch, MXNet and Numpy. Check out the docs for more info!

Ivy Libraries

There are a host of derived libraries written in Ivy, in the areas of mechanics, 3D vision, robotics, gym environments, neural memory, pre-trained models + implementations, and builder tools with trainers, data loaders and more. Click on the icons below to learn more!


Quick Start

Ivy can be installed like so: pip install ivy-core You can immediately use Ivy to train a neural network, using your favourite framework in the background, like so:

import ivy

class MyModel(ivy.Module):
    def __init__(self):
        self.linear0 = ivy.Linear(3, 64)
        self.linear1 = ivy.Linear(64, 1)
        ivy.Module.__init__(self)

    def _forward(self, x):
        x = ivy.relu(self.linear0(x))
        return ivy.sigmoid(self.linear1(x))

ivy.set_framework('torch')  # change to any framework!
model = MyModel()
optimizer = ivy.Adam(1e-4)
x_in = ivy.array([1., 2., 3.])
target = ivy.array([0.])

def loss_fn(v):
    out = model(x_in, v=v)
    return ivy.reduce_mean((out - target)**2)[0]

for step in range(100):
    loss, grads = ivy.execute_with_gradients(loss_fn, model.v)
    model.v = optimizer.step(model.v, grads)
    print('step {} loss {}'.format(step, ivy.to_numpy(loss).item()))

print('Finished training!')

This example uses PyTorch as a backend framework, but the backend can easily be changed to your favourite framework, such as TensorFlow, JAX or MXNet.

Framework Agnostic Functions

In the example below we show how Ivy's concatenation function is compatible with tensors from different frameworks. This is the same for ALL Ivy functions. They can accept tensors from any framework and return the correct result.

import jax.numpy as jnp
import tensorflow as tf
import numpy as np
import mxnet as mx
import torch

import ivy

jax_concatted = ivy.concatenate((jnp.ones((1,)), jnp.ones((1,))), -1)
tf_concatted = ivy.concatenate((tf.ones((1,)), tf.ones((1,))), -1)
np_concatted = ivy.concatenate((np.ones((1,)), np.ones((1,))), -1)
mx_concatted = ivy.concatenate((mx.nd.ones((1,)), mx.nd.ones((1,))), -1)
torch_concatted = ivy.concatenate((torch.ones((1,)), torch.ones((1,))), -1)

To see a list of all Ivy methods, type ivy. into a python command prompt and press tab. You should then see output like the following:

https://github.com/unifyai/ivy/blob/master/docs/partial_source/images/ivy_tab.png?raw=true

Based on this short code sample alone, you may wonder, why is this helpful? Don't most developers stick to just one framework for a project? This is indeed the case, and the benefit of Ivy is not the ability to combine different frameworks in a single project.

So what is the benefit of Ivy?

In a Nutshell

Ivy's strength arises when we want to maximize the usability of our code.

We can write a set of functions once in Ivy, and share these with the community so that all developers can use them, irrespective of their personal choice of framework. TensorFlow? PyTorch? Jax? With Ivy code it doesn't matter!

This makes it very simple to create highly portable machine learning codebases. The core idea behind Ivy is captured by the example of the ivy.clip function below.

https://github.com/unifyai/ivy/blob/master/docs/partial_source/images/a_unified_framework.png?raw=true

On it's own this may not seem very exciting, there are more interesting things to do in machine learning than clip tensors. Ivy is a building block for more interesting applications.

For example, the Ivy libraries for mechanics, 3D vision, robotics, and differentiable environments are all written in pure Ivy. These libraries provide fully differentiable implementations of various applied functions, primed for integration in end-to-end networks, for users of any machine-learning framework.

Another benefit of Ivy is user flexibility. By keeping the Ivy abstraction lightweight and fully functional, this keeps you in full control of your code. The schematic below emphasizes that you can choose to develop at any abstraction level.

https://github.com/unifyai/ivy/blob/master/docs/partial_source/images/abstraction_hierarchy.png?raw=true

You can code entirely in Ivy, or mainly in their native DL framework, with a small amount of Ivy code. This is entirely up to you, depending on how many Ivy functions you need from existing Ivy libraries, and how much new Ivy code you add into your own project, to maximize it's audience when sharing online.

Where Next?

So, now that you've got the gist of Ivy, and why it's useful. Where to next?

This depends on whether you see yourself in the short term as more likely to be an Ivy library user or an Ivy library contributor.

If you would like to use the existing set of Ivy libraries, dragging and dropping key functions into your own project, then we suggest you dive into some of the demos for the various Ivy libraries currently on offer. Simply open up the main docs, then open the library-specific docs linked on the bottom left, and check out the demos folder in the library repo.

On the other hand, if you have your own new library in mind, or if you would like to implement parts of your own project in Ivy to maximise its portability, then we recommend checking out the page Using Ivy in the docs. Here, we dive a bit deeper into the Ivy framework, and the best coding practices to get the most out of Ivy for your own codebases and libraries.

Citation

@article{lenton2021ivy,
  title={Ivy: Templated deep learning for inter-framework portability},
  author={Lenton, Daniel and Pardo, Fabio and Falck, Fabian and James, Stephen and Clark, Ronald},
  journal={arXiv preprint arXiv:2102.02886},
  year={2021}
}
Comments
  • Create numpy diagonal

    Create numpy diagonal

    diagonal #6616. Kindly mark a green circle on it. So there will be no conflict in the future. I already experienced that thing. https://github.com/unifyai/ivy/issues/6616.

    TensorFlow Frontend NumPy Frontend Array API Ivy Functional API 
    opened by hrak99 59
  • Add Statistical functions mean numpy frontend #2546

    Add Statistical functions mean numpy frontend #2546

    Greetings i think i did everything i did the frontend the tests as well and changed the init files i did the mean function according to the numpy documentation waiting for your reply. Best regards.

    opened by Emperor-WS 26
  • Isin extension

    Isin extension

    #5716

    added most backend implementations there is only problem with tensorflow I'm still trying to solve since it doesnt have the function isin, once I'm able to do that I will add tests

    Array API Function Reformatting Ivy Functional API Ivy API Experimental 
    opened by pillarxyz 20
  • reformat shape_to_tuple

    reformat shape_to_tuple

    Hi, I've got a question on testings. I was getting errors, so I checked the logs and I found out that some of those tests aren't ready yet (e.g.: shape_to_tuple). Not sure if I'm right, but it'll be awesome if you give some information about this. Thank you.

    opened by mcandemir 19
  • feat: add is_tensor to tensorflow frontend general functions

    feat: add is_tensor to tensorflow frontend general functions

    Close #7584 Need help with PyTest, I am unable to wrap my head around the testing helpers yet.

    Essentially, when I run these tests, I get the same error, despite trying various combinations of the parameters passed to the test_frontend_function

    TensorFlow Frontend 
    opened by chtnnh 18
  • argmax function: general.py

    argmax function: general.py

    Test Cases:

    • 42 passed for pytest ./ivy/ivy_tests/test_functional/test_core/test_general.py::test_argmax --disable-warnings -rs
    • 6 skipped for conftest.py
    • No errors

    Implemented for

    • [x] jax
    • [x] numpy
    • [x] mxnet
    • [x] tensorflow
    • [x] torch
    Array API Single Function 
    opened by 7wikd 18
  • reformatting task coding challenge

    reformatting task coding challenge

    Please consider this my PR for the task 'vairable_data' in the 'gradients' submodule. The task number #9283 and the ToDo list is #776

    https://github.com/unifyai/ivy/issues/9283

    Array API Ivy Functional API 
    opened by waheeduddin 0
Releases(v1.1.9)
  • v1.1.5(Jul 26, 2021)

    Version 1.1.5.

    Added some new methods and classes, improved the ivy.Module and ivy.Container classes. ivy.Container now overrides more built-in methods, and has more flexible nested methods such as gather_nd, repeat, stop_gradients etc.

    This version was tested against: JAX 0.2.17 JAXLib 0.1.69 TensorFlow 2.5.0 TensorFlow Addons 0.13.0 TensorFlow Probability 0.13.0 PyTorch 1.9.0 MXNet 1.8.0 NumPy 1.19.5

    However, Ivy 1.1.5 inevitably supports many previous and future backend versions, due to the stability of the core APIs for each backend framework.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.4(Apr 12, 2021)

    Version 1.1.4.

    Added some new methods, fixed some small bugs, improved unit testing, and tested against the latest backend versions.

    This version was tested against: JAX 0.2.12 TensorFlow 2.4.1 PyTorch 1.8.1 MXNet 1.8.0 NumPy 1.20.2

    However, Ivy 1.1.4 inevitably supports many previous and future backend versions, due to the stability of the core APIs for each backend framework.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.3(Mar 19, 2021)

    Version 1.1.3.

    Added some new methods, fixed some small bugs, improved unit testing, and tested against the latest backend versions.

    This version was tested against: JAX 0.2.10 TensorFlow 2.4.1 PyTorch 1.8.0 MXNet 1.7.0 NumPy 1.19.5

    However, Ivy 1.1.3 likely supports many previous and future backend versions, due to the stability of the core APIs for each backend framework.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.2(Feb 27, 2021)

    Version 1.1.2.

    Added adam update, changed gradient methdos to operate on gradient dicts instead of lists, added new container chain chain method, among other small changes.

    This version was tested against: JAX 0.2.9 TensorFlow 2.4.1 PyTorch 1.7.1 MXNet 1.7.0 NumPy 1.19.5

    However, Ivy 1.1.2 likely supports many previous and future backend versions, due to the stability of the core APIs for each backend framework.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.1(Feb 10, 2021)

XGBoost-Ray is a distributed backend for XGBoost, built on top of distributed computing framework Ray.

XGBoost-Ray is a distributed backend for XGBoost, built on top of distributed computing framework Ray.

92 Dec 14, 2022
NCVX (NonConVeX): A User-Friendly and Scalable Package for Nonconvex Optimization in Machine Learning.

NCVX (NonConVeX): A User-Friendly and Scalable Package for Nonconvex Optimization in Machine Learning.

SUN Group @ UMN 28 Aug 03, 2022
Mixing up the Invariant Information clustering architecture, with self supervised concepts from SimCLR and MoCo approaches

Self Supervised clusterer Combined IIC, and Moco architectures, with some SimCLR notions, to get state of the art unsupervised clustering while retain

Bendidi Ihab 9 Feb 13, 2022
This is a curated list of medical data for machine learning

Medical Data for Machine Learning This is a curated list of medical data for machine learning. This list is provided for informational purposes only,

Andrew L. Beam 5.4k Dec 26, 2022
Adversarial Framework for (non-) Parametric Image Stylisation Mosaics

Fully Adversarial Mosaics (FAMOS) Pytorch implementation of the paper "Copy the Old or Paint Anew? An Adversarial Framework for (non-) Parametric Imag

Zalando Research 120 Dec 24, 2022
Applied Machine Learning for Graduate Program in Computer Science (PPGCC)

Applied Machine Learning for Graduate Program in Computer Science (PPGCC) - Federal University of Santa Catarina

Jônatas Negri Grandini 1 Dec 22, 2021
LibRerank is a toolkit for re-ranking algorithms. There are a number of re-ranking algorithms, such as PRM, DLCM, GSF, miDNN, SetRank, EGRerank, Seq2Slate.

LibRerank LibRerank is a toolkit for re-ranking algorithms. There are a number of re-ranking algorithms, such as PRM, DLCM, GSF, miDNN, SetRank, EGRer

126 Dec 28, 2022
Houseprices - Predict sales prices and practice feature engineering, RFs, and gradient boosting

House Prices - Advanced Regression Techniques Predicting House Prices with Machine Learning This project is build to enhance my knowledge about machin

1 Jan 01, 2022
Mars is a tensor-based unified framework for large-scale data computation which scales numpy, pandas, scikit-learn and Python functions.

Mars is a tensor-based unified framework for large-scale data computation which scales numpy, pandas, scikit-learn and many other libraries. Documenta

2.5k Jan 07, 2023
Programming assignments and quizzes from all courses within the Machine Learning Engineering for Production (MLOps) specialization offered by deeplearning.ai

Machine Learning Engineering for Production (MLOps) Specialization on Coursera (offered by deeplearning.ai) Programming assignments from all courses i

Aman Chadha 173 Jan 05, 2023
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.

Website | Documentation | Tutorials | Installation | Release Notes CatBoost is a machine learning method based on gradient boosting over decision tree

CatBoost 6.9k Jan 05, 2023
A Python Module That Uses ANN To Predict A Stocks Price And Also Provides Accurate Technical Analysis With Many High Potential Implementations!

Stox A Module to predict the "close price" for the next day and give "technical analysis". It uses a Neural Network and the LSTM algorithm to predict

Stox 31 Dec 16, 2022
AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications.

AutoTabular AutoTabular automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications. With just

wenqi 2 Jun 26, 2022
Predict profitability of trades based on indicator buy / sell signals

Predict profitability of trades based on indicator buy / sell signals Trade profitability analysis for trades based on various indicators signals: MAC

Tomasz Porzycki 1 Dec 15, 2021
healthy and lesion models for learning based on the joint estimation of stochasticity and volatility

health-lesion-stovol healthy and lesion models for learning based on the joint estimation of stochasticity and volatility Reference please cite this p

5 Nov 01, 2022
A simple example of ML classification, cross validation, and visualization of feature importances

Simple-Classifier This is a basic example of how to use several different libraries for classification and ensembling, mostly with sklearn. Example as

Rob 2 Aug 25, 2022
A comprehensive repository containing 30+ notebooks on learning machine learning!

A comprehensive repository containing 30+ notebooks on learning machine learning!

Jean de Dieu Nyandwi 3.8k Jan 09, 2023
Databricks Certified Associate Spark Developer preparation toolkit to setup single node Standalone Spark Cluster along with material in the form of Jupyter Notebooks.

Databricks Certification Spark Databricks Certified Associate Spark Developer preparation toolkit to setup single node Standalone Spark Cluster along

19 Dec 13, 2022
A Multipurpose Library for Synthetic Time Series Generation in Python

TimeSynth Multipurpose Library for Synthetic Time Series Please cite as: J. R. Maat, A. Malali, and P. Protopapas, “TimeSynth: A Multipurpose Library

278 Dec 26, 2022
Evidently helps analyze machine learning models during validation or production monitoring

Evidently helps analyze machine learning models during validation or production monitoring. The tool generates interactive visual reports and JSON profiles from pandas DataFrame or csv files. Current

Evidently AI 3.1k Jan 07, 2023