Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective

Overview

Does-MAML-Only-Work-via-Feature-Re-use-A-Data-Set-Centric-Perspective

Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective

Installing

Standard pip instal [Recommended]

TODO

If you are going to use a gpu the do this first before continuing (or check the offical website: https://pytorch.org/get-started/locally/):

pip3 install torch==1.9.1+cu111 torchvision==0.10.1+cu111 torchaudio==0.9.1 -f https://download.pytorch.org/whl/torch_stable.html

Otherwise, just doing the follwoing should work.

pip install automl

If that worked, then you should be able to import is as follows:

import automl

Manual installation [Development]

To use library first get the code from this repo (e.g. fork it on github):

git clone [email protected]/brando90/automl-meta-learning.git

Then install it in development mode in your python env with python >=3.9 (read modules_in_python.md to learn about python envs in uutils). E.g. create your env with conda:

conda create -n metalearning python=3.9
conda activate metalearning

Then install it in edibable mode and all it's depedencies with pip in the currently activated conda environment:

pip install -e ~/automl-meta-learning/automl-proj-src/

since the depedencies have not been written install them:

pip install -e ~/ultimate-utils/ultimate-utils-proj-src

then test as followsing:

python -c "import uutils; print(uutils); uutils.hello()"
python -c "import meta_learning; print(meta_learning)"
python -c "import meta_learning; print(meta_learning); meta_learning.hello()"

output should be something like this:

hello from uutils __init__.py in: (metalearning) brando~/automl-meta-learning/automl-proj-src ❯ python -c "import meta_learning; print(meta_learning)" (metalearning) brando~/automl-meta-learning/automl-proj-src ❯ python -c "import meta_learning; print(meta_learning); meta_learning.hello()" hello from torch_uu __init__.py in: ">
(metalearning) brando~/automl-meta-learning/automl-proj-src ❯ python -c "import uutils; print(uutils); uutils.hello()"

       
        

hello from uutils __init__.py in:

        
         

(metalearning) brando~/automl-meta-learning/automl-proj-src ❯ python -c "import meta_learning; print(meta_learning)"

         
          
(metalearning) brando~/automl-meta-learning/automl-proj-src ❯ python -c "import meta_learning; print(meta_learning); meta_learning.hello()"

          
           

hello from torch_uu __init__.py in:

            
           
          
         
        
       

Reproducing Results

TODO

Citation

B. Miranda, Y.Wang, O. Koyejo.
Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective. 
(Planned Release Date December 2021).
https://drive.google.com/file/d/1cTrfh-Tg39EnbI7u0-T29syyDp6e_gjN/view?usp=sharing

https://drive.google.com/file/d/1cTrfh-Tg39EnbI7u0-T29syyDp6e_gjN/view?usp=sharing

Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''

CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L

6 Dec 19, 2022
Code for "Unsupervised Layered Image Decomposition into Object Prototypes" paper

DTI-Sprites Pytorch implementation of "Unsupervised Layered Image Decomposition into Object Prototypes" paper Check out our paper and webpage for deta

40 Dec 22, 2022
An Approach to Explore Logistic Regression Models

User-centered Regression An Approach to Explore Logistic Regression Models This tool applies the potential of Attribute-RadViz in identifying correlat

0 Nov 12, 2021
LowRankModels.jl is a julia package for modeling and fitting generalized low rank models.

LowRankModels.jl LowRankModels.jl is a Julia package for modeling and fitting generalized low rank models (GLRMs). GLRMs model a data array by a low r

Madeleine Udell 183 Dec 17, 2022
Source code of "Hold me tight! Influence of discriminative features on deep network boundaries"

Hold me tight! Influence of discriminative features on deep network boundaries This is the source code to reproduce the experiments of the NeurIPS 202

EPFL LTS4 19 Dec 10, 2021
ML course - EPFL Machine Learning Course, Fall 2021

EPFL Machine Learning Course CS-433 Machine Learning Course, Fall 2021 Repository for all lecture notes, labs and projects - resources, code templates

EPFL Machine Learning and Optimization Laboratory 1k Jan 04, 2023
A weakly-supervised scene graph generation codebase. The implementation of our CVPR2021 paper ``Linguistic Structures as Weak Supervision for Visual Scene Graph Generation''

README.md shall be finished soon. WSSGG 0 Overview 1 Installation 1.1 Faster-RCNN 1.2 Language Parser 1.3 GloVe Embeddings 2 Settings 2.1 VG-GT-Graph

Keren Ye 35 Nov 20, 2022
Official NumPy Implementation of Deep Networks from the Principle of Rate Reduction (2021)

Deep Networks from the Principle of Rate Reduction This repository is the official NumPy implementation of the paper Deep Networks from the Principle

Ryan Chan 49 Dec 16, 2022
CryptoFrog - My First Strategy for freqtrade

cryptofrog-strategies CryptoFrog - My First Strategy for freqtrade NB: (2021-04-20) You'll need the latest freqtrade develop branch otherwise you migh

Robert Davey 137 Jan 01, 2023
Code for Motion Representations for Articulated Animation paper

Motion Representations for Articulated Animation This repository contains the source code for the CVPR'2021 paper Motion Representations for Articulat

Snap Research 851 Jan 09, 2023
Official code for the ICLR 2021 paper Neural ODE Processes

Neural ODE Processes Official code for the paper Neural ODE Processes (ICLR 2021). Abstract Neural Ordinary Differential Equations (NODEs) use a neura

Cristian Bodnar 50 Oct 28, 2022
Latent Network Models to Account for Noisy, Multiply-Reported Social Network Data

VIMuRe Latent Network Models to Account for Noisy, Multiply-Reported Social Network Data. If you use this code please cite this article (preprint). De

6 Dec 15, 2022
A simple log parser and summariser for IIS web server logs

IISLogFileParser A basic parser tool for IIS Logs which summarises findings from the log file. Inspired by the Gist https://gist.github.com/wh13371/e7

2 Mar 26, 2022
Learning 3D Part Assembly from a Single Image

Learning 3D Part Assembly from a Single Image This repository contains a PyTorch implementation of the paper: Learning 3D Part Assembly from A Single

18 Dec 21, 2022
The official implementation for "FQ-ViT: Fully Quantized Vision Transformer without Retraining".

FQ-ViT [arXiv] This repo contains the official implementation of "FQ-ViT: Fully Quantized Vision Transformer without Retraining". Table of Contents In

132 Jan 08, 2023
BESS: Balanced Evolutionary Semi-Stacking for Disease Detection via Partially Labeled Imbalanced Tongue Data

Balanced-Evolutionary-Semi-Stacking Code for the paper ''BESS: Balanced Evolutionary Semi-Stacking for Disease Detection via Partially Labeled Imbalan

0 Jan 16, 2022
A Light CNN for Deep Face Representation with Noisy Labels

A Light CNN for Deep Face Representation with Noisy Labels Citation If you use our models, please cite the following paper: @article{wulight, title=

Alfred Xiang Wu 715 Nov 05, 2022
On-device speech-to-intent engine powered by deep learning

Rhino Made in Vancouver, Canada by Picovoice Rhino is Picovoice's Speech-to-Intent engine. It directly infers intent from spoken commands within a giv

Picovoice 510 Dec 30, 2022
Improving Calibration for Long-Tailed Recognition (CVPR2021)

MiSLAS Improving Calibration for Long-Tailed Recognition Authors: Zhisheng Zhong, Jiequan Cui, Shu Liu, Jiaya Jia [arXiv] [slide] [BibTeX] Introductio

Jia Research Lab 116 Dec 20, 2022
Sharpness-Aware Minimization for Efficiently Improving Generalization

Sharpness-Aware-Minimization-TensorFlow This repository provides a minimal implementation of sharpness-aware minimization (SAM) (Sharpness-Aware Minim

Sayak Paul 54 Dec 08, 2022