Need help with pytorch-maml?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

tristandeleu
145 Stars 20 Forks MIT License 92 Commits 5 Opened issues

Description

An Implementation of Model-Agnostic Meta-Learning in PyTorch with Torchmeta

Services available

!
?

Need anything else?

Contributors list

# 5,962
Python
pytorch
ml
meta-le...
89 commits
# 486,433
pandas-...
pandas
data-cl...
python3
1 commit

Model-Agnostic Meta-Learning

Documentation

An implementation of Model-Agnostic Meta-Learning (MAML) in PyTorch with Torchmeta.

Getting started

To avoid any conflict with your existing Python setup, it is suggested to work in a virtual environment with

virtualenv
. To install

virtualenv
:
bash
pip install --upgrade virtualenv
Create a virtual environment, activate it and install the requirements in
requirements.txt
.
bash
virtualenv venv
source venv/bin/activate
pip install -r requirements.txt

Requirements

  • Python 3.6 or above
  • PyTorch 1.5
  • Torchvision 0.6
  • Torchmeta 1.4.6

Usage

You can use

train.py
to meta-train your model with MAML. For example, to run MAML on Omniglot 1-shot 5-way with default parameters from the original paper:

bash
python train.py /path/to/data --dataset omniglot --num-ways 5 --num-shots 1 --use-cuda --step-size 0.4 --batch-size 32 --num-workers 8 --num-epochs 600 --output-folder /path/to/results
The meta-training script creates a configuration file you can use to meta-test your model. You can use
test.py
to meta-test your model:
bash
python test.py /path/to/results/config.json

References

The code available in this repository is mainly based on the paper

Chelsea Finn, Pieter Abbeel, and Sergey Levine. Model-agnostic meta-learning for fast adaptation of deep networks. International Conference on Machine Learning (ICML), 2017 [ArXiv]

If you want to cite this paper

@article{finn17maml,
  author    = {Chelsea Finn and Pieter Abbeel and Sergey Levine},
  title     = {Model-{A}gnostic {M}eta-{L}earning for {F}ast {A}daptation of {D}eep {N}etworks},
  journal   = {International Conference on Machine Learning (ICML)},
  year      = {2017},
  url       = {http://arxiv.org/abs/1703.03400}
}

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.