Need help with slot-attention?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

lucidrains
161 Stars 12 Forks MIT License 28 Commits 0 Opened issues

Description

Implementation of Slot Attention from GoogleAI

Services available

!
?

Need anything else?

Contributors list

# 15,928
Python
pytorch
golang
swagger
24 commits
# 14,824
Shell
MATLAB
C#
Jupyter...
1 commit

Slot Attention

Implementation of Slot Attention from the paper 'Object-Centric Learning with Slot Attention' in Pytorch. Here is a video that describes what this network can do.

Update: The official repository has been released here

Install

$ pip install slot_attention

Usage

import torch
from slot_attention import SlotAttention

slot_attn = SlotAttention( num_slots = 5, dim = 512, iters = 3 # iterations of attention, defaults to 3 )

inputs = torch.randn(2, 1024, 512) slot_attn(inputs) # (2, 5, 512)

After training, the network is reported to be able to generalize to slightly different number of slots (clusters). You can override the number of slots used by the

num_slots
keyword in forward.
slot_attn(inputs, num_slots = 8) # (2, 8, 512)

Citation

@misc{locatello2020objectcentric,
    title = {Object-Centric Learning with Slot Attention},
    author = {Francesco Locatello and Dirk Weissenborn and Thomas Unterthiner and Aravindh Mahendran and Georg Heigold and Jakob Uszkoreit and Alexey Dosovitskiy and Thomas Kipf},
    year = {2020},
    eprint = {2006.15055},
    archivePrefix = {arXiv},
    primaryClass = {cs.LG}
}

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.