### entmax

#### by deep-spin

deep-spin /entmax The entmax mapping and its loss, a family of sparse softmax alternatives.

Available items #### No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:  # entmax This package provides a pytorch implementation of entmax and entmax losses: a sparse family of probability mappings and corresponding loss functions, generalizing softmax / cross-entropy.

Features: - Exact partial-sort algorithms for 1.5-entmax and 2-entmax (sparsemax). - A bisection-based algorithm for generic alpha-entmax. - Gradients w.r.t. alpha for adaptive, learned sparsity!

Requirements: python 3, pytorch >= 1.0 (and pytest for unit tests)

## Example

```In : import torch

In : from torch.nn.functional import softmax
In : from entmax import sparsemax, entmax15, entmax_bisect
In : x = torch.tensor([-2, 0, 0.5])
In : softmax(x, dim=0)
Out: tensor([0.0486, 0.3592, 0.5922])
In : sparsemax(x, dim=0)
Out: tensor([0.0000, 0.2500, 0.7500])
In : entmax15(x, dim=0)
Out: tensor([0.0000, 0.3260, 0.6740])
```

```In : from torch.autograd import grad

In : x = torch.tensor([[-1, 0, 0.5], [1, 2, 3.5]])
In : alpha = torch.tensor(1.33, requires_grad=True)
In : p = entmax_bisect(x, alpha)
In : p
Out:
tensor([[0.0460, 0.3276, 0.6264],
Out: (tensor(-0.2562),)
```

## Installation

```pip install entmax
```

## Citations

Sparse Sequence-to-Sequence Models

```@inproceedings{entmax,
author    = {Peters, Ben and Niculae, Vlad and Martins, Andr{\'e} FT},
title     = {Sparse Sequence-to-Sequence Models},
booktitle = {Proc. ACL},
year      = {2019},
url       = {https://www.aclweb.org/anthology/P19-1146}
}
```

```@inproceedings{correia19adaptively,