Need help with DynamicReLU?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

180 Stars 39 Forks 6 Commits 7 Opened issues


Implementation of Dynamic ReLU on Pytorch

Services available


Need anything else?

Contributors list

No Data


Implementation of Dynamic ReLU(types A,B) on Pytorch.


import torch.nn as nn
from dyrelu import DyReluB

class Model(nn.Module): def init(self): super(Model, self).init() self.conv1 = nn.Conv2d(3, 10, 5) self.relu = DyReLUB(10, conv_type='2d')

def forward(self, x):
    x = self.conv1(x)
    x = self.relu(x)
    return x

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.