activations

Mish

class Mish

Applies the Mish activation function element-wise

\(\text{Mish}(x)=x\tanh(softplus(x))\)

../_images/mish.png
Reference:

Diganta Misra: Mish: A Self Regularized Non-Monotonic Activation Function

Example

>>> import torch
>>> from hearth.activations import Mish
>>>
>>> activation = Mish()
>>> x = torch.linspace(-2, 2, 8)
>>> activation(x)
tensor([-0.2525, -0.3023, -0.2912, -0.1452,  0.1969,  0.7174,  1.3256,  1.9440])
forward(x)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Return type

Tensor

training: bool

get_activation

get_activation(name, **kwargs)

get an new instance activiation by its name, optionally passing any extra kwargs.

Parameters

name (str) – string name of the activation. will be best effort normalized.

Returns

instance of an activation module

Return type

nn.Module

Example

>>> from torch import nn
>>> from hearth.activations import get_activation
>>>
>>> get_activation('relu')
ReLU()

you can also pass kwargs… although this is slightly silly since you may as well use the actual module then…

>>> get_activation('prelu', num_parameters=6)
PReLU(num_parameters=6)

name will be normalized so dont worry about casing:

>>> get_activation('mish')
Mish()
>>> get_activation('MISH')
Mish()

mish

mish(x)

functional version of mish activation see Mish for more info.

Return type

Tensor