Mish# class torch.nn.Mish(inplace=False)[source]# Applies the Mish function, element-wise. Mish: A Self Regularized Non-Monotonic Neural Activation Function. Mish(x)=x∗Tanh(Softplus(x)) ext{Mish}(x) = x * ext{Tanh}( ext{Softplus}(x)) Mish(x)=x∗Tanh(Softplus(x)) Note See Mish: A Self Regularized Non-Monotonic Neural Activation Function Shape: Input: (∗)(*)(∗), where ∗*∗ means any number of dimensions. Output: (∗)(*)(∗), same shape as the input. Examples: >>> m = nn.Mish() >>> input = torch.randn(2) >>> output = m(input)