Softshrink# class torch.nn.Softshrink(lambd=0.5)[source]# Applies the soft shrinkage function element-wise. SoftShrinkage(x)={x−λ, if x>λx+λ, if x<−λ0, otherwise \text{SoftShrinkage}(x) = \begin{cases} x - \lambda, & \text{ if } x > \lambda \\ x + \lambda, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases} SoftShrinkage(x)=⎩⎨⎧x−λ,x+λ,0, if x>λ if x<−λ otherwise Parameters lambd (float) – the λ\lambdaλ (must be no less than zero) value for the Softshrink formulation. Default: 0.5 Shape: Input: (∗)(*)(∗), where ∗*∗ means any number of dimensions. Output: (∗)(*)(∗), same shape as the input. Examples: >>> m = nn.Softshrink() >>> input = torch.randn(2) >>> output = m(input) extra_repr()[source]# Return the extra representation of the module. Return type str forward(input)[source]# Run forward pass. Return type Tensor