Skip to content

Activations

simplegrad.functions.activations.relu(x: Tensor) -> Tensor

Apply ReLU activation element-wise: max(0, x).

Parameters:

Name Type Description Default
x Tensor

Input tensor.

required

Returns:

Type Description
Tensor

Tensor with negative values replaced by zero.

Source code in simplegrad/functions/activations.py
def relu(x: Tensor) -> Tensor:
    """Apply ReLU activation element-wise: max(0, x).

    Args:
        x: Input tensor.

    Returns:
        Tensor with negative values replaced by zero.
    """
    return _Relu.apply(x)

simplegrad.functions.activations.tanh(x: Tensor) -> Tensor

Apply hyperbolic tangent element-wise.

Parameters:

Name Type Description Default
x Tensor

Input tensor.

required

Returns:

Type Description
Tensor

Tensor with values in (-1, 1).

Source code in simplegrad/functions/activations.py
def tanh(x: Tensor) -> Tensor:
    """Apply hyperbolic tangent element-wise.

    Args:
        x: Input tensor.

    Returns:
        Tensor with values in (-1, 1).
    """
    return _Tanh.apply(x)

simplegrad.functions.activations.sigmoid(x: Tensor) -> Tensor

Apply sigmoid activation element-wise: 1 / (1 + exp(-x)).

Parameters:

Name Type Description Default
x Tensor

Input tensor.

required

Returns:

Type Description
Tensor

Tensor with values in (0, 1).

Source code in simplegrad/functions/activations.py
def sigmoid(x: Tensor) -> Tensor:
    """Apply sigmoid activation element-wise: 1 / (1 + exp(-x)).

    Args:
        x: Input tensor.

    Returns:
        Tensor with values in (0, 1).
    """
    return _Sigmoid.apply(x)

simplegrad.functions.activations.softmax(x: Tensor, dim: int | None = None) -> Tensor

Apply softmax along the given dimension.

Parameters:

Name Type Description Default
x Tensor

Input tensor.

required
dim int | None

Dimension to normalize over. If None, normalizes over all elements.

None

Returns:

Type Description
Tensor

Tensor where values along dim sum to 1.

Source code in simplegrad/functions/activations.py
@compound_op
def softmax(x: Tensor, dim: int | None = None) -> Tensor:
    """Apply softmax along the given dimension.

    Args:
        x: Input tensor.
        dim: Dimension to normalize over. If None, normalizes over all elements.

    Returns:
        Tensor where values along ``dim`` sum to 1.
    """
    exps = exp(x)
    return exps / sum(exps, dim)