Skip to content

Loss Functions

simplegrad.functions.losses.ce_loss(z: Tensor, y: Tensor, dim: int = -1, reduction: str = 'mean') -> Tensor

Compute cross-entropy loss with built-in softmax.

Numerically stable: uses the log-sum-exp trick internally.

Parameters:

Name Type Description Default
z Tensor

Logits (raw unnormalized scores), shape (..., num_classes).

required
y Tensor

Target probability distribution, same shape as z.

required
dim int

Class dimension to apply softmax over. Defaults to -1 (last dim).

-1
reduction str

How to reduce the per-sample losses. One of "mean", "sum", or None (return per-sample losses).

'mean'

Returns:

Type Description
Tensor

Scalar loss tensor (or per-sample if reduction=None).

Raises:

Type Description
ValueError

If reduction is not a valid option.

Source code in simplegrad/functions/losses.py
def ce_loss(z: Tensor, y: Tensor, dim: int = -1, reduction: str = "mean") -> Tensor:
    """Compute cross-entropy loss with built-in softmax.

    Numerically stable: uses the log-sum-exp trick internally.

    Args:
        z: Logits (raw unnormalized scores), shape ``(..., num_classes)``.
        y: Target probability distribution, same shape as ``z``.
        dim: Class dimension to apply softmax over. Defaults to -1 (last dim).
        reduction: How to reduce the per-sample losses. One of ``"mean"``,
            ``"sum"``, or ``None`` (return per-sample losses).

    Returns:
        Scalar loss tensor (or per-sample if ``reduction=None``).

    Raises:
        ValueError: If ``reduction`` is not a valid option.
    """
    if dim > 0:
        dim = dim - len(z.shape)
    out = _CELoss.apply(z, y, dim, oper=f"CELoss(dim={dim})")
    if reduction == "mean":
        return mean(out)
    elif reduction == "sum":
        return sum(out)
    elif reduction is None:
        return out
    else:
        raise ValueError(f"Invalid reduction: {reduction}")

simplegrad.functions.losses.mse_loss(p: Tensor, y: Tensor, reduction: str = 'mean') -> Tensor

Compute mean squared error loss: mean((p - y)^2).

Parameters:

Name Type Description Default
p Tensor

Predictions tensor.

required
y Tensor

Targets tensor, same shape as p.

required
reduction str

One of "mean", "sum", or None.

'mean'

Returns:

Type Description
Tensor

Scalar loss tensor (or element-wise if reduction=None).

Raises:

Type Description
ValueError

If reduction is not a valid option.

Source code in simplegrad/functions/losses.py
@compound_op
def mse_loss(p: Tensor, y: Tensor, reduction: str = "mean") -> Tensor:
    """Compute mean squared error loss: mean((p - y)^2).

    Args:
        p: Predictions tensor.
        y: Targets tensor, same shape as ``p``.
        reduction: One of ``"mean"``, ``"sum"``, or ``None``.

    Returns:
        Scalar loss tensor (or element-wise if ``reduction=None``).

    Raises:
        ValueError: If ``reduction`` is not a valid option.
    """
    if reduction == "mean":
        return mean((p - y) ** 2)
    elif reduction == "sum":
        return sum((p - y) ** 2)
    elif reduction is None:
        return (p - y) ** 2
    else:
        raise ValueError(f"Invalid reduction: {reduction}")