Activations
simplegrad.functions.activations.relu(x: Tensor) -> Tensor
simplegrad.functions.activations.tanh(x: Tensor) -> Tensor
simplegrad.functions.activations.sigmoid(x: Tensor) -> Tensor
simplegrad.functions.activations.softmax(x: Tensor, dim: int | None = None) -> Tensor
Apply softmax along the given dimension.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
x
|
Tensor
|
Input tensor. |
required |
dim
|
int | None
|
Dimension to normalize over. If None, normalizes over all elements. |
None
|
Returns:
| Type | Description |
|---|---|
Tensor
|
Tensor where values along |