Activation Layers
simplegrad.nn.activation_layers.ReLU
simplegrad.nn.activation_layers.Softmax
Bases: Module
Softmax activation layer.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dim
|
int | None
|
Dimension to normalize over. Defaults to None (all elements). |
None
|
Source code in simplegrad/nn/activation_layers.py
simplegrad.nn.activation_layers.Tanh
simplegrad.nn.activation_layers.Sigmoid
Bases: Module
Sigmoid activation layer: 1 / (1 + exp(-x)).