ReLU Calculator
Calculate ReLU activation function and variants for neural networks.
ReLU Parameters
f(x) = max(0, x)
ReLU(-0.5)
0.000000
Neuron is INACTIVE
Derivative
0.000000
Status
Inactive
ReLU
Standard ReLU, outputs 0 for negative inputs
Batch Results
x = -20.0000Off
x = -10.0000Off
x = 00.0000Off
x = 11.0000Active
x = 22.0000Active