LeakyReLU layer
- 원본 링크 : https://keras.io/api/layers/activation_layers/leaky_relu/
- 최종 확인 : 2024-11-25
LeakyReLU
class
keras.layers.LeakyReLU(negative_slope=0.3, **kwargs)
Leaky version of a Rectified Linear Unit activation layer.
This layer allows a small gradient when the unit is not active.
Formula:
f(x) = alpha * x if x < 0
f(x) = x if x >= 0
Example
leaky_relu_layer = LeakyReLU(negative_slope=0.5)
input = np.array([-10, -5, 0.0, 5, 10])
result = leaky_relu_layer(input)
Arguments
- negative_slope: Float >= 0.0. Negative slope coefficient.
Defaults to
0.3
. - **kwargs: Base layer keyword arguments, such as
name
anddtype
.