Description
A list of potential weight_decay
values for AutoML to explore.
Weight decay, also known as L2 regularization, is a technique that penalizes large weights in neural networks. It encourages smaller, more stable weight values, which can help the model generalize better and avoid overfitting.
Each value in weight_decay
must be >= 0.0
.
Supported Task Types
- All
Default Values
run_mode | Default Value |
---|---|
FAST | [0.0, 5e-8, 5e-7, 5e-6] |
NORMAL | [0.0, 5e-8, 5e-7, 5e-6] |
BEST | [0.0, 5e-8, 5e-7, 5e-6] |
Updated 2 days ago