2.10 Normalization
Normalization: - Normalization is a data pre-processing technique used to scale numerical features data into a standard range. It ensures that all input features contribute equally to the model and improves convergence of machine learning algorithms. Normalization is a technique used to scale numbers into a common range . It makes big values and small values come to a similar level. Normalization scales features to a standard range to improve model performance. Need for Normalization Features may have different ranges Large-scale values can dominate small-scale values Some algorithms depend on distance calculations Normalization helps: Improve model accuracy Speed up training process Improve convergence in gradient-based algorithms Example: - Imagine a dataset: Age = 18 to 60 Salary = 10,000 to 1,00,000 Salary values are much bigger than age values. If we train a model: The model may give more importance to salary Age may get ignored So we normalize t...