Min-Max Normalization Calculator
Scale any dataset to a custom range instantly
Min-Max Normalization Calculator
Min-max normalization (also called min-max scaling or feature rescaling) transforms a dataset so that all values fall within a chosen target range, most commonly [0, 1]. It is one of the most widely used preprocessing steps in machine learning, ensuring that features with larger numerical ranges do not dominate those with smaller ones during model training.
This calculator applies the min-max formula to your dataset instantly, showing a preview of the normalized values alongside the original statistics.
How to Use This Calculator
- Enter your dataset — comma-separated or newline-separated numbers.
- Target minimum and maximum — the desired output range. Defaults are 0 and 1.
- Round digits — how many decimal places to show in the output (default 4).
- The calculator shows the original min/max, the number of values normalized, and a preview of all transformed values.
The Min-Max Formula
For each value x in a dataset with minimum xmin and maximum xmax, the normalized value is:
x' = (x - xmin) / (xmax - xmin) × (targetmax - targetmin) + targetmin
For the default [0, 1] range, this simplifies to:
x' = (x - xmin) / (xmax - xmin)
The minimum of the original data always maps to target_min, and the maximum maps to target_max.
When to Use Min-Max Normalization
- Neural networks: Gradient descent converges faster when inputs are scaled to [0, 1] or [-1, 1].
- k-NN and k-means: Distance-based algorithms are dominated by features with large ranges if not normalized.
- Image processing: Pixel values are normalized from [0, 255] to [0, 1].
- Custom ranges: Sometimes features need to be in [-1, 1] or [0, 100] for a specific model or visualization.
Min-Max vs. Z-Score Normalization
Min-max normalization fits data into a bounded range but is sensitive to outliers — one extreme value shifts all others toward 0 (or target_min). Z-score normalization (standardization) rescales data to have mean 0 and standard deviation 1; it handles outliers better for normally distributed data. For most deep learning applications, min-max scaling to [0, 1] is preferred for inputs with known bounds.
Real-World Examples
House prices: Prices range from $120,000 to $2,000,000. Normalizing to [0, 1] allows a regression model to treat price on equal footing with other features like square footage.
Sensor readings: Temperature readings from -10°C to 40°C are normalized to [0, 1] for input into a classification model.
Survey responses: A Likert scale from 1–5 is normalized to [0, 1] to combine with other features in a recommendation engine.