Tolerances define the acceptable range of variation in a manufactured dimension — the difference between a part that fits and one that must be scrapped. In mechanical engineering, tolerance is expressed as the difference between the maximum and minimum allowable dimension. For example, a shaft specified at 25.000 mm with a tolerance of ±0.025 mm must measure between 24.975 mm and 25.025 mm to pass inspection. The ISO 286 standard (also known as the IT grades system) defines 20 tolerance grades from IT01 (finest, about 0.3 micrometers for small parts) to IT18 (coarsest, several millimeters), giving engineers a standardized language for specifying precision. Tighter tolerances dramatically increase manufacturing cost — achieving IT6 precision (about 13 micrometers for a 25 mm dimension) typically requires grinding, while IT11 (about 130 micrometers) can be achieved with standard milling. Stack-up analysis, which calculates how individual part tolerances accumulate in an assembly, determines whether mating parts will fit correctly in the worst case. Whether you are designing a press-fit bearing housing, a sliding shaft, or a clearance hole for a bolt, selecting the right tolerance balances functional requirements against manufacturing feasibility and cost.