Your viewpoint of there being one true way of spectrum gradation under restricted device domains is not universally agreed upon; neither among designers, nor among the ‘normal’ population. No other way is the one true way, either! What works well for a PC sRGB display will be drab and lifeless on a laser cinema projection systems. Printers have faced this problem for centuries and innovated to use glass (long ago) and brushed-metal prints (more recently) as a way to regain control of luminosity independent of saturation and to express their design in less restrictive ways. The only ‘normal’ in this scenario is the shared agreements we make, and the primacy of sRGB’s desaturated brights as the sole exclusive agreement is ending.
(Since this reminded me of it, a random pro tip: For those using macOS Terminal.app, you can redefine the 16 ANSI colors using the full P3 colorspace, so long as you use the full GUI color picker rather than the sRGB-limited #rrggbb entry method. Access to improved saturation helps the eye distinguish different shades in the dim and bright color sets more effectively without having to alter their brightness. It won’t improve the limitations of ANSI color as a whole — the insistence on Luminosity = f(R,G,B) is baked into everyone’s assumptions quite deeply thanks to sRGB! — but it does at least mean you can have seven equidistant and non-desaturated, non-sRGB colors at two levels of brightness for syntax highlighting and other typical 16-color uses.)
It's normal and expected for saturation to change. And for brightness to get clipped. But not for hue to change.
That's the critique of OKLCH here, that changing hue is a bizarre and undesirable choice.