>>63939795>slightly drunk right now but not quite sure i follow, manNothing can go wrong with a stoned programmer telling a tipsy programmer how to normalise data inputs on a weapons board so here we go.
You don't want to feed inputs as a metric, i.e. no units.
This is because the code that interprets inputs doesn't want to think about what 215 pixels cursor offset, or 27° means for the control and if that's different for a joystick and a thumbstick.
Instead, you send a percentage. No matter it's a thumbstick, mouse cursor, joystick, touch pad or USB butt plug, it just sends the input as a percentage of the minimum and maximum values, probably as an X and Y.
How you do this with a mouse like in Freelancer (objectively the best mouse UI for FPV spaceflight simulators period) is to know the distance between the centre reticle and the display edges and send the values as a percentage of this. I.E. the cursor is 25% of maximum below the reticle and 17% of maximum right of the reticle.
The flight simulation can then interpret this as a request to pitch and yaw by 25% of maximum yaw power and 17% of pitch power in the appropriate direction.
When units (° or pixels or whatever) are removed from the equation, you can make a value that is "normal" in different paradigms or systems and it still works, sometimes this is called a proportional value. In Statistics, the process of converting a unit value to an "independent" value like this is called Normalisation.
How you do this varies on context but the general process is to take the value you've got (215pixels), subtract the minimum value from this (say 5pixels for window border or something) and from the maximum as well. Now the maximum and input value have been normalised from the range of 5 pixels to 520 (assuming 1080p and a 20pixel radius reticle) to a 0-515 range, that's slightly easier to deal with mathematically.
Now we convert this (210/515) to a percentage by the usual method and we're done.