This is me coming from the component supply things in the (mostly medical and sat-com) robotics/servo industry. Specifically I worked on motor hardware.

It was common in super-precision applications such as semi-conductor manufacturing for our applications team to calibrate a motion profile. That is, the customer would measure the motion stage we or they assembled in the final assembly (all torqued bolts, etc, nothing getting shifted or trammed now). This would give us a position command vs position measured profile, which was then put into the calibration files for the motion controller. In this way, repeatability and zero backlash were the only concerns regarding motion components, the applications team regularly achieved sub-micron positioning across large physical distances.

That said, it was a goddamned expensive service, BUT the real cost was the measuring equipment, sometimes the motion hardware was shockingly cheap. In all cases our customers obtained the measuring hardware due to it being a specific need for themselves and often custom in-house design, but these are nanometer manufacturers so...

My thoughts though are, is anyone using motion controllers with this kind of capacity? It's a simple 1D map per axis, or at most 1D per axis as a space array with simplification possible so that a 5D system wouldn't be a 5D calibration array, but rather a 2D+1D or 3D array and then another 2D sub-array sort of thing (much less data). If so, I would think that repeatability and zero-backlash could be the basis I use to design a future build on rather than going for absolute accuracy. The old precision vs accuracy issue, you can have precision but not accuracy, because if you have repeatability then accuracy is just a matter of calibration.

All feedback is much appreciated!