This paper presents an innovative collaborative haptic-hand user interface which employs a novel method to recognize the real-time geometric hand parameters of each individual user (e.g. finger lengths, joint locations, hand posture, etc.). The novelty of this method is that unlike conventional user interfaces (e.g. keyboard or mouse), which only allow human-machine information transfer via one dimensional interactions at the finger-tips, this system has the potential to interpret the intent of a user’s movement by resourcing additionally harvested hand gesture information. The high number of degrees of freedom (DOF) and complexity associated with the hand make it ideal for use to control or communicate particularities associated with multi-variable complex physics system such as the that of a nuclear reactor or it’s sub-systems. Despite the fact that unconventional hand gesture monitoring systems are available (e.g. cyber grasp and Leap Motion) they lack the full haptic bi-lateral communication provided by the developed system and/or the methods to identify the hand gestures are restricted by a glove, which is cumbersome, or visual means, which are subject to intermittent data due to shadowing effects. Highly enabling particularities associated with the device include 3-DOF haptic feedback at the finger-tips, high-rate (>1kHz) FPGA based position and force monitoring, and the patented efficient user hand geometry calibration method. The efficient method, based on sphere mathematics and statistical confidence, only requires ~10 seconds of the user fluttering their fingers to achieve its calibration aim. The consequential gesture information was shown to both directly communicate gesture to a robotic hand and promises more full bilateral haptic communication between the user’s hand and their cognition than any other device available.