Designing Natural Interaction in 3D Unity Applications
- Chris Burgess
- Jan 11, 2024
- 4 min read
Updated: Nov 3
Why flexible hand tracking tools are the key to building intuitive, human-feeling experiences in Unity.

Since the inception of virtual reality, gaming controllers have dominated as the primary means of interaction in the virtual world. Users manipulate virtual hands using buttons and joysticks, creating an experience that feels anything but natural. Despite the promise of hand tracking, the industry has been slow to fully embrace it, often retrofitting traditional controller inputs, rather than pioneering hands-first interaction models.
If we want virtual worlds to feel truly human, interaction must evolve. Natural interaction is more than a design choice, it represents a shift toward experiences that mirror how we move, reach, and touch in real life.
Unity's introduction of the XR Interaction Toolkit (XRI) marked a significant step forward, offering a framework to support both controllers and hand tracking. However, this dual approach often results in compromises that reduce the potential of true natural interaction in Unity.
That tension between accessibility and authenticity was exactly where our work at Ultraleap began.
As the product lead on developer tooling at Ultraleap, my mission was clear: make it as intuitive as possible for developers to harness the power of Ultraleap hand tracking. Unity’s dominance in XR application development is undeniable, so we invested heavily in building a Unity plugin, which was designed to seamlessly integrate with this platform.
Over the three years I've been immersed in hand tracking development, it became evident that developers needed greater freedom to innovate. They need tools that empower, not constrain. To make natural interaction possible, developers must have the flexibility to design how hands behave, not just how they look.

To address these challenges, we developed a toolset we called Physical Hands. Unlike other tools on the market, this offers unprecedented flexibility in user interactions and virtual hand representation through three distinct Contact Modes: Hard Contact, Soft Contact, and No Contact. The following breaks these modes down to explain when and why you'd use them.
Hard Contact: Redefining Natural Interaction
The first step toward natural interaction is making virtual touch behave like physical touch.
Hard Contact negates the need for instructions. Users can flick, push, pull, grab, and throw objects just as they would in the real world. This mode's intuitive nature minimises the learning curve, making it ideal for enterprise training scenarios. The visual alignment of the virtual hand with physical interactions maintains VR immersion, ensuring that every movement feels authentic and seamless.
Yet, while Hard Contact looks simple and realistic, it does present challenges when the virtual hand does not perfectly map to the user's real hand movements. This slight disconnect can sometimes feel contrived, but the benefits often outweigh the drawbacks, especially in environments where users need to quickly acclimatise themselves.
Soft Contact: Balancing Precision and Freedom
Realism isn’t always the same as usability. In some contexts, developers need more precision. It's a balance between natural movement and deliberate control.
Soft Contact introduces a nuanced interaction model. Users can still push, pull, and flick objects, but picking them up requires a specific pinch or grab pose. This balance prevents unintended manipulations while offering freedom in interaction, a perfect blend for training applications. Soft Contact's 1:1 mirroring of real-world hand poses allows for a natural visual experience, even if the hand penetrates virtual objects.
This mode enhances user satisfaction by reflecting real-world movements accurately, providing a sense of full agency. For developers and users alike, Soft Contact's balance between control and freedom can be transformative, fostering more engaging and effective VR experiences.
This is actually the mode of interaction that was adopted by Leap Motion, formerly called 'The Interaction Engine'. We enhanced the code's functionality to align with our contact modes, so that it resulted in a more streamlined workflow.
No Contact: Simplifying Interaction for Specific Use Cases
Sometimes, simplicity wins. Not every environment needs full realism to feel natural.
No Contact streamlines interactions to their most basic form—grabbing and dropping objects. This mode excels in scenarios where users only need to pick up or drop items, such as procedural training. By only allowing for intended interactions, No Contact offers simplicity and a solution that works every time. Note that this option can frustrate users if they are expecting to be able to do more than grab and drop, however if onboarded properly, users should not have any issues.
Developing Natural Interaction using Unity

Across these modes, one principle stands out: flexibility. The more freedom developers have to choose the right interaction model, the more natural the end experience becomes.
For hand tracking to achieve mainstream adoption, it must emulate real-world hand movements with accuracy and reliability. Physical Hands can be a catalyst for this transition, empowering developers to create hands-first interactions that transcend the limitations of controllers. This shift towards natural interaction lowers barriers to entry and elevates user expectations, particularly in kinematic learning and training applications.
The reason I am promoting this tool is because I am confident that Physical Hands can be the tool for VR and MR hand tracking development because not only is it flexible, it is OpenXR compliant which means it can be used by anyone developing for OpenXR. Whilst Ultraleap is no longer actively developing this feature, it is open source on GitHub (as with its entire Unity plugin), so I am hopeful someone out there will use it in the future because it really has the potential to change how hand tracking application development for the developers.
I help founders and product leaders explore how to apply natural interaction design in their products, keeping it practical, fast, and tailored to their team’s reality. Reach out at [info@crwburgess.com] to start turning natural interaction into user delight.

Comments