π Stumbled here on accident? Start with the introduction!
π This article will teach you how to set up the left and right controllers for the Meta Quest 3 in Unity. This setup is essential for enabling interaction with various elements within your Unity project, enhancing the user experience on the Meta Quest 3 platform.
βΉοΈ If you find yourself facing any difficulties, remember that you can always refer to or download the code from our accompanying GitHub repository
To keep the process straightforward, we'll use the starter assets from the XR Interaction Toolkit
plugin. This approach simplifies the setup process, focusing on achieving a practical outcome quickly and efficiently.
Import XR Interaction Toolkit Starter Assets
We will utilize some of the XR Interaction Toolkit
starter assets in this article. Go to the Package Manager
via Window β Package Manager and import Starter Assets
which are located in the tab Samples
as seen in the next screenshot.
You can locate the XR Interaction Toolkit in the Package Manager
by accessing Unity Registry
and searching for 'interact'.
Enable the new Input System
βΉοΈ The New Input System in Unity is a flexible and extensible system designed to replace the older input methods. It provides a more unified and consistent way to handle different input devices like keyboards, mice, gamepads, and touchscreens. This system allows for more customized and complex input configurations, making it easier to develop games and applications that can be controlled across various platforms and devices.
Go to your Project Settings
and enable the new input system. Youβll find it under Active Input Handling
under the section Player
and the tab Other Settings
.
After selecting the new input system you will be asked to restart the Unity editor.
After restarting, the new input is now active. No additional steps required for now.
Adding the AR Input Manager
βΉοΈ The AR Input Manager in Unity's AR Foundation is a component that manages input in AR applications. It serves as a central point for handling AR-specific input data, such as touch gestures, camera movement, and device orientation. This manager is essential for creating interactive and user-responsive AR experiences, as it efficiently processes and translates real-world user interactions into actions within the AR environment.
Select the Session
GameObject in your hierarchy and add the AR Input Manager
via Add Component
.
The Input Action Manager
The Input Action Manager in Unity is a component that handles input actions within the New Input System. It allows developers to define and manage complex input actions like button presses, gestures, or movement inputs. This manager makes it easier to organize and respond to various user inputs in a structured and efficient manner, crucial for creating responsive and interactive gaming or application experiences.
βΉοΈ For those interested in exploring the Input Manager in greater detail, additional information is available in the provided reference. This resource offers a comprehensive look into the functionalities and applications of the Input Manager, enhancing your understanding and proficiency in utilizing this component in your projects. Input System | Input System | 1.7.0.
Select the XR Origin (XR Rig)
GameObject in your hierarchy.
In the screenshot provided, you'll notice that we've added XRI Default Input Actions
to the list of Action Assets
. These assets were included as part of the Starter Assets
from the XR Interaction Toolkit
plugin, highlighting how the toolkit simplifies the integration of default input actions into your project.
If you're curious about the specific Input Actions included in the Action Asset, simply double-click on the selected action asset. This action will reveal the details and configurations of the Input Actions, providing a clear understanding of the functionalities they encompass.
Adding XR Input Modality Manager
The XR Input Modality Manager
in Unity dynamically switches between hands and controllers during runtime, depending on their tracking status.
if hands start being tracked, it switches to hand interactors.
if motion controllers are activated, it switches to controller interactors once they're tracked.
π This component is valuable even without hand tracking, as it deactivates controller GameObjects until they are tracked, preventing them from appearing by default.
Select the XR Origin (XR Rig)
GameObject in your hierarchy and add the XR Input Modality Manager
via Add Component
. Select Left Controller
and Right Controller
as seen in the screenshot.
Choose a controller preset
For configuring the left and right controllers, we choose a controller preset imported through the starter assets. In your hierarchy select the GameObject Left Controller
. To select a preset, click on the two sliders as depicted in the accompanying screenshot. This step is essential for customizing the controller settings in your project, ensuring they align with the imported presets for optimal functionality and user experience.
Similarly, for the Right Controller GameObject, follow the same process and select the corresponding preset for the right controller. This ensures that both controllers are appropriately configured with their respective presets, maintaining consistency and functionality in your setup.
Choose an Interaction Manager
For the final step in setting up our controllers, we need to assign the Interaction Manager to both the left and right controllers. As shown in the forthcoming screenshot, select XR Interaction Manager
as the Interaction Manager
. This action links the controllers to the central system that manages interactions, ensuring they function correctly within the XR environment.
βΉοΈ Interaction Manager
: The XR Interaction Manager that this interactor will communicate with.
Similarly, for the Right Controller GameObject, follow the same process and select XR Interaction Manager
as the Interaction Manager
.
Testing the app
Run your Unity project via Build And Run
as explained in the last article. You should now see the Unity project running in your headset as seen in the next screenshot. Move your Meta Quest 3 controllers to verify if everything is working as expected.
Next article
In our next article, we will delve into the aspect of detecting planes in MR environments. Plane detection is a cornerstone of MR, allowing virtual content to be realistically placed and interacted with in the physical world. We will explore how this technology works and the ways you can implement plane detection in your projects using Unity.
Top comments (0)