The Unity XR Interaction Toolkit (XRI) is a free official Unity package that gives you VR grab, teleport, ray interact, and UI interaction ready to use without writing interaction code from scratch. Install it via Package Manager, search for “XR Interaction Toolkit”, add an XR Origin to your scene, and your first VR interaction is ready to test in under 30 minutes. It supports Meta Quest, PSVR2, Valve Index, and any OpenXR headset.
The XR Interaction Toolkit is the fastest path to working VR interactions in Unity without paying for any tools. For developers in Pakistan building VR apps, games, or training simulations, XRI is the right starting point in 2026. This guide walks you from zero to a working VR grab interaction with concrete steps.
What Is the Unity XR Interaction Toolkit?
The XR Interaction Toolkit is an official package developed and maintained by Unity Technologies. It provides a complete, production-ready system for building VR interactions using a component-based approach. Instead of building grab mechanics, ray interaction, locomotion, and UI interaction from scratch, you add XRI components to GameObjects and they handle the behaviour.
Key systems included in XRI:
XR Grab Interactable: Makes any GameObject grabbable with one component
XR Ray Interactor: Pointer ray from controller tip for distant selection and UI interaction
XR Direct Interactor: Touch-based interaction for objects within reach
Teleportation Area and Anchor: Built-in locomotion with arc teleport indicator
XR UI Canvas: Interact with standard Unity UI canvases in VR using the ray
XRI uses the OpenXR standard underneath, which means your project works across Meta Quest, PSVR2, Valve Index, HTC Vive, Pico, and any future OpenXR-compatible headset without platform-specific code changes.
âšī¸ Which Unity Version to Use
Use Unity 2022.3 LTS for maximum stability with XRI 2.5+. For new projects, Unity 6 with XRI 3.0+ is the forward-looking choice. Avoid Unity 2021.x as XRI support there is minimal. The package works with the free Personal or Student editions of Unity.
Step 1: Installation via Package Manager
Open your Unity project and follow these steps exactly:
Go to Window in the menu bar, then click Package Manager
In Package Manager, click the + button in the top-left corner
Select Add package by name
Enter the exact package name: com.unity.xr.interaction.toolkit and press Add
Wait for the installation to complete. You will see the package appear in the list
With the package selected, click Samples in the right panel
Click Import next to Starter Assets and XR Device Simulator
Next, configure OpenXR. Go to Edit, then Project Settings, then XR Plug-in Management. Under the PC tab, check OpenXR. Under the Android tab (for Meta Quest), also check OpenXR. Then click on OpenXR in the left sidebar and add an interaction profile. For Meta Quest, add the Meta Quest Touch Pro Controller Profile.
đĄ Meta Quest Setup Extra Step
For Meta Quest development, install the free Meta XR All-in-One SDK from the Unity Asset Store in addition to XRI. It adds Quest-specific features like hand tracking, passthrough, and performance optimizations that XRI alone does not provide.
Step 2: Setting Up Your XR Origin
The XR Origin is the player object in your VR scene. It represents your physical head and hands in the virtual world. Setting it up takes less than two minutes:
Right-click anywhere in the Scene Hierarchy panel
Navigate to XR in the menu, then select XR Origin (VR)
Unity creates an XR Origin with Camera Offset, Main Camera, and Left Controller plus Right Controller child objects
If your scene has an old Main Camera, delete it to avoid conflicts
Press Play and put on your headset. The view should track your head movement immediately
The Left Controller and Right Controller objects come pre-configured with both Ray Interactors (for pointing at distant objects) and Direct Interactors (for touching objects nearby). You do not need to configure these manually for basic use.
đĩđ° Testing Without a Headset
If you do not yet have a VR headset, use the XR Device Simulator from the Starter Assets sample. It lets you simulate VR controller input using keyboard and mouse inside the Unity Editor. Press the left mouse button to simulate grip, use WASD for movement, and right-click for controller rotation. Essential for Pakistani developers starting out before purchasing hardware.
Step 3: Your First VR Grab Interaction
Making an object grabbable in VR takes three steps and about one minute:
Select any GameObject in your scene. A cube works perfectly for a first test
In the Inspector, click Add Component and add XR Grab Interactable
Add a second component: Rigidbody. Leave Use Gravity checked
Press Play, reach out with your controller trigger button, grab the cube, and throw it
The XR Grab Interactable component handles everything: detecting when your controller intersects the object, attaching it to your hand on grip press, tracking its position while held, and releasing it with the correct velocity for a natural throw. Physics from the Rigidbody component makes it fall and bounce correctly after release.
For a socket-style snap interaction (where an object snaps to a fixed position, like placing an item on a shelf), add an XR Socket Interactor component to the target object instead of a grab interactable.
Key XRI Components Every VR Developer Should Know
Component
What It Does
Add To
XR Origin
Represents the player in VR space with head and hands
Scene root (once per scene)
XR Grab InteractableMost Used
Makes any object grabbable and throwable by controllers
Any object you want to grab
XR Ray Interactor
Laser pointer ray for selecting distant objects and clicking UI
Controller child objects on XR Origin
XR Direct Interactor
Touch-based interaction for objects within arm reach
Controller child objects on XR Origin
Teleportation Area
Defines a floor area the player can teleport to
Floor or any walkable surface
XR Socket Interactor
A fixed slot that objects snap into when released nearby
Target slot position GameObjects
XRI vs Other Unity VR SDKs in 2026
SDK
Cost
Platform Support
Best For
Unity XRIRecommended
Free
All OpenXR headsets
Cross-platform VR, all skill levels
Meta XR SDK
Free
Meta Quest only
Quest-specific hand tracking, passthrough
SteamVR Plugin
Free
PC VR via Steam
PC-only VR games on Valve Index, Vive
Pico SDK
Free
Pico headsets only
Publishing to Pico store in China or EU
VRTK 4
Free (Patreon for extras)
All OpenXR headsets
Advanced interactions, experienced developers
Free Resources to Go Further with XRI
Unity Learn VR pathway: Free structured courses covering XRI from basics to shipping a project
Unity XRI Samples on GitHub: Official sample scenes with working grab, socket, teleport, and UI interaction
Valem Tutorials on YouTube: Best free XRI video tutorials updated for 2026, beginner to advanced
Meta Developer Hub: Free tools for sideloading builds to your Quest for testing during development
Unity Asset Store: Search “XR Interaction Toolkit samples” for free community scene packs
PakVR may earn a commission from qualifying purchases made through links in this article. This does not affect our recommendations or increase your price. We only recommend products we genuinely believe are useful.
Frequently Asked Questions
The Unity XR Interaction Toolkit (XRI) is a free official Unity package that provides ready-made components for VR interactions including grabbing, throwing, teleportation, ray pointing, and UI interaction. It works across Meta Quest, PSVR2, Valve Index, and any OpenXR-compatible headset without platform-specific code.
Yes, XRI is completely free. It is published by Unity Technologies and installed via the Unity Package Manager at no cost. It works with Unity Personal, Student, Plus, and Pro editions. There are no paid tiers, runtime fees, or subscription requirements.
XRI supports any headset that implements the OpenXR standard, which includes Meta Quest 2, Quest 3, Quest 3S, PSVR2, Valve Index, HTC Vive XR Elite, HP Reverb G2, and Pico 4. Because it uses OpenXR, a single project build targets all these platforms with minimal platform-specific changes.
Open Unity Package Manager from the Window menu, click the + button, select Add package by name, and enter com.unity.xr.interaction.toolkit. After it installs, go to the Samples tab and import Starter Assets and XR Device Simulator. Then configure OpenXR in Project Settings under XR Plug-in Management.
XRI 2.5+ works best with Unity 2022.3 LTS. XRI 3.0+ works with Unity 6. For new projects starting in 2026, Unity 6 with XRI 3.0 is recommended. Avoid Unity 2021.x as XRI support there is minimal and several key features are missing.
XRI 3.0, designed for Unity 6, introduces a new interaction system with better performance, more flexible attachment points, improved socket interactions, and better hand tracking integration. XRI 2.5 for Unity 2022.3 LTS is stable and production-ready but lacks some of the newer architecture improvements.
Select the object in the Scene Hierarchy, click Add Component in the Inspector, add XR Grab Interactable, then add a Rigidbody component. That is all. Press Play and the object is immediately grabbable with either controller trigger button. No scripting is required for basic grab functionality.
The XR Origin is the root player object in your VR scene. It contains the Camera Offset (representing head height), the Main Camera (what you see through the headset), and Left and Right Controller GameObjects. It is created by right-clicking in the Hierarchy and selecting XR then XR Origin (VR). Every VR scene needs exactly one.
Add a Teleportation Area component to your floor GameObject. The player can then point at the floor with their controller and press the thumbstick or joystick to teleport to that location. An arc trajectory indicator appears automatically during the teleport aim. The XR Origin must also have a Locomotion System component for teleportation to work.
Yes. Import the XR Device Simulator from the XRI Starter Assets sample. Enable it in your scene and you can simulate VR controller input using mouse and keyboard inside the Unity Editor: left mouse button for grip, right mouse button for trigger, WASD for movement, and mouse movement for controller rotation.
OpenXR is an open royalty-free standard created by the Khronos Group that defines how VR applications communicate with VR hardware. Instead of separate SDKs for each headset, OpenXR provides one API for all. XRI uses OpenXR so that your interactions and input handling work on Meta Quest, PSVR2, Valve Index, and other headsets from a single codebase.
Install XRI via Package Manager. In Project Settings, go to XR Plug-in Management, enable OpenXR on the Android platform tab, and add the Meta Quest Touch Pro Controller Profile under OpenXR settings. Also install the Meta XR All-in-One SDK from the Unity Asset Store for Quest-specific features like passthrough and hand tracking.
XRI is platform-agnostic and works on any OpenXR headset including Meta Quest and PC VR. The SteamVR plugin works only for PC VR via Steam and is optimized for Valve Index controllers. For new projects in 2026, XRI is the better choice as it targets all platforms. Use SteamVR plugin only if you specifically target Steam distribution.
Hand tracking in XRI requires the Meta XR SDK in addition to XRI for Meta Quest. Enable hand tracking in the Meta XR settings and add a Hand Tracking Manager to your scene. XRI 3.0 has improved built-in hand tracking support via OpenXR hand tracking extension, which works on Quest 3 and Quest Pro without the full Meta SDK.
An XR Interactable is any component that makes a GameObject respond to VR controller input. The main interactable components in XRI are XR Grab Interactable for grabbing, XR Simple Interactable for hover and select events, and XR Socket Interactor for snap points. Add one to any object to make it respond to VR interaction.
Add an XR UI Canvas component to your world-space Canvas and change its Render Mode to World Space. Add a Tracked Device Graphic Raycaster component instead of the default Graphic Raycaster. The XR Ray Interactor on your controllers will now detect and interact with UI elements using the ray pointer, triggering button clicks via the controller trigger.
The XR Device Simulator is a Unity tool included in the XRI Starter Assets sample that simulates a VR headset and two controllers using your keyboard and mouse. It lets you test VR interactions in the Unity Editor without wearing a physical headset. Essential for rapid iteration and for developers who do not yet own a VR headset.
Switch the build target to Android in Build Settings. In Player Settings, set minimum Android API to level 29 and enable the Quest as the target device. Enable developer mode on your physical Quest headset in the Meta mobile app. Connect via USB, click Build and Run in Unity, and the APK installs and launches on the headset automatically.
Yes, XRI 3.0 is designed specifically for Unity 6 and is the recommended version for new projects. Unity 6 with XRI 3.0 offers improved performance, better hand tracking integration, and the new interaction system architecture. Unity 6 is free to use under the Personal plan for revenue under $200,000 per year.
The best free companion assets for XRI include the Meta XR All-in-One SDK for Quest features, the Unity VR Samples on GitHub for working scene examples, Starter Assets from the XRI package itself for ready-made prefabs, and free environment packs from Synty Studios or Kenney assets for quick prototyping environments.
XRI's XR Grab Interactable has a Movement Type setting in the Inspector. Set it to Velocity Tracking for physics-based movement where the grabbed object follows your hand using Rigidbody velocity, which collides properly with other physics objects. The default Kinematic setting moves the object directly to your hand without physics collisions.
The Meta XR SDK is Meta's own Unity SDK that provides Quest-specific features not in XRI: passthrough cameras, scene understanding, hand mesh tracking, and Presence Platform capabilities. XRI and Meta XR SDK are complementary, not competing. Most Meta Quest projects in 2026 use XRI for interaction logic and Meta XR SDK for Quest hardware features.
Add a Locomotion System component to your XR Origin. Then add a Teleportation Provider and Continuous Move Provider from the XRI locomotion system. Teleportation Provider enables point-and-teleport movement. Continuous Move Provider enables smooth joystick locomotion (which may cause motion sickness for some users). Both can be used together with a toggle.
Yes. Pico headsets support OpenXR, so XRI works with Pico 4 and Neo 3 with minimal configuration. In XR Plug-in Management, enable OpenXR for Android and add the Pico controller profile under OpenXR settings. Build for Android and deploy to Pico the same way as Quest. Some Pico-specific features require the PICO Unity Integration SDK.
Use the XR Interaction Debugger window under Window, then Analysis, then XR Interaction Debugger. It shows all interactors and interactables in the scene with real-time state updates showing which objects are hovered, selected, or activated. Also add the XR Device Simulator to test interactions without a headset. Unity Console shows XRI warnings about misconfigured components.
Valem Tutorials on YouTube is the most recommended free resource for Pakistani Unity VR developers. The channel covers XRI from basic grab interactions to full VR game mechanics with clear English narration. The Unity Learn VR Development pathway is the second best resource, fully free, and structured for beginners building their first complete VR experience.
The best free Unity VR locomotion asset in 2026 is the XR Interaction Toolkit (built into Unity) â it provides both teleport and smooth locomotion out of the box with no cost. For more advanced movement, the free Unity VR Starter Template and OpenXR Plugin add extra locomotion options compatible with all major headsets. In…
The best free Unity VR UI assets in 2026 are Unity’s built-in UGUI system with World Space Canvas for basic VR panels, and the free Mixed Reality Toolkit (MRTK) for advanced VR interface components. Both are free, actively maintained, and compatible with all major VR platforms including Meta Quest. In This Article Why VR UI…