top of page

VIRTUAL REALITY:

VR_Categorizer.png

OVERVIEW

XR development can be very different than traditional game or software development. I've learned how to program and design various XR interactions that I'll display here. The below examples were created in Unity, and I am in the process of designing similar interactions in Unreal 5. 

​

Additionally, the user experience for XR is comprised of several important tenets that I have learned over the last few years. When to use specific features, and how they are best implemented is a skill set in itself. As such, for each XR mechanic shown below, I'll include my thoughts about why the feature is important, and in what circumstances it is sensible.

​

The following features were created in Unity, utilizing the XR development packages, and tested on a Meta Quest 3. 

Locomotion

 - Continuous movement, and teleportation

Continuous movement is beneficial for projects that necessitate quick responsive movement. For example, Asgard's Wrath II, from Sanzaru Games, is a melee-based RPG that makes use of fast continuous motion. 

​

On the other hand, most training simulations don't necessitate this type of movement.

​

The primary drawback with this method of movement is the potential for motion sickness, which affects a very large percentage of users, between 40% - 70% according to TechTarget.

Continuous Movement

Teleportation

The main alternative to continuous movement is teleportation. 

​

I placed various teleportation anchors around the map and included several object colliders as qualifying surfaces in a teleportation area. Areas are more convenient, and enable more free movement, whilst anchors limit the user to a particular path.

 

For more linear experiences, I'd prefer using anchors for the control, but for more open space, an area. I do believe however, that teleportation areas are more difficult to design for.

By assigning the teleportation function to the right thumb stick, I enabled the user to determine which direction they would face before committing to the action. â€‹

​

Teleportation anchors have a defined forward position, but teleportation areas do not. This provides the user more control over their movement. While the teleport ray is active, the typical action of the right thumb stick (turning) is disabled via a hierarchy of input actions. While teleport is active, it takes priority.

Directional Input

Turning is assigned to the right thumb stick, and is typically either continuous, or snaps by a certain number of degrees. 

​

Continuous turning works well in faster-paced environments and snap turn shines brightest in more stationary experiences, but ultimately, they can both work in either scenario. 

​

Which to use is dependent on user preference and comfort and should be up to the user to decide. 

Continuous Turning

Snap turn has the benefit of being more accessible for most users. 

​

It is especially useful in experiences in which the user is not expected to be turning their head often, as a disconnect between virtual and physical movement is often a cause of the motion sickness some people experience with VR. 

Snap Turning

Interactivity 

 - Affecting the game world 

Grabbing objects in the virtual space is a key interaction which grounds the user in the environment and also opens up room for more gameplay elements. 

​

Attach transform grabbing snaps the object to the players hand at a specific point. This is ideal for tools, weapons, or any other items that have a designated handle. Several points can be specified at different locations in the event the object has several intended ways to hold it. 

Attach transform grab

Dynamic grabbing lacks a specific transform point to snap to. 

​

In experiences that are more dependent on the physics of the environment, this type of grabbing can provide some added complexity for the user to navigate through, which may not be a bad thing. 

​

It is also slightly more immersive than snap grabbing and encourages the user to play around with the physics of the experience. 

Dynamic grab

Levers and knobs are effective ways for the user to cause changes in the environment and trigger certain events. 

​

In this example, I assigned Z-axis movement of the cubes to the lever, and X-axis movement to the knob. These interactors can be used as triggers or ways to more dynamically control the experience, as they have several parameters to make use of that can be gamified and made into more interesting elements. 

Lever and knob interactors

Interactivity 

- Affordance and other interactions

Placing things in VR can be a hassle, which is why having a snapping system for handled objects is essential. Parameters such as the range and attach points are alterable which enables this feature to be useful for a variety of experiences. 

​

The transparent material that appears makes it clear to the user where the object is meant to go, and also implies the existence of the snapping functionality. 

Hover and Snap

For systems such as the ladder here, providing the user with proper feedback is essential. For example, the material change on hover conveys that the object is to be interacted with, and not a decorative element. The alternative color when the ladder is grabbed also tells the player that their interaction was successful, and they can continue to climb. 

​

Snapping the players hands to the rungs also conveys these effects, but in busier environments it's essential that interactable elements stand out somewhat from the decorative ones. 

Affordance in movement

bottom of page