Enhancing Virtual Reality Experiences with Unity 2022 - Steven Antonio Christian - E-Book

Enhancing Virtual Reality Experiences with Unity 2022 E-Book

Steven Antonio Christian

0,0
43,19 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

Virtual reality (VR) has emerged as one of the most transformative mediums of the 21st century, finding applications in various industries, including gaming, entertainment, and education.
Enhancing Virtual Reality Experiences with Unity 2022 takes you into the fascinating realm of VR, where creativity meets cutting-edge technology to bring tangible real-world applications to life. This immersive exploration not only equips you with the essential skills needed to craft captivating VR environments using Unity's powerful game engine but also offers a deeper understanding of the philosophy behind creating truly immersive experiences.
Throughout the book, you’ll work with practical VR scene creation, interactive design, spatial audio, and C# programming and prepare to apply these skills to real-world projects spanning art galleries, interactive playgrounds, and beyond. To ensure your VR creations reach their full potential, the book also includes valuable tips on optimization, guaranteeing maximum immersion and impact for your VR adventures.
By the end of this book, you’ll have a solid understanding of VR’s versatility and how you can leverage the Unity game engine to create groundbreaking projects.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB

Seitenzahl: 811

Veröffentlichungsjahr: 2023

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Enhancing Virtual Reality Experiences with Unity 2022

Use Unity’s latest features to level up your skills for VR games, apps, and other projects

Steven Antonio Christian

BIRMINGHAM—MUMBAI

Enhancing Virtual Reality Experiences with Unity 2022

Copyright © 2023 Packt Publishing

All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.

Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.

Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.

Group Product Manager: Rohit Rajkumar

Publishing Product Manager: Nitin Nainani

Senior Editor: Divya Anne Selvaraj and Mark D'Souza

Technical Editor: Joseph Aloocaran and Simran Ali

Book Project Manager: Aishwarya Mohan

Copy Editor: Safis Editing

Proofreader: Safis Editing

Indexer: Sejal Dsilva

Production Designer: Nilesh Mohite

DevRel Marketing Coordinator: Nivedita Pandey

First published: November 2023

Production reference: 1131023

Published by Packt Publishing Ltd.

Grosvenor House

11 St Paul’s Square

Birmingham

B3 1RB, UK.

ISBN 978-1-80461-953-7

www.packtpub.com

Two years have passed since I began this book, a time when every page written was a stepping stone in my personal and professional journey. I discovered, I faltered, and I stood tall, striking a fine balance between my medical pursuits and my passion for XR development. It was a journey that whispered the secrets of resilience, that painted a canvas of innovation, bright and bold, before my very eyes. Yet, this journey was not a solitary venture; it was nurtured by a community that stood by me, unwavering.

To my mother, Shalanda, and my partner, Tyne, your love was my beacon of hope; your trust, my stronghold. To my father, Steven Christian, your legacy is the wellspring from which I draw inspiration; your unyielding faith has laid the groundwork for my enduring aspirations.

This narrative of my life is enriched by those who’ve supported my dreams. From my humble beginnings (tinkering with phone modifications in high school, taking on the field as a college football player, and sharing my journey of app development and animated experiences on YouTube) to venturing into entrepreneurship and blossoming as a medical school technologist, your cheer, your applause, and your reassurances have kept me grounded.

To the loved ones who are no longer with us in person but continue to shine in my heart and spirit – my beloved grandmother, my cousin Marschell, and my dear friends Greg, Quint, and Alain – your memory is a guiding light, each one a luminous star on my life’s constellation. You push me to reach beyond the known, to innovate, and to evolve. This journey and the work encapsulated within these pages are a testament to your lasting influence on my life.

And so, to each of you, I dedicate this book. Your essence is woven into its narrative, echoing in the silence between the lines, making it more than just a chronicle of knowledge, but a monument to our shared history and an homage to our enduring bond.

– Steven Christian, MAIS

M.D., Ph.D. Candidate in Integrative Neuroscience at the University of Nevada, Reno '27

Foreword

In a world where the boundaries of reality are being redefined and reimagined, Steven Christian emerges on the world’s stage as a visionary author who joins the forefront of the technological revolution using virtual reality (VR). Prepare to embark on an extraordinary voyage where the power of imagination, creativity, and technology intertwine to illuminate Steven’s blueprint using the Unity game engine to further galvanize the development of extended reality (XR) experiences.

Enhancing Virtual Reality Experiences with Unity finds at center stage an author who courageously blends his life experiences as a college athlete, visual artist, animator, founder of Iltopia Studios, former surgical patient, and now M.D./Ph.D. student who is determined to make a global impact leveraging emerging technology and culture. His contributions to advancing medicine will undoubtedly be invaluable and mirror his voice that resonates on the numerous stages onto which he’s been invited to share his knowledge and passion.

The creation of immersive experiences using VR continues to impact every field of human endeavor. Beyond sharing knowledge of using Unity to create immersive experiences, Steven provides a window into the creative fuel that powers his imagination and curiosity.

Enhancing Virtual Reality Experiences with Unity is not just a book but a cornerstone that will empower, accelerate curiosity, and guide creators toward a future where they can craft more meaningful and impactful experiences within the realm of VR and beyond.

Patrick B. Thomas MD, MMCi, FACS

Director of Digital Innovation in Pediatric Surgery at The University of Nebraska Medical Center College of Medicine

Contributors

About the author

Steven Antonio Christian is a former football player turned creator, writer, animator, and the founder of Iltopia Studios. Holding a master’s degree from Oregon State University, he’s known for projects such as the Eyelnd Feevr AR immersive storytelling experience and the Analog-AR headset. An instructor at Portland Community College and a Unity Certified Instructor, he is currently pursuing an M.D./Ph.D. in Integrative Neuroscience at the University of Nevada as the first Black American M.D./Ph.D. student in Nevada’s state history. His work is celebrated on platforms such as The Wall Street Journal and Unity for Humanity. Passionate about community uplifting, Steven’s primary goal is to entertain, educate, and empower through his art and provide hope for African American youth to pursue careers in technology and medicine.

Dedicated to the pillars of my life, the memories that guide me, and the dreams we’ve built together.

About the reviewer

Vishal Pandey is a co-founder and CTO at MythyaVerse, an innovative start-up focusing on VR and AI solutions for wellness and productivity. He possesses a strong academic background, with a B.Tech. in computer science and an M.Tech. in data science, and is currently pursuing a Ph.D. at IIT Roorkee. Previously, as a research fellow at the Defence Research and Development Organization, Vishal played a key role in building XR products for cognitive training in defense research in India. He has multiple published papers in the field of human-computer interaction, solidifying his status as an XR thought leader and innovator. Vishal’s commitment to pushing the boundaries of technology continues to make a lasting impact in the industry.

Table of Contents

Preface

Part 1: Philosophy and Basics of Understanding Virtual Reality

1

Philosophy of Building Immersive Experiences

What is an immersive experience?

Understanding immersion through the senses

What makes something immersive?

How to make experiences immersive

What are the essential components of an immersive experience?

Skills required to build immersive experiences

Technological components of building immersive experiences

Understanding XR, AR, VR, and MR

Understanding the difference between AR, VR, and MR

Brief history of VR

How does VR work?

How do we experience VR?

Hardware, software, and platforms that support VR development and engagement

Approaching VR development

Setting expectations for projects

Navigating available resources

Developing an efficient workflow

Summary

Part 2: Technical Skills for Building VR Experiences in Unity (Assets, GameObjects, Scripts, and Components)

2

Building VR Scenes in Unity

Technical requirements

Setting up a Unity project

Installation

Unity Hub

Licenses

Unity Editor version

Modules

Project templates

Creating a new project

Navigating the Unity interface

The Scene view

The Game view

The Hierarchy window

The Project window

The Inspector window

Package Manager

Build Settings

Project Settings

Play mode

VR setup

Setting up the interaction profile

Android VR

Installing XR Interaction Toolkit

Headset setup

Setting up developer mode

VR scene setup

Testing in the Editor

Testing on a device

Summary

3

Working with Inputs and Interactions

Technical requirements

Why do interactions matter?

Setting up a demo scene using primitive shapes

Setting up the locomotion system

Setting up teleportation

Teleportation areas

Snap and continuous turning

Continuous locomotion

Adding interactor components

Ray interactors

Direct interactors

Gaze interactions

Multiple object interactions

Adding haptic feedback to our VR controllers

Adding attach points to virtual objects

Adding socket interactors to our demo scene

Extending XR Interaction Toolkit

Summary

4

Using Game Objects, Materials, and Prefabs

Technical requirements

Creating a grid with Primitive GOs

Creating a materials library

Creating a grid of spheres

Creating custom materials

Assigning custom materials and assets to GOs

Downloading assets from the Asset Store

Replacing VR hands with custom GOs

Replacing the shader and materials on VR hands

Replacing primitive VR objects with customs GOs

Creating a custom VR demo room with ProBuilder

Designing a demo room floor plan with Google Drawings

Converting the floor plan into a 3D room with ProBuilder

Decorating the VR demo room with Polybrush and vertex painting

Creating prefabs with our GOs

Organizing project assets with folders in the Project tab

Exporting project assets with Unity packages and FBX Export

Summary

5

Implementing Animation – Physics and Colliders

Technical requirements

Introducing Unity’s physics system

What is a Rigidbody component?

Adding physics using Rigidbodies

Introducing collider components

Adding colliders for interactions

Adding accurate human body physics to the VR rig

Using colliders as barriers

Introducing Unity’s animation system

Working with Unity Timeline

Creating an animation clip for GOs

Adding animations to objects with Timeline

Trigger animation with gaze and Raycast

Summary

6

Lighting Your Worlds and Experiences

Technical requirements

Getting to know the Unity lighting system

Directional lights

Lighting window

Baking lightmaps

Skybox and environment lights

Simulating indoor lighting

Spot lights

Point lights

Modifying quality settings for lighting

Emissive materials

Visualizing light rays

Area lights

Introducing post-processing

Global Post-Processing Volume

Local Post-Processing Volume

Summary

7

Creating Immersion with Sound

Technical requirements

Getting to know Unity’s audio and sound system

Adding background music with Global 2D audio sources

Adding audio events to our VR rig

Adding spatial audio with 3D audio sources

Summary

8

Working with C#, Unity Events, and Input Actions

Technical requirements

How does C# work in Unity?

C# syntax

C# structure

Public and private

Common functions and methods

Common statements

Object rotation script

Creating custom input actions

Getting to know Unity Events

Triggering events with input actions

Triggering events with colliders

Combining collider triggers with input actions

Creating portals with colliders and events

Switching between scenes with portals

Summary

9

Unlocking the Power of Render Pipelines

Getting to know Unity’s Scriptable Render Pipeline

What are the different render pipelines?

Which one is best for your project?

Exporting our built-in render pipeline project

Importing our package into Universal Render Pipeline

Importing our package into High Definition Render Pipeline

Enabling post-processing in URP and HDRP

Converting render pipelines within a project

Summary

Part 3: Projects: Putting Skills Together

10

Design Thinking for Virtual Reality Experiences

Technical requirements

An introduction to design thinking

How can design thinking be applied to VR development?

Developing a design document for a VR project

Components of a design document

Creating our design document

Empathy statement

Defining the project goal

Creating an ideation statement

Researching, finding references, and allocating resources

Roadmapping

Experienced design

Designing the maps

Grayboxing the maps

Kitbashing the maps

Summary

11

Adding Audio to a Virtual Reality World

Technical requirements

Adding park audio

Adding car audio

Adding NPC audio

Summary

12

Building an Art Gallery

Technical requirements

Creating a gallery mesh

Adding indoor lights

Adding light switches

Creating a drawing system

Hanging a picture frame

Summary

13

Animating a Virtual Reality Experience

Technical requirements

Understanding the animation pipeline

Writing a synopsis

Writing a script

Drafting storyboards

Creating a sequence animatic

Adding audio

Previsualizing a sequence

Setting up cameras with Cinemachine

Downloading 3D characters

Downloading animations from Mixamo

Setting up animated characters

Adding animation clips to our timeline

Adding VFX

Adding background characters

Summary

14

Recording Virtual Reality Videos

Technical requirements

Working with the Unity Recorder

Building a VR theater experience

Creating a 360° video

Summary

15

Enhancing Virtual Reality Rigs

Technical requirements

Adding animated hands to the VR rig

Adding grip animations

Adding pinch animations

Hiding hands when holding objects

Adding and toggling between the ray interactor and the direct interactor

Adding interactors

Creating a flexible toggle for independent direct and ray interactors

Adding triggers for individual toggling

Adding an input manager

Making the VR rig run

Making our VR rig jump

Making the VR rig crouch

Summary

16

Triggering Actions in Virtual Reality

Technical requirements

Creating and configuring a VR settings menu

Creating sliders for our VR settings menu

Adding text indicators for the settings menu sliders

Controlling the jump, crouch, and speed values with sliders

Adding an on/off toggle to the VR settings menu

Triggering animations in the scene

Adding portals for navigation

Creating a portal between the art gallery and the virtual world

Creating a portal between the theatre scene and the virtual world

Creating a portal between the demo room and the virtual world

Creating and controlling a bouncing basketball

Creating a basketball object

Controlling the bounce of our basketball object

Summary

17

Destroying Objects in Virtual Reality

Technical requirements

Creating a grenade

Creating an explosion

Adding detonation visuals

Creating a grenade-spawning portal

Making a simple hand cannon

Attaching the cannon to a hand

Enabling cannon firing

Making a complex hand cannon

Making projectiles

Creating a simple projectile

Creating a complex projectile

Making projectile variants

Making a rapid-fire projectile

Making an energy shot projectile

Making a fireball projectile

Making a grenade projectile

Making a water drop projectile

Making a rock slide projectile

Making an ice shard projectile

Making a flamethrower projectile

Destroying GameObjects with projectiles

Adding health to an object

Adding health points

Adding damage points

Creating a shatter prefab

Instantiating a shatter prefab

Creating a shatter prefab variant

Summary

Part 4: Final Touches

18

Optimizing Your Virtual Reality Experiences

Technical requirements

Optimization in Unity

Biggest performance impacts

What can you do to improve performance?

Optimization tools

Working with the Unity Profiler

Using the Stats window

Using Frame Debugger

Optimizing draw calls

Static objects

Occlusion culling

Optimizing material settings

GPU instancing

Using Mobile shaders

Optimizing camera settings

FOV

Clipping planes

Render paths

Optimizing images and textures

Optimizing audio files

Optimizing animation files

Keyframe reduction

Optimizing the lighting system

Light baking

Shadows

Optimizing project settings

Build settings

Player settings

Rendering settings

Configuration settings

Quality settings

Stereo rendering settings

Screen scale resolution

Applying optimizations to our VR world

Assessing our scene

Running the Profiler

Running the Memory Profiler

Optimizing the memory profile

Optimizing our scene

Reviewing the optimizations

Summary

Index

Other Books You May Enjoy

Part 1: Philosophy and Basics of Understanding Virtual Reality

In this thought-provoking part of the book, we delve into the core philosophical underpinnings of virtual reality (VR), shedding light on the why before the how of VR creation. This part will not just prepare you technically but also mentally, equipping you with the right mindset and conceptual understanding necessary to create truly immersive experiences. This journey is about more than just understanding VR—it's about truly living it.

This part includes the following chapter:

Chapter 1, Philosophy of Building Immersive Experiences

1

Philosophy of Building Immersive Experiences

Welcome to Enhancing Virtual Reality Experience with Unity! In this book, we will explore not only what it takes to build virtual reality (VR) experiences, but also how to expand on that knowledge to create innovative experiences with VR. You will be able to create amazing projects in VR on your own in no time as we progress through this book. We will follow a series of step-by-step tutorials to complete projects aimed at giving you the skills you need to be proficient at VR development.

VR can encompass many content and concept areas, but we will cover the major areas so that you have a good foundation and understanding as you continue your journey as a developer. Ultimately, the goal is to create VR experiences that are fun and engaging.

This chapter will explore some of the foundational concepts of immersive experiences, VR, and using the Unity game engine. The goal is to first understand some of the philosophy around what immersive experiences are and why we build experiences with this technology. Before we dive deep into VR, we will expand your concept of VR by first defining immersive experiences and introducing VR within that context. VR goes beyond making games for headsets. It is a medium that can be applied to a variety of industries and applications, such as healthcare, education, therapy, design, entertainment, and so on. We will break down the various components that comprise the experiences and introduce some of the hardware that is necessary to develop and participate in those experiences. In this way, before we begin developing, you will have a better idea of what to expect when you open Unity to start building your experiences.

In this chapter, we will cover the following topics:

What is an immersive experience?What are the essential components of an immersive experience?Understanding XR, AR, VR, and MRHow does VR work?Approaching VR development

What is an immersive experience?

Immersion is a core concept of how we experience the world around us. It can be minimal (sitting in a park and reading a book) or maximal (going scuba diving in the ocean as you feel the weightlessness from the water pressure pushing against your body), but the fact remains that immersion is a constant in our lives. Quite frankly, we don’t have any concept of what a lack of immersion is because the experiences we have involve some level of immersion.

Medical students are trained to test the functions of the human body so that patients can have a fully immersive experience. As the body declines due to age and disease, we see that things become less immersive, and ultimately, quality of life diminishes. Immersion affects our perceptions and informs our reality to an extent. Nevertheless, there is no box that we can put the concept of immersion into because it is all-encompassing.

The Merriam-Webster Dictionary defines immersion as “a state of being deeply engaged or involved, deep mental involvement.” We can also use the more literal definition: “to plunge into something that surrounds or covers especially: to plunge or dip into a fluid.” Both definitions overlap in many ways because they allude to an ever-present stimulus. When we talk about immersive experiences, we are referring to the concept of how the surrounding environment provides stimuli that inform our perceptions. In those conversations, we often describe what we see, smell, feel, hear, and believe to be true based on what engaged our senses at that moment.

An immersive experience is an illusion that makes you feel like you are inside or part of an environment. We perceive the environment as tangible (real), but it is intangible. This environment engages your senses through the use of technology and feedback to mimic real-world phenomena: when you walk, you hear footsteps; running blurs your surroundings; and looking at lights disrupts your vision.

We are familiar with this notion as extended reality (XR) or mixed reality (MR) – that is, placing digital objects in the real world and directly interacting with them as if they were actually there. We can use hardware such as head-mounted displays (HMDs) and infrared sensors to augment physical spaces with digital objects and enhance the experience within the space.

However, before we learn more about MR, we must talk about the role of the senses in immersion. Senses are the focal point of our experiences. Without them, we are unable to interpret information or engage with the world around us. If we hear a loud noise, we will cover our ears and try to leave the source of the noise. When we are confronted with the source in the future, our negative experience will inform us how we should respond to that source. Let’s say we were immersed in an environment with a loud noise. That noise provided an unpleasant experience for our ears, and we responded by removing the stimulus. We care about what appeals to our experiences because, in this example, the noise shaped the experience. If the noise wasn’t as loud, it would likely have improved the experience.

Understanding immersion through the senses

We should all be familiar with the major senses: touch, sight, hearing, smell, and taste. We often associate those senses with experience and actions. If you want to taste, you will eat food; if you want to see, you will watch TV; if you want to smell, you will breathe in an aroma; if you want to touch, you will hold something; and if you want to hear, you will listen to music. Those actions toward a stimulus engage the senses and provide us with an experience. Each sense alone provides us with a different way to experience the world around us.

If our senses are tied to our experiences, they become the anchors of an immersive experience. We can say that an immersive experience is something that incorporates multiple senses into one experience. Think of immersion as being on a spectrum, rather than it being all or nothing. You can’t remove immersion completely because it is tied to our senses. Unless you lose all of your ability to sense, you can’t completely remove immersion. Rather, it is on a scale of less immersive to more immersive.

For example, imagine you are walking down a busy street in the middle of rush hour in a major city. You are rushing from a lunch date back to work, so you are holding part of your lunch on your way back. While doing so, someone dumps spoiled milk on you from their balcony. It reeks! In this scenario, you can imagine the type of experience you would have in that situation. Based on the description, you can also isolate each part of that experience into their respective senses:

Touch: You can feel the food in your hand as you hold your lunch. When the milk lands on you, you feel the liquid on your skin and clothes (fun fact – we can’t feel wetness, but we can feel the difference in pressure and temperature of the liquid compared to the air on our skin).Sight: You see tons of cars and people out in the city. Maybe you can even see some buildings and have the sun shining in your eyes.Smell: You could be smelling the food you are eating, the smog and sewage of the city, or even the stench of the spoiled milk that was spilled onto you.Taste: You could taste the food you had for lunch, or maybe some of the spoiled milk got into your mouth; you probably have a bad taste in your mouth now.Hearing: In a busy city, you might hear car horns, engines revving, people talking, and even the sound of the spoiled milk hitting the ground as it also covers you.

When it comes to immersion, you can’t get more immersed than this! The experience I have described incorporates all the senses (Figure 1.1) and would leave you with a vivid and lasting memory:

Figure 1.1 – Human senses that focus on VR include sight, hearing, and touch

When we talk about applications of immersive technology, especially VR, we are in some ways trying to use technology to mimic what we would experience in real life. Research suggests that experiences are more memorable when more senses are tied to those experiences. If you can find ways to build experiences that incorporate the senses in believable ways, the user will feel more engaged, and they will walk away from the experience more informed. We will explore some techniques in future chapters to achieve these goals.

What makes something immersive?

When we talk about being less immersive, we simply mean that we start to remove our senses from the experience. When you remove touch, you can’t feel anything; remove sight, you can’t see the world around you; remove smell, you can’t enjoy aromas; remove hearing and everything is quiet; and remove taste and you cannot enjoy food. The more senses you remove, the less immersive the experience is. Going back to the idea of removing immersion completely, can you do that and still be alive?

Think about the experiences we enjoy and see if you can define what senses are used to make it immersive:

Reading a book involves seeing and touchingListening to music involves listeningWatching TV involves seeing and listeningTalking on the phone involves talking and listeningSwimming involves moving and seeingPlaying video games involves touching, seeing, and hearingDriving a car involves touching, seeing, and hearing

When we talk about these experiences, one crucial component is interaction. We aren’t in stasis: we are acting and reacting to the world around us. Even if the experience is passive, there is a level of interactivity that keeps us engaged. When the experience requires us to perform an action, that makes it interactive. Let’s not confuse interaction with immersion. They are two separate concepts involved in the same experience. You may listen to music, watch a movie, or read a book. Those are passive experiences, but the act of turning the page, changing the channel with a remote, or rewinding a song gives you a level of interaction that keeps you engaged.

To make VR experiences immersive, use the elements of immersion as a guide. We know that immersive experiences are more engaging for users, and the focal point of immersive experiences is our senses. By developing experiences focused on what we see, feel, and do, we can create applications that have a true impact on the VR industry and community. Note that this is independent of the content or industry. These approaches and philosophies can be applied to a variety of industry applications because they speak to the core component of what makes VR different than other mediums such as animation, cinema, or console games. It is the ability to immerse the user in an experience.

How to make experiences immersive

So, how do you make something more immersive? Simple – involve more senses in the experience. Compared to text, video is more immersive because it involves two senses rather than one. Reading incorporates sight and listening incorporates hearing, but watching a movie involves both seeing and listening. To make reading more immersive, add sound. To make listening to music more immersive, add haptic feedback to feel the sound vibrations. When you are thinking about immersion, think about building off the native experience rather than exchanging one element for another. I would not consider video to be the same as immersive reading because you are replacing text with images. Although you are adding sound, you are taking away the text-based visual.

Adding sound to a quiet reading experience such as background music or sound effects can make the reading experience more immersive without taking away the core element of the experience.

With VR, you can take the concept of immersive experiences and build on that framework using technology and digital assets. VR uses interactions in a completely virtual world to let you walk, run, jump, and navigate with motion. Even though you can’t touch the object in the world, haptics can provide limited vibrational feedback. Ultimately, you can see the objects, hear the objects, and orient yourself in spaces among the objects. Compared to being behind a computer screen or gamepad, VR is more immersive because you are in the very location you want to explore, not a proxy of it. You don’t control the character; you are the character.

Now that we have introduced what an immersive experience is and its various components, we can explore what components are essential to achieving such experiences within VR. This will help simplify how to approach building immersive experiences and make the process seem less daunting.

What are the essential components of an immersive experience?

The point of explaining and defining immersion and immersive experiences in the preceding section was intentional. When we talk about VR or any other variation of XR, we are talking about different types of experiences that engage the user in distinct ways. It is important to understand that as fact rather than opinion because some technical aspects and elements make the experiences what they are. They can be clearly defined and formulaic. With most VR experiences, the user will have an experience with an HMD such as an Oculus Quest. With augmented reality (AR), the user will most likely have an experience through their smartphone. The list can go on.

The true impact as a creator and developer is taking the core elements of a formulaic experience and infusing abstract and creative elements into it so that people have a memorable experience they want to share with friends and colleagues or even promote to the world. At face value, all VR experiences are a variation of putting on a plastic headset and responding to stimuli that are not real, but the experiences people reflect on afterward with VR are a lot more formative and expressive. They will describe what they did, what they saw, and how the VR experience made them feel. The following are some examples of general (non-specific) experiences that can be enhanced using VR:

Entertainment through 360-degree videos: You can immerse yourself in the video as if you are there. 360 videos provide a passive VR experience that allows you to see in all directions rather than at just a monitor screen in front of you. Think about someone base jumping with a 360-camera attached to them. In VR, you can tag along with them as they go on an epic adventure.Games: Instead of sitting on the couch with a controller, you can be the player in the game dodging all the obstacles and scoring all the points. Such games include the following:Story/role-playing games: You take on the persona of a character in a story and evolve as the story progresses.FPS games: First-person shooters allow you to go to battle with others in a game of survival. You can navigate environments to evade gunfire and take out opposing players.Foraging/exploration games: Games where you can traverse vast worlds, climb high peaks, and scavenge for resources. These games normally focus on puzzles and creating lighthearted experiences.Sports games: Instead of going to the field or the court in real life, you can play your favorite sport in VR.Artistic games: These games are abstract because they are all about using interactions to elicit a certain effect. This can mean shooting paintballs at a 100-foot canvas to make a painting or fishing in a field on another planet.Survival games: Much like first-person shooters, you are immersed in an environment where the goal is to think outside the box to increase your chance of survival.Social/virtual meetings: You can meet up with friends, watch movies, and go to meetings in virtual environments with friends across the globe. Geographic locations won’t hinder you from connecting with others.Medicine: You can improve patient outcomes in a variety of areas, from therapy to training. With VR, you can create simulation modules for healthcare professionals to improve their training and provide immersive learning experiences for patients to better understand their health.Education: In the classroom, VR can give students the ability to explore learning in a more exploratory way. This allows them to retain information better. Students can go on museum and gallery tours to places across the globe and interact with content beyond a textbook.Military training: You can make training more accessible and cheaper with simulations of the tasks at hand. The military uses VR to simulate combat environments.Utility/productivity: VR can be used to extend your office beyond its physical location. Instead of using a computer monitor, your headset can create countless virtual monitors so that you can multitask and work in a variety of locations. Want to work on a project on the beach with a 200-foot monitor screen? You can do that in VR.Real estate: You can tour homes from your living room using digital replicas of the places you intend to learn more about. Digital twins allow for deep exploration of real-world locations without you having to physically be there.Engineering: You can design and prototype before you move to manufacturing. This saves countless work hours and allows for rapid revisions and iterations, thus saving money.Exercise: Instead of going to a workout class, you can bring the workout class to you. In VR, you can gamify your workouts with others and/or in fantastical ways using digital enhancements.Content creation: You can ditch the keyboard and mouse to sculpt and paint content for projects in VR. If you like the kinetic experience of sculpting but still want to work digitally, you can put on a headset and do what you do best in the way that feels the most natural.

Skills required to build immersive experiences

Some technical and nontechnical skills are valuable in the XR industry and for developing VR experiences that will have an impact. In many cases, if you have developed skills and worked on projects in other industries, you can integrate those skills into making engaging immersive experiences. Understand that you do not exist within a vacuum. You have skills and ideas that can push the culture of XR forward in new and exciting ways. I can speak from personal experience. I was a Division 1 college football player who did software development for Windows Mobile in the early 2000s. When Windows Mobile went defunct, I shifted to comic illustration and visual storytelling. My creative endeavors evolved from newspaper comics to webcomics, to animation on YouTube, to live-action visual effects, and ultimately to XR creation. I did all that while playing football, retiring, and getting into medical school. Over this 10+ years’ creative and professional journey, I developed skills in a variety of areas that further informed my workflow and ideas to create and pursue.

The reason your skills are so valuable is because XR is just a medium. It is a manifestation of the ideas you think of and write on paper. Those ideas can become books, animated shows, live-action movies, training modules, mobile apps, and so on. You just so happen to want to create VR experiences. In many ways, there are projects and ideas only you can produce to a specific end because you ultimately infuse your skills and experiences into the work you do. Whether it is naming conventions or artistic style, the things you create will have a touch of you in them. If 10 developers and creators get the same prompt, which is a brief description of a project idea that a client or developer hopes to create, you will get 10 different projects. Some will be better than others based on the utilization of tools and execution of the prompt. Here are the skills that will prove most useful to you when you are trying to build immersive experiences:

Project management: Immersive experiences can be large in scale and require experience managing multiple elements. If you don’t know how to navigate both people and a variety of content sources, you can easily become overwhelmed. Although this is often lost on developers and creators, project management skills are crucial to completing projects. To build a portfolio and further your career, you must be able to complete projects.Creative direction: XR has yet to reach its peak market value. As a result, grand ideas exist that are yet to be manifested and translated into experiences. The value of a creator and developer is not only measured by the technical skills you offer but also by the vision you present to explore the technology and push it to new heights with your ideas. Having experience in a variety of content creation workflows and pipelines can help you explore the possibilities of XR. Creative directors develop and manage projects from ideas to finished products. They typically have experience in a variety of areas, such as marketing, illustration, business, product development, and more. These skills serve as valuable assets on the creative journey.Software development: Having technical skills is very valuable in developing XR experiences. Although Unity makes building experiences easier, to get the most out of the medium and the platform, you need to know how to open the hood and unlock certain features. Even if you aren’t a seasoned coder, being familiar with code structures and functions can lead to major growth and innovations in the space.

Technological components of building immersive experiences

Every immersive experience has the same core elements. The difference among all experiences is the degree to which each of the core elements is incorporated. Whether you are doing a simulation or playing a game, you will need animation, a user interface, lighting, and audio. Creating experiences is more about navigating the required elements to fit the scope of the project rather than redefining what it means to build an experience. Innovation is taking what already exists and improving upon it with ideas that show the true potential of the tools and the medium. The following comprise some of the core components of building immersive experiences:

Game engine: Most VR experiences are built on game engines because of their ability to render objects in real time instead of pre-rendering the objects, such as animation and compositing software. In this book, we will be using the Unity game engine, the most versatile engine used to build XR applications and games. Another popular engine is Unreal Engine.Rendering: I mentioned that game engines use real-time rendering to provide a platform for building interactive experiences. Within Unity, there are render pipelines that determine the way objects are rendered within the experience. The difference in rendering pipelines is usually device-dependent. There are render pipelines for lower-end devices such as smartphones and high-definition pipelines for higher-end devices. Since VR headsets have hardware specifications, knowing which one to use for your project early in the development process is crucial to providing the best experience for your users.3D content: You can’t have a virtual experience without 3D content (2D content on some occasions). In a virtual experience, you interact with the 3D objects in the world you are experiencing. This content can comprise characters, buildings, environments, and even digital twins. Content can make or break the experiences you build because the user chooses the VR experiences they want based on the content in the experience.Shaders, materials, and textures: Often associated with 3D content, shaders, materials, and textures provide an element of variation to the 3D world that can elicit various emotions and responses. They provide the color and character to the polygons and pixels of the digital world. If 3D models are the architecture and foundation for the world, then materials and shaders are the paint and decorations. When the materials, shaders, and textures are used correctly, they become recognizable and familiar to the user.Levels design and architecture: Design is crucial to experiences because you need to give users environments to anchor their experiences to. Without a map to navigate or cities to traverse, they have no direction.Audio: Sound is crucial to having an immersive experience. Hearing the sound of your feet on the pavement as you walk or increasing the volume of music as you walk closer to the source creates something subtle yet impactful. If you can create and integrate sounds, you can harness an important component of an immersive experience.UX/UI: User interfaces are the blueprints of interactions within the experiences. If you design a world but people don’t know how to explore the world, then you need to find ways to design elements that feel natural and intuitive.Animation: Animation brings life to a static world. In a real-time game engine, VR makes animation an integral part of the interactive experiences. It is often the element people respond to the most because animation alters the world as time progresses. If we were playing a sports game, the location of the animated character or object can determine how the player would respond. In a still world, a player has no incentive to engage in the experience.Lighting: Nothing is fun if you can’t see. Lighting 3D environments is a necessary skill because it gives you full control of every element that can improve visibility and influence mood.Performance: Not every experience will work on every device. Being able to develop something vast but also performant can maximize your reach and leave a lasting impression on users.Software development: Even though we interact with the content we see in a virtual world, those interactions are dependent on elements that you code and integrate using C#. Knowing how to utilize code so that it works for your project can unlock unlimited possibilities.

Now that we understand what immersive experiences are and the components that make up such experiences, let’s dive deeper into XR.

Understanding XR, AR, VR, and MR

Extended reality (XR) is the umbrella term that’s used to explain technology that engages our senses. This includes providing information, recreating worlds, or enhancing the world in real time. It was developed to enable more immersive experiences using digital objects. When we look at how digital objects are used, it is often through a 2D experience. This experience can include animation, word processing, video games, and even training simulations. Incorporated digital content can include images, and 3D designs that are rendered on a screen. But why should we spend hours building 3D content only to experience it in 2D? XR provides a way out of this limitation by creating a pathway for viewing 3D content in a 3D space. If we think in 3D and build in 3D, then we must have a way to experience our content in 3D.

XR is an umbrella term for augmented reality (AR), virtual reality (VR), and mixed reality (MR) (Figure 1.2). On the surface, people often confuse the three, but it would be valid to say that even if they are different from each other, they all comprise XR.

Within XR, we can think of AR, VR, and MR on a spectrum just like immersion. On one side, you have a completely digital world with digital objects, and on the other side, you have a physical world with digital objects. The consistent component across each experience is the digital objects, but the difference is how connected to the physical world the experience is. There are specific hardware and sensors that contribute to attaining these experiences, but we will get to that a bit later.

Figure 1.2 – Overview of VR, AR, MR, and XR

AR is when you have an experience that places digital objects in a completely physical world. This experience is dependent on sensors from a device that can scan the surrounding area to create a believable experience for the user. You are usually adding digital elements to the screen of a live camera feed. The camera feed is most likely from a smartphone or webcam. There are AR headsets such as HoloLens that create more immersive experiences, but they place those experiences into another category (MR). AR can also incorporate audio such as Bose glasses, which infuse audio into your environment without the need for headphones. Some popular AR experiences are found on smartphones: Pokémon Go, Snapchat face filters, and IKEA Place.

VR is when you have an experience in a completely digital world. In VR, you are not tied to the physical world. You can think of it as being inside a computer, like in Tron, or inside your favorite game. You can walk, run, and jump as an avatar in the digital world. Compared to AR, VR is not sensor-heavy, but it does require specific hardware to get the most out of the experience. At most, you would require a headset such as the Oculus Quest, but you can also use Google Cardboard, where you can use your phone with a low-cost headset case to have bite-sized experiences.

MR lies somewhere in the middle of that spectrum. It combines both AR and VR elements, allowing real words and digital objects to interact seamlessly. Instead of removing yourself from the physical world to have more interactions with digital objects in the digital world, you are integrating more sensors to track your body so that you can interact with digital objects in the physical world. In this experience, you are combining camera sensors with HMDs to scan the world around you, scan your body, and build an immersive environment that combines the best of AR and VR. Devices such as HoloLens and Magic Leap allow you to do that.

Understanding the difference between AR, VR, and MR

AR uses interactions on a screen such as toggles, sliders, or buttons. You can think of this as playing a phone game. AR does not allow you to interact with digital objects outside of screen and button input. It is mainly used for rendering digital objects in a digital environment or adding digital elements such as animation and 3D models to print media. The major draw to AR is the fact that it doesn’t remove the user from the physical world they are in. It enhances the real environment with digital content.