Unity 2020 Virtual Reality Projects - Jonathan Linowes - E-Book

Unity 2020 Virtual Reality Projects E-Book

Jonathan Linowes

0,0
57,59 €

oder
-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

This third edition of the Unity Virtual Reality (VR) development guide is updated to cover the latest features of Unity 2019.4 or later versions - the leading platform for building VR games, applications, and immersive experiences for contemporary VR devices.
Enhanced with more focus on growing components, such as Universal Render Pipeline (URP), extended reality (XR) plugins, the XR Interaction Toolkit package, and the latest VR devices, this edition will help you to get up to date with the current state of VR. With its practical and project-based approach, this book covers the specifics of virtual reality development in Unity. You'll learn how to build VR apps that can be experienced with modern devices from Oculus, VIVE, and others. This virtual reality book presents lighting and rendering strategies to help you build cutting-edge graphics, and explains URP and rendering concepts that will enable you to achieve realism for your apps. You'll build real-world VR experiences using world space user interface canvases, locomotion and teleportation, 360-degree media, and timeline animation, as well as learn about important VR development concepts, best practices, and performance optimization and user experience strategies.
By the end of this Unity book, you'll be fully equipped to use Unity to develop rich, interactive virtual reality experiences.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB

Veröffentlichungsjahr: 2020

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Unity 2020 Virtual Reality ProjectsThird Edition
Learn VR development by building immersive applications and games with Unity 2019.4 and later versions
Jonathan Linowes
BIRMINGHAM - MUMBAI

Unity 2020 Virtual Reality ProjectsThird Edition

Copyright © 2020 Packt Publishing

All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.

Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.

Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.

Commissioning Editor: Ashwin NairAcquisition Editor: Larissa PintoContent Development Editor: Aamir AhmedSenior Editor: Hayden EdwardsTechnical Editor: Deepesh PatelCopy Editor:Safis EditingProject Coordinator:Kinjal BariProofreader: Safis EditingIndexer:Tejal Daruwale SoniProduction Designer: Aparna Bhagat

First published: September 2015Second edition: May 2018 Third edition: July 2020

Production reference: 1290720

Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK.

ISBN 978-1-83921-733-3

www.packt.com

This book is dedicated to Lisa—my wife, best friend, and soul mate—and the amazing family we created together: Rayna, Jarrett, Steven, and Shira, who know in their hearts that the future is theirs to embrace.

- Jonathan Linowes

Packt.com

Subscribe to our online digital library for full access to over 7,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.

Why subscribe?

Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals

Improve your learning with Skill Plans built especially for you

Get a free eBook or video every month

Fully searchable for easy access to vital information

Copy and paste, print, and bookmark content

Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.

At www.packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.

Contributors

About the author

Jonathan Linowes is a long-time Unity developer and software engineer with a focus on VR and AR games and applications. He founded Parkerhill XR Studio and Reality Labs, an immersive indie studio and developer of products including the BridgeXR toolkit, the Power Solitaire VR game, and the Epoch Resources mobile game. He is a VR/AR evangelist, Unity developer, entrepreneur, and Certified Unity Instructor. Jonathan has a Bachelor of Fine Arts degree from Syracuse University, a Master of Science degree from the MIT Media Lab, and has held technical leadership positions at Autodesk and other companies. He has authored several books on VR and AR from Packt Publishing.

About the reviewer

Yash Gugale completed his Master's degree from the Department of Computer Graphics Technology at Purdue University where he specialized in VR, AR (Unity 3D), graphics programming (OpenGL and shaders), and data visualization (D3.js). He has a strong background in machine learning (Sklearn), deep learning (PyTorch), mobile (Android) and web development, photogrammetry (Meshroom), animation (Maya), UI/UX, and 360-degree videos. His work also involves volumetric videos and applying shader effects in Unity to create amazing volumetric experiences. He is currently working as a software engineer at Samsung to build TV applications and in their XR Volumetric Studios. In his free time, he enjoys salsa dancing, hiking, traveling, yoga, and scuba diving.

Packt is searching for authors like you

If you're interested in becoming an author for Packt, please visit authors.packtpub.com and apply today. We have worked with thousands of developers and tech professionals, just like you, to help them share their insight with the global tech community. You can make a general application, apply for a specific hot topic that we are recruiting an author for, or submit your own idea.

Table of Contents

Title Page

Copyright and Credits

Unity 2020 Virtual Reality Projects Third Edition

Dedication

About Packt

Why subscribe?

Contributors

About the author

About the reviewer

Packt is searching for authors like you

Preface

Who this book is for

What this book covers

To get the most out of this book

Download the example code files

Conventions used

Get in touch

Reviews

Virtually Everything for Everyone

What is virtual reality?

Differences between virtual reality and augmented reality

Applications versus games

Types of VR experience

Types of HMD

Desktop VR

Mobile VR

How virtual reality works

Stereoscopic 3D viewing

Head, hand, and body tracking

Technical skills that are important to VR

What this book covers

Who this book is for

Summary

Understanding Unity, Content, and Scale

Technical requirements

Installing Unity

Development system requirements

Installing Unity Hub

Installing the Unity Editor

Creating a new Unity project

Installing additional packages and assets

Getting started with Unity

Exploring the Unity Editor

Understanding the default new scene

Using grid and snap

A couple more options

Creating a simple diorama

Adding a cube and a plane

Adding a red ball

Changing the scene view

Making a Crate Material

Adding a photo

Using prefabs

Creating and instantiating a prefab

Editing and overriding a prefab

Importing content

Creating 3D content for VR

Importing from the Unity Asset Store

Using Unity Legacy Standard Assets

Importing models in supported formats

Round-trip geometry workflows

Summary

Setting Up Your Project for VR

Technical requirements

Introducing the Unity XR platform

Choosing your target VR platforms and toolkits

Enabling virtual reality for your platform

Setting your target platform

Installing XR Plugin Management

Installing the XR Interaction Toolkit

Adding the XR camera rig

Exploring the XR Rig objects and components

Building and running your project

Configuring the player settings

Building, running, and testing your VR project

Building for SteamVR

Setting up for OpenVR

Installing the SteamVR Unity Plugin toolkit

Building for Oculus Rift

Setting up for Oculus desktop

Installing the Oculus Integration toolkit

Building for Immersive Windows MR

Setting up for Immersive WMR

Installing Visual Studio workloads

Installing the Mixed Reality Toolkit (MRTK)

Building for Oculus Quest

Installing the Android tools

Setting up for Oculus mobile VR

Other Android and optimization settings

Installing the Oculus Integration toolkit

Using adb

Building for Google Cardboard

Setting up for Google Cardboard

Targeting Android for Cardboard

Targeting iOS for Cardboard

Summary

Using Gaze-Based Control

Technical requirements

Adding Ethan, the walker

Artificially intelligent Ethan

The NavMesh bakery

Scripting a random walk target

"Zombie-izing" Ethan!

Adding a zombie material

Painting models with Polybrush

Going where I'm looking

The LookMoveTo script

Adding a feedback cursor object

Observing through obstacles

Making a look-to-kill system

The KillTarget script

Adding particle effects

Introducing Unity C# programming

Summary

Interacting with Your Hands

Technical requirements

Setting up the scene

Defining a balloon game object

Making the balloon prefab

Creating a Balloon Controller

Using an Input Manager button

Polling the XRI_Right_Trigger button

Controlling balloons with the input trigger

Creating balloons

Releasing balloons

Inflating a balloon while pressing the trigger

Using Unity events for input

Invoking our input action events

Subscribing to input events

Tracking your hands

Parenting the balloon to your hand

Forcing balloons to float upright

Interacting with a balloon gun

Introducing the XRI Interactor/Interactable architecture

Creating a grabbable balloon gun

Handling Activate events

Using the XR Interaction Debugger

Popping balloons

Making the balloons poppable

Adding a popping explosion

Disabling rigid physics while in hand

Throwing a ball projectile

Resetting the ball position

Summary

Canvasing the World Space UI

Technical requirements

Studying VR design principles

Making a reusable default canvas

Creating a default canvas prefab

Initializing the default main camera

Including an Event System component with XRUI Input Module

Implementing a HUD

Creating a visor HUD

The windshield HUD

Hiding the panel using Canvas Group

The in-game world space UI

Making a scoreboard

Using TextMesh Pro

Info bubbles

The reticle cursor

Adding a canvas reticle to gaze-based interaction

Adding a reticle to the XR interactor hand controller

Adding a gaze-based reticle using XRI

Building an interactive dashboard

Adding a dynamic water hose

Creating a dashboard with a toggle button

Stopping the ray interactor at the canvas

Direct interaction with UI elements

Building a wrist-based menu palette

Summary

Teleporting, Locomotion, and Comfort

Technical requirements

Implementing basic glide locomotion

Moving forward with the thumbstick

Rotating with the thumbstick

Moving in the direction you're looking or pointing

Avoiding obstacles

Climbing a wall

Building a wall with grab holds

Adding the XRI Interactor and Interactable components

Adding a ClimbController script

Adding the GrabPull script and actions

Falling

Using the XRI Locomotion System

Understanding the Locomotion System

Turning in a snap

Integrating scripts with Locomotion System

Teleporting between locations

Installing the XRI examples

Adding teleportation

Restricting interaction to a specific layer

Ray interactors for teleportation

Switching between Interactors

Locomotion and comfort in VR

Other locomotion mechanics

Managing VR motion sickness

Summary

Lighting, Rendering, Realism

Technical requirements

Lighting and rendering strategies

Choosing a Render Pipeline

Choosing Lighting Settings and GameObjects

Setting up our demo scene

Using the SampleScene

Disabling baked lighting

Creating a menu panel

Using environment lighting

Environment lighting source

Adding Environment Light Intensity

Adding a Fog effect

Using PBR materials and URP Shaders

Using Light objects and Emission surfaces

Using Light Probes and Reflection Probes

Enhancing your scenes with post-processing effects

Summary

Playing with Physics and Fire

Technical requirements

Understanding Unity physics

Creating bouncy balls

Managing the GameObject life cycle

Removing fallen objects

Setting a limited lifetime

Implementing an object pool

Building a headshot game

Serving a ball

Adding sound effects

Hitting the target

Building a Paddleball game

Creating a hand paddle

Building a shooter ball game

Making a shooter wall

Shooting balls toward the player

Improving the ball

Juicing the scene

Great balls of fire

Skull environment

Audio synchronization

Summary

Exploring Interactive Spaces

Technical requirements

Using ProBuilder and ProGrids

Using the ProGrids editor interface

Using the ProBuilder editor interface

Constructing the art gallery building

Using a floor plan sketch

Creating the floor

Creating the walls

Making holes for entrances

Creating a roof and skylight

Assembling the scene

Replacing the building materials

Tuning the lighting

Creating the artwork rig

Defining an artwork rig

Adding a spotlight

The exhibition plan

Adding pictures to the gallery

Managing art info data

Using lists

Using data structures

Using scriptable objects

Displaying the art info

Adjusting for image aspect ratio

Teleporting around the gallery

Room-scale considerations

Summary

Using All 360 Degrees

Technical requirements

Exploring 360-degree media

Understanding equirectangular projections

VR is hacking your field of view

Stereo 360-degree media

Having fun with photo globes

Seeing crystal balls

Rendering globes

Handling magic orbs

Viewing 360-degree photos

Viewing 360 images from the web

Adding an image viewer UI

Playing 360-degree videos

Using Unity skyboxes

Six-sided or cubemap skyboxes

Spherical panoramic skyboxes

360-degree video skyboxes

Capturing 360-degrees in Unity

Capturing cubemaps and reflection probes

Using third-party capture tools

Summary

Animation and VR Storytelling

Technical requirements

Composing our story

Gathering the assets

Creating the initial scene

Timelines and Audio tracks

Using a Timeline to activate objects

Recording an Animation Track

A growing tree

A growing bird

Using the Animation editor

A wafting nest

Animating other properties

Animating lights

Animating a scripted component property

Using Animation clips

Shaking an egg

Using Animator Controllers

ThirdPersonController Animator

Living Birds Animator

Defining the fly-to targets

Using Animator hashes and DOTween to animate

Using a Signal Track in Timeline

Making the story interactive

Look to play

Resetting the initial scene's setup

Summary

Optimizing for Performance and Comfort

Technical requirements

Using the Unity Profiler and Stats windows

The Stats window

Overview of the Profiler window

Analyzing and diagnosing performance problems

Optimizing your art

Decimating models

Levels of detail

Optimizing your scene with static objects

Setting up the scene

Lighting and baking

Occlusion culling

Optimizing the rendering pipeline

Optimizing your code

Understanding the Unity life cycle

Writing efficient code

Runtime performance and debugging

Summary

Other Books You May Enjoy

Leave a review - let other readers know what you think

Virtually Everything for Everyone

This virtual reality thing calls into question, what does it mean to "be somewhere"? Before cell phones, you would call someone and it would make no sense to say, "Hey, where are you?" You know where they are, you called their house, that's where they are. So then cell phones come around and you start to hear people say, "Hello. Oh, I'm at Starbucks," because the person on the other end wouldn't necessarily know where you are because you became un-tethered from your house for voice communications. So when I saw a VR demo, I had this vision of coming home and my wife has got the kids settled down, she has a couple minutes to herself, and she's on the couch wearing goggles on her face. I come over and tap her on the shoulder, and I'm like, "Hey, where are you?" It's super weird. The person's sitting right in front of you, but you don't know where they are. - Jonathan Stark, mobile expert, and podcaster

Welcome to virtual reality (VR)! In this book, we will explore what it takes to create VR experiences on our own. We will take a walk through a series of hands-on projects, step-by-step tutorials, and in-depth discussions using the Unity 3D game engine and other free or open source resources. Though VR technology is rapidly advancing, we'll try to capture the basic principles and techniques that you can use to make your VR games and applications feel immersive and comfortable.

In this first chapter, we will define VR and illustrate how it can be applied not only to games, but also to many other areas of interest and work productivity. We'll see that VR is all about immersion and presence, seemingly transporting you to a different place and experience. VR is not just for gaming—it can be applied to a wide spectrum of personal, professional, and educational applications. This chapter discusses the following topics:

What is virtual reality?

Differences between virtual realityand augmented reality

How VR applications may differ from VR games

Types of VR experience

Types of VR device

Some psychological, physiological, and technical explanations of how VR works

Technical skills that are necessary for the development of VR

What is virtual reality?

Today, we are witnesses to burgeoning consumer-accessible VR, an exciting technology that promises to transform in a fundamental way how we interact with information, our friends, and the world at large.

What is virtual reality? In general, VR is the computer-generated simulation of a 3D environment, which seems very real to the person experiencing it, using special electronic equipment. The objective is to achieve a strong sense of being present (presence) in the virtual environment.

Today's consumer tech VR involves wearing HMD (head-mounted display) goggles to view stereoscopic 3D scenes. You can look around by moving your head, and walk around by using hand controls or motion sensors. You are engaged in a fully immersive experience. It's as if you're really there in some other virtual world. The following photo shows me, the author, experiencing an Oculus Rift Development Kit 2 (DK2) in 2015:

VR is not new. It's been here for decades, albeit hidden away in academic research labs and high-end industrial and military facilities. It was big, clunky, and expensive. Ivan Sutherland invented the first HMD in 1965 (see https://amturing.acm.org/photo/sutherland_3467412.cfm). It was tethered to the ceiling with metal pipes! In the past, several failed attempts have been made to bring consumer-level VR products to the market:

Source: https://mashable.com/2012/09/24/augmented-reality/

In 2012, Palmer Luckey, the founder of Oculus VR LLC, gave a demonstration of a makeshift head-mounted VR display to John Carmack, the famed developer of the Doom, Wolfenstein 3D, and Quake classic video games. Together, they ran a successful Kickstarter campaign and released a developer kit called Oculus Rift Development Kit 1 (DK1) to an enthusiastic community. This caught the attention of investors, as well as Mark Zuckerberg (Facebook CEO), and in March 2014, Facebook bought the company for $2 billion. With no product, no customers, and infinite promise, the money and attention that it attracted helped fuel a new category of consumer products.

At the same time, others were also working on their own products, which were soon introduced to the market, including Steam's HTC VIVE, Google Daydream, Sony PlayStation VR, Samsung Gear VR, Microsoft's immersive Mixed Reality, and more. New innovations and devices that enhance the VR experience continue to be introduced.

Most of the basic research has already been done, and the technology is now affordable, thanks in large part to the mass adoption of devices that work on mobile technology. There is a huge community of developers with experience in building 3D games and mobile apps. Creative content producers are joining in and the media is talking it up. At last, virtual reality is real!

Say what? Virtual reality is real? Ha! If it's virtual, how can it be... Oh, never mind.

Eventually, we will get past the focus on the emerging hardware devices and recognize that content is king. The current generation of 3D development software (commercial, free, or open source) that has spawned a plethora of indie (independent) game developers can also be used to build nongame VR applications.

Though VR finds most of its enthusiasts in the gaming community, the potential applications reach well beyond that. Any business that presently uses 3D modeling and computer graphics will be more effective if it uses VR technology. The sense of immersive presence that is afforded by VR can enhance all common online experiences today, which includes engineering, social networking, shopping, marketing, entertainment, and business development. In the near future, viewing 3D websites with a VR headset may be as common as visiting ordinary flat websites today.

It's probably worthwhile to clarify what virtual reality is not by comparing VR with augmented reality.

Differences between virtual reality and augmented reality

A sister technology to VR is augmented reality (AR), which combines computer-generated imagery (CGI) with views of the real world. AR on smartphones has recently garnered widespread interest with the introduction of Apple's ARKit for iOS and Google ARCore for Android. Furthermore, the Vuforia AR toolkit is now integrated directly with the Unity game engine, helping to drive even more adoption of the technology. AR on a mobile device overlays the CGI on top of live video from a camera.

The latest innovations in AR are wearable AR headsets, such as Microsoft's HoloLens and Magic Leap. The computer graphics are shown directly in your field of view, not mixed into a video image. If VR headsets are like closed goggles, AR headsets are like translucent sunglasses that combine the real-world light rays with CGI.

A challenge for AR is ensuring that the CGI is consistently aligned with and mapped onto the objects in the real-world space and to eliminate latency while moving about so that they (the CGI and objects in the real-world space) stay aligned.

AR holds as much promise as VR for future applications, but it's different. Though AR intends to engage the user within their current surroundings, VR is fully immersive. In AR, you may open your hand and see a log cabin resting in your palm, but in VR, you're transported directly inside the log cabin and you can walk around inside it.

We are also beginning to see hybrid devices that combine features of VR and AR and let you switch between modes. For example, we're already seeing VR devices with pass-through video features, primarily used for setting up your play area bounds and floor level, and as a safety feature when the player goes out of bounds. The camera mounted on the HMD, generally used for spatial positioning, can be fed to the display. Be aware that the field of view of the video may be distorted, so it shouldn't be used for walking around.

If you are interested in developing applications for AR, please also refer to the author's book Augmented Reality for Developers from Packt Publishing (https://www.packtpub.com/web-development/augmented-reality-developers).

Next, we'll explore the ways in which VR can be used to improve our lives and entertainment.

Applications versus games

Consumer-level VR started with gaming. Video gamers are already accustomed to being engaged in highly interactive hyper-realistic 3D environments. VR just ups the ante.

Gamers are early adopters of high-end graphics technology. Mass production of gaming consoles and PC-based components in the tens of millions and competition between vendors leads to lower prices and higher performance. Game developers follow suit, often pushing the state of the art, squeezing every ounce of performance out of hardware and software. Gamers are a very demanding bunch, and the market has consistently stepped up to keep them satisfied. It's no surprise that many, if not most, of the current wave of VR hardware and software companies are first targeting the video gaming industry. A majority of the VR apps on the Oculus Store, such as Rift (https://www.oculus.com/experiences/rift/), GearVR (https://www.oculus.com/experiences/gear-vr/), and Google Play for Daydream (https://play.google.com/store/search?q=daydream&c=apps&hl=en), are for games. And of course, the Steam VR platform (http://store.steampowered.com/steamvr) is almost entirely about gaming. Gamers are the most enthusiastic VR advocates and seriously appreciate its potential.

Game developers know that the core of a game is the game mechanics, or the rules, which are largely independent of the skin, or the thematic topic, of the game. Game mechanics can include puzzles, chance, strategy, timing, or muscle memory. VR games can have the same mechanical elements but might need to be adjusted for the virtual environment. For example, a first-person character walking in a console video game is probably going about 1.5 times faster than their actual pace in real life. If this wasn't the case, the player would feel that the game was too slow and boring. Put the same character in a VR scene and they will feel that it is too fast; it could likely make the player feel nauseous. In VR, you want your characters to walk at a normal, earthly pace. Not all video games will map well to VR; it may not be fun to be in the middle of a hyperrealistic war zone when you're actually virtually there.

That said, VR is also being applied in areas other than gaming. Though games will remain important, nongaming applications will eventually overshadow them. These applications may differ from games in a number of ways, with the most significant having much less emphasis on game mechanics and more emphasis on either the experience itself or application-specific goals. Of course, this doesn't preclude some game mechanics. For example, the application may be specifically designed to train the user in a specific skill. Sometimes, the gamification of a business or personal application makes it more fun and effective in driving the desired behavior through competition.

In general, nongaming VR applications are less about winning and more about the experience itself.

Here are a few examples of the nongaming application areas that are proving successful in VR:

Travel and tourism

: Visit faraway places without leaving your home. Visit art museums in Paris, New York, and Tokyo in one afternoon. Take a walk on Mars. You can even enjoy Holi, the spring festival of colors, in India while sitting in your wintery cabin in Vermont.

Mechanical engineering and industrial design

: Computer-aided design software, such as AutoCAD and SOLIDWORKS, pioneered three-dimensional modeling, simulation, and visualization. With VR, engineers and designers can directly experience the end product before it's actually built and play with what-if scenarios at a very low cost. Consider iterating a new automobile design. How does it look? How does it perform? How does it appear when sitting in the driver's seat?

Architecture and civil engineering

: Architects and engineers have always constructed scale models of their designs, if only to pitch the ideas to clients and investors or, more importantly, to validate the many assumptions about the design. Currently, modeling and rendering software is commonly used to build virtual models from architectural plans. With VR, the conversations with stakeholders can be so much more confident. Other personnel, such as interior designers, HVAC, and electrical engineers, can be brought into the process sooner.

Real estate

: Real-estate agents have been quick adopters of the internet and visualization technology to attract buyers and close sales. Real-estate search websites were some of the first successful uses of the web. Online panoramic video walkthroughs of for-sale properties have been commonplace for years. With VR, I can be in New York and gauge the feel of a place to live in Los Angeles.

Medicine

: The potential of VR for health and medicine may literally be a matter of life and death. Every day, hospitals use MRI and other scanning devices to produce models of our bones and organs that are used for medical diagnosis and possibly preoperative planning. Using VR to enhance visualization and measurement will provide a more intuitive analysis, for example. VR is especially being used for medical training, such as the simulation of surgery for medical students.

Mental health

: VR experiences have been shown to be effective in a therapeutic context for the treatment of

post-traumatic stress disorder

(

PTSD

) in what's called

exposure therapy

, where the patient, guided by a trained therapist, confronts their traumatic memories through the retelling of the experience. Similarly, VR is being used to treat arachnophobia (fear of spiders) and the fear of flying.

Education

: The educational opportunities for VR are almost too obvious to mention. One of the first successful VR experiences is

Titans of Space

, which lets you explore the solar system first hand. In science, history, the arts, and mathematics, VR will help students of all ages because, as they say, field trips are much more effective than textbooks.

Training

: Toyota has demonstrated a VR simulation of drivers' education to teach teenagers about the risks of distracted driving. In another project, vocational students got to experience the operating of cranes and other heavy construction equipment. Training for first responders, the police, and fire and rescue workers can be enhanced with VR by presenting highly risky situations and alternative virtual scenarios. The

National Football League

(

NFL

) and college teams are looking to VR for athletic training.

Entertainment and journalism

: Virtually attend rock concerts and sporting events or watch music videos. Re-experience news events as if you were personally present. Enjoy 360-degree cinematic experiences. The art of storytelling will be transformed by virtual reality.

Wow, that's quite a list! This is just the low-hanging fruit. Unity Technologies, the company behind the Unity 3D engine, appreciates this and is making an all-out push beyond gaming for its engine (you can learn more about Unity Solutions at https://unity.com/solutions).

The purpose of this book is not to dive too deeply into any of these applications. Rather, I hope that this brief look at the possibilities helps stimulate your thinking and provides an idea of how VR has the potential to be virtually anything for everyone. Next, we'll attempt to define the spectrum of the types of VR experience.

Types of VR experience

There is not just one kind of VR experience. In fact, there are many. Consider the following types of VR experiences:

Diorama

: In the simplest case, webuilda 3D scene. You're observing from a third-person perspective. Your eye is the camera. Actually, each eye is a separate camera that gives you a stereoscopic view. You can look around.

First-person experience

: This time, you'reimmersedin the scene as a freely moving agent. Using an input controller (a hand controller or some other technique), you can "walk" around and explore the virtual scene.

Room scale:

The first-person experience with physical space. Given positional tracking, you can physically walk around a predefined area. A guardian system will show when you've reached unsafe boundaries.

Interactive virtual environment

: This islikethe first-person experience, but you're more than an observer. While you are in the scene, you can interact with the objects in it. Physics is at play. Objects may respond to you. You may be given specific goals to achieve and challenges to face using the game mechanics. You might even earn points and keep score.

3D content creation

: In VR, you can create content that can be experienced in VR.

Google Tilt Brush

is one of the first blockbuster experiences, as is

Oculus Medium

and

Google Blocks

, among others. Unity is working on

EditorXR

for Unity developers to work on their projects directly in the VR scene.

Riding on rails

: In thiskindof experience, you're seated and being transported through the environment (or the environment changes around you). For example, you can ride a rollercoaster using this VR experience. However, it may not necessarily be an extreme thrill ride; it could be a simple real estate walk-through or even a slow, easy, and meditative experience.

360-degree media

: Thinkpanoramicimages that are projected on the inside of a sphere. You're positioned at the center of the sphere and can look all around. Some purists don't consider this

real

VR, because you're seeing a projection and not a model rendering. However, it can provide an effective sense of presence.

Social VR

: When multipleplayersenter the same VR space and can see and speak with each other's avatars, it becomes a remarkable social experience.

In this book, we will implement a number of projects that demonstrate how to build each of these types of VR experience. For brevity, we'll need to keep it pure and simple, with suggestions for areas for further investigation. Our focus will be on consumer-grade devices, described in the next section.

Types of HMD

Presently, there are two basic categories of HMDs for VR: desktop VR and mobile VR, although the distinctions are increasingly becoming blurred. Eventually, we might just talk about platforms as we do traditional computing, in terms of the operating system—for example, Windows, Android, and console VR. Let's look at each of these HMDs in more detail.

Desktop VR

With desktop VR (and console VR), yourheadsetis peripheral to a more powerful computer that processes the heavy graphics. The computer may be a Windows PC, Mac, Linux, or a game console, although Windows is by far the most prominent PC, and PlayStation is a bestseller intermsof console VR.

The headset is connected to the computer with physical wires (tethered connection) or a near-zero latency wireless connection. The game runs on the remote machine and the HMD is aperipheraldisplay device with a motion-sensing input. The termdesktopis an unfortunate misnomer since it's just as likely to be stationed in your living room or den.

TheOculus Rift(https://www.oculus.com/) is anexampleof a device where the goggles have anintegrateddisplay and sensors. The games run on a separate PC. Other desktop headsetsincludethe HTC VIVE, Sony'sPlayStation VR, andMicrosoft Mixed Reality.

Desktop VR devicesrelyon a desktop computer for CPU (general processor) and GPU (graphics processing unit) power, where more is better. Please refer to the recommended specificationrequirementsfor your specific device.

However, for the purpose of this book, we won't have any heavy rendering in our projects, and you can get by with the minimum system specifications.

Mobile VR

Mobile VR originated with Google Cardboard(https://vr.google.com/cardboard/), a simple housing device for two lenses and a slot for yourmobilephone. The phone'sdisplay shows twin stereoscopic views. It has rotational head tracking, but ithasno positional tracking. Cardboard also provides the user with the ability to click ortapits side to make selections in a game. The complexity of the imagery is limited because it uses your phone's processor for rendering the views on the phone display screen.

Google Daydream can be said to have progressed to the Samsung GearVR, requiring more performant minimum specifications in the Android phone, including greater processing power. GearVR's headsets include motion sensors to assist the phone device rather than relying on the phone's own sensors. These devices alsointroduceda three-degrees-of-freedom(DOF)hand controller that can be used as a laser pointer within VR experiences:

The term degrees of freedom (DoF) refers to the number of basic ways a rigid object can move through 3D space. There are six total degrees of freedom. Three correspond to rotational movement around the x, y, and z axes, commonly termed pitch, yaw, and roll. The other three correspond to translational movement along those axes, which can be thought of as moving forward or backward, moving left or right, and moving up or down.
– Google VR Concepts (https://developers.google.com/vr/discover/degrees-of-freedom)
Since the previous edition of this book, Google has discontinued the Daydream headset and has mad the Cardboard software open-source. Likewise, Samsung and Oculus have discontinued support for GearVR, supplanted by the Oculus Go and Quest devices.

The next generation of mobile VR devices includes all-in-one headsets, such as Oculus Go, with embedded screens and processors, eliminating the need for a separate mobile phone. The Oculus Quest further adds depth sensors and spatial mapping processors to track the user's location in 3D space, 6DOF hand controllers, and in some cases even hand tracking without hand controllers.

As of December 2020, Oculus is sunsetting the Oculus Go and will stop accepting new applications in its store.

The bottom line is that the projects in this book will explore features from the high end to the low end of the consumer VR device spectrum. But generally, our projects will not demand a lot of processing power, nor will they require high-end VR capability, so you can begin developing for VR on any of these types of devices,including Google Cardboard and an ordinary mobile phone.

Next, let's dive a little deeper into how this technology works.

How virtual reality works

So, what is it about VR that's got everyone so excited? With your headset on, you experience synthetic scenes. It appears 3D, it feels 3D, and maybe you will even have a sense of actually being there inside the virtual world. The strikingly obvious thing is that VR looks and feels really cool! But why?

Immersion and presence are the two words that are used to describe the quality of a VR experience. The Holy Grail is to increase both to the point where it seems so real that you forget you're in a virtual world. Immersion is the result of emulating the sensory input that your body receives (visual, auditory, motor, and so on). This can be explained technically. Presence is the visceral feeling that you get of being transported there—a deep emotional or intuitive feeling. You could say that immersion is the science of VR and presence is the art. And that, my friend, is cool.

A number of different technologies and techniques come together to make the VR experience work, which can be separated into two basic areas:

3D viewing

Head, hand, and body tracking

In other words, displays and sensors, such as those built into today's mobile devices, are a big reason why VR is possible and affordable today.

Suppose that the VR system knows exactly where your head and hands are positioned at any given moment in time. Suppose that it can immediately render and display the 3D scene for this precise viewpoint stereoscopically. Then, wherever and whenever you move, you'll see the virtual scene exactly as you should. You will have a nearly perfect visual VR experience. That's basically it. Ta-dah!

Well, not so fast. Literally. Let's dig deeper into some of the psychological, physiological, and technical aspects that make VR work.

Stereoscopic 3D viewing

Split-screen stereography was discovered not long after the invention of photography. Take a look at the popular stereograph viewer from 1876 shown in the following photo (B.W. Kilborn & Co, Littleton, New Hampshire; see http://en.wikipedia.org/wiki/Benjamin_W._Kilburn):

A stereo photograph has separate views for the left and right eyes, which are slightly offset to create a parallax effect. This fools the brain into thinking that it's a truly three-dimensional view. The device contains separate lenses for each eye, which lets you easily focus on the photo close up.

Similarly, rendering these side-by-side stereo views is the first job of the VR-enabled camera object in Unity.

Let's say that you're wearing a VR headset and you're holding your head very still so that the image looks frozen. It still appears better than a simple stereograph. Why?

The old-fashioned stereograph has relatively small twin images rectangularly bound. When your eye is focused on the center of the view, the 3D effect is convincing, but you will see the boundaries of the view. Move your eyes around (even with your head still), and any remaining sense of immersion is totally lost. You're just an observer on the outside peering into a diorama.

Now, consider what a VR screen looks like without the headset by using the following screenshot:

The first thing that you will notice is that each eye has a barrel-shaped view. Why is that? The headset lens is a very wide-angle lens, so when you look through it, you have a nice wide field of view. In fact, it is so wide (and tall) that it distorts the image (pincushion effect). The graphics software inverts that distortion by creating a barrel distortion so that it looks correct to us through the lenses. This is referred to as an ocular distortion correction. The result is an apparent field of view (FOV) that is wide enough to include a lot more of your peripheral vision. For example, the Oculus Rift has an FOV of about 100 degrees (we talk more about FOV in Chapter 11, Using All 360 Degrees).

Also, of course, the view angle from each eye is slightly offset, comparable to the distance between your eyes or the inter pupillary distance (IPD). The IPD is used to calculate the parallax and can vary from one person to the next.

To measure your IPD,hold a ruler (with millimeter markings) on your forehead in front of a mirror, as close to your eyes as possible. Open one eye and line up the 0 mark on the center of your pupil. Now, close that eye and open the other, and the distance to the center of your other pupil should be your IPD.

It might be less obvious, but if you look closer at the VR screen, you will see color separations, like you'd get from a color printer whose print head is not aligned properly. This is intentional. Light passing through a lens is refracted at different angles based on the wavelength of the light. Again, the rendering software inverts the color separation so that it looks correct to us. This is referred to as a chromatic aberration correction. It helps make the image look really crisp.

The resolution of the screen is also important to get a convincing view. If it's too low res, you'll see the pixels, or what some refer to as a screen-door effect. The pixel width and height of the display is an oft-quoted specification when comparing the HMDs, but the pixels per inch (PPI) value may be more important. Other innovations in display technology, such as pixel smearing and foveated rendering (showing higher-resolution details exactly where the eyeball is looking) help improve the apparent resolution and reduce the screen-door effect.

When experiencing a 3D scene in VR, you must also consider the frames per second (FPS). If the FPS is too slow, the animation will look choppy. Things that affect FPS include the GPU performance and the complexity of the Unity scene (the number of polygons and lighting calculations), among other factors. This is compounded in VR because you need to draw the scene twice, once for each eye. Technology innovations, such as GPUs that are optimized for VR, frame interpolation, and other techniques, will improve the frame rates. For us developers, performance-tuning techniques in Unity that are often used by mobile game developers can be applied in VR (we will talk more about performance optimization in Chapter 13, Optimizing for Performance and Comfort). These techniques and optics help make the 3D scene appear realistic.

Sound is also very important—more important than many people realize. VR should be experienced while wearing stereo headphones. In fact, you can still have a great experience when the audio is done well but the graphics are pretty crappy. We see this a lot in TV and cinema. The same holds true in VR. Binaural audio gives each ear its own stereoviewof a sound source in such a way that your brain imagines its location in 3D space. True 3D audio provides an even more realistic spatial audio rendering, where sounds bounce off nearby walls and can be occluded by obstacles in the scene to enhance the first-person experience and realism.

For a fun example of binaural audio, put on your headphones and visit the classic Virtual Barber Shop at https://www.youtube.com/watch?v=IUDTlvagjJA. No special listening devices are needed. Regular headphones will work (speakers will not).

Lastly, the VR headset should fit your head and face comfortably so that it's easy to forget that you're wearing it, and it should block out light from the real environment around you.

Head, hand, and body tracking

So, we have a nice 3D picture that is viewable in a comfortable VR headset with a wide field of view. If we had this setup and you moved your head, it'd feel like you had a diorama box stuck to your face. Move your head and the box moves along with it, and this is much like holding the antique stereograph device or theView-Master. Fortunately, VR is so much better.

The VR headset has a motion sensor (IMU) inside that detects spatial acceleration and rotation rates on all three axes, providing what's called the six degrees of freedom (6DOF). This is the same technology that is commonly found in mobile phones and some console game controllers, but perhaps with higher sensitivity and accuracy. With the IMU integrated into your headset, when you move your head, the current viewpoint is calculated and used when the next frame's image is drawn. This is referred to as motion detection.

The previous generation of mobile motion sensors was good enough for us to play mobile games on a phone, but for VR, it's not accurate enough. These inaccuracies (rounding errors) accumulate over time, as the sensor is sampled thousands of times per second, and you may eventually lose track of where they were in the real world. This drift was a major shortfall of the older, phone-based Google Cardboard VR. It could sense your head's motion, but it lost track of your head's orientation. The current generation of phones, such as Google Pixel and Samsung Galaxy, which conform to the Daydream specifications, have upgraded sensors.

High-end HMDs account for drift with a separate positional tracking mechanism:

Inside out

: The original Oculus Rift CV1 did this with inside-out positional tracking, where an array of (invisible) infrared LEDs on the HMD were read by an external optical sensor (infrared camera) to determine your position. You need to remain within the

view

of the camera for the head tracking to work.

Outside in

: Alternatively, the Steam VR VIVE Lighthouse technology uses

outside-in positional tracking

, where two or more dumb laser emitters are placed in the room (much like the lasers in a barcode reader at the grocery checkout), and an optical sensor on the headset reads the rays to determine your position.

Spatial mapping