Complete Virtual Reality and Augmented Reality Development with Unity - Jesse Glover - E-Book

Complete Virtual Reality and Augmented Reality Development with Unity E-Book

Jesse Glover

0,0
43,19 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Get close and comfortable with Unity and build applications that run on HoloLens, Daydream, and Oculus Rift


Key Features:


Build fun augmented reality applications using ARKit, ARCore, and VuforiaExplore virtual reality by developing more than 10 engaging projects Learn how to integrate AR and VR concepts together in a single application


Book Description:


Unity is the leading platform to develop mixed reality experiences because it provides a great pipeline for working with 3D assets.


Using a practical and project-based approach, this Learning Path educates you about the specifics of AR and VR development using Unity 2018 and Unity 3D. You’ll learn to integrate, animate, and overlay 3D objects on your camera feed, before moving on to implement sensor-based AR applications. You’ll explore various concepts by creating an AR application using Vuforia for both macOS and Windows for Android and iOS devices. Next, you’ll learn how to develop VR applications that can be experienced with devices, such as Oculus and Vive. You’ll also explore various tools for VR development: gaze-based versus hand controller input, world space UI canvases, locomotion and teleportation, timeline animation, and multiplayer networking.


You’ll learn the Unity 3D game engine via the interactive Unity Editor and C# programming.


By the end of this Learning Path, you’ll be fully equipped to develop rich, interactive mixed reality experiences using Unity.


This Learning Path includes content from the following Packt products:


Unity Virtual Reality Projects - Second Edition by Jonathan LinowesUnity 2018 Augmented Reality Projects by Jesse GloverWhat you will learnCreate 3D scenes to learn about world space and scaleMove around your scenes using locomotion and teleportationCreate filters or overlays that work with facial recognition softwareInteract with virtual objects using eye gaze, hand controllers, and user input eventsDesign and build a VR storytelling animation with a soundtrack and timelinesCreate social VR experiences with Unity networking


Who this book is for:


If you are a game developer familiar with 3D computer graphics and interested in building your own AR and VR games or applications, then this Learning Path is for you. Any prior experience in Unity and C# will be an advantage. In all, this course teaches you the tools and techniques to develop engaging mixed reality applications.


Jonathan Linowes is founder of Parkerhill Reality Labs, an immersive media indie studio and developer of the BridgeXR toolkit, Power Solitaire VR game, and upcoming Chess Or Die game. He is a VR/AR evangelist, Unity developer, entrepreneur, and teacher. Jonathan has a BFA degree from Syracuse University, an MS degree from the MIT Media Lab, and held technical leadership positions at Autodesk, among other companies. He has authored a number of books and videos by Packt, including Unity Virtual Reality Projects (first edition 2015), Cardboard VR Projects for Android, and Augmented Reality for Developers. Jesse Glover is a self-taught software developer and indie game developer who has worked with multiple game engines and has written many tutorials on the subject of game development over the past 8 years. He maintains a YouTube channel dedicated to game development made easy and writes for Zenva in his spare time to teach the ins and outs of game development with Unity, CryEngine, and Unreal Engine, just to name a few. Jesse has also written Unity Programming for Human Beings.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB

Seitenzahl: 601

Veröffentlichungsjahr: 2019

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Complete Virtual Reality and Augmented Reality Development with Unity

 

 

 

 

Leverage the power of Unity and become a pro at creating mixed reality applications

 

 

 

 

 

 

 

 

 

Jesse Glover 
Jonathan Linowes 

 

 

 

 

 

 

 

 

 

BIRMINGHAM - MUMBAI

Complete Virtual Reality and Augmented Reality Development with Unity

 

Copyright © 2019 Packt Publishing

 

 

All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.

Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors nor Packt Publishing or its dealers and distributors will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.

Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.

 

 

First Published: April 2019

 

Production Reference: 1170419

 

 

 

Published by Packt Publishing Ltd. Livery Place, 35 Livery Street Birmingham, B3 2PB, U.K.

 

ISBN 978-1-83864-818-3

www.packtpub.com

 
mapt.io

Mapt is an online digital library that gives you full access to over 5,000 books and videos, as well as industry-leading tools to help you plan your personal development and advance your career. For more information, please visit our website.

Why Subscribe?

Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals

Improve your learning with Skill Plans built especially for you

Get a free eBook or video every month

Mapt is fully searchable

Copy and paste, print, and bookmark content

Packt.com

Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.

At www.packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks. 

Contributors

About the Authors

Jesse Glover is a self-taught software developer and indie game developer who has worked with multiple game engines and has written many tutorials on the subject of game development over the past 8 years. He maintains a YouTube channel dedicated to game development made easy and writes for Zenva in his spare time to teach the ins and outs of game development with Unity, CryEngine, and Unreal Engine, just to name a few. Jesse has also written Unity Programming for Human Beings.

 

 

 

 

 

Jonathan Linowes is a founder of Parkerhill Reality Labs, an immersive media indie studio and developer of the BridgeXR toolkit, Power Solitaire VR game, and upcoming Chess Or Die game. He is a VR/AR evangelist, Unity developer, entrepreneur, and teacher. Jonathan has a BFA degree from Syracuse University, an MS degree from the MIT Media Lab, and held technical leadership positions at Autodesk, among other companies. He has authored a number of books and videos by Packt, including Unity Virtual Reality Projects (first edition 2015), Cardboard VR Projects for Android, and Augmented Reality for Developers.

 

 

 

 

 

Packt Is Searching for Authors Like You

If you're interested in becoming an author for Packt, please visit authors.packtpub.com and apply today. We have worked with thousands of developers and tech professionals, just like you, to help them share their insight with the global tech community. You can make a general application, apply for a specific hot topic that we are recruiting an author for, or submit your own idea.

Table of Contents

Title Page

Copyright

Complete Virtual Reality and Augmented Reality Development with Unity

About Packt

Why Subscribe?

Packt.com

Contributors

About the Authors

Packt Is Searching for Authors Like You

Preface

Who This Book Is For

What This Book Covers

To Get the Most out of This Book

Download the Example Code Files

Conventions Used

Get in Touch

Reviews

Virtually Everything for Everyone

What is virtual reality to you?

Types of head-mounted displays

Desktop VR

Mobile VR

The difference between virtual reality and augmented reality

Applications versus games

How virtual reality really works

Stereoscopic 3D viewing

Head tracking

Types of VR experiences

Technical skills that are important to VR

Summary

Content, Objects, and Scale

Getting started with Unity

Creating a new Unity project

The Unity editor

The default world space

Creating a simple diorama

Adding a cube

Adding a plane

Adding a sphere and some material

Changing the scene view

Adding a photo

Coloring the ground plane

Measurement tools

Keeping a unit cube handy

Using a Grid Projector

Measuring the Ethan character

Using third-party content

Creating 3D content with Blender

An introduction to Blender

A unit cube

UV Texture image

Importing into Unity

A few observations

Creating 3D content in VR

Exporting and importing Tilt Brush models

Publishing and importing using Google Poly

Editing Unity in VR with EditorXR

Setting up EditorXR

Using EditorXR

Summary

VR Build and Run

Unity VR Support and Toolkits

Unity's built-in VR support

Device-specific toolkits

Application toolkits

Web and JavaScript-based VR

3D worlds

Enabling Virtual Reality for your platform

Setting your target platform

Setting your XR SDK

Installing your device toolkit

Creating the MeMyselfEye player prefab

Building for SteamVR

Building for Oculus Rift

Building for Windows Immersive MR

Setting up Windows 10 Developer mode

Installing UWP support in Visual Studio

UWP build

Setting up for Android devices

Installing the Java Development Kit (JDK)

Installing Android SDK

Via Command Line Tools

About your Android SDK root path location

Installing USB device debugging and connection

Configuring the Unity External Tools

Configuring Unity Player Settings for Android

Building for GearVR and Oculus Go

Building for Google VR

Google Daydream

Google Cardboard

Google VR Play Mode

Setting up for iOS devices

Have an Apple ID

Install Xcode

Configuring the Unity Player Settings for iOS 

Build And Run

Summary

Gaze-Based Control

Ethan, the walker

Artificially intelligent Ethan

The NavMesh bakery

A random walker in the town

The RandomPosition script

"Zombie-ize" Ethan!

Go where I'm looking

The LookMoveTo script

Adding a feedback cursor

Observing through obstacles

If looks could kill

The KillTarget script

Adding particle effects

Cleaning up

Short intro to Unity C# programming

Summary

Handy Interactables

Setting up the scene

Creating a balloon

Making it a prefab

Basic button input

Using the Fire1 button

OpenVR trigger button

Daydream controller clicks

Polling for clicks

Our own button interface functions

Creating and releasing balloons

Inflating a balloon while pressed

Using scriptable objects for input

Creating the scriptable object

Populating the input action object

Accessing the input action object

Simulation testing with scriptable objects

Using Unity events for input

Invoking our input action events

Subscribing to input events

Really using your hands

Parenting balloons to your hand

Popping balloons

Interactable items

Interactables using SteamVR Interaction System

Interactables using Daydream VR Elements

Summary

World Space UI

Studying VR design principles

A reusable default canvas

Visor HUD

The reticle cursor

The windshield HUD

The game element UI

Using TextMeshPro

Info bubble

An in-game dashboard with input events

Creating a dashboard with buttons

Linking the water hose to the buttons

Activating buttons from the script

Look to highlight a button

Looking and then clicking to select

Looking and starting to select

Pointing and clicking with VR components

Using Unity UI and SteamVR

Using Unity UI and Daydream

Building a wrist-based menu palette

Summary

Locomotion and Comfort

Understanding Unity characters

Unity components

The Camera component

The Rigidbody component

The Character Controller component

Unity Standard Assets

ThirdPersonController

AIThirdPersonController

First-person FPSController

RigidBodyFPSController

Using glide locomotion

Move in the direction you're looking

Keep your feet on the ground

Don't pass through solid objects

Don't fall off the edge of the world

Stepping over small objects and handling uneven terrain

Starting and stopping movement

Adding comfort mode locomotion

Other locomotion considerations

Techniques for teleportation

Looking to teleport

Teleporting between surfaces

Teleport spawn points

Other teleport considerations

Teleportation toolkits

Teleporting with SteamVR Interaction System

Teleporting with Daydream Elements

Resetting center and position 

Supporting room scale teleportation

Managing VR motion sickness

Summary

Playing with Physics and Fire

Unity physics

Bouncy balls

Managing game objects

Destroying fallen objects

Setting a limited lifetime

Implementing an object pool

Headshot game

Paddle ball game

Deflector versus paddle

Shooter ball game

Juicing the scene

Great balls of fire

Skull environment

Audio synchronization

Summary

Animation and VR Storytelling

Composing our story 

Gathering the assets

Creating the initial scene

Timelines and Audio tracks

Using a Timeline to activate objects

Recording an Animation Track

A growing tree

A growing bird

Using the Animation editor

A wafting nest

Animating other properties

Animating lights

Animating a scripted component property

Controlling particle systems

Using Animation clips

Shaking an egg

Using Animator Controllers

Definitions for Animation and Animator

ThirdPersonController Animator

Living Birds Animator

Learning to fly

Hacking the birds

Fly away!

Making the story interactive

Look to play

Resetting the initial scene setup

More interactivity ideas

Summary

What AR is and How to Get Set up

Available AR packages 

Defining AR

An incomplete list of AR devices

Advantages and disadvantages of the different AR toolkits available

ARCore

ARKit

Vuforia

ARToolKit

Building our first AR applications

Setting up Vuforia

Setting up ARToolKit

Setting up ARCore

Setting up ARKit

Building Hello World in ARKit

Summary

GIS Fundamentals - The Power of Mapping

What is GIS?

The history of GIS

GIS techniques and technologies

Ways to capture GIS

Converting from raster to vector

Projections and coordinate systems

Spatial analysis with GIS

Data analysis with GIS

GIS modeling

Geometric networks

Hydrological modeling

Cartographic modeling

Map overlays

Statistics used with GIS

Geocoding

Reverse geocoding

Open Geospatial Consortium Standards

Web mapping

GIS and adding dimension of time

Semantics

The implications of GIS in society

GIS in the real world

GIS in education

GIS in local governments

GIS and augmented reality

Applications of GIS

Gaming and GIS

Summary

Censored - Various Sensor Data and Plugins

Project overview

Getting started

Sensors

Leveraging sensors with plugins

Writing unity plugins

C# language plugin

C++ language plugin

Swift language plugin

Objective-C language plugin

Java language plugin

Creating a sensor driver in Java

Summary

The Sound of Flowery Prose

Project overview

Getting started

Conceptualizing the project

Basic idea/concept

Choosing the right programming language

Choosing your release platform

Choosing your game engine, libraries, and frameworks

Developing the game design and application design document

Bonus – UML design

Prototyping

Setting up the Unity project

Code implementation details

Working with XCode

Summary

Picture Puzzle - The AR Experience

Project background

Project overview

Getting started

Installing Vuforia

Differences between macOS and Windows setups

Windows project setup

Building the Windows project

macOS project setup

Building the macOS Project

Working with Xcode

Summary

Fitness for Fun - Tourism and Random Walking

Background information on Mapbox

Project overview

Getting started

Setting up Mapbox

Important items to note

Setting up the project

Scripting the project

Finalizing the project

Summary

Snap it! Adding Filters to Pictures

Project overview

Getting started

What is OpenCV?

Creating the project with paid assets

Installing and building OpenCV

Downloading OpenCV

Downloading CMake

Configuring the CMake and OpenCV source files

OpenCV with Unity

OpenCV and Unity

Summary

To the HoloLens and Beyond

What is Mixed Reality, and how does it work?

Urban Hunt

Smart Urban Golf

XR applications in media

XR with HoloLens

Getting Mixed Reality ready

Project overview

Playing with Mixed Reality

Setting up the camera

Performance and quality control

Targeting the Windows 10 SDK

Do the robot

Building and deploying from Visual Studio

Summary

Other Books You May Enjoy

Leave a review - let other readers know what you think

Preface

Unity is the leading platform to develop mixed reality experiences because it provides a great pipeline for working with 3D assets.

Using a practical and project-based approach, this Learning Path educates you about the specifics of AR and VR development using Unity 2018 and Unity 3D. You’ll learn to integrate, animate, and overlay 3D objects on your camera feed, before moving on to implement sensor-based AR applications. You’ll explore various concepts by creating an AR application using Vuforia for both macOS and Windows for Android and iOS devices. Next, you’ll learn how to develop VR applications that can be experienced with devices, such as Oculus and Vive. You’ll also explore various tools for VR development: gaze-based versus hand controller input, world space UI canvases, locomotion and teleportation, timeline animation, and multiplayer networking. You’ll learn the Unity 3D game engine via the interactive Unity Editor and C# programming.

By the end of this Learning Path, you’ll be fully equipped to develop rich, interactive mixed reality experiences using Unity.

Who This Book Is For

If you are a game developer familiar with 3D computer graphics and interested in building your own AR and VR games or applications, then this Learning Path is for you. Any prior experience in Unity and C# will be an advantage. In all, this course teaches you the tools and techniques to develop engaging mixed reality applications.

What This Book Covers

Chapter 1, Virtually Everything for Everyone, is an introduction to the new technologies and opportunities in consumer virtual reality in games and non-gaming applications, including an explanation of stereoscopic viewing and head tracking.

Chapter 2, Content, Objects, and Scale, introduces the Unity game engine as we build a simple diorama scene and reviews importing 3D content created with other tools such as Blender, Tilt Brush, Google Poly, and Unity EditorXR.

Chapter 3, VR Build and Run, helps you set up your system and Unity project to build and run on your target device(s), including SteamVR, Oculus Rift, Windows MR, GearVR, Oculus Go, and Google Daydream.

Chapter 4, Gaze-Based Control, explores the relationship between the VR camera and objects in the scene, including 3D cursors and gaze-based ray guns. This chapter also introduces Unity scripting in the C# programming language.

Chapter 5, Handy Interactables, looks at user input events using controller buttons and interactable objects, using various software patterns including polling, scriptable objects, Unity events, and interactable components provided with toolkit SDK.

Chapter 6, World Space UI, implements many examples of user interface (UI) for VR using a Unity world space canvas, including a heads-up display (HUD), info-bubbles, in-game objects, and a wrist-based menu palette.

Chapter 7, Locomotion and Comfort, dives into techniques for moving yourself around a VR scene, looking closely at the Unity first-person character objects and components, locomotion, teleportation, and room-scale VR.

Chapter 8, Playing with Physics and Fire, explores the Unity physics engine, physic materials, particle systems, and more C# scripting, as we build a paddle ball game to whack fireballs in time to your favorite music.

Chapter 9, Animation and VR Storytelling, builds a complete VR storytelling experience using imported 3D assets and soundtrack, and Unity timelines and animation.

Chapter 10, What AR is and How to Get Set Up, explains the processes of installing various SDKs and packages for enabling AR, and building a Hello World example with AR.

Chapter 11, GIS Fundamentals - The Power of Mapping, explores the history of GIS, GIS implications in applications and games, and GIS in education.

Chapter 12, Censored - Various Sensor Data and Plugins, looks at how to write plugins for Unity in C#, how to write plugins for Unity in C++, how to write plugins for Unity in Objective-C, and how to write plugins for Unity in Java.

Chapter 13, The Sound of Flowery Prose, goes into details of the steps for designing an application, looks at conceptualizing the project, and explores how to create an AR application based on the perception of sound.

Chapter 14, Picture Puzzle - The AR Experience, helps you design an educational app, learn to use Vuforia, and develop an educational AR application with Vuforia.

Chapter 15, Fitness for Fun - Tourism and Random Walking, teaches about Mapbox, integrating Mapbox into Unity, and building a random walk-to-location app prototype.

Chapter 16, Snap it! Adding Filters to Pictures, helps you learn about OpenCV, incorporate OpenCV into Unity, build OpenCV from source, and build a facial detection app prototype with OpenCV.

Chapter 17, To the HoloLens and Beyond, gives you an insight into the difference between AR and Mixed Reality (MR), teaches you how to use the Hololens simulator, and gets you to build a basic prototype for MR using the Hololens simulator.

To Get the Most out of This Book

To get the most out of this book, you should have some knowledge of the Unity Editor, UI, and build processes. In addition to this, it is highly advised that you have some skill with C# that is above the beginners' level, as this book does not go into how to write C# code. Lastly, it is suggested that you should have, at the very least, take a look at other programming languages, such as Swift, Objective-C, C, C++, and Java, and are able to get the gist of what is happening with the code that you will encounter in this book at a glance.

The only requirements are basic knowledge of the Unity Game Engine and C#, as they are the primary focuses of this book.

Download the Example Code Files

You can download the example code files for this book from your account at www.packt.com. If you purchased this book elsewhere, you can visit www.packt.com/support and register to have the files emailed directly to you.

You can download the code files by following these steps:

Log in or register at

www.packt.com

.

Select the

SUPPORT

tab.

Click on

Code Downloads & Errata

.

Enter the name of the book in the

Search

box and follow the onscreen instructions.

Once the file is downloaded, please make sure that you unzip or extract the folder using the latest version of:

WinRAR/7-Zip for Windows

Zipeg/iZip/UnRarX for Mac

7-Zip/PeaZip for Linux

The code bundle for the book is also hosted on GitHub at https://github.com/PacktPublishing/Complete-Virtual-Reality-and-Augmented-Reality-Development-with-Unity. In case there's an update to the code, it will be updated on the existing GitHub repository.

We also have other code bundles from our rich catalog of books and videos available at https://github.com/PacktPublishing/. Check them out!

Conventions Used

There are a number of text conventions used throughout this book.

CodeInText: Indicates code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles. Here is an example: "Create a brand new Unity Project. I will call mine Snap."

A block of code is set as follows:

struct Circle{Circle(int x, int y, int radius) : X(x), Y(y), Radius(radius) {}int X, Y, Radius;};

When we wish to draw your attention to a particular part of a code block, the relevant lines or items are set in bold:

extern "C" void __declspec(dllexport) __stdcall Close(){

_capture.release();

}

Bold: Indicates a new term, an important word, or words that you see onscreen. For example, words in menus or dialog boxes appear in the text like this. Here is an example: "Select System info from the Administration panel."

Warnings or important notes appear like this.
Tips and tricks appear like this.

Get in Touch

Feedback from our readers is always welcome.

General feedback: If you have questions about any aspect of this book, mention the book title in the subject of your message and email us at [email protected].

Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you would report this to us. Please visit www.packt.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.

Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.

If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit authors.packtpub.com.

Reviews

Please leave a review. Once you have read and used this book, why not leave a review on the site that you purchased it from? Potential readers can then see and use your unbiased opinion to make purchase decisions, we at Packt can understand what you think about our products, and our authors can see your feedback on their book. Thank you!

For more information about Packt, please visit packt.com.

Virtually Everything for Everyone

This virtual reality thing calls into question, what does it mean to "be somewhere"? Before cell phones, you would call someone and it would make no sense to say, "Hey, where are you?" You know where they are, you called their house, that's where they are. So then cell phones come around and you start to hear people say, "Hello. Oh, I'm at Starbucks," because the person on the other end wouldn't necessarily know where you are, because you became un-tethered from your house for voice communications. So when I saw a VR demo, I had this vision of coming home and my wife has got the kids settled down, she has a couple minutes to herself, and she's on the couch wearing goggles on her face. I come over and tap her on the shoulder, and I'm like, "Hey, where are you?" It's super weird. The person's sitting right in front of you, but you don't know where they are.                                                                     -Jonathan Stark, mobile expert, and podcaster

Welcome to virtual reality (VR)! In this book, we will explore what it takes to create virtual reality experiences on our own. We will take a walk through a series of hands-on projects, step-by-step tutorials, and in-depth discussions using the Unity 3D game engine and other free or open source software. Though the virtual reality technology is rapidly advancing, we'll try to capture the basic principles and techniques that you can use to make your VR games and applications feel immersive and comfortable.

In this first chapter, we will define virtual reality and illustrate how it can be applied not only to games but also many other areas of interest and productivity. This chapter discusses the following topics:

What is virtual reality?

Differences between virtual reality

 

and augmented reality

How VR applications may differ from VR games

Types of VR experiences

Technical skills that are necessary for the development of VR

What is virtual reality to you?

Today, we are witnesses to the burgeoning consumer virtual reality, an exciting technology that promises to transform in a fundamental way how we interact with information, our friends, and the world at large.

What is virtual reality? In general, VR is the computer-generated simulation of a 3D environment, which seems very real to the person experiencing it, using special electronic equipment. The objective is to achieve a strong sense of being present in the virtual environment.

Today's consumer tech VR involves wearing an HMD (head-mounted display goggles) to view stereoscopic 3D scenes. You can look around by moving your head, and walk around by using hand controls or motion sensors. You are engaged in a fully immersive experience. It's as if you're really there in some other virtual world. The following image shows me, the author, experiencing an Oculus Rift Development Kit 2 (DK2) in 2015:

Virtual reality is not new. It's been here for decades, albeit hidden away in academic research labs and high-end industrial and military facilities. It was big, clunky, and expensive. Ivan Sutherland invented the first HMD in 1965 (see https://amturing.acm.org/photo/sutherland_3467412.cfm). It was tethered to the ceiling! In the past, several failed attempts have been made to bring consumer-level virtual reality products to the market.

In 2012, Palmer Luckey, the founder of Oculus VR LLC, gave a demonstration of a makeshift head-mounted VR display to John Carmack, the famed developer of the Doom, Wolfenstein 3D, and Quake classic video games. Together, they ran a successful Kickstarter campaign and released a developer kit called Oculus Rift Development Kit 1 (DK1) to an enthusiastic community. This caught the attention of investors as well as Mark Zuckerberg, and in March 2014, Facebook bought the company for $2 billion. With no product, no customers, and infinite promise, the money, and attention that it attracted helped fuel a new category of consumer products.

Concurrently, others also working on their own products which were soon introduced to the market, including Steam's HTC VIVE, Google Daydream, Sony PlayStation VR, Samsung Gear VR, Microsoft's immersive Mixed Reality, and more. New innovations and devices that enhance the VR experience continue to be introduced.

Most of the basic research has already been done and the technology is now affordable thanks in large part to the mass adoption of devices that work on mobile technology. There is a huge community of developers with experience in building 3D games and mobile apps. Creative content producers are joining in and the media is talking it up. At last, virtual reality is real!

Say what? Virtual reality is real? Ha! If it's virtual, how can it be... Oh, never mind.

Eventually, we will get past the focus on the emerging hardware devices and recognize that content is king. The current generation of 3D development software (commercial, free, and open source) that has spawned a plethora of indie, or independent, game developers can also be used to build non-game VR applications.

Though VR finds most of its enthusiasts in the gaming community, the potential applications reach well beyond that. Any business that presently uses 3D modeling and computer graphics will be more effective if it uses VR technology. The sense of immersive presence that is afforded by VR can enhance all common online experiences today, which includes engineering, social networking, shopping, marketing, entertainment, and business development. In the near future, viewing 3D websites with a VR headset may be as common as visiting ordinary flat websites today.

Types of head-mounted displays

Presently, there are two basic categories of HMDs for virtual reality—desktop VR and mobile VR, although the distinctions are increasingly becoming blurred. Eventually, we might just talk about platforms as we do traditional computing, in terms of the operating system—Windows, Android, or console VR.

Desktop VR

With desktop VR (and console VR), your headset is peripheral to a more powerful computer that processes the heavy graphics. The computer may be a Windows PC, Mac, Linux, or a game console, although Windows is by far the most prominent PC and the PS4 is a bestseller in terms of console VR.

Most likely, the headset is connected to the computer with wires. The game runs on the remote machine and the HMD is a peripheral display device with a motion sensing input. The term desktop is an unfortunate misnomer since it's just as likely to be stationed in either a living room or a den.

The Oculus Rift (https://www.oculus.com/) is an example of a device where the goggles have an integrated display and sensors. The games run on a separate PC. Other desktop headsets include the HTC VIVE, Sony's PlayStation VR, and Microsoft immersive Mixed Reality.

Desktop VR devices rely on a desktop computer (usually via video and USB cables) for CPU and graphics processing unit (GPU) power, where more is better. Please refer to the recommended specification requirements for your specific device.

However, for the purpose of this book, we won't have any heavy rendering in our projects, and you can get by with minimum system specifications.

Mobile VR

Mobile VR originated with Google Cardboard (https://vr.google.com/cardboard/), a simple housing device for two lenses and a slot for your mobile phone. The phone's display is used to show the twin stereoscopic views. It has rotational head tracking, but it has no positional tracking. The Cardboard also provides the user with the ability to click or tap its side to make selections in a game. The complexity of the imagery is limited because it uses your phone's processor for rendering the views on the phone display screen.

Google Daydream and Samsung GearVR improved the platforms by requiring more performant minimum specifications including greater processing power in the mobile phone. GearVR's headsets include motion sensors to assist the phone device. These devices also introduced a three-degrees-of-freedom (DOF) hand controller that can be used as a laser pointer within VR experiences.

The next generation of mobile VR devices includes all-in-one headsets, like Oculus Go, with embedded screens and processors, eliminating the need for a separate mobile phone. Newer models may include depth sensors and spatial mapping processors to track the user's location in 3D space.

The bottom line is, the projects in this book will explore features from the high end to the low end of the consumer VR device spectrum. But generally, our projects do not demand a lot of processing power nor do they require high-end VR capability, so you can begin developing for VR on any of these types of devices, including Google Cardboard and an ordinary mobile phone.

If you are interested in developing VR applications for Google Daydream on Android directly in Java rather than through the Unity game engine, please also refer to another of the author's books, Cardboard VR Projects for Android from Packt Publishing (https://www.packtpub.com/application-development/cardboard-vr-projects-android).

The difference between virtual reality and augmented reality

It's probably worthwhile to clarify what virtual reality is not.

A sister technology to VR is augmented reality (AR), which combines computer-generated imagery (CGI) with views of the real world. AR on smartphones has recently garnered widespread interest with the introduction of Apple's ARKit for iOS and Google ARCore for Android. Further, the Vuforia AR toolkit is now integrated directly with the Unity game engine, helping to drive even more adoption of the technology. AR on a mobile device overlays the CGI on top of live video from a camera.

The latest innovations in AR are wearable AR headsets, such as Microsoft's HoloLens and Magic Leap, which show the computer graphics directly in your field of view. The graphics are not mixed into a video image. If VR headsets are like closed goggles, AR headsets are like translucent sunglasses that combine the real-world light rays with CGI. A challenge for AR is ensuring that the CGI is consistently aligned with and mapped onto the objects in the real-world space and to eliminate latency while moving about so that they (the CGI and objects in the real-world space) stay aligned.

AR holds as much promise as VR for future applications, but it's different. Though AR intends to engage the user within their current surroundings, virtual reality is fully immersive. In AR, you may open your hand and see a log cabin resting in your palm, but in VR, you're transported directly inside the log cabin and you can walk around inside it.

We are also beginning to see hybrid devices that combine features of VR and AR and let you switch between modes.

If you are interested in developing applications for AR, please also refer to the author's book Augmented Reality for Developers from Packt Publishing (https://www.packtpub.com/web-development/augmented-reality-developers).

Applications versus games

Consumer-level virtual reality started with gaming. Video gamers are already accustomed to being engaged in highly interactive hyper-realistic 3D environments. VR just ups the ante.

Gamers are early adopters of high-end graphics technology. Mass production of gaming consoles and PC-based components in the tens of millions and competition between vendors leads to lower prices and higher performance. Game developers follow suit, often pushing the state of the art, squeezing every ounce of performance out of hardware and software. Gamers are a very demanding bunch, and the market has consistently stepped up to keep them satisfied. It's no surprise that many, if not most, of the current wave of VR hardware and software companies, are first targeting the video gaming industry. A majority of the VR apps on the Oculus Store such as Rift (https://www.oculus.com/experiences/rift/), GearVR (https://www.oculus.com/experiences/gear-vr/), and Google Play for Daydream (https://play.google.com/store/search?q=daydream&c=apps&hl=en), for example, are games. And of course, the Steam VR platform (http://store.steampowered.com/steamvr) is almost entirely about gaming. Gamers are the most enthusiastic VR advocates and seriously appreciate its potential.

Game developers know that the core of a game is the game mechanics, or the rules, which are largely independent of the skin, or the thematic topic of the game. Gameplay mechanics can include puzzles, chance, strategy, timing, or muscle memory. VR games can have the same mechanic elements but might need to be adjusted for the virtual environment. For example, a first-person character walking in a console video game is probably going about 1.5 times faster than their actual pace in real life. If this wasn't the case, the player would feel that the game was too slow and boring. Put the same character in a VR scene and they will feel that it is too fast; it could likely make the player feel nauseous. In VR, you want your characters to walk at a normal, earthly pace. Not all video games will map well to VR; it may not be fun to be in the middle of a war zone when you're actually there.

That said, virtual reality is also being applied in areas other than gaming. Though games will remain important, non-gaming applications will eventually overshadow them. These applications may differ from games in a number of ways, with the most significant having much less emphasis on game mechanics and more emphasis on either the experience itself or application-specific goals. Of course, this doesn't preclude some game mechanics. For example, the application may be specifically designed to train the user in a specific skill. Sometimes, the gamification of a business or personal application makes it more fun and effective in driving the desired behavior through competition.

In general, non-gaming VR applications are less about winning and more about the experience itself.

Here are a few examples of the kinds of non-gaming applications that people are working on:

Travel and tourism

: Visit faraway places without leaving your home. Visit art museums in Paris, New York, and Tokyo in one afternoon. Take a walk on Mars. You can even enjoy Holi, the spring festival of colors, in India while sitting in your wintery cabin in Vermont.

Mechanical engineering and industrial design

: Computer-aided design software such as AutoCAD and SOLIDWORKS pioneered three-dimensional modeling, simulation, and visualization. With VR, engineers and designers can directly experience the end product before it's actually built and play with what-if scenarios at a very low cost. Consider iterating a new automobile design. How does it look? How does it perform? How does it appear when sitting in the driver's seat?

Architecture and civil engineering

: Architects and engineers have always constructed scale models of their designs, if only to pitch the ideas to clients and investors or, more importantly, to validate the many assumptions about the design. Presently, modeling and rendering software is commonly used to build virtual models from architectural plans. With VR, the conversations with stakeholders can be so much more confident. Other personnel, such as the interior designers, HVAC, and electrical engineers, can be brought into the process sooner.

Real estate

: Real estate agents have been quick adopters of the internet and visualization technology to attract buyers and close sales. Real estate search websites were some of the first successful uses of the web. Online panoramic video walkthroughs of for-sale properties are commonplace today. With VR, I can be in New York and find a place to live in Los Angeles. 

Medicine

: The potential of VR for health and medicine may literally be a matter of life and death. Every day, hospitals use MRI and other scanning devices to produce models of our bones and organs that are used for medical diagnosis and possibly pre-operative planning. Using VR to enhance visualization and measurement will provide a more intuitive analysis. Virtual reality is also being used for the simulation of surgery to train medical students.

Mental health

: Virtual reality experiences have been shown to be effective in a therapeutic context for the treatment of

post-traumatic stress disorder

(

PTSD

) in what's called

exposure therapy

, where the patient, guided by a trained therapist, confronts their traumatic memories through the retelling of the experience. Similarly, VR is being used to treat arachnophobia (fear of spiders) and the fear of flying.

Education

: The educational opportunities for VR are almost too obvious to mention. One of the first successful VR experiences is

Titans of Space

, which lets you explore the solar system first-hand. In science, history, arts, and mathematics, VR will help students of all ages because, as they say, field trips are much more effective than textbooks.

Training

: Toyota has demonstrated a VR simulation of drivers' education to teach teenagers about the risks of distracted driving. In another project, vocational students got to experience the operating of cranes and other heavy construction equipment. Training for first responders, the police, and fire and rescue workers can be enhanced with VR by presenting highly risky situations and alternative virtual scenarios. The

National Football League

(

NFL

) and college teams are looking to VR for athletic training.

Entertainment and journalism

: Virtually attend rock concerts and sporting events. Watch music videos Erotica. Re-experience news events as if you were personally present. Enjoy 360-degree cinematic experiences. The art of storytelling will be transformed by virtual reality.

Wow, that's quite a list! This is just the low-hanging fruit.

The purpose of this book is not to dive too deeply into any of these applications. Rather, I hope that this survey helps stimulate your thinking and provides an idea of how virtual reality has the potential to be virtually anything for everyone.

How virtual reality really works

So, what is it about VR that's got everyone so excited? With your headset on, you experience synthetic scenes. It appears 3D, it feels 3D, and maybe you even have a sense of actually being there inside the virtual world. The strikingly obvious thing is: VR looks and feels really cool! But why?

Immersion and presence are the two words used to describe the quality of a VR experience. The Holy Grail is to increase both to the point where it seems so real, you forget you're in a virtual world. Immersion is the result of emulating the sensory input that your body receives (visual, auditory, motor, and so on). This can be explained technically. Presence is the visceral feeling that you get being transported there—a deep emotional or intuitive feeling. You could say that immersion is the science of VR and presence is art. And that, my friend, is cool.

A number of different technologies and techniques come together to make the VR experience work, which can be separated into two basic areas:

3D viewing

Head-pose tracking

In other words, displays and sensors, like those built into today's mobile devices, are a big reason why VR is possible and affordable today.

Suppose the VR system knows exactly where your head is positioned at any given moment in time. Suppose that it can immediately render and display the 3D scene for this precise viewpoint stereoscopically. Then, wherever and whenever you move, you'll see the virtual scene exactly as you should. You will have a nearly perfect visual VR experience. That's basically it. Ta-dah!

Well, not so fast. Literally.

Stereoscopic 3D viewing

Split-screen stereography was discovered not long after the invention of photography, like the popular stereograph viewer from 1876 shown in the following picture (B.W. Kilborn & Co, Littleton, New Hampshire; see http://en.wikipedia.org/wiki/Benjamin_W._Kilburn). A stereo photograph has separate views for the left and right eyes, which are slightly offset to create parallax. This fools the brain into thinking that it's a truly three-dimensional view. The device contains separate lenses for each eye, which let you easily focus on the photo close up:

Similarly, rendering these side-by-side stereo views is the first job of the VR-enabled camera in Unity.

Let's say that you're wearing a VR headset and you're holding your head very still so that the image looks frozen. It still appears better than a simple stereograph. Why?

The old-fashioned stereograph has relatively small twin images rectangularly bound. When your eye is focused on the center of the view, the 3D effect is convincing, but you will see the boundaries of the view. Move your eyes around (even with your head still), and any remaining sense of immersion is totally lost. You're just an observer on the outside peering into a diorama.

Now, consider what a VR screen looks like without the headset (see the following screenshot):

The first thing that you will notice is that each eye has a barrel-shaped view. Why is that? The headset lens is a very wide-angle lens. So, when you look through it, you have a nice wide field of view. In fact, it is so wide (and tall), it distorts the image (pincushion effect). The graphics software SDK does an inverse of that distortion (barrel distortion) so that it looks correct to us through the lenses. This is referred to as an ocular distortion correction. The result is an apparent field of view (FOV) that is wide enough to include a lot more of your peripheral vision. For example, the Oculus Rift has a FOV of about 100 degrees.

Also, of course, the view angle from each eye is slightly offset, comparable to the distance between your eyes or the Inter Pupillary Distance (IPD). IPD is used to calculate the parallax and can vary from one person to the next. (The Oculus Configuration Utility comes with a utility to measure and configure your IPD. Alternatively, you can ask your eye doctor for an accurate measurement.)

It might be less obvious, but if you look closer at the VR screen, you will see color separations, as you'd get from a color printer whose print head is not aligned properly. This is intentional. Light passing through a lens is refracted at different angles based on the wavelength of the light. Again, the rendering software does an inverse of the color separation so that it looks correct to us. This is referred to as a chromatic aberration correction. It helps make the image look really crisp.

The resolution of the screen is also important to get a convincing view. If it's too low-res, you'll see the pixels, or what some refer to as a screen-door effect. The pixel width and height of the display is an oft-quoted specification when comparing the HMDs, but the pixels per inch (PPI) value may be more important. Other innovations in display technology such as pixel smearing and foveated rendering (showing higher-resolution details exactly where the eyeball is looking) will also help reduce the screen-door effect.

When experiencing a 3D scene in VR, you must also consider the frames per second (FPS). If the FPS is too slow, the animation will look choppy. Things that affect FPS include the GPU performance and the complexity of the Unity scene (the number of polygons and lighting calculations), among other factors. This is compounded in VR because you need to draw the scene twice, once for each eye. Technology innovations, such as GPUs optimized for VR, frame interpolation, and other techniques will improve the frame rates. For us, developers, performance-tuning techniques in Unity, such as those used by mobile game developers, can be applied in VR. These techniques and optics help make the 3D scene appear realistic.

Sound is also very important—more important than many people realize. VR should be experienced while wearing stereo headphones. In fact, when the audio is done well but the graphics are pretty crappy, you can still have a great experience. We see this a lot in TV and cinema. The same holds true in VR. Binaural audio gives each ear its own stereo view of a sound source in such a way that your brain imagines its location in 3D space. No special listening devices are needed. Regular headphones will work (speakers will not). For example, put on your headphones and visit the Virtual Barber Shop at https://www.youtube.com/watch?v=IUDTlvagjJA. True 3D audio provides an even more realistic spatial audio rendering, where sounds bounce off nearby walls and can be occluded by obstacles in the scene to enhance the first-person experience and realism.

Lastly, the VR headset should fit your head and face comfortably so that it's easy to forget that you're wearing it, and it should block out light from the real environment around you.

Head tracking

So, we have a nice 3D picture that is viewable in a comfortable VR headset with a wide field of view. If this was it and you moved your head, it'd feel like you had a diorama box stuck to your face. Move your head and the box moves along with it, and this is much like holding the antique stereograph device or the childhoodView-Master. Fortunately, VR is so much better.

The VR headset has a motion sensor (IMU) inside that detects spatial acceleration and rotation rates on all three axes, providing what's called the six degrees of freedom. This is the same technology that is commonly found in mobile phones and some console game controllers. Mounted on your headset, when you move your head, the current viewpoint is calculated and used when the next frame's image is drawn. This is referred to as motion detection.

The previous generation of mobile motion sensors was good enough for us to play mobile games on a phone, but for VR, it's not accurate enough. These inaccuracies (rounding errors) accumulate over time, as the sensor is sampled thousands of times per second and one may eventually lose track of where they were in the real world. This drift was a major shortfall of the older, phone-based Google Cardboard VR. It could sense your head's motion, but it lost track of your head's orientation. The current generation of phones, such as Google Pixel and Samsung Galaxy, which conform to the Daydream specifications, have upgraded sensors.

High-end HMDs account for drift with a separate positional tracking mechanism. The Oculus Rift does this with inside-out positional tracking, where an array of (invisible) infrared LEDs on the HMD are read by an external optical sensor (infrared camera) to determine your position. You need to remain within the view of the camera for the head tracking to work.

Alternatively, the Steam VR VIVE Lighthouse technology does outside-in positional tracking, where two or more dumb laser emitters are placed in the room (much like the lasers in a barcode reader at the grocery checkout), and an optical sensor on the headset reads the rays to determine your position.

Windows MR headsets use no external sensors or cameras. Rather, there are integrated cameras and sensors to perform spatial mapping of the local environment around you, in order to locate and track your position in the real-world 3D space.

Either way, the primary purpose is to accurately find the position of your head and other similarly equipped devices, such as handheld controllers.

Together, the position, tilt, and the forward direction of your head—or the head pose—are used by the graphics software to redraw the 3D scene from this vantage point. Graphics engines such as Unity are really good at this.

Now, let's say that the screen is getting updated at 90 FPS, and you're moving your head. The software determines the head pose, renders the 3D view, and draws it on the HMD screen. However, you're still moving your head. So, by the time it's displayed, the image is a little out of date with respect to your current position. This is called latency, and it can make you feel nauseous.

Motion sickness caused by latency in VR occurs when you're moving your head and your brain expects the world around you to change exactly in sync. Any perceptible delay can make you uncomfortable, to say the least.

Latency can be measured as the time from reading a motion sensor to rendering the corresponding image, or the sensor-to-pixel delay. According to Oculus's John Carmack:

A total latency of 50 milliseconds will feel responsive, but still noticeable laggy. 20 milliseconds or less will provide the minimum level of latency deemed acceptable.

There are a number of very clever strategies that can be used to implement latency compensation. The details are outside the scope of this book and inevitably will change as device manufacturers improve on the technology. One of these strategies is what Oculus calls the timewarp, which tries to guess where your head will be by the time the rendering is done and uses that future head pose instead of the actual detected one. All of this is handled in the SDK, so as a Unity developer, you do not have to deal with it directly.

Meanwhile, as VR developers, we need to be aware of latency as well as the other causes of motion sickness. Latency can be reduced via the faster rendering of each frame (keeping the recommended FPS). This can be achieved by discouraging your head from moving too quickly and using other techniques to make yourself feel grounded and comfortable.

Another thing that the Rift does to improve head tracking and realism is that it uses a skeletal representation of the neck so that all the rotations that it receives are mapped more accurately to the head rotation. For example, looking down at your lap creates a small forward translation since it knows it's impossible to rotate one's head downwards on the spot.

Other than head tracking, stereography, and 3D audio, virtual reality experiences can be enhanced with body tracking, hand tracking (and gesture recognition), locomotion tracking (for example, VR treadmills), and controllers with haptic feedback. The goal of all of this is to increase your sense of immersion and presence in the virtual world.

Types of VR experiences

There is not just one kind of virtual reality experience. In fact, there are many. Consider the following types of virtual reality experiences:

Diorama

: In the simplest case, we build a 3D scene. You're observing from a third-person perspective. Your eye is the camera. Actually, each eye is a separate camera that gives you a stereographic view. You can look around.

First-person experience

: This time, you're immersed in the scene as a freely moving avatar. Using an input controller (keyboard, game controller, or some other technique), you can walk around and explore the virtual scene.

Interactive virtual environment

: This is like the first-person experience, but it has an additional feature—while you are in the scene, you can interact with the objects in it. Physics is at play. Objects may respond to you. You may be given specific goals to achieve and challenges with the game mechanics. You might even earn points and keep score.

3D content creation

: In VR, create content that can be experienced in VR.

Google Tilt Brush

is one of the first blockbuster experiences, as is

Oculus Medium

and

Google Blocks

and others. Unity is working on

EditorXR

for Unity developers to work on their projects directly in the VR scene.

Riding on rails

: In this kind of experience, you're seated and being transported through the environment (or the environment changes around you). For example, you can ride a rollercoaster via this virtual reality experience. However, it may not necessarily be an extreme thrill ride. It can be a simple real estate walk-through or even a slow, easy, and meditative experience.

360-degree media

: Think panoramic images taken with

GoPro

 on steroids that are projected on the inside of a sphere. You're positioned at the center of the sphere and can look all around. Some purists don't consider this

real

 virtual reality, because you're seeing a projection and not a model rendering. However, it can provide an effective sense of presence.

Social VR

: When multiple players enter the same VR space and can see and speak with each other's avatars, it becomes a remarkable social experience.

In this book, we will implement a number of projects that demonstrate how to build each of these types of VR experience. For brevity, we'll need to keep it pure and simple, with suggestions for areas for further investigation.

Technical skills that are important to VR

Each chapter of the book introduces new technical skills and concepts that are important if you wish to build your own virtual reality applications. You will learn about the following in this book:

World scale

: When building for a VR experience, attention to the 3D space and scale is important. One unit in Unity is usually equal to one meter in the virtual world.

First-person controls

: There are various techniques that can be used to control the movement of your avatar (first-person camera), gaze-based selection, tracked hand input controllers, and head movements.

User interface 

controls

: Unlike conventional video (and mobile) games, all user interface components are in world coordinates in VR, not screen coordinates. We'll explore ways to present notices, buttons, selectors, and other

user interface

(

UI

) controls to the users so that they can interact and make selections.

Physics and gravity

: Critical to the sense of presence and immersion in VR is the physics and gravity of the world. We'll use the Unity physics engine to our advantage.

Animations

: Moving objects within the scene is called

animation

—duh! It can either be along predefined paths or it may use AI (artificial intelligence) scripting that follows a logical algorithm in response to events in the environment.

Multi-user services

: Real-time networking and multi-user games are not easy to implement, but online services make it easy without you having to be a computer engineer.

Build

,

run and optimize

: Different HMDs use different developer kits SDK and assets to build applications that target a specific device. We'll consider techniques that let you use a single interface for multiple devices. Understanding the rendering pipeline and how to optimize performance is a critical skill for VR development.

We will write scripts in the C# language and use features of Unity as and when they are needed to get things done.

However, there are technical areas that we will not cover, such as realistic rendering, shaders, materials, and lighting. We will not go into modeling techniques, terrains, or humanoid animations. We also won't discuss game mechanics, dynamics, and strategies. All of these are very important topics that may be necessary for you to learn (or for someone in your team), in addition to this book, to build complete, successful and immersive VR applications.

 

Summary

In this chapter, we looked at virtual reality and realized that it can mean a lot of things to different people and can have different applications. There's no single definition, and it's a moving target. We are not alone, as everyone's still trying to figure it out. The fact is that virtual reality is a new medium that will take years, if not decades, to reach its potential.

VR is not just for games; it can be a game changer for many different applications. We identified over a dozen. There are different kinds of VR experiences, which we'll explore in the projects in this book.

VR headsets can be divided into those that require a separate processing unit (such as a desktop PC or a console) that runs with a powerful GPU and the ones that use your mobile technologies for processing.

We're all pioneers living at an exciting time. Because you're reading this book, you're one, too. Whatever happens next is literally up to you. The best way to predict the future is to invent it.

So, let's get to it!

In the next chapter, we'll jump right into Unity and create our first 3D scene and learn about world coordinates, scaling, and importing 3D assets. Then, in Chapter 3, VR Build and Run, we'll build and run it on a VR headset, and we'll discuss how virtual reality really works.

Content, Objects, and Scale