39,59 €
Unreal Engine 4 is a powerful tool for developing VR games and applications. With its visual scripting language, Blueprint, and built-in support for all major VR headsets, it's a perfect tool for designers, artists, and engineers to realize their visions in VR.
This book will guide you step-by-step through a series of projects that teach essential concepts and techniques for VR development in UE4. You will begin by learning how to think about (and design for) VR and then proceed to set up a development environment. A series of practical projects follows, taking you through essential VR concepts. Through these exercises, you'll learn how to set up UE4 projects that run effectively in VR, how to build player locomotion schemes, and how to use hand controllers to interact with the world. You'll then move on to create user interfaces in 3D space, use the editor's VR mode to build environments directly in VR, and profile/optimize worlds you've built. Finally, you'll explore more advanced topics, such as displaying stereo media in VR, networking in Unreal, and using plugins to extend the engine.
Throughout, this book focuses on creating a deeper understanding of why the relevant tools and techniques work as they do, so you can use the techniques and concepts learned here as a springboard for further learning and exploration in VR.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 657
Veröffentlichungsjahr: 2019
Copyright © 2019 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Commissioning Editor: Kunal ChaudhariAcquisition Editor:Karan GuptaContent Development Editor:Arun NadarTechnical Editor: Rutuja VazeCopy Editor: Safis EditingProject Coordinator:Kinjal BariProofreader: Safis EditingIndexer:Priyanka DhadkeGraphics:Alishon MendonsaProduction Coordinator:Arvindkumar Gupta
First published: April 2019
Production reference: 1300419
Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK.
ISBN 978-1-78913-287-8
www.packtpub.com
Mapt is an online digital library that gives you full access to over 5,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.
Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals
Improve your learning with Skill Plans built especially for you
Get a free eBook or video every month
Mapt is fully searchable
Copy and paste, print, and bookmark content
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.
Kevin Mack is a co-founder of Manic Machine, a Los Angeles-based development studio specializing in VR and virtual production development using Unreal Engine. Manic Machine designs and builds games in VR and provides development services to clients and partners in the film and visual effects industries. Prior to this, he co-founded WhiteMoon Dreams, which developed traditional and VR games and experiences. Earlier work includes design on the Medal of Honor series for EA, Fear Effect series for Kronos Digital Entertainment, and several titles for Disney Interactive. Kevin holds a BFA in film production from New York University and an MFA in film directing from the American Film Institute.
Robert Ruud is a co-founder of Manic Machine, where he focuses primarily on the design and development of Manic Machine's proprietary tech and gameplay experiences. Prior to this, he spent six years at Whitemoon Dreams, where he designed and engineered gameplay for the successfully kickstarted game, Warmachine: Tactics, which was one of the first games to be released to market using Unreal Engine 4, and where he also led the design exploration for the company's location-based VR experiences. Robert holds a BA in philosophy from California State Polytechnic University, Pomona, where his studies focused on cognitive science and philosophy of the mind.
Deepak Jadhav is a game developer based in Pune, India. Deepak holds a bachelor's degree in computer technology and a master's degree in game programming and project management. Currently, he is working as a game developer at a leading game development company in India. He has been involved in developing games on multiple platforms, such as PC, macOS, and mobile. With years of experience in game development, he has a strong background in C# and C++, and has also refined his skills in platforms including Unity, Unreal Engine, Augmented and Virtual Reality.
If you're interested in becoming an author for Packt, please visit authors.packtpub.com and apply today. We have worked with thousands of developers and tech professionals, just like you, to help them share their insight with the global tech community. You can make a general application, apply for a specific hot topic that we are recruiting an author for, or submit your own idea.
Title Page
Copyright and Credits
Unreal Engine 4 Virtual Reality Projects
About Packt
Why subscribe?
Packt.com
Contributors
About the authors
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Thinking in VR
What is virtual reality?
VR hardware
VR isn't just about hardware though
Presence is tough to achieve
What can we do in VR?
Games in VR
Interactive VR
VR cinema – movies, documentary, and journalism
Architecture, Engineering, and Construction (AEC) and real estate
Engineering and design
Education and training
Commerce, advertising, and retail
Medicine and mental health
So much else
Immersion and presence
Immersion
Using all the senses
Make sure sensory inputs match one another and match the user's expectations
Keep latency as low as possible
Make sure interactions with the world make sense
Build a consistent world
Be careful of contradicting the user's body awareness
Decide how immersive you intend your application to be and design accordingly
Presence
Simulator sickness
Safety
Best practices for VR
Maintain framerate
Tethered headsets
Standalone Headsets
Never take control of the user's head
Do not put acceleration or deceleration on your camera
Do not override the field of view, manipulate depth of field, or use motion blur
Minimize vection
Avoid stairs
Use more dimmer lights and colors than you normally would
Keep the scale of the world accurate
Be conscious of physical actions
Manage eyestrain
Make conscious choices about the content and intensity of your experience
Let players manage their own session duration
Keep load times short
Question everything we just told you
Planning your VR project
Clarify what you're trying to do
Is it a good fit for VR? Why?
What's important – what has to exist in this project for it to work? (MVP)
Break it down
Tackle things in the right order
Test early and often
Design is iterative
Summary
Setting Up Your Development Environment
Prerequisite – VR hardware
Setting up Unreal Engine
What it costs
Creating an Epic Games account
The Epic Games launcher
Installing the engine
Editting your vault cache location
Setting up a Derived Data Cache (DDC)
Setting up a local DDC
Launching the engine
Setting up for mobile VR
Creating or joining an Oculus developer organization
Setting your VR headset to developer mode in Oculus Go
Installing Android Debug Bridge (ADB)
Setting up NVIDIA CodeWorks for Android
Verifying that the HMD can communicate with your PC
Generating a signature file for Samsung Gear
Deploying a test project to the device
Setting up a test project
Checking that your OculusVR plugin is enabled
Setting a default map
Clearing the default mobile touch interface
Setting your Android SDK project settings
Setting your Android SDK locations
Launching the test project
Using the Epic Games launcher
The Unreal Engine Tab
Learn
The content examples project
Gameplay concepts and example games
Marketplace
Library
Setting up for C++ development
Installing Microsoft Visual Studio Community
Recommended settings
The UnrealVS plugin
Installing the UnrealVS plugin
Turning on the UnrealVS toolbar
Unreal debugging support
Test everything out 
Building Unreal from source code
Setting up a GitHub account and installing Git
Setting up or logging into your GitHub account
Installing Git for Windows
Installing Git Large File Storage
Installing a Git GUI
Connecting your GitHub account to your Epic Games account
Downloading the Unreal Engine source code
Choosing your source branch
Forking the repository
Cloning the repository to your local machine
Option 1 – Cloning using GitHub Desktop
Option 2 – Cloning from the command line
Downloading engine binary content
Generating project files
Opening and building the solution
Updating your fork with new changes from Epic
Option – Using the command line to sync changes
Setting the upstream repository
Syncing the fork
Reviewing the Git commands we just used
Option – Using the web GUI to sync changes
Creating a pull request
Merging the pull request
Pulling the origin to your local machine
Re-synchronizing your engine content and regenerating project files
Going further with source code on GitHub
Additional useful tools
A good robust text editor
3D modeling software
Image-editing software
Audio-editing software
Summary
Hello World - Your First VR Project
Creating a new project
Setting your hardware target
Setting your graphics target
Settings summary
Taking a quick look at your project's structure
The Content directory
The Config directory
The Source directory
The Project file
A summary of an Unreal project structure
Setting your project's settings for VR
Instanced Stereo
Round Robin Occlusions
Forward and deferred shading
Choosing the right rendering method for your project
Choosing your anti-aliasing method
Modifying MSAA settings
Starting in VR
Turning off other stray settings you don't need
Turning off default touch interface (Oculus Go/Samsung Gear)
Configuring your project for Android (Oculus Go/Samsung Gear)
Verifying your SDK locations
Making sure Mobile HDR is turned off (Oculus Go/Samsung Gear)
Mobile Multi-View (Oculus Go/Samsung Gear)
Monoscopic Far Field Rendering (Oculus Go / Samsung Gear)
Project Settings cheat-sheet
Decorating our project
Migrating content into a project
Cleaning up migrated content
Deleting assets safely
Moving assets and fixing up redirectors
Setting a default map
Testing our map on desktop
Testing our map on mobile (Oculus Go/Samsung Gear)
Setting up a game mode and player pawn
Creating a VR pawn
Creating a game mode
Assigning the game mode
Overriding a GameMode for a specific map
Placing a pawn directly in the world
Setting up the VR pawn
Adding a camera
Adding motion controllers
Setting our tracking origin.
Adjusting our Player Start location to the map.
Testing in the headset.
Packaging a standalone build
Summary
Getting Around the Virtual World
Teleport locomotion
Creating a navigation mesh
Moving and scaling the Navmesh Bounds volume
Fixing collision problems
Excluding areas from the navmesh
Modifying your navmesh properties
Setting up the pawn Blueprint
Iterative development
Make it work
Make it right
Make it fast
Do things in order
Setting up a line trace from the right motion controller
Improving our Trace Hit Result
Using navmesh data
Changing from line trace to parabolic trace
Drawing the curved path
Drawing the endpoint after all the line segments have been drawn
Teleporting the player
Creating Input Mappings
Caching our teleport destination
Executing the teleport
Allowing the player to choose their landing orientation
Mapping axis inputs
Cleaning up our Tick event
Using thumbstick input to orient the player
Creating a teleport destination indicator
Giving it a material
Adding the teleport indicator to the pawn
Optimizing and refining our teleport
Displaying UI only when teleport input is pressed
Creating a deadzone for our input
Fading out and in on teleport
Teleport locomotion summary
Seamless locomotion
Setting up inputs for seamless locomotion
Changing the pawn's parent class
Fixing the collision component
Handling movement input
Fixing movement speed
Letting the player look around without constantly steering
Implementing snap-turning
Setting up inputs for snap turning
Executing the snap turn
Going further
Snap turn using analog input
Summary
Interacting with the Virtual World - Part I
Starting a new project from existing work
Migrating Blueprints to a new project
Copying input bindings
Setting up new project to use the migrated game mode
Additional project settings for VR
Testing our migrated game mode and pawn
Adding scenery
Adding a NavMesh
Testing the map
Creating hands
Migrating hand meshes and animations from the VR Template project
Adding hand meshes to our motion controllers
Creating a new Blueprint Actor class
Adding motion controller and mesh components
Adding a Hand variable
Using a Construction Script to handle updates to the Hand variable
Adding BP_VRHand child actor components to your pawn
Fixing issues with Hand meshes
Replacing references to our old motion controller components in blueprints
Creating a function to get our hand mesh
Animating our hands
A quick word about access specifiers
Calling our grab functions from the pawn
Creating new input action mappings
Adding handlers for new action mappings
Implementing grab animations in the Hand blueprints
Creating an Animation Blueprint for the hand
Creating a blend space for our hand animations
Wiring the blend space into the animation blueprint
Connecting the animation blueprint to our hand blueprint
Creating a new enumerator for our grip
Smoothing out our grip animation
Summary
Interacting with the Virtual World - Part II
Creating an object we can pick up
Creating a Blueprint Interface for pickup objects
Implementing the Pickup and Drop functions
Setting up VRHand to pick up objects
Creating a function to find the nearest pickup object
Calling Find Nearest Pickup Object on the Tick event
Picking up an actor
Releasing an actor
Test grabbing and releasing
Fixing cube collision
Letting players know when they can pick something up
Adding haptic feedback
Creating a Haptic Feedback Effect Curve
Playing the haptic effect on command
Going further
Summary
Creating User Interfaces in VR
Getting started
Creating a new Unreal project from an existing project
We’re not alone – adding an AI character
Migrating the third-person character blueprint
Cleaning up the third-person character blueprint
Examining the animation blueprint
Creating a companion character subclass
Adding a follow behavior to our companion character
Examining the AI controller
Improving the companion's follow behavior
Adding a UI indicator to the companion pawn
Creating a UI widget using UMG
Adding a UI widget to an actor
Orienting the indicator widget to face the player
Implementing the Align UI function
Calling Align UI from the Tick event
Adding a new AI state to the companion pawn
Implementing a simple AI state
Indicating AI states using the UI indicator
Using events to update, rather than polling
Being careful of circular references
Ensuring that UI is updated when our state is changed
Adding an interactive UI
Adjusting the button colors 
Adding event handlers to our buttons
Attaching the UI element to the player pawn
Using widget interaction components
Sending input through widget interaction components
Making a better pointer for our interaction component
Creating an interaction beam material
Creating an impact effect
Summary
Building the World and Optimizing for VR
Setting up the project and collecting assets
Migrating blueprints into the new project
Verifying the migrated content
Using the VR editor
Entering and exiting VR Mode
Navigating in VR Mode
Moving through the world
Teleporting through the world
Rotating the world
Scaling the world
Practicing movement
Modifying the world in VR Mode
Moving, rotating, and scaling objects
Using both controllers to rotate and scale objects
Practicing moving objects
Composing a new scene in VR Mode
Navigating the radial menu
Gizmo
Snapping
Windows
Edit
Tools
Modes
Actions and System
Making changes to our scene
Optimizing scenes for VR
Testing your current performance
Stat FPS
Determining your frame time budget
Warnings about performance profiling
Stat unit
Profiling the GPU
Stat scenerendering
Draw calls
Stat RHI
Stat memory
Optimization view modes
CPU profiling
Turning things on and off
Addressing frame rate problems
Cleaning up Blueprint Tick events
Managing skeletal animations
Merging actors
Using mesh LODs
Static mesh instancing
Nativizing Blueprints
Summary
Displaying Media in VR
Setting up the project
Playing movies in Unreal Engine
Understanding containers and codecs
Finding a video file to test with
Adding a video file to an Unreal project
Creating a File Media Source asset
Creating a Media Player
Using Media Textures
Testing your Media Player
Adding video to an object in the world
Using a media playback material
Adding sound to our media playback
Playing media
Going deeper with the playback material
Adding additional controls to our video appearance
Displaying stereo video
Displaying half of the video
Displaying a different half of the video to each eye
Displaying over/under stereo video
Displaying 360 degree spherical media in VR
Finding 360 degree video 
Creating a spherical movie screen
Playing stereoscopic 360 degree video
Controlling your Media Player
Creating a Media Manager
Adding a Pause and Resume function
Assigning events to a media player
Summary
Creating a Multiplayer Experience in VR
Testing multiplayer sessions
Testing multiplayer from the editor
Understanding the client-server model
The server
Listen servers, dedicated dervers, and clients
Listen servers
Dedicated servers
Clients
Testing multiplayer VR
Setting up our own test project
Adding an environment
Creating a network Game Mode
Objects on the network
Server-only objects
Server and client objects
Server and owning client objects
Owning client only objects
Creating our network game mode
Creating a network client HUD
Creating a widget for our HUD
Adding a widget to our HUD
Network replication
Creating a replicated actor
Spawning an actor on the server only
Replicating the actor to the client
Replicating a variable
Notifying clients that a value has changed using RepNotify
Creating network-aware pawns for multiplayer
Adding a first-person pawn
Setting collision response presets
Setting up a third-person character mesh
Adjusting the third-person weapon
Replicating player actions
Using remote procedure calls to talk to the server
Using multicast RPCs to communicate to clients
Client RPCs
Reliable RPCs
Going further
Summary
Taking VR Further - Extending Unreal Engine
Creating a project to house our plugin
Installing the VRExpansion plugin
Installing using precompiled binaries
Compiling your own plugin binaries
Verifying the plugins in your project
Understanding plugins
Where plugins live
Installing plugins from the Marketplace
What's inside a plugin?
About licenses
Inside a plugin directory
Finishing our brief tour
Exploring the VRExpansion example project
Finishing our project setup
Using VRExpansion classes
Adding navigation
Adding a game mode
Updating the PlayerStart class
Adding a VR character
Setting up input
Setting up your VR character using example assets
Making effective use of example assets
Migrating the example pawn
Making sense of complicated blueprints
Begin by checking the parent class
Looking at the components to see what they're made of
Look for known events and see what happens when they run
Using inputs as a way to find a starting point in your blueprint
Setting breakpoints and tracing execution
Viewing the execution trace
Managing breakpoints with the Debug window
Using the call stack
Finding variable references
Using more of the VRExpansion plugin
Summary
Where to Go from Here
Final word
Useful Mind Hacks
Rubber-duck debugging
Just the facts
Describing your solutions in positive terms
Plan how you're going to maintain and debug your code when you write it
Favor simple solutions
Look it up before you make it up
Research and Further Reading
Unreal Engine resources
VR resources
Other Books You May Enjoy
Leave a review - let other readers know what you think
Virtual reality (VR) isn't just the media we knew and loved from the twentieth century in a stereo headset. It's much more than that. VR doesn't simply show us images of the world around us in stereo 3D. In a literal sense, sure, that is what it does, but that's a little like saying that music just wiggles the air around our ears. Technically true, but too reductive to let us understand it. VR plays with our senses and dances with the cognitive mechanisms by which we think we understand the world. To get VR and learn how to create for it, we have to accept that it is an entirely new medium, and what we don't know about its language, rules, and methods far outweighs what we do know. This is powerful stuff, and, without question, VR or some variant of this technology is likely to be the defining art form of the twenty-first century.
You'd be right to greet this assertion with a bit of skepticism. Given the present state of the technology and of the industry, it takes some imagination to see beyond the horizon of where we are now. And you've probably seen by now that the public's expectations are in a race with the actual state of the technology and the art form. Sometimes, they lag behind its reality, and sometimes they jump ahead. Opinions about VR, therefore, are all over the place. If we're in one of those phases where the tech makes a leap forward, people get amazed and excited by the possibilities and the breathless blogs declare that the world has changed. If we're in one of those phases where the expectations have jumped ahead, suddenly everyone's disappointed that their first-generation Oculus Rift hasn't morphed overnight into the Holodeck and we see a lot of disillusionment on blogs. It's impossible to predict where the pendulum will be in its swing when you read this.
Here's the reality though, and why we believe this medium is worth learning now: VR is coming, it's inevitable, and it changes everything, even if this isn't yet obvious from the rudimentary state of the first-generation technology. This medium carries with it the potential to revolutionize the way we learn, play, engage the virtual world, and so much else. But it's going to take time and imagination.
VR is a medium at a crossroads. The decisions we make now are going to carry us far into the future. The developers working in this medium will be the ones to shape its language and methods for the next generation. To work in VR is to work on a frontier, and that's an exciting place to be.
In this book, we intend to give you a solid set of tools to begin your work on this frontier. This book uses a practical, hands-on approach to teach you how to build VR games and applications using the Unreal Engine. Each chapter walks you step-by-step through the process of building the essential building blocks of a VR application, and we pair these steps with in-depth explanations of what's really going on when you follow them and why things are done the way they are. It's this why that matters. Understanding how the underlying systems and ideas work is crucial to the work you'll do on your own after you've finished these tutorials, and, in this book, we've tried to give you both—an understanding of what to do to build a VR application, and the background you'll need in order to use this book as a springboard for your own work in VR.
You should come away from this book with a solid understanding of how VR applications are built, and what specifically you need to know and understand about the Unreal Engine to build them. It's our hope that the work we do together here will set you up to take your exploration into this new frontier wherever you want to go.
If you're interested in creating VR games or applications, interested in seeing how VR could augment the work you do in your current field, or are just interested in exploring VR and seeing what it can do, this book is for you. You don't have to be an experienced engineer or even deeply experienced with Unreal Engine to benefit from this book; we explain everything as we go. Readers who are entirely new to Unreal Engine will find it helpful to run through Epic's getting started tutorials before diving in here, just so you know where everything is, but this book is entirely appropriate for both experienced Unreal users who need to learn specifically how Unreal works with VR, and for new Unreal users just finding their way around.
Whether you're entirely new to VR development and to Unreal, you've already been working in VR in another engine, or you know your way around Unreal but are new to VR, this book should be able to provide a lot of value. (And we hope even those already well versed in VR creation using Unreal Engine find a few interesting new perspectives and techniques as well.)
Chapter 1, Thinking in VR, introduces VR as a medium and discusses a few of the many ways it can be used in a number of fields. We discuss the crucial concepts of immersion and presence, and outline practices for designing and building effective VR experiences.
Chapter 2, Setting Up Your Development Environment, takes you through the process of setting up Unreal Engine and setting up to develop for mobile VR, and talks about where to learn about using Unreal and where to get help. For those interested in working in C++, this chapter also shows how to set up your development environment to build C++ projects and to build Unreal Engine from source code.
Chapter 3, Hello World: Your First VR Project, shows you how to create a new VR project from scratch, what settings to use when creating for VR and why we use them, and what you need to do differently if you're building for mobile VR. This chapter also teaches you how to get content into your project and work with it, and how to set up a few of the basic blueprints you'll need for VR development.
Chapter 4, Getting Around the Virtual World, teaches you how to create and refine navigation meshes for character locomotion, how to build a player-controlled character and set up input handling, and then shows how to build a teleport-based locomotion scheme and how to implement seamless movement for a more immersive VR experience.
Chapter 5, Interacting with the Virtual World - Part I, shows you how to add hands to the player-controlled character and use hand-held motion controllers to drive them.
Chapter 6, Interacting with the Virtual World - Part II, shows how to set up an animation blueprint to animate the player's hands in response to input, and how to make it possible for players to pick up and manipulate objects in the world.
Chapter 7, Creating User Interfaces in VR, shows you how to create interactive 3D user interfaces for VR, and introduces an AI-controlled companion character to be controlled by this interface.
Chapter 8, Building the World and Optimizing for VR, teaches you how to use the Unreal Editor's VR Mode to build environments from within VR, and how to find performance bottlenecks in your environment and fix them.
Chapter 9, Displaying Media in VR, teaches you how to display video media on virtual screens in VR space, in both mono and stereo. You'll learn how to put 2D and 3D movies onto traditional virtual screens, how to surround the player with 360-degree mono and stereo video, and how to create a media manager to control its playback.
Chapter 10, Creating a Multiplayer Experience in VR, teaches you about Unreal's client-server network model, and shows you how to replicate actors, variables, and function calls from the server to connected clients, how to set up a player character to display differently to its owner and to other players, and how to set up remote procedure calls to trigger events on the server from clients.
Chapter 11, Taking VR Further - Extending Unreal Engine, shows you how to install and build plugins to extend the engine's capabilities, and how to use Blueprint's powerful debugging tools to dig into unfamiliar code and understand it.
Chapter 12, Where to Go from Here, shows you where to get further information as you dive deeper into VR development.
Appendix A, Useful Mind Hacks, leaves you with a number of useful mind hacks to make your development more effective.
Appendix B, Research and Further Reading, provides a few useful starting places for your search that will gradually help accelerate your learning enormously.
You don't need to be an expert Unreal developer to benefit from this book, but it is helpful to have a sense of where things are. If you haven't yet installed Unreal Engine, don't worry—we'll cover this in Chapter 2, Setting Up Your Development Environment, but if you've never used it before, it may be helpful at that point to take the time to run through the Unreal getting started tutorials before diving back into this book just so you know where everything is.
All of the projects in this book have been designed to work with the Oculus Rift and HTC Vive minimum specs, so whether you're on a desktop or a laptop, you should be fine provided your system meets these minimum specs. You should, of course, have a VR headset available, and if you're planning to develop for mobile VR, it's still recommended that you have a desktop VR headset available as well, since it will make testing dramatically easier. All of the software you'll be using through the course of this book is freely available online and we'll walk you through downloading and installing it, so there's nothing special you need to have installed on your system before we begin.
This book is primarily written with PC developers in mind, but if you're working on a Mac, your development environment setup will be different, but everything we do in the engine will work the same way.
So that's it. If you have a VR headset, a system that can run it, and internet access (since we'll be downloading the engine and example content), you have everything you need.
You can download the example code files for this book from your account at www.packt.com. If you purchased this book elsewhere, you can visit www.packt.com/support and register to have the files emailed directly to you.
You can download the code files by following these steps:
Log in or register at
www.packt.com
.
Select the
SUPPORT
tab.
Click on
Code Downloads & Errata
.
Enter the name of the book in the
Search
box and follow the onscreen instructions.
Once the file is downloaded, please make sure that you unzip or extract the folder using the latest version of:
WinRAR/7-Zip for Windows
Zipeg/iZip/UnRarX for Mac
7-Zip/PeaZip for Linux
The code bundle for the book is also hosted on GitHub at https://github.com/PacktPublishing/Unreal-Engine-4-Virtual-Reality-Projects. In case there's an update to the code, it will be updated on the existing GitHub repository.
We also have other code bundles from our rich catalog of books and videos available at https://github.com/PacktPublishing/. Check them out!
We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here: https://www.packtpub.com/sites/default/files/downloads/9781789132878_ColorImages.pdf.
There are a number of text conventions used throughout this book.
CodeInText: Indicates code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles. Here is an example: "We should take a quick look at your project's .uproject file as well."
A block of code is set as follows:
html, body, #map { height: 100%; margin: 0; padding: 0}
When we wish to draw your attention to a particular part of a code block, the relevant lines or items are set in bold:
[default]exten => s,1,Dial(Zap/1|30)exten => s,2,Voicemail(u100)
exten => s,102,Voicemail(b100)
exten => i,1,Voicemail(s0)
Any command-line input or output is written as follows:
UE4Editor.exe ProjectName ServerIP -game
Bold: Indicates a new term, an important word, or words that you see onscreen. For example, words in menus or dialog boxes appear in the text like this. Here is an example: "Select Window | Developer Tools | Device Profiles to open the Device Profiles window."
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book, mention the book title in the subject of your message and email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you would report this to us. Please visit www.packt.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit authors.packtpub.com.
Please leave a review. Once you have read and used this book, why not leave a review on the site that you purchased it from? Potential readers can then see and use your unbiased opinion to make purchase decisions, we at Packt can understand what you think about our products, and our authors can see your feedback on their book. Thank you!
For more information about Packt, please visit packt.com.
Welcome to the virtual world. (It's bigger on the inside.)
In this book, we're going to explore the process of creating VR applications, games, and experiences using Unreal Engine 4. We'll spend some time looking at what VR is and what we can do to design effectively for the medium, and then, from there, we'll move on to demonstrate these concepts in depth using the Unreal Engine to craft VR projects that illustrate and explore these techniques and ideas.
Every chapter will revolve around a hands-on project, beginning with basics such as setting up your development environment and creating your first test applications in VR, and moving on from there into increasingly in-depth explorations of what you can do in VR and how you can use Unreal Engine 4 to do it. In each project, we'll walk you through the process of building a project that demonstrates a specific topic in VR and explain the methods used and, in some cases, demonstrate a few alternatives. It's important to us, as you build these projects, that you come away not just knowing how to do the things we describe, but also why we do them this way, so you can use what you've learned as a launchpad to plan and execute your own work.
In this first chapter, we'll look at what VR is and a few of the many ways it's currently used in a wide range of fields. We'll talk about the two most important concepts in VR: immersion and presence, and how understanding what these are and how they work will help you to make better experiences for your users. We'll lay out a collection of best practices for developing immersive and engaging VR experiences, and talk about some of the unique challenges posed by VR development. Finally, we'll pull this knowledge together and dig into a method for planning and executing a VR project's design.
In brief, this chapter is going to take us through the following topics:
What is virtual reality?
What can we do in VR?
Immersion and presence
Best practices for VR
Planning your VR project
Let's start at the beginning, and talk about virtual reality itself. VR, at its most basic level, is a medium that immerses users into a simulated world, allowing them to see, hear, and interact with an environment and things within this environment that don't actually exist in the physical world around them. Users are fully surrounded by this experience, an effect that VR developers call immersion. Users who are immersed in a space can look around and often move and interact without ever breaking the illusion that they're actually there. Immersion, as we're going to see shortly, is fundamental to the way VR works.
The most common way of immersing a user, and the one we'll be talking about in this book, is through the use of a Head-Mounted Display (HMD), often just referred to as a headset. (There are other ways of doing VR—projecting images on walls, for example, but in this book, we focus on head-mounted VR.) The user's headset displays the virtual world and tracks the movement of their head to rotate and shift the view to create the illusion that they're actually looking around and moving through physical space. Some headsets, though not all of them, include headphones to add to the illusion by enabling sounds in the environment to sound as though they're coming from their sources in the virtual world through a process called spatialized audio.
Some headsets only track the direction the user is looking, while others can track changes to the user's position as well. If you're using a headset that tracks rotation but not position, and you lean forward to try to look more closely at an object, nothing's going to happen. The object will seem as though it's moving away from you as you try to lean in toward it. If you do this on a headset that tracks position as well, your virtual head will move closer to the object. We use the term Degrees of Freedom (DoF) to describe the ways objects can move in space. (Yes, it's OK to pronounce it doff. All of the developers do.) Take a look at the following points:
3DOF
:
A device that tracks rotation but doesn't track position is commonly called a
3DoF
device because it only tracks the three degrees of freedom that describe rotation
:
the degree to which the device is leaning to the side (
roll
), tilting forward (
pitch
), or turning sideways (
yaw
). Up until recently, all mobile VR headsets were 3DoF devices, as they used
Inertial Measurement Units
(
IMUs
) similar to those found in cellphones to detect rotation, but had no way to know where they were in space. The Oculus Go and Samsung Gear headsets are examples of 3DoF devices.
6DOF
: A device that tracks position as well as the rotation is a 6DoF device, because it's
tracking all six degrees of freedom—roll, pitch, and yaw, but also up and down, side-to-side, and forward or backward movement. Tracking an object's position in space requires you to have a fixed reference point from which you can describe its motion. Most first-generation systems needed additional hardware for this. The Lighthouse base stations for the HTC Vive, or the Constellation cameras for the Oculus Rift provide this postional tracking on desktop systems. Windows Mixed Reality headsets and standalone headsets such as the Oculus Quest and Vive Focus use camera arrays on the headset to track the headset's position in the room (we call this
inside-out
tracking
), so they don't require external cameras or base stations. The HTC Vive, Oculus Rift, HTC Vive Focus, Oculus Quest, and Windows Mixed Reality headsets are 6DoF devices.
Headsets can either be tethered to a computer—as is the case with the Oculus Rift and the HTC Vive, which allows the full computing power of the attached PC to drive the visuals – or they can be self-contained devices such as the Samsung Gear, Oculus Go, Oculus Quest, and HTC Vive Focus. At the time of this writing, wireless connections between PCs and VR headsets are beginning to enter the market.
Most headsets also come paired with input devices that allow users to interact with the world, which can act as pointers or as hands. Handheld devices, as with headsets, can be tracked in three or six degrees of freedom. 3DoF devices such as the Oculus Go's controller are essentially pointers—users can aim them but can't reach out and grab something. 6DoF devices act much more like virtual hands and allow users to interact with the world in a much greater variety of ways.
One of the major mistakes many new developers make when first approaching VR is that they try to apply the traditional designs they're used to creating in 2D space to the VR space and, for the most part, this doesn't work. VR is its own medium, and it doesn't follow the same rules as the media that came before it. It's worth it to take a moment to look at what this means.
When most people first consider VR, they see the headset and assume that it's primarily a visual experience—traditional flat-screen media shown in stereo. It's understandable that it would seem this way, but their perception misses the point. Yes, the VR headset is (depending on whether or not it includes integrated audio) either primarily or entirely a display device, but the experience it creates for the user is very different than the experience created by a traditional flat screen.
Let's imagine for a minute that you're looking at a photo or a 2D video looking down over the edge of a tall building. You see the streets far below, but they don't really feel as though they're far below you. They're just small in the image. Take the same image, but now present it in stereo through a VR headset, and you'll probably experience vertigo. Why is this? Take a look at the following screenshot:
First, as we mentioned a moment ago, you're immersed in the experience. There's nothing else in the surrounding world to remind you that it isn't real. Let's jump back to our previous example—the building edge on your television—turn around and look behind you. Oh. You're just in your living room. Even when you look directly at it, the largest television you could possibly buy still leaves you with lots of peripheral vision to remind you that what you're seeing there isn't real. Everything on a flat screen, even a 3D screen, takes place on the other side of a window. You're watching, but you're not really there. In VR, the window is gone. When you look to the right, the world is still there. Look behind you, and you're still in it. Your perception is completely overtaken by an experience that has become an environment, not just a frame you're looking at.
Second, the stereo image creates a sense of real depth. You can see how far down the drop really goes. The cars in the street below aren't just small, they're far away. In a 6DoF headset that allows motion tracking, your movements in the real world are mirrored in the virtual world. You can lean over the edge or step back. This mixture of immersion, real depth perception, and natural response to your movement comes together to convince your body that what you're perceiving is real. We call this phenomenon presence, and it's a sensation that's mostly experienced physically.
The mechanics of immersion and the resulting experience of presence are unique to VR. No other medium does this.
While we're on the topic of presence, it's worth pointing out that it's a fragile phenomenon, and the current state of VR technology still faces a few challenges to creating a sense of presence fully and reliably. Some of these are rooted in hardware and are almost certainly going to go away as the technology advances. Users can feel the headset on their face, for example, and on wired headsets, they can feel the cable running from the headset. The current generation of headsets offers a field of view that's too narrow to provide peripheral vision. (The desktop devices offer a 110° field of view, but your eyes, meanwhile, can perceive a field twice as wide.) Display resolutions aren't yet high enough to keep users from being able to see individual pixels (VR users call this the screen door effect), and finicky optics can blur the user's vision if they're not perfectly aligned. This means, in practice, that it's hard to read small text on a VR headset, and that users are sometimes reminded of the hardware when they have to adjust it to get back into the sweet spot for the lenses.
Looking at the state of things, though, it's obvious that these hardware challenges won't last forever. Self-contained and wireless headsets are quickly entering the market, with increasingly reliable tracking that no longer relies on external equipment. Displays are getting wider, resolutions are getting higher, and optical waveguides show great promise for lighter displays with wider in-focus regions. VR works extremely well already, and it's easy to see how it's going to continue to improve.
There are a few other things that can break presence that we can't do as much about—hitting a desk accidentally with a controller, for example, or running into furniture, losing tracking, or hearing sounds from outside the experience. We can manage these when we have control over the user's space, but where we don't, there's not much we can do.
Even given these limitations, though, think about how profoundly the current generation of VR can create a sense of presence in a user, and realize that it only gets better from here. Users believe what they experience in VR to a degree that simply doesn't happen with other media. They explore and learn in ways that aren't possible in any other way. They empathize and connect with people and places more deeply than they could in any way, other than physically being there. Nothing else goes as deep. And we're only getting started.
So, what can we do with VR? Let's explore this, but before we begin, it's worth it to point out that this medium is still in its infancy. At the time of this writing, we're on the first generation of consumer VR hardware and the vast majority of the population hasn't even seen a VR headset yet, much less experienced it. Try this: the next time you're in a restaurant or a public space, ask yourself how many of the people around you have likely ever seen a VR headset—a handful at best. Now, how many of them have watched a movie (a century-old medium), watched television (three-quarters of a century), or played a video game (just shy of half a century)? VR is that new. We haven't come close to discovering everything we can do with it.
With that in mind then, use these ideas as a map of the current state of things and some fodder for ideas, but realize that there's much much more that we haven't even thought of yet. Why shouldn't you be the one to discover something new?
As we discussed a moment ago, VR at its core creates an experience of presence. If you're developing a game for VR, this means that designs that focus on giving the player an experience of being in a place are good candidates for the medium. Skyrim VR and Fallout 4 VR do a fantastic job of making players feel as though they're really in these expansive worlds. Myst-like games that put the player into a space they can explore and manipulate work well too.
The addition of motion controllers to simulate hands, such as those supplied with the HTC Vive, Oculus Rift, and Oculus Quest, enable developers to create simulations with complex interactions, such as Job Simulator and Vinyl Reality, which wouldn't be possible using traditional game controllers. Tender Claws' Virtual Virtual Reality provides a great example, meanwhile, of achieving 6DoF-like control with the Oculus Go's 3DoF controller.
The immersive aspect of VR means that games that surround you with the experience, such as Space Pirate Trainer, work well because the player can interact with actors all around them and not just what's in front. This need to watch all around you can be a focus of your design.
The sensation of motion VR evoked in players turns fast-moving games such as Thumper and Ludicrous Speed into physically-engaging experiences, and games such as Beat Saber capitalize on the player's physical movements to turn the game into a fitness tool as well.
Games in VR present a few challenges too, though. This same experience of presence and physical movement that makes the experience so engaging can mean that not every game design is a great candidate for VR. Simply porting a 2D game into VR isn't likely to work. A Heads-Up Display (commonly abbreviated as HUD) placed over the scene in 2D space won't work in VR, as there's no 2D plane to put it on. Fast movements that could be perfectly fine in 2D may make players motion-sick in VR. The decision to make a game for VR needs to be a conscious choice, and you'll need to design with the medium's strengths and challenges in mind.
Interactive VR experiences aren't just limited to games. 3D painting applications such as Tilt Brush allow users to sculpt and paint in room-scale 3D and share their creations with other users. Google Earth VR allows users to explore the earth, much of it in 3D. Interactive storytelling experiences such as Colosse, Allumette, Coco VR, and others immerse users in a story and allow them to interact with the world and characters. Interactive VR applications and experiences can be built for productivity or entertainment and can take almost any form imaginable.
It's worthwhile to keep a few considerations in mind when thinking about creating an interactive VR application. The mouse and keyboard aren't generally available to users in VR—they can't see these devices to use them, so interactions are usually best designed around the controllers provided with the VR system. Text can be difficult to read in VR—display resolutions are improving, but they're still low enough that small text may not be readable. The lack of a 2D HUD means that traditional menus don't work easily—usually, these need to be built into the world or attached to the player's virtual hands (see Tilt Brush for an excellent example of this.)
Interactive VR offers incredible possibilities for entirely new ways of exploring and interacting, and it's likely that we haven't even begun to see the full range of possibilities yet.
The same experience of presence that makes VR so well-suited for certain types of games makes it a powerful medium for documentary and journalism applications. VR is able to immerse users in a circumstance or environment and can evoke empathy by allowing viewers to share an experience deeply. Chris Milk, a pioneering VR filmmaker, has referred to VR as the "ultimate empathy machine," and we think that's a fair description. Alejandro Iñárritu's CARNE y ARENA was awarded a special Oscar by the Academy of Motion Picture Arts and Sciences in 2017 to recognize its powerful use of the medium to tell a story with deep empathy. VR's capacity to create presence through immersion makes things possible that simply can't be done on a flat screen.
Film and video in VR can be presented in several ways, which generally boil down to the shape of the virtual screen on which the images are presented and whether those images are presented in monoscopic 2D or stereo 3D. Flat or curved surfaces are generally used to present media carried over from traditional film or television, while domes, panoramas, or spheres can be used to surround the viewer with a more immersive 2D or 3D experience.
Mono
