27,59 €
The drastic surge in the demand for XR development has led to an imminent need for comprehensive resources, learning material, and overall know-how in this area. This one-stop resource will ensure that professionals venturing into XR development can access all XR-related techniques to build appealing XR applications, without relying on Google every step of the way.
This book is your guide to developing XR applications with Unity 2021.3 or later versions, helping you to create VR, AR, and MR experiences of increasing complexity. The chapters cover the entire XR application development process from setting up an interactive XR scene using the XR Interaction Toolkit or AR Foundation, adding physics, animations, continuous movement, teleportation, sound effects, and visual effects, to testing and deploying to VR headsets, simulators, smartphones, and tablets. Additionally, this XR book takes you on a journey from the basics of Unity and C# to advanced techniques such as building multiplayer applications and incorporating hand- and gaze-tracking capabilities.
By the end of this book, you'll be fully equipped to create cutting-edge XR projects for engaging individual, academic, and industrial use cases that captivate your audience.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 465
Veröffentlichungsjahr: 2023
A beginner’s guide to creating virtual, augmented, and mixed reality experiences using Unity
Anna Braun
Raffael Rizzo
BIRMINGHAM—MUMBAI
Copyright © 2023 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Group Product Manager: Rohit Rajkumar
Publishing Product Manager: Kaustubh Manglurkar
Book Project Manager: Sonam Pandey
Senior Editors: Divya Anne Selvaraj and Mudita Sonar
Technical Editor: Simran Ali
Copy Editor: Safis Editing
Proofreader: Safis Editing
Indexer: Hemangini Bari
Production Designer: Alishon Mendonca
DevRel Marketing Coordinator: Nivedita Pandey
First published: November 2023
Production reference: 1021123
Published by Packt Publishing Ltd.
Grosvenor House
11 St Paul’s Square
Birmingham
B3 1RB, UK
ISBN 978-1-80512-812-0
www.packtpub.com
Life is a precious gift, my dear,
Embraced with those we hold near.
With endless love, beyond what I can pay back,
Thank you, mom and dad, for the support I never lack.
My brothers, my heart’s compass and guide,
Near or far, with me, they always reside.
With you, dear husband, each moment we share,
A bond of love, laughter, and support beyond compare.
To my loving aunts, uncles, and cousins so dear,
Philippe, Catherine, Noora, and Jussi, your warmth is ever near.
Childhood friends, through decades we steer,
Invaluable moments, held forever dear.
Never forget, the gift of life is so rare,
Spend it with love, and show the world you care.
To neighbors, animals, and nature, be kind,
Let’s foster love and care in our human mind.
Let’s ensure life on Earth does shine,
For all beings, through the threads of time.
Our vision, vast as the cosmic sea,
To spread love throughout eternity.
– Anna Braun
This book stands as a significant milestone in my journey, but it pales in comparison to the achievements represented by the wonderful souls who surround me. To my parents, who laid the foundation of my life with their unwavering love and support; to my cherished brother and sister, who have walked this journey alongside me; to my friends, each one a beacon of light and strength; to my dear grandmother, whose nurturing spirit continues to guide me like a second mother; and to my future wife, soon to be the mother of our children, who ceaselessly inspires me to evolve into the best version of myself – this work is a testament to the love, guidance, and resilience you’ve instilled in me. My heart holds nothing but boundless love for each and every one of you.
– Raffael Rizzo
Anna Braun is an experienced XR developer and has already worked in XR departments at esteemed institutions, such as Fraunhofer IGD and Deutsche Telekom AG. She holds a profound interest in crafting Augmented Reality (AR) applications, tailored for both smartphones and AR glasses. Beyond her master’s degree in extended reality, Anna is a distinguished author in the technological domain and is a frequent speaker at academic conferences and events, sponsored by non-profit organizations such as the Mozilla Foundation. Moreover, she is a co-founder of a firm specializing in XR consultancy and development.
To those who’ve shaped my journey – my parents, who offered endless support; my two brothers, who served as my compass; and my dear husband, with whom every shared moment is a testament to unparalleled love, laughter, and support. To family and friends whose warmth remained constant through time’s tide. May this work reflect our shared belief in the boundless capacity of love and the interconnectedness of all life.
Raffael Rizzo initiated his XR career by developing a VR training program for a soccer academy, aimed at assessing young athletes’ reaction times. He has worked at esteemed institutions such as Deutsche Telekom AG and Fraunhofer IGD, amassing a wealth of experience in XR with a focus on Unity. His master’s degree encompasses extended reality, computer vision, machine learning, and 3D visualization. Raffael has co-founded a company dedicated to XR consultations and development.
To the incredible tapestry of people who’ve shaped my life – my parents, siblings, friends, cherished grandmother, and future wife – this achievement pales beside the gift of having you all. My gratitude and love know no bounds.
Darren Delorme is a senior Unity XR developer who has established himself as a seasoned professional in the world of augmented and virtual reality. In 2011, Darren created a map and app with the use of Vuforia, publishing and distributing 70,000 AR-enabled Tourism Maps in Tofino, British Columbia, Canada. In 2018, Darren joined Oculus Start, and in 2019, he worked as a Unity Live Help expert. Working at Arbor XR, Darren played a key role in developing a multi-platform input system for Quest, Pico, and Vive. He also designed the user interface for ArborXR home and created the 3D home environment for all their supported platforms. Darren recently joined the Royal Astronomical Society of Canada (RASC) and started a VR space project, slated for release in 2024, merging his passion for astronomy with his love for virtual reality.
Denis Pineda, known as @dampheldev on social media, is a visionary video game developer specializing in creating XR experiences. A self-taught 3D game artist since the age of 12, and passionate about learning new technologies, Denis proudly holds a bachelor’s degree in game design, with a dedicated focus on developing Unity 3D and C# projects from scratch. His repertoire encompasses the creation of awe-inspiring 3D art, from modeling and texturing to animation and integration with real-time PC, mobile, and VR platforms. Denis has contributed his expertise to advergaming XR projects in Guatemala and Honduras. He has also worked as an indie game developer for his Rambutan Dog Games brand and as a sought-after freelancer.
Welcome to the first part of this book, where we’ll provide you with essential information about Extended Reality (XR) technologies and the Unity Engine. This part serves as a foundation for your journey in XR application development in the subsequent parts. Here, we will guide you through the process of installing Unity Hub and the Unity Editor, and we’ll introduce you to important concepts such as scene creation, asset downloading from Unity Asset Store, the various types of light sources available in Unity, and explanations of supported rendering types. No prior knowledge of Unity or XR technologies is required to follow along with this part. Instead, this part will equip you with all the necessary knowledge to start your hands-on XR development adventure.
This part comprises the following chapters:
Chapter 1, Introduction to XR and UnityChapter 2, The Unity Editor and Scene CreationIn this chapter, we will explore the extraordinary potential of XR development and how Unity, the leading game engine in the field, can help us unlock it. Through this journey, you will gain a deep understanding of the various forms of Extended Reality (XR) and their capabilities, paving the way for you to create immersive and unforgettable experiences. You will discover how Unity plays a crucial role in blending the realms of the physical and digital worlds worldwide. Let's dive into this fascinating world and discover the limitless possibilities that await us.
In this chapter, our discussion will encompass the following topics:
Understanding XR and its different forms (AR, MR, and VR)How did Unity evolve as a platform for XR development?XR is an all-encompassing term that includes a range of technologies designed to enhance our senses. Whether it’s providing additional information about the physical world or creating entirely new, simulated worlds for us to explore, XR comprises Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) technologies. You may be someone who has already put on multiple XR headsets, lost yourself in immersive VR games such as Beat Saber, or even have a VR headset of your own. If so, you already have a pretty good idea of what XR is. You may know that while VR implies a complete immersion experience that shuts out the physical world, AR and MR combine the real and virtual worlds with each other. However, can you pinpoint the differences between AR and MR?
Let’s explore this through an example – imagine you’re wearing a Meta Quest 3 headset. This headset comes equipped with cameras that allow passthrough capabilities – a feature enabling you to view your real-world surroundings while also being immersed in the virtual environment. When these cameras are active, you’re essentially in an MR experience. For instance, as you stand in your living room, you might spot an ancient vase on your coffee table. In actuality, your coffee table is real, but the vase is a virtual object seamlessly integrated through MR. The true magic of MR lies in its ability to intertwine the digital and physical worlds so intimately that it becomes challenging to differentiate between them. When the passthrough is disabled, however, the headset returns to a purely VR experience, detaching you from the physical world’s view.
In contrast, AR enhances our experience of the real world by projecting digital information onto it. When wearing AR glasses, such as HoloLens or Magic Leap, there’s a clear delineation between the real and virtual elements in your view. For instance, imagine wearing AR glasses and seeing interactive labels hovering next to objects in your room, such as a floating note reminding you to water a plant or arrows directing you to a misplaced item.
Additionally, there are mobile AR devices, where the augmentation is experienced through a smartphone or tablet screen. These devices provide a windowed experience of the augmented world – for example, games such as Pokémon Go, where you use your smartphone’s camera to spot virtual creatures in the real world, or an app that lets you visualize how a piece of furniture looks in your room before purchasing.
AR strives to enhance our current reality, but MR pushes the boundaries of imagination to envision a world where real and virtual objects merge seamlessly, while VR creates a completely alternative reality altogether. Many find it challenging to differentiate between MR headsets and AR glasses, but the display serves as a good indicator. AR glasses utilize transparent displays instead of cameras, granting them direct access to real-world lighting for seamless integration of digital overlays. In contrast, MR headsets allow users to view the environment through additional sensors and cameras, but they do not provide direct access to sunlight. It’s like a spectrum of reality – at one end, we have the familiar and the real, subtly enhanced with digital elements; at the other end, we have entirely invented worlds, with limitless potential for exploration and adventure. And somewhere in between, we have the magical space where the real and virtual merge, seamlessly blending the best of both worlds. Can you guess which idea dates back the earliest – VR, AR, or MR? And how did XR enter the picture?
Although VR certainly is the most advanced of these technologies to date, it was the author of The Wonderful Wizard of Oz, L. Frank Baum, who predicted AR in his 1901 novel The Master Key: An Electrical Fairy Tale and the Optimism of Its Devotees. Nevertheless, it was not until half a century had passed that the first aspirations of meandering through virtual realms and augmenting our reality truly took flight.
In 1962, Morton Heilig introduced the Sensorama – a device that plays with more senses than modern VR headsets. In one of the Sensorama’s experiences, you were teleported into a mythical jungle, and could hear the calls of tropical birds, smell the moist soil, and feel the mild breeze on your face as you wandered on a thin trail. In another experience, you rode a quad through a desert, inhaling the smell of hot rocks through your nose, feeling the wind hitting your cheeks from different angles as you made a turn, and listening to the symphony of the wind flowing down the dunes and the quad’s rusty motor. The Sensorama consisted of a stereoscopic color display, fans to simulate wind, odor emitters creating different smells with chemicals, a movable chair, and a stereo-sound system (https://www.youtube.com/watch?v=vSINEBZNCks).
The emergence of XR technology was a rocky and unpredictable journey. Following the Sensorama’s sensory immersion, which painted a splendid vision of blended realities within reach, the 1990s witnessed a VR terrain strewn with the debris of ill-fated contraptions, such as Sega VR and Nintendo Virtual Boy. They failed to ignite the consumer’s imagination. As a result, the momentum of Head-Mounted Displays (HMDs) began to wane.
This is why researchers at Indiana University invented a completely different XR technology called the Computer Automatic Virtual Environment (CAVE). This technology enables viewers standing inside a cube and wearing 3D glasses to see any 3D graphics projected onto the walls, ceiling, and floor as if they’re floating in space. Users wander through the scenes, observing the projected objects from any angle. CAVEs can usually host up to 10 people, all of whom might be standing in front of a virtual replica of Michelangelo’s David statue, viewing the masterpiece from any angle, plunging into the rollercoaster of evoked emotions, and capturing every slight edge with their eyes. As CAVEs enabled high-resolution XR experiences at a time when heavy HMDs with low-quality resolution were the only alternative, they seemed to be the most promising way to experience XR for a few years. So, which factors led to the widespread adoption of HMDs to experience virtual words, while the seemingly superior CAVEs failed to gain significant traction?
Installing a CAVE costs a pile of money, often hundreds of thousands of dollars, paired with high monthly maintenance and usage costs. Setting up a CAVE can be tiring. You need to properly adjust the projectors, add appropriate lighting and air conditioning, and coordinate a network of computers to ensure the projectors seamlessly synchronize with one another. Most importantly, CAVEs fall short when it comes to multiple individual experiences or immersive storylines that demand interactive engagement between the users and the application: https://sky-real.com/news/the-vr-cave-halfway-between-reality-and-virtuality/and https://web.archive.org/web/20070109083006/http://inkido.indiana.edu/a100/handouts/cave_out.html.
After a bleak winter in XR development, marked by the dashed excitement over CAVEs and a slew of failed HMDs, a glimmer of hope emerged in the early 2000s. In 2005, the world witnessed the birth of the first-ever AR app called AR Tennis. This game could be played by two people, armed only with a Nokia phone and a sense of adventure. Using the phone’s camera, a marker was tracked, and voilà – a tennis court sprang to life on the screen. Both participants could interact with a virtual ball, court, and opponent, all simulated within the device. The mobile phone itself functions as the racquet, allowing for an immersive gameplay experience (https://www.imgawards.com/games/ar-tennis/).
In 2012, Palmer Luckey, often regarded as the father of modern VR for his pioneering work, marked another milestone in XR development with the introduction of a crowdfunded VR headset known as the Oculus Rift. Luckey founded Oculus VR, the project’s parent company, which was eventually acquired by Facebook in 2014 (https://history-computer.com/oculus-history/). The following year, Microsoft unveiled its groundbreaking HoloLensAR headset, which seamlessly blended high-definition holograms with the real world (https://news.microsoft.com/en-au/2016/10/12/microsoft-announces-global-expansion-for-hololens/). In 2016, Niantic released Pokémon Go, a game that would go on to capture the hearts of millions and usher in a new era of AR games. Like wildfire, the game’s popularity spread across the globe, igniting a new wave of excitement and enthusiasm for AR technology.
In 2017, IKEA introduced its famous AR app, allowing customers to transcend the limits of imagination and see how IKEA’s furniture would look in their homes, without ever leaving the comfort of their own couch.
Today, we have arrived at a pivotal moment in XR history, where the technology has never been more thrilling, and the possibilities never more endless. From its humble beginnings as a niche industry, XR is now poised to explode into every aspect of our lives, like a supernova of innovation and progress. In 2021, Mark Zuckerberg unveiled his bold vision for the future – a world where immersive, practical XR devices are not just a novelty but also an essential tool to enhance our lives and make our everyday joys even more extraordinary. The year 2022 saw the launch of the Meta Quest Pro, marking the first time Meta dipped its toe into releasing MR headsets, after covering a global market share of 81% with their VR headsets, such as the Meta Quest 2 (https://www.counterpointresearch.com/global-xr-ar-vr-headsets-market-share/).
However, even as XR technology becomes more advanced and sophisticated, there are still some who believe in making it accessible to all. The founders of the Open Source Community for Augmented Reality (OpenAR), a group dedicated to democratizing AR, have created DIY Open AR Glasses, a budget-friendly project that empowers anyone to build their own AR glasses with just €20 (https://sites.uef.fi/openar/ and https://openar.fi/). It is beautiful to witness how technology can be used to bring people together and make the world a better place.
Unity Technologies, now a billion-dollar titan in the realms of gaming and XR, emerged from humble origins in a cozy Copenhagen flat in 2004. The company’s journey to dominating the market with its Unity game engine has been equally fraught with uncertainty and turbulence, like that of XR technologies.
In 2002, a Danish graphics student sought aid on a digital forum, where a Berliner high schooler with backend experience joined him. Serendipity united their game studio dreams. With only a father’s financial cushion and café wages, they created their first game in 2005 – an alien, trapped in a life-support sphere, navigating Earth. Players tilted the world, and the alien rolled, gathering gems and evading CIA capture, utilizing wall-sticking and jumping tricks. The game’s difficulty overshadowed its cutting-edge lighting, and it found little success. Recognizing their talent for building tools and prototypes, the duo refocused their energy on crafting a game engine for the Mac community, dubbing it Unity – a symbol of collaboration and compatibility (https://techcrunch.com/2019/10/17/how-unity-built-the-worlds-most-popular-game-engine/).
Today, Unity’s fame in the realm of mobile game creation is legendary, with good cause. As the architect behind nearly half the globe’s games, it stands as the model for countless developers to aspire to. Yet, Unity’s reach extends far beyond just gaming. It’s increasingly used for 3D design and simulations across other industries such as film, automotive, and architecture. In fact, Unity is used to create a stunning 60% of all AR and VR experiences worldwide (https://www.dailydot.com/debug/unity-deempind-ai/).
In the world of high-end filmmaking, VR and the Unity Game Engine are the hottest ingredients to create a cinematic masterpiece. Disney’s Oscar winner The Jungle Book, Spielberg’s Ready Player One, and Oscar winner Blade Runner 2049 all owe their stunning visuals to these breakthrough technologies (https://unity.com/madewith/virtual-cinematography). The Lion King, a technical wonder in the realm of photo-realistic animation, is no exception. From every bone and breath to each muscle and whisker, every aspect of the African lions was carefully observed in nature and simulated in Unity to mimic real-life lions as closely as possible, while retaining the distinguished features of the original animated movie (https://ai.umich.edu/blog-posts/how-disneys-the-lion-king-became-a-pioneer-in-the-use-of-virtual-reality/).
Beyond the realm of filmmaking, XR development in Unity has the potential to transform the very way we communicate and interact with each other. Through the use of immersive virtual environments and cutting-edge Artificial Intelligence (AI) and emotion systems, XR has the power to create non-linear narratives, personalized experiences, and dynamic interactions that blur the lines between reality and fantasy. From education and training to gaming and social media, the possibilities for XR are endless, and Unity is at the forefront of this exciting new frontier.
XR development in Unity enabled experiences unlike anything the world had ever seen before, such as the award-winning VR game Bonfire, where you’re stranded on a desolate planet with only a flickering bonfire. Your survival depends on building a rapport with an extraterrestrial who speaks no known language and relies solely on your non-verbal cues and gestures. The result is magical, non-verbal communication, accompanied by a completely organic, non-linear storyline, driven by sophisticated AI and emotion systems that adapt to your every move in real time.
Unity’s XR development not only transforms gaming but also revolutionizes healthcare, as exemplified by VirtaMed’s innovative applications that reshape the way surgeons acquire surgical skills. The MR simulator mimics every action of a real surgical operation, from the subtlest movements to complications such as unexpected bleeding and bile leakage. Every trainee’s move is recorded with millimeter accuracy, providing valuable feedback. Simulated training accelerates the learning curve for aspiring surgeons, allowing them to enter the operating room with greater confidence and skill.
This chapter introduced you to the different facets of XR, namely AR, MR, and VR. You learned how AR strives to enhance our current reality, how MR pushes the boundaries of imagination to envision a world where real and virtual objects merge seamlessly, and how VR creates a completely alternative reality altogether. You now understand how the early dreams of delving into completely virtual or augmented realities transformed into bold actions and how XR development was and still is far from a straightforward path. You learned about Unity, from its humble beginnings to its worldwide dominance in game development, simulations, and XR development, looking at some state-of-the-art use cases. The following chapter will familiarize you with the Unity Hub and Editor. You will learn how to import assets, create materials, and acquire all the other skills you need before you can start developing your first XR experience.
In this chapter, we’ll lay the groundwork for your Unity journey. You’ll familiarize yourself with the Unity Editor, create a basic scene, and explore essential lighting aspects. We’ll cover installing Unity, navigating the Editor, working with GameObjects, importing assets, and experimenting with various lighting settings. By the end, you’ll have a solid foundation to delve deeper into Unity and create increasingly complex and captivating scenes.
We’ll cover the following topics as we proceed:
Setting up the Unity development environmentGetting to know the Unity Editor and its interfaceUnderstanding GameObjects and componentsCreating a basic scene in Unity and adding objectsBefore diving into the Unity Editor, it is important to ensure that your system meets the minimum requirements to run Unity. To successfully complete the exercises in this chapter, you will require a personal computer that has Unity 2021.3 LTS or a more recent version installed. To ensure your hardware meets the requirements, you can cross-check it on the Unity website (https://docs.unity3d.com/Manual/system-requirements.html).
First things first, let’s get Unity up and running on your development machine. Throughout this book, we’ll be harnessing the power of the Unity 3D game engine to create inspiring projects. Unity is an incredibly potent, cross-platform 3D development environment, complete with an intuitive and visually appealing editor.
If you have not yet installed Unity on your computer, we will guide you through the process. Following the installation, we’ll proceed to create our initial scene. Let’s begin the setup and exploration of Unity.
Over the course of this book, the Unity Hub will become your trusty command center for managing different Unity projects, Unity Editor versions, and modules. To initiate the installation process of the Unity Hub, follow these steps:
Head over to the official Unity website (https://unity3d.com/get-unity/download) and navigate to the latest version of the Unity Hub.Follow the onscreen instructions to install Unity Hub.With the Unity Hub installed, open it up and sign in using your Unity account. If you’re new to Unity, create an account to join the ranks of fellow creators.Without having the Unity Editor installed, the Unity Hub is just as powerful as a CD player without a CD. The next section covers how you can install the Unity Editor within the Unity Hub.
The Unity Editor is where the magic happens—a workspace for designing, building, and testing your game projects. To install it, follow these steps:
Within the Unity Hub, navigate to the Installs tab and hit the Add button to add a new Unity Editor version.Opt for the latest LTS version of the Unity Editor and click Next to kick off the installation process.During installation, don’t forget to include the necessary platforms and modules tailored to your specific needs. Add the Windows/Mac/Linux Build Support, depending on the operating system of your PC. Likewise, select the Android or iOS Build Support, depending on the nature of your smartphone, so that you can follow along with the AR tutorials in this book. Lastly, if you are using a VR headset that runs on Android, such as the Quest 2 or Quest Pro, be sure to add the Android Build Support module along with its sub-modules: OpenJDK and Android SDK & NDK Tools.As we have the Unity Editor installed, it is time to create a project.
After installing the Unity Hub and the Unity Editor, it’s time to create a new Unity project. For the sake of simplicity, we will first use a sample scene. A sample scene in Unity is a pre-built scene created by Unity to show developers how various functions and techniques can be implemented. Unity offers a variety of sample scenes, ranging from simple 2D games to complex 3D environments, that can be used as a starting point for your own projects. The sample scenes can be downloaded via the Asset Store or directly from the Unity Hub. Here’s how to do that directly from the Unity Hub.
Figure 2.1 shows a project’s creation within the Unity Hub.
Figure 2.1 – How to create a project with a sample scene in the Unity Hub
To load a sample scene as a new project, just go through the following steps:
Open the Unity Hub and go to the Projects tab.Click on the New button to create a new project.Choose 3D Sample Scene (URP), give your project a name, and select a location to save it. For this project, we’ve opted for 3D Sample Scene (URP) as it provides a preconfigured environment showcasing the capabilities of the Universal Render Pipeline (URP), ideal for those new to Unity or seeking a reference. While the standard 3D (URP) template is typically favored by developers for its clean slate, allowing for a customized setup, they often enhance these projects by importing additional packages or assets via the Package Manager or Unity Asset Store.Click on the Create button to create the project.Now that we’ve got that out of the way, let’s look at how to choose the right render pipeline.
Upon examining the sample scene options, you might have noticed the choice between the URP and the High Definition Render Pipeline (HDRP). But which one is better, and what exactly is a render pipeline? In essence, a render pipeline is a sequence of steps and processes that dictate how the engine renders graphics. It transforms 3D assets, lighting, and other scene components into the final 2D image gracing your screen. While URP and HDRP share some low-level tasks, each pipeline is tailored to specific project needs and target platforms.
Table 2.1 shows how URP and HDRP stack up against one another:
Table 2.1 – A comparison between URP and HDRP
A comfortable VR experience demands high frame rates, typically exceeding 90 FPS. URP emphasizes performance, ensuring smooth frame rates across various VR devices, including standalone VR headsets, PC-based VR, and mobile VR. As the market sees a growing number of standalone VR headsets, URP proves invaluable for its adaptability and ease of project optimization.
While URP may not boast the visual prowess of HDRP, it strikes a balance between graphic quality and performance, making it suitable for the majority of VR projects where performance is crucial.
Unity’s versatility extends beyond HDRP and URP pipelines, allowing the creation of custom Scriptable Render Pipeline (SRP) pipelines for experienced graphics programmers. However, developing a custom SRP pipeline requires deep knowledge of 3D graphics programming, rendering pipelines, and C# language proficiency. For those lacking these skills, HDRP and URP offer an optimal balance between flexibility and ease of use.
Given its many advantages, URP emerges as the go-to choice for most VR endeavors. Throughout this book, we’ll focus exclusively on URP. However, HDRP remains a worthy contender for those pursuing high-end PC VR experiences.
With Unity installed, the render pipeline selected, and your project at your fingertips, it’s time to acquaint ourselves with the Unity Editor.
If you’re new to Unity, the editor’s interface can be a bit overwhelming at first. But don’t worry—we’ll guide you through the Unity Editor and show you how to navigate its various menus and panels. Experienced users can also benefit from staying current with the latest best practices and techniques, as designing for VR presents unique challenges that may require a different approach than traditional game development.
Upon launching a new Unity project, you’ll be greeted by the Unity Editor. This multifaceted workspace is composed of several distinct windows known as panels.
Figure 2.2 shows the window layout for the sample scene project we just created.
Figure 2.2 – The window layout for the sample scene project
Figure 2.2 showcases a number of panels, namely: (1) Scene view, (2) Game view, (3) Hierarchy, (4) Inspector, (5) Project, and (6) Console.
Let’s explore these essential panels that make up Unity’s interface.
Imagine the Scene view (1) as your canvas, where you’ll bring your game world to life, creating mesmerizing landscapes and placing your characters in fantastic environments. This panel is the heart of the game, where every object is placed and arranged to tell a compelling story. For example, you can select the Safety Hat GameObject within our sample scene and move, rotate, scale, or remove it.
The Game view (2) is the place where you can experience your game from your player’s perspective. It provides a real-time preview of the gameplay, including the visual rendering and user interface elements you’ve implemented.
The Hierarchy panel (3) or Scene Hierarchy window is your game’s blueprint, showcasing the organized list of every GameObject that makes up your game world. It’s like an architectural plan that helps you navigate, manage, and visualize the relationships between game elements, ensuring a coherent and structured experience. Our sample scene demonstrates what a well-structured hierarchy looks like. All added GameObjects are subordinate to the Example Assets parent object. This includes the Props GameObject, which itself serves as a parent object for any GameObjector asset that decorates a scene and adds details and context to the game world—for example, Jigsaw, Hammer, Workbench, and so on. So, keep in mind that grouping related objects together under a parent GameObject makes it easier to manage and manipulate them as a single entity. Additionally, naming conventions can be used to make the hierarchy more readable and easier to understand.
The Inspector panel (4) or Inspector window is the control center for fine-tuning your game elements. It’s where you can adjust every tiny detail of your GameObjects or assets, making sure your game world is precisely how you envisioned it. From position and scale to adding components and modifying scripts, the Inspector panel is your ticket to perfection. Let’s select the Safety Hat object in the Scene Hierarchy window by navigating to Example Asset | Props | Safety Hat. The Inspector window shows all the defined components of the Safety Hat object. The Transform component is responsible for positioning, rotating, and scaling the object. You will find a gizmo representation of it in the Scene View, which allows us to transform objects directly there. Two other components that can be seen in the Inspector window are the Mesh Filter and Mesh Renderer components. The Mesh Filter and Mesh Renderer components provide a way to create and display 3D models in a scene. The Mesh Filter component defines the model’s geometry, while the Mesh Renderer component applies visual properties such as materials and textures. Without these components, you wouldn’t see the object in the Scene View.
The Project panel (5) holds all the building blocks of your game, from textures and models to sounds and scripts. It’s like a library of resources, where every imported or created asset is at your fingertips, waiting to be used in your XR project.
Finally, the Console panel (6) is an essential tool, assisting in the identification and resolution of issues during development. It provides detailed logs, warnings, and error messages, allowing for efficient troubleshooting and ensuring the integrity of the game’s performance.
You are now familiar with the default panels in Unity. Next, we will get to know the Unity Grid and Snap system, which is a game changer when building scenes.
Tip
You can keep the default layout, or customize your panels using the dropdown at the top right of the Unity Editor under Layout. We usually prefer the 2x3 layout, but for simplicity, we will use the default layout in this book.
The Unity Grid and Snap system helps align and place objects in a more organized manner in the game environment. It allows you to snap objects to a grid and also to other objects for easier placement and arrangement.
Figure 2.3 shows you where to find the system.
Figure 2.3 – How to use the Grid and Snap system
To make use of this system, you will need to turn on the Grid Snapping button, which is represented by an icon with a grid and magnet (1). Furthermore, make sure to activate the Global handles by selecting the Global icon (2) positioned adjacent to the Grid Snapping field.
The Grid Size field in the Grid Snapping dropdown in Unity refers to the size of the grid squares in the Scene view. By default, the Grid Size field is set to 1 for all three axes (X, Y, and Z), meaning each grid square has a width, height, and depth of one unit. This setting can be adjusted as needed to match the scale of the objects in your scene. The Grid Size field determines the increment at which objects will snap to the grid, so a larger grid size will result in coarser snapping, while a smaller grid size will result in finer snapping.
Your turn
1. Select the same stud as in Figure 2.3 and move it two units towards the Y direction. Activate Grid Snapping, and try different grid sizes to place the stud back on the workbench.
2. Try to become familiar with the navigation in the Scene view by moving, orbiting, and zooming through the scene. This documentation may help you: https://docs.unity3d.com/510/Documentation/Manual/SceneViewNavigation.html).
3. Pick another two GameObjectsand move, scale, and rotate them as you like.
GameObjects and components are essential building blocks of Unity projects, allowing developers to create interactive and dynamic content. Now, we’ll provide a comprehensive overview of GameObjects and components and how they work together.
Unity empowers developers with the ability to create multiple scenes within the editor. This feature aids in managing complexity, enhancing performance, and fostering more modular and reusable game projects. To create a new default scene alongside our sample scene, access the menu at the uppermost section of the editor and choose File | New Scene | Standard (URP). Save it in your Scenes