29,99 €
With two decades of programming experience across multiple languages and platforms, expert game developer and console porting programmer Michael Dunsky guides you through the intricacies of character animation programming. This book tackles the common challenges developers face in creating sophisticated, efficient, and visually appealing character animations.
You’ll learn how to leverage the Open Asset Import Library for easy 3D model loading and optimize your 3D engine by offloading computations from the CPU to the GPU. The book covers visual selection, extended camera handling, and separating your application into edit and simulation modes. You’ll also master configuration storage to progressively build your virtual world piece by piece.
As you develop your engine-like application, you’ll implement collision detection, inverse kinematics, and expert techniques to bring your characters to life with realistic visuals and fluid movement. For more advanced animation and character behavior controls, you’ll design truly immersive and responsive NPCs, load real game maps, and use navigation algorithms, enabling the instances to roam freely in complex environments.
By the end of this book, you’ll be skilled at designing interactive virtual worlds inhabited by lifelike NPCs that exhibit natural, context-aware behaviors.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 679
Veröffentlichungsjahr: 2025
Mastering C++ Game Animation Programming
Enhance your skills with advanced game animation techniques in C++, OpenGL, and Vulkan
Michael Dunsky
Mastering C++ Game Animation Programming
Copyright © 2025 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Portfolio Director: Rohit Rajkumar
Relationship Lead: Neha Pande
Project Manager: Sandip Tadge
Content Engineer: Rashi Dubey
Technical Editor: Tejas Mhasvekar
Copy Editor: Safis Editing
Proofreader: Rashi Dubey
Indexer: Rekha Nair
Presentation Designer: Aparna Bhagat
Marketing Owner: Nivedita Pandey
Growth Lead: Namita Velgekar
First published: March 2025
Production reference: 1110325
Published by Packt Publishing Ltd.
Grosvenor House
11 St Paul’s Square
Birmingham
B3 1RB, UK.
ISBN 978-1-83588-192-7
www.packt.com
No bits were harmed during development.
Michael Dunsky is an electronics engineer, console porting programmer, and game developer with more than 20 years of programming experience. He started with BASIC at the young age of 14 and expanded his portfolio over the years to include assembly language, C, C++, Java, Py-thon, VHDL, OpenGL, GLSL, and Vulkan. During his career, he also gained extensive knowledge of Linux, virtual machines, server operation, and infrastructure automation. Michael holds a Master of Science degree in Computer Science from the FernUniversität in Hagen with a focus on computer graphics, parallel programming, and software systems.
Thanks to Marco and Morris for giving me the green light to create my second book, completed again in my spare time. And thanks to the team at Packt for the great support.
Wanderson dos Santos Lopes is a seasoned software engineer with proven experience in both gameplay and server programming, specializing in C++ and Unreal Engine. With an extensive background in developing multiplayer mechanics and robust backend systems, he writes performance-first code and optimizes core software across platforms. His expertise in diagnosing and resolving bugs and implementing maintainable technologies has contributed to several high-profile gaming projects.
Bill Merrill acquired a BS in Computer Science from the University of California at Riverside and has been an engineer in the games industry for over 22 years, specializing in animation, AI, gameplay, and engine systems. In addition to several proprietary projects in robotics and simulation with Amazon, Bill has shipped numerous AAA games, from major franchises such as Silent Hill and Front Mission to original IPs such as Evolve and New World. He has also contributed to the industry through GDC talks and peer-reviewed contributions to the Game AI Pro book series. Bill now functions as a hands-on Technical Director working on a new, unannounced IP.
Preface
Who this book is for
What this book covers
To get the most out of this book
Get in touch
Share Your Thoughts
Part 1: Populating the World with the Game Character Models
Working with Open Asset Import Library
Technical requirements
Getting the source code and the basic tools
Getting the code using Git
Getting the code as a ZIP file
Installing the required tools and libraries for Windows
Installing Visual Studio 2022 on Windows
Enabling long path names on Windows
Downloading Open Asset Import Library
Configuring the build
Installing the Vulkan SDK on Windows
Compiling and starting the example code
Installing the required tools and libraries for Linux
Downloading Open Asset Import Library
Installing a C++ compiler and the required libraries on Linux
Compiling the examples via the command line on Linux
Installing Eclipse on Linux
Code organization in this book
Animating game characters – a primer
About nodes, bones, skeletal animation, and skinning
Preparing the data for efficient usage
Updating character data
What is Open Asset Import Library?
Loading a model file
Loading embedded textures
Parsing the node hierarchy
Adding vertex buffers for the meshes
Importing animations
Checking the code for all the details
Extending the UI with an Open File dialog
Integrating the file dialog into CMakeLists.txt
Using the ImGui file dialog
Adding a filter to show only supported file types
Adding a single filter
Adding a group of filters
Adding a regular expression-style filter
Loading the model file
Drawing the model to the screen
Adding and removing model instances dynamically
Reusing the bones for the sake of simplicity
Storing instance-specific settings
Dynamic model and instance management
Drawing all instances
Summary
Practical sessions
Additional resources
Moving Animation Calculations from CPU to GPU
Technical requirements
What are compute shaders and why should we love them?
The famous raster interrupt
The rise of multi-core machines
Hidden multi-core champions
Welcome to the wonderful world of compute shaders
Profiling animation performance
Locating the hotspots in the code
Analyzing the current data representation
Adjusting the data model
Adding missing data for the compute shader
Relocating data to another shader
Doing the last preparations
Moving the node computations to the GPU
Adding more shader storage buffers
Calculating the node transforms in a shader
Creating the final node matrices
Finalizing the compute relocation
Testing the implementation by scaling up
How to debug a compute shader
Summary
Practical sessions
Additional resources
Adding a Visual Selection
Technical requirements
Implementing a “move to instance” function
Adding coordinate arrows
Creating a button to center the selected instance
Adding a highlight to the selected instance
Preparing the renderer to support highlights
Adjusting logic to shaders and the UI
Selecting a model instance with point and click
Pros and cons of shooting virtual rays
Advantages of drawing the instance index into a texture
Adjusting the framebuffer
Creating a selection shader
Reading a pixel from a texture
Adding mouse button handling
Assigning an index to each instance
Selecting the instance at mouse positions
Implementing a null object to allow deselection
What is a null object?
Creating and using the AssimpModel null object
Adjusting the user interface
Summary
Practical sessions
Additional resources
Part 2: Transforming the Model Viewer into an Animation Editor
Enhancing Application Handling
Technical requirements
Switching between edit and view modes
Deciding what should be switched in view mode
Adding the state variable plus code
Toggling between the two modes and changing the title
An outlook for future changes
Reverting changes before applying
The basic idea
Adding code and User Interface elements
Drawbacks of the current solution
Implementing undo and redo functionality
What do we need to create undo and redo?
Creating a setting storage class
Hooking up the storage class to the renderer
Defining hotkeys for undo and redo
Adding an ImGui menu to allow direct access
Limits and enhancements of our undo/redo implementation
Summary
Practical sessions
Additional resources
Saving and Loading the Configuration
Technical requirements
Textual or binary file formats – pros and Cons
Saving and loading binary data
Saving and loading textual data
Choosing a text format to save our data
The INI file format
The JSON file format
The YAML file format
Exploring the structure of a YAML file
The YAML node
The YAML map
The YAML sequence
Combinations of maps and sequences
Adding a YAML parser
Getting yaml-cpp
Integrating yaml-cpp into the CMake build
Adding the parser class
Using the node type of yaml-cpp
Accessing sequences and maps
Handling exceptions thrown by yaml-cpp
Saving and loading the configuration file
Deciding what to store in the configuration file
Overloading the output operator of the emitter
Creating and writing the configuration file
Adding a file dialog to the user interface
Loading the configuration file back and parsing the nodes
Cleaning up and recreating the scene from the saved values
Strict or relaxed configuration file loading
Common errors leading to corrupted files
Loading a default configuration file at startup
Summary
Practical sessions
Additional resource
Extending Camera Handling
Technical requirements
Adding multiple cameras
From a single camera to an array of cameras
Extracting the camera settings
Adjusting the renderer
Defining a free camera as the default camera
Adding and deleting cameras
Adjusting camera configuration load and save
Bumping the configuration file version
Creating different camera types
Implementing first- and third-person cameras
Retrieving the bone matrix for the first-person view
Computing first-person camera parameters
Moving the camera in a third-person view
Disabling manual camera movement
Limits of current first-/third-person cameras
Adding stationary cameras
Creating a stationary follow camera
Switching between cameras and configurations
Configuring keyboard shortcuts for camera selection
Adding orthogonal projection
User interface controls for the projection settings
Summary
Practical sessions
Additional resources
Part 3: Tuning Character Animations
Enhancing Animation Controls
Technical requirements
Blending between animations with style
The power of lookup tables
Creating the lookup tables
Uploading the data tables to the GPU
Adjusting the renderer code and the compute shader
Adding new states to the code
Using bit fields and plain enums
Extending model and instance settings
Adding the idle/walk/run logic
Using acceleration and deceleration
Linking states and animations
Mapping idle/walk/run animations
Mapping actions to animation clips
Defining allowed state changes
Using a finite state machine to control the animation flow
Saving and loading the states
Storing the new data types
Reading back the model settings in the renderer
Summary
Practical sessions
Additional resources
An Introduction to Collision Detection
Technical requirements
The complexities of collision detection
Avoiding the naive way
Using spatial partitioning to reduce complexity
Grid
Quadtree
Octree
Binary space partitioning
K-d tree
Bounding volume hierarchy
Simplifying the instances for faster collision checks
Axis-aligned bounding box
Oriented bounding box
Bounding circles and spheres
Capsule
Convex hull
Bounding volume hierarchy
Adding a quadtree to store nearby model instances
Adjusting the bounding box code
Rewriting the quadtree code to fit our needs
Calculating the instance bounding boxes
Adding a three-dimensional bounding cube class
Creating the AABB lookup tables
Using the AABB in the model and renderer
Creating a window to show the quadtree plus contents
Retrieving the colliding instances and reacting to collisions
Drawing the AABB debug lines
Implementing bounding spheres
Creating the data for the bounding spheres
Drawing bounding spheres
Using the bounding spheres for collision detection
Summary
Practical sessions
Additional resources
Adding Behavior and Interaction
Technical requirements
Structures to control instance behavior
Adding a visual node editor
Integrating imnodes by using CMake
Using imnodes to create UI elements
Creating the imnodes context
Setting default values for imnodes
Creating the node editor
Adding a simple node
Creating imnodes attributes and ImGui elements in a node
Maintaining links between nodes
Creating graph node classes
Exploring the base class for graph nodes
Creating the wait node
Using callbacks to propagate changes
Creating a behavior struct and a storage class for the instances
Adding a node factory
Extending the node editor
Saving and loading a node tree
Extending the code to support behavior changes
Creating a node tree copy for every instance
Connecting SingleInstanceBehavior and the renderer
Adding events
Limitations of the current implementation
Adding interaction between instances
Creating interaction control properties
Extending the handling code
Drawing debug information
Summary
Practical sessions
Additional resources
Advanced Animation Blending
Technical requirements
How to animate facial expressions
Adding face animations to code and GPU shaders
Loading morph meshes
Storing all morph meshes in a single buffer
Adding face morph settings to the code
Filling the per-instance buffer data in the renderer
Extending the shader to draw face animations
Finalizing the face animation code
Adding UI elements to control face animations
Saving and loading the new instance settings
Using face animations in node trees
Adjusting the code for the new FaceAnim node
Adding the FaceAnim node
Enabling instance and renderer to react to face animation changes
Limitations of morph target animations
Implementing additive blending
How additive blending works
Extending the code to support additive animations
Creating mappings for the new head animations
Adding a head animation node
Saving and loading the head animation settings
Summary
Practical sessions
Additional resources
Part 4: Enhancing Your Virtual World
Loading a Game Map
Technical requirements
Differences between map and model data
Level data does not move around
Using a separate collision detection for level data
Level data may contain additional data
Spatial division
Lightmaps
Navigation mesh
Hierarchical level-of-detail
Level data may be partial or incomplete
Choosing a file format for a map
Using levels in file formats supported by Assimp
Extending existing formats or creating a custom format
Importing a game map
Adding a C++ class to hold the level data
Adding callbacks and renderer code
Extending the UI with level property controls
Saving and loading the level configuration
Converting the quadtree to an octree
Creating an interactive octree view
Adding interactivity
Collecting the lines
Calculating the view and drawing the lines
Building an AABB for the level data
Using non-animated models as assets
Sending the level data to the GPU
Creating a new shader
Drawing the level AABB
Summary
Practical sessions
Additional resources
Advanced Collision Detection
Technical requirements
Enhancing collision detection for level data
Adding a new octree type
Filling the level data octree
Detecting instance/level collisions
Drawing debug lines
Extending the node tree to support level geometry collisions
Using gravity to keep the instances on the floor
Finding ground triangles in level data
Adding basic gravity
Keeping the instances on the ground triangles
Adding inverse kinematics
The two types of kinematics
Understanding the FABRIK basics
Implementing the FABRIK inverse kinematics algorithm
Defining the node chain for the instance’s feet
Adjusting the node positions
Keeping the node transformations separate
Adding the code for the animated instances
Creating the new world positions
Detecting feet-to-ground collisions
Running the FABRIK solver
Limitations of FABRIK
Summary
Practical sessions
Additional resources
Adding Simple Navigation
Technical requirements
An overview of different ways to navigate
Distance-based navigation
Graph-based navigation
DFS and BFS algorithms
Dijkstra’s algorithm
A* algorithm
Mesh-based navigation
Navigation meshes
Area awareness system
Using machine learning to generate navigation data
The A* path-finding algorithm
Estimating the distance to the target
Minimizing path costs
Exploring the A*algorithm
Implementing A*-based navigation
Preparing the mesh triangles
Adding the path-finding class
Generating ground triangles
Finding a path between two ground triangles
Preparing the data
Running the main loop
Extracting the best node
Backtracking the shortest path
Adding navigation targets to the map
Adjusting model and instance
Adding gravity for non-animated instances
Saving and loading the new model and instance data
Navigating instances to a target
Calculating the path to the target
Rotating the instance to reach the target
Adding debug lines for the path
Summary
Practical sessions
Additional resources
Creating Immersive Interactive Worlds
Technical requirements
Adding sound effects and background music
Using an audio library
Simple DirectMedia Layer
OpenAL
PortAudio
FMOD
Playing sound effects
The game character’s footsteps
Other character sounds
Local sound sources
Ambient sounds
Weather effects
Playing music
Menu music
Ambient music
Adaptive music play
Allowing custom music
Hands-on: Implementing an audio manager
Defining the high-level interface
Using SDL for the low-level layer
Controlling music replay
Adding a callback for continuous music playback
Playing sound effects
Using the footstep sound effects in the renderer
Extending the audio manager class
Using the music player in the UI
Alternative sound manager implementations
Enhancing visuals
Bringing colors to the world by using physically based rendering
Adding transparency
Looking up at a beautiful sky
Playing with light and shadows
Swimming in realistic water
Adding stunning post-processing effects
Upgrading to ray tracing
Diving into virtual reality
Hands-on: Adding a skybox to the virtual world
Exploring the technical details
Implementing the skybox
Drawing the skybox
Extending immersion with daytime and weather
Adding a day/night cycle
Allowing forward time travel
Playing in real time
Worshipping the weather god
Listening to the oracle of seasons
Hands-on: Adding day and night
Implementing light control
Adding a UI control
Summary
Practical sessions
Additional resources
Other Books You May Enjoy
Index
Cover
Index
Once you’ve read Mastering C++ Game Animation Programming, we’d love to hear your thoughts! Please click here to go straight to the Amazon review page for this book and share your feedback.
Your review is important to us and the tech community and will help us make sure we’re delivering excellent quality content.
Thanks for purchasing this book!
Do you like to read on the go but are unable to carry your print books everywhere?
Is your eBook purchase not compatible with the device of your choice?
Don’t worry, now with every Packt book you get a DRM-free PDF version of that book at no cost.
Read anywhere, any place, on any device. Search, copy, and paste code from your favorite technical books directly into your application.
The perks don’t stop there, you can get exclusive access to discounts, newsletters, and great free content in your inbox daily
Follow these simple steps to get the benefits:
Scan the QR code or visit the link belowhttps://packt.link/free-ebook/978-1-83588-192-7
Submit your proof of purchaseThat’s it! We’ll send your free PDF and other benefits to your email directlyThe first part of the book starts with an introduction to Assimp, the Open Asset Import Library. You will learn how to load a character model from a file, how to process the different elements of the character, such as meshes, textures, and nodes, and how to draw the model to the screen. You will also learn how to move the computational load to the GPU by using compute shaders, freeing the CPU for features introduced in the book. Finally, you will explore the idea and implementation of a visual selection, enabling you to select a model instance on the screen with a click of the mouse.
This part has the following chapters:
Chapter 1, Working with Open Asset Import LibraryChapter 2, Moving Animation Calculations from CPU to GPUChapter 3, Adding a Visual SelectionWelcome to Mastering C++ Game Animations! Are you the kind of person who looks at the animated models in a computer or console game, or a 3D animation tool, and asks yourself questions like:
How does this work? How do they do this? Could I do this myself, too?
If so, this book will take you in the right direction to achieving this. In the next 14 chapters, you will learn how to create your own little game character model viewer.
The book starts with loading a file using Open Asset Import Library, converting the data structures from the importer library into more efficient data structures for rendering, and rendering the character model with a simple OpenGL or Vulkan renderer. You will also learn how to optimize data updates and rendering by relocating computational load to the GPU in the form of GPU-based lookup tables and compute shaders.
For the character animations, you will not only dive into normal animation blending but also be introduced to state-based animation control, additive animation blending to move the head independently of the rest of the body, and facial animations. You will also learn how to control the behavior of the instances by using a simplified version of behavior trees and implement interaction between the instances on the screen.
To give a proper home to the game characters, you will learn how to load a game map into the application. Moving around in the game map will be enhanced by adding collision detection, inverse kinematics for the character feet, and simple navigation to let the instances run around fully on their own in the virtual world.
In addition to the animations, features such as interactive selection by using the mouse, saving and loading the configuration to a file to allow working on larger virtual worlds, and handling different cameras in the virtual world are introduced. Also, a graphical, node-based configuration will be implemented, enabling you to change the behavior of the instances in a non-programming way.
With all these steps combined, your virtual characters in the virtual world will come closer to real game characters.
Join our community on Discord
Join our community’s Discord space for discussions with the author and other readers: https://packt.link/cppgameanimation.
Every journey starts with the first step, so welcome to Chapter 1! This chapter will set the foundation for the animation application, as you will get an insight into how to load a model file from your computer into the program, position the instance in the vast emptiness of the virtual world, and play the animations that are included in the file. By the end of this chapter, your game character model will be able to jump, run, or walk on the screen, maybe surrounded by non-animated models or other static objects.
In this chapter, we will cover the following topics:
Animating game characters – a primerWhat is Open Asset Import Library?Loading a model fileExtending the UI with an Open File dialogAdding and removing model instances dynamicallyAs we will use open source software and platform-independent libraries in this book, you should be able to compile and run the code “out of the box” on Windows and Linux. You will find a detailed list of the required software and libraries, plus their installation, in the following Technical requirements section.
For this chapter, you will need the following:
A PC with Windows or Linux, and the tools listed later in this sectionGit for source-code managementA text editor (such as Notepad++ or Kate) or a full IDE (such as Visual Studio 2022 for Windows, or Eclipse/KDevelop for Linux)Important note
A recent C++ compiler is required to compile the code. In the current CMake build system, C++17 is configured, but the code is known to work with newer C++ standards, up to and including C++26 (although the compiler must support those standards).
Now, let’s get the source code for this book and start unpacking the code.
The code for this book is hosted on GitHub, which you can find here:
https://github.com/PacktPublishing/Mastering-Cpp-Game-Animation-Programming
You need to install Git since the build system utilizes Git to download the third-party projects used in the examples.
On Linux systems, use your package manager. For Ubuntu, the following line installs Git:
sudo apt install gitOn Windows, you can download Git here: https://git-scm.com/downloads.
To unpack the code, you can use any of the following two methods.
To get the code in the book, you should use Git. Using Git offers you additional features, such as creating a local branch for your changes, keeping track of your progress, and comparing your updates to the example code. Also, you can easily revert changes if you have broken the code during the exploration of the source code, or while working on the practical sessions at the end of each chapter.
You can get a local checkout of the code in a specific location on your system either through the Git GUI, by cloning the repository in Visual Studio 2022, or by executing the following command in a CMD:
git clone https://github.com/PacktPublishing/Mastering-Cpp-Game-Animation-ProgrammingPlease make sure that you use a path without spaces or special characters such as umlauts as this might confuse some compilers and development environments.
Although Git is recommended, you can also download the code as a ZIP file from GitHub. You will need to unpack the ZIP file to a location of your choice on your system. Also, make sure that the path the ZIP file is unpacked to contains no spaces or special characters.
Before we can use the code from the book, some tools and libraries must be installed. We will start with the Windows installation, followed by the Linux installation.
To compile the example code on a Windows machine, I recommend using Visual Studio 2022 as the IDE since it contains all you need for a quick start. Using other IDEs like Eclipse, Rider, or KDevelop is no problem as the build is managed by CMake, but you may need to install a C++ compiler like MSYS2 plus the compiler packages as an additional dependency.
If you want to use Visual Studio for the example files and don’t have it installed yet, download the free Community Edition of Visual Studio at https://visualstudio.microsoft.com/de/downloads/.
Then, follow these steps:
Choose the Desktop development with C++ option so that the C++ compiler and the other required tools are installed on your machine:Figure 1.1: Installing the C++ desktop development in Visual Studio 2022
Then, under Individual components, also check the C++ CMake tools for Windows option:Figure 1.2: Check the box for CMake tools for Windows to be installed in Visual Studio 2022
Finish the installation of Visual Studio, start it, and skip the initial project selection screen.When using a fresh installation of Windows 10 or 11, the maximum path length for files is 260 characters. Depending on the location of the folder containing the code for the book, Visual Studio 2022 might run into errors caused by paths for temporary build folders exceeding the 260 characters limit.
To enable long path names, the Windows Registry needs to be adjusted. A simple way is to create a text file with the .reg extension, for instance, long-paths.reg, and copy the following content to the file:
Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem] "LongPathsEnabled"=dword:00000001A double-click on the file will automatically start the Windows Registry Editor to import the settings to the Windows Registry. After confirming both the UAC dialog and the following warning dialogs by clicking Yes, the Registry Editor will import the new settings.
Now, reboot the PC to activate the long path names and continue with the installations.
For Windows, Open Asset Import Library must be built and installed from the source files. Clone the repository from https://github.com/assimp/assimp in a new Visual Studio 2022 project, as shown in Figure 1.3:
Figure 1.3: Cloning the asset importer GitHub repository within Visual Studio 2022
As an alternative, you can create a clone from a Git Bash, or via the Git GUI:
git clone https://github.com/assimp/assimpWe need to make a few adjustments to create a static library instead of a dynamic library. Using a static library makes the build process easier for us, as we don’t have to worry about an additional DLL file.
To change the CMake settings, choose the following option after right-clicking on the CMakeLists.txt file:
Figure 1.4: Changing the CMake settings for the asset importer
In the Configuration tab of Visual Studio 2022 that appears, change the configuration name to x64-RelWithDebInfo, and change the configuration type to RelWithDebInfo:
Figure 1.5: Modifying the current configuration of the asset importer
By using RelWithDebInfo, a release version with debug information will be created. The resulting executable will be optimized by the compiler, but the file still contains data to allow debugging the program in case of problems.
Next, change the following settings in the CMake settings. You can use the search field on the bottom left, named Filter variables..., to search for the specified setting:
Disable building a shared library:Figure 1.6: Switching the setting to create a static library
Change the linking of the C runtime:Figure 1.7: Linking the C runtime statically
Remove the library suffix to create a file name without the compiler version:Figure 1.8: Removing the suffix of the created file
Next, select Build and then Install in the context menu of the CMakeLists.txt file.
After the installation is finished, the following folder structure will be generated:
Figure 1.9: Asset importer library and includes
We have to make all the files discussed in this section available for all examples in the book. To do this, two options are available – copy the files to a fixed path or add an environment variable.
First, create this folder on your computer:
C:\Program Files\assimpThen, copy the two folders lib and include into it:
Figure 1.10: The two folders have been copied to the Program Files folder
The CMake search script for Assimp will try to find the static library and the header files in this folder.
As an alternative solution, you can create a folder on your PC wherever you want, for instance, to D:\assimp. Then, copy the folders lib and include into the folder and set the environment variable ASSIMP_ROOT to the location of the created folder:
Figure 1.11: The environment variable ASSIMP_ROOT pointing to a folder on the PC
Please remember that you have to restart Visual Studio 2022 after setting the environment variable.
For Vulkan support, you also need to have the Vulkan SDK installed. Get it here: https://vulkan.lunarg.com/sdk/home.
Do a default installation, and make sure to add GLM headers. and Vulkan Memory Allocator header., as the CMake search scripts will use them if the Vulkan SDK is installed:
Figure 1.12: Adding GLM and VMA during Vulkan SDK installation
Make sure to restart Visual Studio 2022 after installing the Vulkan SDK to allow detecting the Vulkan SDK header files and environment variables.
Running the examples can be done in two different ways: following the book example by example or compiling all the code at once to browse all the examples.
Compiling the code can be done using the following steps:
To open an example project, choose Open a local folder from the Visual Studio 2022 start screen or Open CMake from the File menu of Visual Studio 2022, then navigate to the folder with the example code you want to compile, or to the top-level folder of the example code if you want to compile all examples at once. Visual Studio will automatically detect and configure CMake in the selected folder for you. The last line of the output window should be as follows: 1> CMake generation finished.This confirms the successful run of the CMake file generation.Now, set the startup item by right-clicking on the CMakeLists.txt file – this step is required to build and run the project:Figure 1.13: Configuring the startup item in Visual Studio 2022
After setting the startup item, we can build the current project. Right-click on the CMakeLists.txt file and choose Build:Figure 1.14: Build the project in Visual Studio 2022
After the compilation succeeds, start the program in a non-debug build by using the unfilled green arrow:Figure 1.15: Start the compiled program without debugging in Visual Studio 2022
If you are a Linux user, you can follow the explanation in the following section to get all the tools and libraries onto your system.
Modern Linux distributions already contain most of the tools needed to compile the example code for the book.
For the common Linux distributions, Assimp should be available from the package manager. For Ubuntu, you need to install the Assimp development package:
sudo apt install libassimp-devIf you use Ubuntu Linux, all required dependencies can be installed by using the integrated package manager. Use this command to install the packages for the OpenGL-based examples:
sudo apt install git gcc g++ cmake ninja-build libglew-dev libglm-dev libglfw3-dev zlib1g-devTo use Clang as a compiler, instead of GCC, you can use this command:
sudo apt install git llvm clang cmake ninja-build libglew-dev libglm-dev libglfw3-dev zlib1g-devIf you plan to build the Vulkan examples, these additional packages are required and should be installed to get the most out of the Vulkan code:
sudo apt install glslang-tools glslc libvulkan-dev vulkan-validationlayersIf you want to use the latest Vulkan SDK instead of the Ubuntu version, you can download the package from the LunarG website:
https://vulkan.lunarg.com/sdk/home#linux
For other Linux distributions, the package manager and the names of the packages may differ. For instance, on an Arch-based system, this command line will install all required packages to build the OpenGL examples:
sudo pacman -S git cmake gcc ninja glew glm glfw assimp zlibFor the Vulkan examples, these additional packages are required on Arch-based installations:
sudo pacman –S vulkan-devel glslangThe examples can be compiled directly on the command line, without using an IDE or editor. To build a single example, change into the chapter and example subfolders of the folder containing the cloned repository, create a new subfolder named build, and change into the new subfolder:
$ cd chapter01/01_assimp_opengl $ mkdir build && cd buildTo compile all examples at once, create the build folder in the top-level folder of the example code and then change into the new subfolder.
Then, run CMake to create the files required to build the code with the ninja build tool:
$ cmake -G Ninja ..The two dots at the end are needed; CMake needs the path to the CMakeLists.txt file.
If you build a single example, let ninja compile the code and run the generated executable file:
$ ninja && ./MainIf all the required tools and libraries are installed and the compilation is successful, an application window should open.
When building all examples at once, a new folder named bin will be created inside the top-level folder, containing a subfolder for every chapter and in every chapter’s folder the two subfolders for the two examples of that chapter, similar to the source-code structures.
In case of build errors, you need to check the requirements again.
If you want to use an IDE, you can continue with the installation of Eclipse.
If you want to compile the example code with the Eclipse IDE on Linux, some extra steps are required:
Download and install Eclipse IDE for C/C++ Developers from https://www.eclipse.org/downloads/packages/.After installing Eclipse, head to the marketplace under Help:Figure 1.16: Accessing the Eclipse marketplace
Install the cmake4eclipse and CMake Editor packages. The first one enables CMake support in Eclipse, with all the features we need, and the second one adds syntax coloring to the CMake files. The extra colors make it more convenient to edit the files:Figure 1.17: Installing the CMake Editor and cmake4eclipse
Compiling and starting the example code can be done in the following steps:
Select Open Project from File System from the File menu.Choose Directory... and navigate to the folder with the source code:If you want to build all examples at once, select the top-level source folder, press Deselect All, and select only the first project.To build only a single example, you can either use Deselect All on the top-level folder and select only the example you want to build, or you can descend into the folder for the specific example.Click on Finish to open the project.Next, choose Build Project from the context of the project folder.You may need to switch the console output to show the current build messages. Use the small arrow with the tooltip Display Selected Console:Figure 1.18: Selecting the right output to see the build messages
If Eclipse does not refresh the project content after the build, choose Refresh from the context menu of the project folder, or press F5.Choose Run As, and select the second option, Local C/C++ application.Select the Main executable from the window to run the program.Figure 1.19: Choosing the Main executable to run the compiled application
As the last step of the preparations, we look at the organization of the code in the GitHub repository of the book.
The code for every chapter is stored in the GitHub repository, in a separate folder with the relevant chapter number. The number uses two digits to get the ordering right. Inside each folder, one or more subfolders can be found. These subfolders contain the code of the chapter, depending on the progress of that specific chapter.
For all chapters, we put the Main.cpp class and the CMake configuration file, CMakeLists.txt, into the project root folder. Inside the cmake folder, helper scripts for CMake are stored. These files are required to find additional header and library files.
All C++ classes are located inside folders, collecting the classes of the objects we create. The Window class will be stored in the window subfolder to hold all files related to the class itself, and the same applies to tools – the logger, the model classes, and the renderer-related classes. After you have all the required code and tools installed, let’s get a general idea of what game character animations are about.
Moving a game character around in a virtual world with lots of different animations, changeable outfits, a collision detection system for other characters and the environment, and maybe even interaction with other characters looks nice and simple when playing a game.
But the mathematics and techniques behind the smoothly animated game characters are extensive and complex. Every movement, animation, action, or state change of the character involves a long journey until the final image is rendered to the screen.
Let’s look at a high-level explanation of animations first. If you already know the details, you can skip to the next section.
The building blocks of an animated three-dimensional character model are the so-called nodes. A node can be compared to a joint in the virtual body of the character, like the shoulder or hip.
All nodes in the character model are connected in the form of a virtual skeleton, forming the bones of the model. By attaching child nodes to a node, modeling an arm with a hand and fingers, or a leg with a foot and toes – or even an entire human-like skeleton – is no problem. The starting point of the virtual skeleton is the so-called root node. The root node has no parent node and is used as the starting point for animations.
Usually, the level of detail does not reach the details of a skeleton in a real-world object since many of the real bones are static or play only a minor role in muscle or pose changes during animation.
The virtual skeleton of the character model can be animated by rotating nodes around their center point – and thus rotating the bone to all attached child nodes around the center point of this node. Just imagine raising your arm a bit: your upper arm will rotate around your shoulder joint, and the lower arm, hand, and finger follow the shoulder rotation. This kind of animation is called a skeletal animation.
A character needs to be stored in a more or less natural pose in the file, which is called the reference pose, or bind pose. You will find most models in a T-pose where both arms create a horizontal line, and sometimes see the A-pose, where the position of the arms of the skeleton resembles the uppercase letter A.
To animate a character, the transforms of each node between the position in the bind pose and the desired position in an animation pose need to be changed. Since the transformation of a node needs to be calculated in the local coordinates of that specific node, an inverse bind matrix per node exists to transform between local and world coordinates.
The animations themselves are stored in animation clips. An animation clip does not contain node transforms for every possible time of the animation but only for specific time points. Only the node transforms at so-called key frames are stored in the animation clip data, resulting in less data usage. Node positions between two key frames are interpolated using linear interpolation for translation and scaling, and spherical linear interpolation (SLERP) for rotations.
By using interpolation between two key frames, the skeleton can be brought into virtually any pose between the two stored poses. By interpolating between key frames or even interpolated poses of different animation clips, blending between the two poses can be achieved, Blending can be used to change the animation clip of a model without visual distortion, for instance, to create a smooth transition between a walking and a running animation clip.
The virtual skin of a character model is called a mesh, and applying a mesh to a skeleton in the vertex shader of the rendering pipeline is called skinning. To give the virtual skin a natural appearance, every vertex of the mesh uses weights to handle the influence of surrounding nodes.
These weights are used as a factor for node transforms: the higher the node weight, the more transforms of that node will be applied to the vertex, and vice versa. By using the node weights, the effects of expanding and compressing the skin and underlying muscles of the virtual body can be modeled with good precision.
In the glTF file format, four weights per vertex are used, but other file formats with more weights also exist.
There is a special kind of animation called a morph animation. In a morph animation, parts of a mesh are replaced, and the vertex positions can be interpolated between the different meshes. Morph animations are used to model facial expressions, updating only parts of a character model’s face instead of the entire model. By replacing only parts of the mesh but keeping the skeletal information unchanged, morph animations can be easily added to skeletal animations.
Another form of animation is the so-called additive animation. Additive animations are some sort of mix between skeletal and morph animations: by adding the difference between the current pose and the bind pose to the skeletal animation clip, the animations of the additive clip are modeled on top of the skeletal animation, but only for the nodes that are changed in the additive animation clip.
Additive animations are used to move only specific parts of a character independently of the main skeletal animation. For instance, the skeletal animation contains only the walking or running part of the body, while the additive animation clip changes only the head or hands. Now the character can move the head to look around, without the need to create walking and running animations containing all possible head movements.
The combination of skeletal, morph, and additive animation enables us to build powerful and natural-looking characters for our virtual world, allowing the model to walk or run beside us, follow our movements with the head, and use facial morph animations to speak all at the same time.
Now let us look at the general workflow for creating character model animations. We can divide the animation workflow into two parts: preparation and updates. While the preparation part is needed only once while loading the model, updates are usually made for every frame drawn to the screen.
We will dive into the preparation process of the model first.
Game character models are stored in single files, or as a collection of files, each for a specific purpose. For instance, model and texture data could reside in separate files, allowing artists to change the images independently of the model vertices.
The following steps must be done in the application before the file in the model data can be used for animation and rendering:
As the first step in the preparation phase, these files must be loaded into the memory of the computer. Depending on the implementation in the game, partial loading is possible, adding only the elements of the character model that are needed at a specific level, or in a specific part of the virtual world.Then, the data needs to be pre-processed. The representation of the data in the files on disk may be optimized in terms of saving space, but for efficient manipulation and rendering, a different kind of optimization is required.For example, different rendering APIs, like OpenGL, Vulkan, and DirectX, may need slightly different representations of vertex or texture data, or shader code to be uploaded to the GPU. Instead of storing the different versions in the model file, a generic representation may be used. The required adjustments or conversions will be done after loading.
As the last step, static data like vertex data or textures will be uploaded to the GPU, and other static and variable data parts are stored in C++ classes and objects.At this point, the model is ready to use. With the first appearance of that character on the screen, a continuously running task of data updates is needed. These per-frame tasks are required for states that change at runtime, such as positions or animation poses.
Since the data of the character is split between main memory and GPU memory, the game or animation program must sample, extract, convert, and upload data for every single frame the character is drawn to the screen.
For instance, the key-frame data of the animations needs to be updated according to the animation clip to be shown and the time since the last frame.
The following steps must be done in every frame to create the pixels of a single model instance for a specific time point of a selected animation clip:
Blending between different animation clips could be requested by the program flow, and additional animation parts may be needed, like additive blending for the head or the hands, or facial animations to allow a facial expression for the character. So, we extract rotation, translation, and scaling for all nodes at the specified replay time from all animation clips and combine the per-clip node transformsations into a final transformation matrix for every node.After the animation data is sampled and combined, the skeleton of the character needs to be adjusted. According to the animation data, every virtual bone must be translated, rotated, and scaled to reach the desired destination.Also, the world position of the character may need to be updated. World position changes can occur in different forms, like running around, jumping, or falling down. Knowing the exact position of all characters is an important part of the remaining steps.Once bone positions and the world position of the character have been determined, collision detection can run. The collision detection checks if the character intersects with other characters or environmental objects like the floor and walls, or even if the character was hit by a projectile. As a reaction to the collision detection results, adjustments to the character properties, like position, or animation clip may be triggered.Having the collision data at hand, inverse kinematics adjustments may run. Adjusting the skeleton data of the character could be needed to avoid character limbs intersecting with the wall or the floor, or to level the feet position on uneven ground.