32,39 €
Create next-generation Augmented Reality and Mixed Reality apps with the latest version of Google ARCore
Key FeaturesHarness the power of the Google’s new augmented reality (AR) platform ARCore to build cutting-edge Augmented reality appsLearn core concepts of Environmental Understanding, Immersive Computing, and Motion Tracking with ARCoreExtend your application by combining ARCore with OpenGL, Machine Learning and more.Book Description
Are you a mobile developer or web developer who wants to create immersive and cool Augmented Reality apps with the latest Google ARCore platform? If so, this book will help you jump right into developing with ARCore and will help you create a step by step AR app easily.
This book will teach you how to implement the core features of ARCore starting from the fundamentals of 3D rendering to more advanced concepts such as lighting, shaders, Machine Learning, and others.
We’ll begin with the basics of building a project on three platforms: web, Android, and Unity. Next, we’ll go through the ARCore concepts of motion tracking, environmental understanding, and light estimation. For each core concept, you’ll work on a practical project to use and extend the ARCore feature, from learning the basics of 3D rendering and lighting to exploring more advanced concepts.
You’ll write custom shaders to light virtual objects in AR, then build a neural network to recognize the environment and explore even grander applications by using ARCore in mixed reality. At the end of the book, you’ll see how to implement motion tracking and environment learning, create animations and sounds, generate virtual characters, and simulate them on your screen.
What you will learn Build and deploy your Augmented Reality app to the Android, Web, and Unity platforms Implement ARCore to identify and visualize objects as point clouds, planes, surfaces, and/or meshes Explore advanced concepts of environmental understanding using Google ARCore and OpenGL ES with Java Create light levels from ARCore and create a C# script to watch and propagate lighting changes in a scene Develop graphics shaders that react to changes in lighting and map the environment to place objects in Unity/C# Integrate motion tracking with the Web ARCore API and Google Street View to create a combined AR/VR experienceWho this book is for
This book is for web and mobile developers who have broad programming knowledge on Java or JavaScript or C# and want to develop Augmented Reality applications with Google ArCore. To follow this book no prior experience with AR development, 3D, or 3D math experience is needed.
Micheal Lanham is a proven software and tech innovator with 20 years of experience. He has developed a broad range of software applications, including games, graphics, web, desktop, engineering, artificial intelligence, GIS, and Machine Learning applications for a variety of industries. He was introduced to Unity in 2006 and has been an avid developer, consultant, manager, and author of multiple Unity games, graphics projects, and books since. Micheal lives in Calgary, Canada, with his family.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 288
Copyright © 2018 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Commissioning Editor: Amarabha BanerjeeAcquisition Editor: Reshma RamanContent Development Editor: Onkar WaniTechnical Editor: Vaibhav DwivediCopy Editor: Shaila KusanaleProject Coordinator: Devanshi DoshiProofreader: Safis EditingIndexer: Priyanka DhadkeGraphics: Jason MonteiroProduction Coordinator: Shraddha Falebhai
First published: March 2018
Production reference: 1280318
Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK.
ISBN 978-1-78883-040-9
www.packtpub.com
Mapt is an online digital library that gives you full access to over 5,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.
Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals
Improve your learning with Skill Plans built especially for you
Get a free eBook or video every month
Mapt is fully searchable
Copy and paste, print, and bookmark content
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.PacktPub.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.PacktPub.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.
Micheal Lanham is a proven software and tech innovator with 20 years of experience. He has developed a broad range of software applications, including games, graphics, web, desktop, engineering, artificial intelligence, GIS, and Machine Learning applications for a variety of industries. He was introduced to Unity in 2006 and has been an avid developer, consultant, manager, and author of multiple Unity games, graphics projects, and books since. Micheal lives in Calgary, Canada, with his family.
Neil Alexander is a recent graduate from the University of North Carolina at Charlotte, where he earned a master's in computer science with a specialization in intelligent and interactive systems. As part of his education, he worked on developing several virtual reality demos and data visualization applications. He graduated from the Don Bosco Institute of Technology and has also worked as a research analyst at an IT publishing firm in Mumbai.
He currently works as a data scientist with several Blockchain and cryptocurrency startups in the Washington D.C. area.
If you're interested in becoming an author for Packt, please visit authors.packtpub.com and apply today. We have worked with thousands of developers and tech professionals, just like you, to help them share their insight with the global tech community. You can make a general application, apply for a specific hot topic that we are recruiting an author for, or submit your own idea.
Title Page
Copyright and Credits
Learn ARCore - Fundamentals of Google ARCore
Packt Upsell
Why subscribe?
PacktPub.com
Contributors
About the author
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Getting Started
Immersive computing
AR and ARCore
Motion tracking
Environmental understanding
Light estimation
The road ahead
Summary
ARCore on Android
Installing Android Studio
Installing ARCore
Installing the ARCore service on a device
Build and deploy
Exploring the code
Summary
ARCore on Unity
Installing Unity and ARCore
Building and deploying to Android
Remote debugging
Testing the connection
Remotely debugging a running app
Exploring the code
Unity Update method
Summary
ARCore on the Web
Installing WebARonARCore
Installing Node.js
The Node Package Manager
Exploring the samples
Debugging web apps on Android
Connecting Chrome Developer tools
Debugging with Chrome
3D and three.js
Understanding left- or right-handed coordinate systems
3D scale, rotation, and transformation
Summary
Real-World Motion Tracking
Motion tracking in depth
3D sound
Resonance Audio
A tracking service with Firebase
Setting up the database
Time to test the connection
Visualizing tracked motion
Exercises
Summary
Understanding the Environment
Tracking the point cloud
Meshing and the environment
Interacting with the environment
Touch for gesture detection
Drawing with OpenGL ES
Shader programming
Editing the shader
Exercises
Summary
Light Estimation
3D rendering
Building a test scene
Materials, shaders, and textures
3D lighting
Light estimation
Cg/HLSL shaders
Estimating light direction
Updating the environmental lighting
Exercises
Summary
Recognizing the Environment
Introduction to ML
Linear regression explained
Deep learning
Neural networks – the foundation of deep learning
Programming a neural network
Scripting the neural network
Training a neural network
Activating the warning
Adding the environmental scanner
Backward propagation explained
Gradient descent explained
Defining the network architecture
The network view of the world
Exercises
TensorFlow
Summary
Blending Light for Architectural Design
Setting up the project
Building the scene
Modifying the base scene
The environment and placing content
Building the UI
Scripting the buttons
Interacting with the virtual
Building the object outliner
Positioning the chair
Lighting and shadows
Turning the shadows on
Exercises
Summary
Mixing in Mixed Reality
Mixed reality and HoloKit
Setting up HoloKit
How does it work?
Introducing WRLD
Setting up WRLD for MR
Navigating the map
Switching from AR to MR
Building the SceneSwitcher
Creating the SceneSwitcher prefab
Modifying the Wrld map script
Mapping, GIS, and GPS
Making the Splash scene
Fixing the altitude issue
What's next?
Exercises
Summary
Performance Tips and Troubleshooting
Diagnosing performance
Chrome DevTools
Android Profiler
Unity Profiler
Tips for managing better performance
General troubleshooting
Troubleshooting code
Exercises
Troubleshooting tips
Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
Augmented reality applications have moved from novelty to reality, and with the release of ARKit and now ARCore, have become more accessible to the average developer. Now virtually anyone with a grasp of a programming language can quickly build an AR experience using a variety of platforms. Google, with the release of ARCore, has now made this even easier and also provides support for multiple development platforms. This book will guide you through building AR applications using JavaScript and web in mobile with Java/Android and also in mobile with C# / Unity. Along the way, you will learn the fundamentals of building a quality AR experience for your user.
This book is for any developer who wants to dive into building an augmented reality app with ARCore, but has no background in game or graphic programming. Although the book only assumes the reader has basic high-school level math, the reader should still have a firm grasp of at least one of the following programming languages: JavaScript, Java, or C#.
Chapter 1, Getting Started, covers the fundamental concepts any modern AR app needs to tackle in order to provide a great experience to the user. We will learn the basic concepts of motion tracking, environmental understanding, and light estimation.
Chapter 2, ARCore on Android, is an introduction to Android development with Android Studio, where we show you how to install Android Studio and set up your first ARCore app.
Chapter 3, ARCore on Unity, discusses how to install and build an ARCore app with Unity. This chapter also shows you how to remotely debug an app using the Android development tools.
Chapter 4, ARCore on the Web, jumps into web development with JavaScript and focuses on how to set up your own simple web server with Node.js. Then, this chapter looks through the various sample ARCore templates and discusses how to extend those for further development.
Chapter 5, Real-World Motion Tracking, extends our learnings from the preceding chapter and extend one of the web examples to add a real-world motion tracking. Not only will this showcase several fundamentals of working with 3D concepts, but it will also demonstrate how ARCore tracks a user's motion.
Chapter 6, Understanding the Environment, jumps back to the Android platform and deal with how ARCore understands the user's environment. We will grasp how ARCore identifies planes or surfaces in the environment and meshes them for user interaction and visualization. Here, we will take a look at how to modify a shader in order to measure and colorize the points from the user.
Chapter 7, Light Estimation, explains the role that lighting and shadows play in selling the AR experience to the user. We learn how ARCore provides for the estimation of light and how it is used to light the virtual models placed by the user into the AR world.
Chapter 8, Recognizing the Environment, is where we cover the basics of Machine Learning and how essential is the technology to the success of the AR revolution. We then look to building a simple neural network that learns through supervised training using a technique called back propagation. After learning the basics of NN and deep learning, we look to a more complex example that demonstrates various forms of Machine Learning.
Chapter 9, Blending Light for Architectural Design, covers the building of an AR design app that allows the user to place virtual furniture in the living space or wherever they need to. We also cover how to place and move an object in AR using touch and how to identify when an object is selected. Then, we will extend our lighting and shadows from Chapter 7, Light Estimation and provide real-time shadows on the virtual objects.
Chapter 10, Mixing in Mixed Reality, is where we introduce mixed reality through the use of inexpensive MR headsets. ARCore is ideally suited for use in these inexpensive headsets since it already tracks the user and monitors their environment internally. We will oversee how to turn our app from a traditional mapping app using the 3D WRLD API for Unity to an AR mapping app, where we will also provide an option to switch to MR and an MR headset.
Chapter 11, Performance Tips and Troubleshooting, covers techniques for measuring an app's performance on all the development platforms we deal with. We then talk about the importance of performance and the impact it can have to the various systems. After that, we cover general debugging and troubleshooting tips, where we finish off with a table that covers the most common errors a user may encounter in this book.
These are the things to be remembered in order to use this book to the fullest:
The reader will need to be proficient in one of the following programming languages: JavaScript, Java, or C#
A memory of high-school mathematics
An Android device that supports ARCore; the following is the link to check the list:
https://developers.google.com/ar/discover/
A desktop machine that will run Android Studio and Unity; a dedicated 3D graphics card is not explicitly required
You can download the example code files for this book from your account at www.packtpub.com. If you purchased this book elsewhere, you can visit www.packtpub.com/support and register to have the files emailed directly to you.
You can download the code files by following these steps:
Log in or register at
www.packtpub.com
.
Select the
SUPPORT
tab.
Click on
Code Downloads & Errata
.
Enter the name of the book in the
Search
box and follow the onscreen instructions.
Once the file is downloaded, please make sure that you unzip or extract the folder using the latest version of:
WinRAR/7-Zip for Windows
Zipeg/iZip/UnRarX for Mac
7-Zip/PeaZip for Linux
The code bundle for the book is also hosted on GitHub athttps://github.com/PacktPublishing/Learn-ARCore-Fundamentals-of-Google-ARCore. In case there's an update to the code, it will be updated on the existing GitHub repository.
We also have other code bundles from our rich catalog of books and videos available athttps://github.com/PacktPublishing/. Check them out!
We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here: https://www.packtpub.com/sites/default/files/downloads/LearnARCoreFundamentalsofGoogleARCore_ColorImages.pdf.
Feedback from our readers is always welcome.
General feedback: Email [email protected] and mention the book title in the subject of your message. If you have questions about any aspect of this book, please email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you would report this to us. Please visit www.packtpub.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit authors.packtpub.com.
Please leave a review. Once you have read and used this book, why not leave a review on the site that you purchased it from? Potential readers can then see and use your unbiased opinion to make purchase decisions, we at Packt can understand what you think about our products, and our authors can see your feedback on their book. Thank you!
For more information about Packt, please visit packtpub.com.
Welcome to the world of immersive computing and augmented reality with Google ARCore. In this book, we will start with the basics. First, we will cover the basics of augmented reality (AR) on some important core concepts. From there, we will cover the installation and basics of the three development platforms (Android, web, and Unity) that we will use throughout the book. Next, we will take a more in-depth look at the technical challenges faced by AR developers, including various solutions techniques and for solving them. In the final chapters of the book, we will expand on those skills by developing three example AR and mixed reality (MR) apps, where we will build a Machine Learning object recognizer, an AR Designer app, and an app that transitions from AR to MR.
In this chapter, we will begin by quickly covering the fundamental concepts of immersive computing and augmented reality. Then, we will look at the core problems ARCore is designed to address (motion tracking, environmental understanding, and light estimation). Here's a quick look at the topics we will cover in this chapter:
Immersive computing
ARCore and AR
Motion tracking
Environmental understanding
Light estimation
The road ahead
Immersive computing is a new term used to describe applications that provide an immersive experience for the user. This may come in the form of an augmented or virtual reality experience. While our attention in this book will be primarily focused on building an augmented reality experience, we will highlight techniques that can be used for VR as well. In order to better understand the spectrum of immersive computing, let's take a look at this diagram:
The preceding diagram illustrates how the level of immersion affects the user experience, with the left-hand side of the diagram representing more traditional applications with little or no immersion, and the right representing fully immersive virtual reality applications. For us, we will stay in the middle sweet spot and work on developing augmented reality applications. In the next section, we will be introduced to AR and ARCore in more detail.
Augmented reality applications are unique in that they annotate or augment the reality of the user. This is typically done visually by having the AR app overlay a view of the real world with computer graphics. ARCore is designed primarily for providing this type of visual annotation for the user. An example of a demo ARCore application is shown here:
The screenshot is even more impressive when you realize that it was rendered real time on a mobile device. It isn't the result of painstaking hours of using Photoshop or other media effects libraries. What you see in that image is the entire superposition of a virtual object, the lion, into the user's reality. More impressive still is the quality of immersion. Note the details, such as the lighting and shadows on the lion, the shadows on the ground, and the way the object maintains position in reality even though it isn't really there. Without those visual enhancements, all you would see is a floating lion superimposed on the screen. It is those visual details that provide the immersion. Google developed ARCore as a way to help developers incorporate those visual enhancements in building AR applications.
ARCore has its origins in Tango, which is/was a more advanced AR toolkit that used special sensors built into the device. In order to make AR more accessible and mainstream, Google developed ARCore as an AR toolkit designed for Android devices not equipped with any special sensors. Where Tango depended on special sensors, ARCore uses software to try and accomplish the same core enhancements. For ARCore, Google has identified three core areas to address with this toolkit, and they are as follows:
Motion tracking
Environmental understanding
Light estimation
In the next three sections, we will go through each of those core areas in more detail and understand how they enhance the user experience.
Tracking a user's motion and ultimately their position in 2D and 3D space is fundamental to any AR application. ARCore allows us to track position changes by identifying and tracking visual feature points from the device's camera image. An example of how this works is shown in this figure:
In the figure, we can see how the user's position is tracked in relation to the feature points identified on the real couch. Previously, in order to successfully track motion (position), we needed to pre-register or pre-train our feature points. If you have ever used the Vuforia AR tools, you will be very familiar with having to train images or target markers. Now, ARCore does all this automatically for us, in real time, without any training. However, this tracking technology is very new and has several limitations. In the later part of the book, and specifically in Chapter 5, Real-World Motion Tracking, we will add a feature to our AR assistant that allows us to track multiple objects' positions from multiple devices in real time using GPS. Then, in Chapter 10, Mixing in Mixed Reality, we will extend our tracking to include augmented maps.
The better an AR application understands the user's reality or the environment around them, the more successful the immersion. We already saw how ARCore uses feature identification in order to track a user's motion. Yet, tracking motion is only the first part. What we need is a way to identify physical objects or surfaces in the user's reality. ARCore does this using a technique called meshing.
We will cover more details about meshing in later chapters, but, for now, take a look at the following figure from Google that shows this meshing operation in action:
What we see happening in the preceding image is an AR application that has identified a real-world surface through meshing. The plane is identified by the white dots. In the background, we can see how the user has already placed various virtual objects on the surface. Environmental understanding and meshing are essential for creating the illusion of blended realities. Where motion tracking uses identified features to track the user's position, environmental understanding uses meshing to track the virtual objects in the user's reality. In Chapter 8, Recognizing the Environment, we will look at how to train our own machine learning object identifier, which will allow us to extend our meshing to include automatically recognizable objects or areas of an environment.
Magicians work to be masters of trickery and visual illusion. They understand that perspective and good lighting are everything in a great illusion, and, with developing great AR apps, this is no exception. Take a second and flip back to the scene with the virtual lion. Note the lighting and detail in the shadows on the lion and ground. Did you note that the lion is casting a shadow on the ground, even though it's not really there? That extra level of lighting detail is only made possible by combining the tracking of the user's position with the environmental understanding of the virtual object's position and a way to read light levels. Fortunately, ARCore provides us with a way to read or estimate the light in a scene. We can then use this lighting information in order to light and shadow virtual AR objects. Here's an image of an ARCore demo app showing subdued lighting on an AR object:
The effects of lighting, or lack thereof, will become more obvious as we start developing our startup applications. Later, in Chapter 9, Blending Light for Architectural Design, we will go into far more detail about 3D lighting and even build some simple shader effects.
In this chapter, we didn't go into any extensive details; we will get to that later, but you should now have a good grasp of the core elements ARCore was developed to address. In the next section, we will take a closer look at how best to use the material in this book.
We will take a very hands-on approach for the rest of this book. After all, there is no better way to learn than by doing. While the book is meant to be read in its entirety, not all readers have the time or a need to do this. Therefore, provided in the following table is a quick summary of the platforms, tools, techniques, and difficulty level of each chapter left in the book:
Chapter
Focus
Difficulty
Platform
Tools and techniques
Chapter 2
,
ARCore on Android
Basics of Android
Basic
Android (Java)
Installation of tools and environment for Android.
Chapter
3
,
ARCore on Unity
Basics of Unity
Basic
Android/Unity (C#)
Installation, setup, and deployment of the Unity sample.
Chapter
4
,
ARCore on the Web
Building ARCore web apps
Medium
Web (JavaScript)
Installation and setup of tools to support web development and hosting.
Chapter 5, Real-World Motion Tracking
3D spatial audio and Firebase
Medium
Web (JavaScript)
Introduce motion tracking with a mobile device with audio, integrate with Google Firebase, and track multiple objects and/or users in AR.
Chapter 6, Understanding the Environment
Introduction to EU and meshing
Medium
Android (Java)
Learning the ARCore API for Java as well as creating a new ARCore Android project, meshing an environment, and interacting with objects using OpenGL ES.
Chapter
7
,
Light Estimation
Introduction to light estimation and lighting in Unity
Advanced
Unity (C#, Cg/HLSL)
Understand the importance of lighting and how it can be used to make AR objects appear more realistic.
Chapter
8
,
Recognizing the Environment
Introduction to Machine Learning (ML) for AR and how it can be used.
Advanced
Android (Java), Unity (C#)
Look at various ML platforms in order to better understand how it can be used in AR applications.
Chapter
9
,
Blending Light for Architectural Design
3D lighting and shaders
Advanced
Unity (C#)
An advanced introduction to lighting and shaders in Unity, including writing HLSL/ Cg shader code.
Chapter
10
,
Mixing in Mixed Reality
Combine all elements together.
Advanced+
Unity (C#), Android (Java)
We will extend the ARCore platform by introducing mixed reality and allowing the app to transition from AR to MR.
Chapter
11
,
Performance and Troubleshooting
Performance and troubleshooting tips
Basic
All
Provides some helpful tips on performance, with a section dedicated to addressing the possible issues you may have while working on the samples.
Also, Chapter 10, Mixing in Mixed Reality, is intended to be used after the reader has reviewed all the previous chapters.
While some readers may prefer to only explore a single ARCore platform by sticking to those specific chapters, you are strongly encouraged to work through all the samples in this book. Given that the ARCore API is so similar across platforms, transferring the techniques you learn for one should translate well to another. Also, don't be intimidated by a different platform or programming language. If you have a good base of knowledge in one C language, learning any other language from the rest of the family takes only minimal effort. Developer, programmer, software engineer, or whatever you want to call yourself, you can always benefit from learning another programming language.
In this chapter, we took a very quick look at what immersive computing and AR is all about. We learned that augmented reality covers the middle ground of the immersive computing spectrum, that AR is just a careful blend of illusions used to trick the user into believing that their reality has been combined with a virtual one. After all, Google developed ARCore as a way to provide a better set of tools for constructing those illusions and to keep Android competitive in the AR market. After that, we learned the core concepts ARCore was designed to address and looked at each: motion tracking, environmental understanding, and light estimation, in a little more detail. Finally, we finished with a helpful roadmap for users looking to get the most out of this book in the shortest amount of time.
In the next chapter, we begin to dive in and get our hands dirty by getting the sample Android project set up and tweaked for our needs.
Google developed ARCore to be accessible from multiple development platforms (Android [Java], Web [JavaScript], Unreal [C++], and Unity [C#]), thus giving developers plenty of flexibility and options to build applications on various platforms. While each platform has its strengths and weaknesses, which we will get to later, all the platforms essentially extend from the native Android SDK that was originally built as Tango. This means that regardless of your choice of platform, you will need to install and be somewhat comfortable working with the Android development tools.
In this chapter, we will focus on setting up the Android development tools and building an ARCore application for Android. The following is a summary of the major topics we will cover in this chapter:
Installing Android Studio
Installing ARCore
Build and deploy
Exploring the code
If you have experience working with the Android tools and already have the SDK installed, you may want to just skim over the first three sections. Otherwise, be sure to follow along with the exercises in this chapter, as these steps will be required to undertake exercises in many other areas of this book.
Android Studio is a development environment for coding and deploying Android applications. As such, it contains the core set of tools we will need for building and deploying our applications to an Android device. After all, ARCore needs to be installed to a physical device in order to test. Follow the given instructions to install Android Studio for your development environment:
Open a browser on your development computer to
https://developer.android.com/studio
.
Click on the green
DOWNLOAD ANDROID STUDIO
button.
Agree to the
Terms and Conditions
and follow the instructions to download.
After the file has finished downloading, run the installer for your system.
Follow the instructions on the installation dialog to proceed. If you are installing on Windows, ensure that you set a memorable installation path that you can easily find later, as shown in the following example:
Click through the remaining dialogs to complete the installation.
When the installation is complete, you will have the option to launch the program. Ensure that the option to launch Android Studio is selected and click on
Finish
.
Android Studio comes embedded with OpenJDK. This means we can omit the steps to installing Java, on Windows at least. If you are doing any serious Android development, again on Windows, then you should go through the steps on your own to install the full Java JDK 1.7 and/or 1.8, especially if you plan to work with older versions of Android.