29,99 €
Connecting a physical robot to a robot simulation using the Robot Operating System (ROS) infrastructure is one of the most common challenges faced by ROS engineers. With this book, you'll learn how to simulate a robot in a virtual environment and achieve desired behavior in equivalent real-world scenarios.
This book starts with an introduction to GoPiGo3 and the sensors and actuators with which it is equipped. You'll then work with GoPiGo3's digital twin by creating a 3D model from scratch and running a simulation in ROS using Gazebo. Next, the book will show you how to use GoPiGo3 to build and run an autonomous mobile robot that is aware of its surroundings. Finally, you'll find out how a robot can learn tasks that have not been programmed in the code but are acquired by observing its environment. You'll even cover topics such as deep learning and reinforcement learning.
By the end of this robot programming book, you'll be well-versed with the basics of building specific-purpose applications in robotics and developing highly intelligent autonomous robots from scratch.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 446
Veröffentlichungsjahr: 2020
Copyright © 2020 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Acquisition Editor:Rohit RajkumarContent Development Editor: Ronn KurienSenior Editor: Richard Brookes-BlandTechnical Editor:Dinesh PawarCopy Editor:Safis EditingProject Coordinator:Neil DmelloProofreader: Safis EditingIndexer:Rekha NairProduction Designer:Joshua Misquitta
First published: February 2020
Production reference: 1250220
Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK.
ISBN 978-1-83855-130-8
www.packt.com
Packt.com
Subscribe to our online digital library for full access to over 7,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.
Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals
Improve your learning with Skill Plans built especially for you
Get a free eBook or video every month
Fully searchable for easy access to vital information
Copy and paste, print, and bookmark content
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.
Bernardo Ronquillo Japón is an Internet of Things (IoT) and robotics expert who has worked for top technology companies since 1995, including Instituto de Astrofísica de Canarias, Gran Telescopio Canarias, Altran, and Alestis Aerospace.
Using his skills and experience, he founded The Robot Academy, where he develops open source hardware and software solutions for engineers and makers: Social Robot IO (2015), for the stimulation of children with autistic spectrum disorder; Robot JUS (2016), which helps engineers get deeper technical insights with the Robot Operating System (ROS) when using low-complexity hardware; and IIoT All-in-One (2018) as an industrial IoT training package for assisting companies in their digital transformation process.
Lentin Josephis an author, roboticist, and robotics entrepreneur from India. He runs a robotics software company called Qbotics Labs in Kochi, Kerala. He has 8 years of experience in the robotics domain, primarily in ROS, OpenCV, and PCL.
He has authored several books on ROS, including Learning Robotics Using Python – First Edition and Second Edition, Mastering ROS for Robotics Programming – First Edition and Second Edition, ROS Robotics Projects, and Robot Operating System for Absolute Beginners, Published by Packt.
He gained his master's in robotics and automation in India and has worked at the Robotics Institute, CMU, USA. He is also a TEDx speaker.
RamkumarGandhinathanis aroboticist and researcher by profession. He started building robots in sixth grade and has been in the robotics field for over 15 years. He has personally built over 80 robots of different types. With 7 years of professional experience (4 years full time and 3 years part time/internship) in the robotics industry, he has 5 years of experience with ROS in particular. In his career, he has built over 15 industrial robot solutions using ROS. He is also fascinated by building drones and he himself pilots drones. His research interests include simultaneous localization and mapping (SLAM), motion planning, sensor fusion, multi-robot communication, and systems integration.
If you're interested in becoming an author for Packt, please visit authors.packtpub.com and apply today. We have worked with thousands of developers and tech professionals, just like you, to help them share their insight with the global tech community. You can make a general application, apply for a specific hot topic that we are recruiting an author for, or submit your own idea.
Title Page
Copyright and Credits
Hands-On ROS for Robotics Programming
About Packt
Why subscribe?
Contributors
About the author
About the reviewers
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Code in Action
Conventions used
Get in touch
Reviews
Section 1: Physical Robot Assembly and Testing
Assembling the Robot
Understanding the GoPiGo3 robot
The robotics perspective
The programming perspective
Robot kit and resources
Getting familiar with the embedded hardware
The GoPiGo3 board
Raspberry Pi 3B+
Why does a robot need a CPU?
Deep diving into the electromechanics
The most useful sensors
Distance sensor
Line follower
IMU sensor
Pi Camera
Putting it all together
Quick hardware test
Resources
Getting started with DexterOS
Coding with Bloxter
Calibrating the robot
Driving the robot
Checking the sensors
Shutting down the robot
Summary
Questions
Further reading
Unit Testing of GoPiGo3
Technical requirements
Getting started with Python and JupyterLab
Launching JupyterLab for GoPiGo3
Hardware testing
Testing battery, LEDs, and motors/encoders
Battery level
Hardware information and current voltage levels
LEDs and blinkers
Motors and encoders test
Unit testing of sensors and drives
Quick start with sensors and motors
Driving around
Distance sensor
Check port connections
Distance sensor unit test
GoPiGo3 API library
DI sensors API library
Servo package
Servo package unit test
Line follower
Line follower unit test
Inertial Measurement Unit (IMU)
IMU unit test
Raspberry Pi
Pi unit test
GoPiGo3 projects
Summary
Questions
Further reading
Getting Started with ROS
Technical requirements
ROS basic concepts
The ROS graph
roscore
Workspaces and catkin
Configuring your ROS development environment
Installing ROS
Ubuntu and ROS in the Raspberry Pi
Integrated Development Environment (IDE)
Installing RoboWare Studio
Communication between ROS nodes – messages and topics
Creating a workspace
Creating a workspace and building it using RoboWare
Setting up the ROS package
Accessing package files and building the workspace using RoboWare
A node publishing a topic
A node that listens to the topic
Combining the publisher and subscriber in the same node
Using publicly available packages for ROS
Summary
Questions
Further reading
Section 2: Robot Simulation with Gazebo
Creating the Virtual Two-Wheeled ROS Robot
Technical requirements
Getting started with RViz for robot visualization
Building a differential drive robot with URDF
Overview of URDF for GoPiGo3
URDF robot body
Caster
The URDF model's left and right wheels
Inspecting the GoPiGo3 model in ROS with RViz
Understanding the roslaunch command
Using Roboware to execute a launch file
Controlling the GoPiGo3 robot's wheels from RViz
Using the joint_state_publisher package
Robot frames of reference in the URDF model
Using RViz to check the model while building
Changing the aspect of the model in the RViz window
Helpful ROS tools for checking purposes
Summary
Questions
Further reading
Simulating Robot Behavior with Gazebo
Technical requirements
Getting started with the Gazebo simulator
Making modifications to the robot URDF
Extending URDF to produce an SDF robot definition
Collisions and physical properties
Gazebo tags
Verifying a Gazebo model and viewing the URDF
Launching the GoPiGo model in Gazebo
Explaining configurable launch files using the <arg> tag
Moving your model around
Guidelines for tuning the Gazebo model
Summary
Questions
Further reading
Section 3: Autonomous Navigation Using SLAM
Programming in ROS - Commands and Tools
Technical requirements
Setting up a physical robot
Downloading and setting up Ubuntu Mate 18.04
Access customization
Updating your system and installing basic utilities
Enabling SSH access
Setting up a VNC server (x11vnc)
Setting up autostart on boot
Forcing the HDMI output and screen layout
The Geany IDE
Installing drivers for the GoPiGo3 and DI Sensors
Setting up the Pi Camera
Installing ROS Melodic
Installing a Pi Camera ROS package
A quick introduction to ROS programming
Setting up the workspace
Cloning a ROS package
Our first execution of a ROS node
Case study 1 – writing a ROS distance-sensor package
Creating a new package
Producing your source code
Including the required libraries – rospy and msgs.msg
Assigning a node name to the script
Defining the publisher
Setting up the msg_range object
Changing units to the International System of Units
Adding a measured distance and timestamp to the msg_range object
Setting the reading frequency
Running an infinite loop
Publishing each new event
Waiting until the next reading
Launching the ROS execution environment
Working with ROS commands
Shell commands
Changing the current location
Listing files and folders inside a package
Editing any file inside a package
Execution commands
The central process of the ROS environment
Executing a single node
Information commands
Exploring topics
Exploring nodes
The rosmsg command
The rosbag command
Packages and the catkin workspace
Creating and running publisher and subscriber nodes
Automating the execution of nodes using roslaunch
Case study 2 – ROS GUI development tools – the Pi Camera
Analyzing the ROS graph using rqt_graph
Displaying image data using rqt_image_view
Graphing time series of sensor data with rqt_plot
Playing a recorded ROS session with rqt_bag
Distance sensor
The Pi Camera
Customizing robot features using ROS parameters
Summary
Questions
Further reading
Robot Control and Simulation
Technical requirements
Setting up the GoPiGo3 development environment
ROS networking between the robot and the remote computer
Communication between ROS environments
Robot network configuration
Laptop network configuration
Launching the master node and connecting
Case study 3 – remote control using the keyboard
Running the gopigo3 node in the robot
Inspecting published topics and messages
Teleoperation package
Running teleoperation on a laptop
Teleoperation with the mouse
Remote control using ROS topics
The motion control topic – /cmd_vel
Using /cmd_vel to directly drive GoPiGo3
Checking the X, Y, and Z axes of GoPiGo3
Composing motions
Remotely controlling both physical and virtual robots
Reverting the ROS master to the local computer
Simulating GoPiGo3 with Gazebo
Adding the controller to the Gazebo model of the robot
Real-world and simulation at once
Summary
Questions
Further reading
Virtual SLAM and Navigation Using Gazebo
Technical requirements
ROS navigation packages
ROS master running on the local computer
Dynamic simulation using Gazebo
Adding sensors to the GoPiGo3 model
Camera model
Simulating the camera
Distance sensor
Simulating the distance sensor
Components in navigation
Costmaps for safe navigation
Robot perception and SLAM
Adding a Laser Distance Sensor (LDS)
Simulating the LDS
SLAM concepts
Occupancy Grid Map (OGM)
The SLAM process
The navigation process
Practising SLAM and navigation with the GoPiGo3
Exploring the environment to build a map using SLAM
Driving along a planned trajectory using navigation
Summary
Questions
Further reading
SLAM for Robot Navigation
Technical requirements
Setting the ROS master to be in the robot
Preparing an LDS for your robot
Setting up YDLIDAR
Integrating with the remote PC
Running the YDLIDAR ROS package
Integrating with Raspberry Pi
Checking that YDLIDAR works with GoPiGo3
Visualizing scan data in the Raspberry Pi desktop
Grouping launch files
Visualizing scan data from the remote laptop
Processing YDLIDAR data from a remote laptop
Creating a navigation application in ROS
Practicing navigation with GoPiGo3
Building a map of the environment
Navigating GoPiGo3 in the real world
Summary
Questions
Further reading
Section 4: Adaptive Robot Behavior Using Machine Learning
Applying Machine Learning in Robotics
Technical requirements
Setting up the system for TensorFlow
Installing pip
Installing the latest version
Installing TensorFlow and other dependencies
Achieving better performance using the GPU
ML comes to robotics
Core concepts in ML
Selecting features in ML
The ML pipeline
From ML to deep learning
ML algorithms
Regression
Logistic regression
Product recommendation
Clustering
Deep learning
Deep learning and neural networks
The input layer
The hidden layer(s)
The output layer
A methodology to programmatically apply ML in robotics
A general approach to application programming
Integrating an ML task
Deep learning applied to robotics – computer vision
Object recognition in Gazebo
Object recognition in the real world
Summary
Questions
Further reading
Machine Learning with OpenAI Gym
Technical requirements
An introduction to OpenAI Gym
Installing OpenAI Gym
Without Anaconda (optional)
Installing gym in development mode (optional)
Installing OpenAI ROS
Agents, artificial intelligence, and machine learning
The cart pole example
Environments
Spaces
Observations
Running the full cart pole example
Q-learning explained – the self-driving cab example
How to run the code for the self-driving cab
Reward table
Action space
State space
Self-driving cab example using the RL algorithm
Evaluating the agent
Hyperparameters and optimization
Running an environment
Configuring the environment file
Running the simulation and plotting the results
Checking your progress with the logger
Summary
Questions
Further reading
Achieve a Goal through Reinforcement Learning
Technical requirements
Preparing the environment with TensorFlow, Keras, and Anaconda
TensorFlow backend
Deep learning with Keras
ROS dependency packages
Understanding the ROS Machine Learning packages
Training scenarios
ROS package structure for running a reinforcement learning task
Setting the training task parameters
Training GoPiGo3 to reach a target location while avoiding obstacles
How to run the simulations
Scenario 1 – travel to a target location
Scenario 2 – travel to a target location avoiding the obstacles
Testing the trained model
Summary
Questions
Further reading
Assessment
Chapter 1: Assembling the Robot
Chapter 2: Unit Testing of GoPiGo3
Chapter 3: Getting Started with ROS
Chapter 4: Creating the Virtual Two-Wheeled ROS Robot
Chapter 5: Simulating Robot Behavior with Gazebo
Chapter 6: Programming in ROS - Commands and Tools
Chapter 7: Robot Control and Simulation
Chapter 8: Virtual SLAM and Navigation Using Gazebo
Chapter 9: SLAM for Robot Navigation
Chapter 10: Applying Machine Learning in Robotics
Chapter 11: Machine Learning with OpenAI Gym
Chapter 12: Achieve a Goal through Reinforcement Learning
Other Books You May Enjoy
Leave a review - let other readers know what you think
Why a new book about learning robotics with ROS? Well, programming is but a small part of what it takes to work with robots. If you want to become really good at robotics, you'll need skills in other areas as well: electromechanics, robot simulation, autonomous navigation, and machine learning/reinforcement learning. Each of these four topics is a building block that you will need to master on your path to acquiring full robotics skills. This book is divided into four parts, each one being devoted to each of these building blocks.
Part 1, Physical Robot Assembly and Testing, focuses on electromechanics and describes each hardware part of the robot, providing practical demonstrations of how to test every sensor and actuator that it is equipped with. This part of the book should provide you with a good understanding of how a mobile robot works.
Part 2, Robot Simulation with Gazebo, deals with robot simulation. It is here where we introduce ROS and develop a two-wheeled robot simulation that emulates both the physical aspects and the behavior of an actual robot. We will explore the concept of the digital twin, a virtual robot that is the twin of a physical one. This is a fundamental part of developing robotic applications, as it cuts the costs associated with testing real hardware. The digital twin allows us to speed up the development process and save testing with the physical robot for the advanced stages of development.
Part 3, Autonomous Navigation Using SLAM, is devoted to robot navigation, the most common task for mobile robots. State-of-the-art algorithms and techniques are explained in a practical manner, first in simulation and then with a physical robot.
Part 4, Adaptive Robot Behavior Using Machine Learning, focuses on machine learning and reinforcement learning, the most active fields in robot research and real-world robotic applications. By using this technology, a robot is able to transition from pure automatism – where every possible behavior or answer is coded – to being a flexible behavior machine, where the robot is capable of reacting in a smart way to environmental demands by learning from data. This data can be obtained from the robot's previous experience or gathered from the experience of similar robots.
To build a state-of-the-art robot application, you will first need to master and then combine these four building blocks. The result will be what is commonly known as a smart robot. This is your task – this is your challenge.
If you are an engineer who wishes to build AI-powered robots powered using ROS, then this book is for you. Technicians and hobbyists who wish to develop their own ROS robotics projects will also find this book to be a useful resource.
Chapter 1, Assembling the Robot, provides the key concepts and the practical assembly guidelines about the mobile robot on which all the content in this book is based. With a very practical approach in mind, we dive deep into the characteristics of GoPiGo3 that makes it an ideal and cost-effective platform to learn robotics. By completing the GoPiGo3 assembly, you will have acquired the first manual skills necessary for manipulating typical components in robotics. To purchase GoPiGo3 kit, you can visit https://www.dexterindustries.com/gopigo3/ and apply the coupon code BRJAPON@PACKT to get a 10% discount.
Chapter 2, Unit Testing of GoPiGo3, provides you with a practical insight into how GoPiGo3 works. We do so by introducing the JupyterLab environment, a friendly interface that takes the structure of a notebook composed of human-readable paragraphs followed by Python code snippets. You will produce two versions of each test program: the JupyterLab notebook and the pure Python script. Using these programming tools, you will test each sensor/actuator individually and check that it's working properly, as well as gain an understanding of the technology behind.
Chapter 3, Getting Started with ROS, explains the basic concepts of ROS. It introduces you to the framework using easy-to-understand language, avoiding very technical descriptions. This is because our primary goal is to show you exactly what ROS is in a conceptual sense. It will be in the following chapters that deep technical descriptions are provided so that you are finally able to integrate ROS into your projects.
Chapter 4, Creating a Virtual Two-Wheeled ROS Robot, describes how to build a simple two-wheeled robot, a digital twin of GoPiGo3. The model is written in the Unified Robot Description Format (URDF) and the result is checked with RViz, an ROS tool that provides a configurable Graphical User Interface (GUI) to allow the user to display the specific information they are after. RViz may be used both for global robot visualization and for debugging specific features while building a model.
Chapter 5, Simulating Robot Behavior with Gazebo, teaches you how to plug the digital definition of your robot (the URDF file) into the simulation environment of Gazebo, which is powered with a physics engine able to emulate realistic behaviors. You will also develop your understanding of how to check and test a digital robot to ensure that its behavior represents well what should happen in the reality.
Chapter 6, Programming in ROS Commands and Tools, introduces you to command-line interaction with ROS and explains the types of ROS commands. We will explore the most frequently used communication patterns in ROS, including the publish-subscribe model. To deal with all of your ROS data, you will be introduced to rqt, which eases the process of developing and debugging applications. Also, ROS parameters are introduced to give you an overview of their power to manage robot configuration at a high level.
Chapter 7,Robot Control and Simulation, teaches you how to set up an ROS environment for a real robot, using GoPiGo3. We will start by looking at remote control using the keys of your laptop keyboard, then progress to the more technical method of using ROS Topics. This chapter will start you on your path from manual keyboard- and Topic-based control to internal programming logic, so that your robots can be capable of executing tasks autonomously.
Chapter 8, Virtual SLAM and Navigation Using Gazebo, explores the technique of Simultaneous Localization and Mapping (SLAM) using a practical approach and the digital twin of GoPiGo3. You will be taught why SLAM is required prior to proper navigation. The simulation will be run in Gazebo, the ROS-native simulation tool with a physics engine that offers realistic results.
Chapter 9, SLAM for Robot Navigation, shifts the focus to the real world with the physical GoPiGo3 robot. The chapter highlights the many details and practical questions that arise when you face a robotic task in a real environment. Simulation is good to start with, but the real proof that your robot performs as expected is gained by executing tasks in an actual scenario. This chapter is the starting point for you to get deeper into robot navigation and will be vital to your knowledge base if this is a field that you want to pursue.
Chapter 10, Applying Machine Learning in Robotics, intends to be a gentle introduction to the topic of machine learning in robotics, favoring intuition instead of complex mathematical formulations and putting the focus on understanding the common concepts used in the field. The practical example used in this chapter will involve the Pi camera of GoPiGo3 recognizing objects.
Chapter 11, Machine Learning with OpenAI Gym, gives you the theoretical background on reinforcement learning based on simple scenarios. This chapter allows you to better understand what happens under the hood in classical reinforcement training tasks. We will continue using practical examples to explore the concepts presented and will use the open source environment OpenAI Gym, which lets us easily test different algorithms from training agents, also driving robots in ROS.
Chapter 12, Achieve a Goal through Reinforcement Learning, goes a step further than computer vision for object recognition and shows that GoPiGo3 not only perceives things but can also take steps to achieve a goal. Our robot will have to decide what action to execute at every step of the simulation to achieve the goal. After executing each action, the robot will be provided with feedback on how good the decision it made was in the form of a reward. After some training, the incentive of the reward will enforce and reinforce good decision making.
The book takes a practical approach to things and will encourage you to practice what you are learning with a physical robot. We choose GoPiGo3 (https://www.dexterindustries.com/gopigo3/) because of its modularity, moderate cost, and the fact that it's based on the Raspberry Pi. You can acquire a Raspberry Pi board from online stores worldwide. Before purchasing any component of the kit, we recommend that you first read Chapter 1, Assembling the Robot, to get basic information on all the components that you will need to purchase. To purchase GoPiGo3 kit, you can visit https://www.dexterindustries.com/gopigo3/ and apply the coupon code BRJAPON@PACKT to get a 10% discount.
Some knowledge of Python and/or C++ programming and familiarity with single-board computers such as the Raspberry Pi are required to get the most out of this book.
Finally, you will need a laptop with Ubuntu 16.04 Xenial Xerus or Ubuntu 18.04 Bionic Beaver. The code of the book has been tested using both operating systems. If you have to start from scratch, we recommend that you use Ubuntu 18.04 because it is the latest Long-term Support (LTS) version provided by Canonical and will be supported until April 2023.
All the installation instructions you'll need are given in theTechnical requirementssection at the beginning of each chapter.
You can download the example code files for this book from your account atwww.packt.com. If you purchased this book elsewhere, you can visitwww.packtpub.com/supportand register to have the files emailed directly to you.
You can download the code files by following these steps:
Log in or register at
www.packt.com
.
Select the
Support
tab.
Click on
Code Downloads
.
Enter the name of the book in the
Search
box and follow the onscreen instructions.
Once the file is downloaded, please make sure that you unzip or extract the folder using the latest version of:
WinRAR/7-Zip for Windows
Zipeg/iZip/UnRarX for Mac
7-Zip/PeaZip for Linux
The code bundle for the book is also hosted on GitHub at https://github.com/PacktPublishing/Hands-On-ROS-for-Robotics-Programming. In case there's an update to the code, it will be updated on the existing GitHub repository.
We also have other code bundles from our rich catalog of books and videos available athttps://github.com/PacktPublishing/. Check them out!
We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here:http://www.packtpub.com/sites/default/files/downloads/9781838551308_ColorImages.pdf.
Visit the following link to check out videos of the code being run:http://bit.ly/2PrRpXF
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book,mention the book title in the subject of your message and email us [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you would report this to us. Please visitwww.packtpub.com/support/errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us [email protected] a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visitauthors.packtpub.com.
Please leave a review. Once you have read and used this book, why not leave a review on the site that you purchased it from? Potential readers can then see and use your unbiased opinion to make purchase decisions, we at Packt can understand what you think about our products, and our authors can see your feedback on their book. Thank you!
For more information about Packt, please visit packt.com.
This section focuses on describing and setting up the hardware that will be used alongside this book. Mechanical parts, including sensors and actuators, microcontrollers, and embedded computers, are the core hardware features of any mobile robot. Installation instructions for the required software in order to run GoPiGo3 with ROS will be included.
This section comprises the following chapters:
Chapter 1
,
Assembling the Robot
Chapter 2
,
Unit Testing of GoPiGo3
Chapter 3
,
Getting Started with ROS
This chapter will provide you with a variety of practical assembly guidelines about the mobile robot that the content of this book is based on. With a very practical approach in mind, we'll deep dive into the characteristics of GoPiGo3 and what makes it an ideal platform to learn robotics.
First, we will focus on the hardware and talk about the components that every robot is composed of including the mechanical parts and embedded system, sensor, and motors.
After completing the GoPiGo3 assembly section, you will be acquiring manual skills so that you can start manipulating typical components in robotics. You will also be driven to adopt a systematic approach of applying partial verification tests while assembling your robot, also known as unit tests.
After introducing the GoPiGo3 robot in the first section of this chapter, we will explain these concepts in depth, including the embedded controller, the GoPiGo3 board, and the embedded computer, the Raspberry Pi.
Next, we will describe the sensors and actuators that the robot will use, grouped into what we will call the electromechanics.
Finally, we will provide you with some useful guidelines so that assembling the robot is straightforward. Then, we will test the GoPiGo3 robot using its easy-to-start software, DexterOS. Even though we will adopt Ubuntu as an operating system for running ROS later in this book, it is recommended that you start with DexterOS so that you familiarize yourself with the hardware while avoiding specific software programming tasks, which is something that will be left for later chapters.
In this chapter, we will cover the following topics:
Understanding the GoPiGo3 robot
Getting familiar with the embedded hardware – GoPiGo3 board and Raspberry Pi
Deep diving into the electromechanics
–
motors, sensors, and 2D camera
Putting it all together
Hardware testing using Bloxter (visual programming) under DexterOS
GoPiGo3 is a Raspberry Pi-based robot car manufactured by Dexter Industries. It is intended to be used as an educational kit for learning about both robotics and programming, two complementary perspectives that clearly show the transversal knowledge you should acquire to become a robotics engineer. We'll explain what this means by letting Nicole Parrot, Director of Engineering at Modular Robotics, explain it in her own words:
Ready to dive into robotics? Let's go!
From the robotics perspective, you will learn how to work with the basic parts:
Motors
, which allow the robot to move from one point to another. In GoPiGo3, we have DC motors with built-in encoders that provide a precise motion. This is one of the main upgrades from GoPiGo2, where the encoders were external to the motors and not very accurate.
Sensors
, which acquire information from the environment, such as the distance to near objects, luminosity, acceleration, and so on.
The
controller
—that is, the GoPiGo3 red board—handles the physical interface with sensors and actuators. This is the real-time component that allows GoPiGo3 to interact with the physical world.
A
single-board computer
(
SBC
) Raspberry Pi 3B+, which provides processing capacity. As such, it works under an operating system, typically a Linux-based distribution, providing wide flexibility from a software point of view.
Most educational kits stop at a level-3 controller; they do not include a level-4 single-board computer. The software in the controller is a small program (only one) that is embedded in the board. Every time you want to modify the code for the robot, you have to fully replace the existing program and flash the new version from an external computer while using the serial connection over a USB port.
A classic example of this is an Arduino-controlled robot. Here, the Arduino board plays the role of our GoPiGo3 board, and if you have worked with it, you will surely remember how you needed to transfer the new program from the Arduino IDE on your laptop to the robot through a USB cable.
From the programming perspective, GoPiGo3 allows you to start easy by learning a visual programming language, Bloxter, a fork of the open source Google Blockly, that was specifically developed for GoPiGo3. This is a very comfortable prerequisite when it comes to learning about the basic concepts of writing software programs.
But if you are reading this book, we are sure you already know how to program in one of the many available languages, that is, C, C++, Java, JavaScript, or Python. Dexter Industries provides various open source libraries (https://github.com/DexterInd/GoPiGo3/tree/master/Software) that you can use to program GoPiGo3. Some of them are as follows:
C
C#
Go
Java
Node.js (JavaScript)
Python
Scratch
In any case, in this first chapter, we encourage you to only use Bloxter to emphasize the robotics perspective and become familiar with the hardware you have in your hands. After that, you may use your choice of language, given the many GoPiGo3 application programming interfaces (APIs) that are available.
In this book, we will focus on Python as the primary language to program in ROS. The Python language is easier to learn while still being very powerful and predominant in robotics and computer science. After going through some Python examples in Chapter 2, Unit Testing GoPiGo3, we will get started with the Robot Operating System (ROS), which isn't an actual programming language but a development application framework for robots. As such, we will show you how to adapt your Python programs with wrappers so that they can also run within ROS as pieces for building high-level functionalities.
You will appreciate the added value of such a jump to ROS when you discover how many more things GoPiGo3 can do when its Python base of code is wrapped with ROS. This software upgrade provides GoPiGo3 with a toolkit that allows students, creators, and engineers to understand how robots work. Furthermore, you should be aware that the ROS is commonly used in professional environments.
At a high level, we can group the hardware of the robot into two sets:
Electromechanics
: This refers to the sensors and actuators that allow it to interact with the physical world.
Embedded hardware
: The electronic boards that allows it to acquire a signal from the sensors, convert it into a digital signal, and provide the processing logic and send commands to the actuators. Here, we typically have two types of electronic boards:
The
controller
, which serves as the physical interface with the sensors and actuators—that is, theGoPiGo3 board. The controller deals with both analog and digital signals from the electromechanical devices, transforming them into digital signals that can be processed by a CPU.
The
computer
, which provides us with the means to implement intelligent logic. In most robots, this is an SBC. In the case of GoPiGo3, this is the Raspberry Pi running a Linux OS distribution, such as Raspbian or Ubuntu.
Although you could directly connect digital devices to the Raspberry Pi through its general purpose input/output (GPIO) pins, from a functional point of view, it isbetter to interface all the sensors and actuators through the controller—that is, the GoPiGo3 board: keep the interface with the physical world at the controller level and do the processing and computation at the computer level.
If you are a regular Raspberry Pi user and own the board, you only need to purchase the GoPiGo3 Robot Base Kit (https://www.dexterindustries.com/product/gopigo3-robot-base-kit/). This kit includes the following:
GoPiGo3 board (red board)
Chassis (frame, wheels, hardware)
Motors
Encoders
Power battery pack and cable
Screwdriver for assembly
The following image shows all theparts that are included:
The following image shows the assembled kit (without the Raspberry Pi):
The batteries (8 AA 1.2 V) are not included. Although you can use cheaper replaceable units, it is strongly advised that you use rechargeable batteries. In the long term, it will be cost-effective and environmentally friendly.
Apart from the kit, you will need to add a Raspberry Pi 3 and its Micro SD card; otherwise, especially if you are new to the Raspberry Pi world, you would be better off buying the GoPiGo3 Beginner Starter Kit (https://www.dexterindustries.com/product/gopigo-beginner-starter-kit/), which includes the Raspberry Pi 3 and its accessories, as well as an orientable distance sensor equipped with a servo-motor, allowing it to cover a 180° field of view. This sensor set is composed of the following:
The distance sensor (
https://www.dexterindustries.com/product/distance-sensor/
)
The servo package (
https://www.dexterindustries.com/product/servo-package/
)
The following image shows the final aspect of the Beginner Starter Kit, once it's been assembled. The same result can be obtained with the Robot Base Kit by adding the Raspberry Pi and the orientable distance sensor:
Now that we've looked at the GoPiGo3 robot, it's time to cover the technical details regarding the embedded hardware and the electromechanics.
Do you remember which hardware is for what? The GoPiGo3 board is for interfacing with sensors and actuators, while Raspberry Pi is used for computing tasks. We will cover these topics in detail here.
This customized board (https://www.dexterindustries.com/GoPiGo/learning/hardware-port-description/) provides the general features that are expected from a controller:
Real-time communication with sensors and actuators.
Interface
input/output
(
I/O
) through a
serial peripheral interface
(
SPI
) that feeds the data from the sensors to the Raspberry Pi and may also receive commands for the actuators (also from the Raspberry Pi, after running the logic in its CPU for every step of the control loop).
One single program loaded on board, known as the firmware. Since the goal of this software is to implement a communication protocol while the computer implements the logic, it doesn't need to be changed unless you decide to upgrade it when a new version is available.
Let's briefly explain the input/output interface protocol that we mentioned in the preceding bullet point list. SPI is a bus that's used to send data between microcontrollers and external devices, which are sensors, in our case. It uses separate clock and data lines, along with a select line to choose the device to talk to. The side of the connection that generates the clock is called the master, which is the Raspberry Pi in our case, while the other side is called the slave, which is the GoPiGo3 board. This way, both boards are synchronized, resulting in a faster form of communication than asynchronous serial, which is the typical communication protocol in general-purpose boards such as Arduino.
You can find out more about the SPI protocol in an easy-to-follow tutorial at https://learn.sparkfun.com/tutorials/serial-peripheral-interface-spi. Communication over SPI with the Raspberry Pi occurs through the headers interface, which can be seen in the top part of GoPiGo3 board in the following image. Only five out of the 40 GPIO pins are needed for such an interface:
For interfacing with devices, the board provides the following (a top view of the board can be seen in the next diagram):
Two I2C ports—two Grove ports connected to the Raspberry Pi I2C bus through a level-conversion chip
One serial port—one Grove port connected to serial pins on the Raspberry Pi through a level-conversion chip
Two analog-digital ports—two Grove ports connected to the GoPiGo3 microcontroller
Two servo ports for the PWM type servomotor:
Let's explain these new concepts:
Serial port
: This is the complimentary communication protocol we mentioned previously when we talked about the SPI. While the last is synchronous (it needs five interface pins), the serial port is asynchronous—that is, there is no clock signal to follow and only two pins are needed:
Tx
for data transmission and
Rx
for data reception. In GoPiGo3, this port is directly connected to the Raspberry Pi serial pins through a level-conversion chip.
I2C ports
: As its name suggests, this uses the I2C communication protocol. Just like SPI, it is a synchronous protocol, faster than asynchronous serial.
I2C
uses two lines,
SDA
for data and
SCL
for the clock signal. The third and fourth wires are for the power supply:
VIN
at 5V and the
GND
ground—that is, a 0V reference.
SDA
is bidirectional, so any of the connected devices can send or receive data. In these two ports, you will connect the distance sensor and the line follower sensor.
Analog-digital
: These ports
can connect to analog, digital, or I2C
Grove devices. We will be connecting to one of the analog-digital port, the IMU sensor. We will talk about this in more detail later.
Servo ports, that connect PWM servomotors
: These are cheaper and easier to control than encoder-equipped motors, all while offering enough accuracy to control the orientation of the support they will hold. In GoPiGo3, we can attach the distance sensor or the Pi came
ra to a servo motor.
Pulse Width Modulation
(
PWM
)
technology
refers to having control in a
continuous range by changing the duty cycle of the voltage supply, resulting in an equivalent output ranging from 0V to 5V: 0V is the 0% duty cycle, while 100% corresponds to 5V being applied during the entirety of the cycle. By applying 5V for a percentage of the period lower than 100%, you get continuous control of the position, ranging from 0 to
180°
rotation of the motor shaft. For an explanation about this, along with some useful figures, go to
https://www.jameco.com/jameco/workshop/howitworks/how-servo-motors-work.html
.
Raspberry Pi has the largest community both in terms of education and the industry, which makes it the best single-board computer of choice when developing embedded software for robots or for Internet of Things (IoT) devices. The following image shows the Raspberry Pi 3B+, the most common model that powers GoPiGo3:
The main characteristics of Raspberry Pi 3B+ are as follows:
A
Central Processing Unit
(
CPU
) composed of four Cortex-A53 1.4 GHz.
A
Graphics Processing Unit
(
GPU
) is a Broadcom VideoCore IV at 250 MHz.
The
Synchronous Dynamic Random-Access Memory
(
SDRAM
) is 1 GB, which is shared with the GPU.
On-board storage is provided through a MicroSDHC slot. You can choose whatever micro SD card size fits your needs. In any case, the general recommendation is to use a class-10 micro SD of 16 GB capacity—10 means that it is able to be written at 10 Mb/second.
Let's go over the functionality of each of these components:
The CPU provides the computation capacity to run all kinds of algorithms. This is where the intelligence of our robot resides.
The GPU's mission is to handle computer graphics and image processing. In our case, it will mostly be devoted to processing the images from the Pi camera and providing computer vision capabilities.
SDRAM has 1 GB volatile storage that's shared with the GPU, so this is a balance of how much memory you assign to the GPU (by default, it takes up to 64 Mb). RAM is where the program is loaded so that it can be executed.
On-board microSD card is the persistent storage that contains the operating system as well as all the installed software.
Raspberry Pi runs an operating system, typically a Linux-based distribution such as Debian or Ubuntu.
Although Raspbian—the Debian-based distro by the Raspberry Pi Foundation—is the official distribution, we will be using Ubuntu—supported by Canonical—because it's the platform that Open Robotics (https://www.openrobotics.org) uses to deliver a version of ROS every year, synchronized with the yearly versions of Ubuntu.
Apart from the fact that this book's goal is to get you some hands-on experience with ROS—and for that, you need a Linux OS to install the software on—if you really want to create a smart robot, you need the processing capacity to run compute-intensivealgorithms, and that is what a CPU such as Raspberry Pi provides.
Why is this computation needed? Because a smart robot has to integrate information from the environment with the logic of the task at hand to be able to complete it successfully. Let's use the example of carrying one object from its current position to a target destination. To do so, devices such as a laser distance sensor, a 3D camera, and/or a GPS provide the robot with information from the environment. These sources of data have to be combined so that the robot is able to locate itself in the environment. By supplying a target destination, it also has to compute the optimum path to carry the object on, something that it is called path planning. When executing such a path plan, it has to detect obstacles that may appear along the path and avoid them without losing focus of the goal. Hence, every step of the task involves executing an algorithm in the CPU of the robot.
This is one of the many practical scenarios that you will learn to solve using ROS, which is currently the de factoindustry standard for the development of robotics applications.
As explained in GoPiGo's official documentation (https://www.dexterindustries.com/GoPiGo/learning/technical-specifications-for-the-gopigo-raspberry-pi-robotics-kit/), the specifications of the GoPiGo3 robot are as follows:
Operating voltage
: 7V-12V
External interfaces
:
I2C ports
: Two Grove ports connected to the Raspberry Pi I2C bus through a level-conversion chip
Serial ports
: One Grove port connected to the serial pins on the Raspberry Pi through a level-conversion chip
Analog digital ports
: Two Grove ports connected to the GoPiGo3 microcontroller
Encoders
: Two magnetic encoders with six pulse counts per rotation (with 120:1 gear reduction for a total of 720 pulses per wheel rotation)
Wheels diameter
: 66.5 mm
Distance between wheels
: 117 mm
More:
Design information is available at the official GitHub repository (
https://github.com/DexterInd/GoPiGo3
)
This is just a summary of what we explained in the section titled The GoPiGo3 board. In this section, we will concentrate on describing the devices that are connected to the GoPiGo3 board.
The sensors we are going to mount onto the GoPiGo3 are the ones that we need in order to accomplish the top-level task of the robot—that is, navigation with motion planning, while keeping costs low. These sensors are as follows:
Distance sensor
Line follower
Inertial Measurement Unit
(
IMU
) sensor
2D camera
In the case of using the line-follower sensor, since the robot will follow a marked path on the floor (usually painted in black), the motion-planning part can be skipped and navigation will be much easier. If there is an obstacle on the path, you will have to apply an algorithm to move around the obstacle and return to the path—that is, place the line-follower sensor above the black line again.
Now, we should take the time to understand what information each sensor provides. Later in this book, you will encounter such a navigation problem and the algorithms that can be used to implement it.
The simple distance sensor allows us to measure the distance to the object in front of it. It has a small laser that measures the distance to an object. The sensor applies the time of flight method for a very fast and accurate distance reading. The product page can be viewed at https://www.dexterindustries.com/product/distance-sensor/:
You can connect the distance sensor to any of the two I2C ports. Be aware that the GoPiGo3 software libraries will not ask you to specify which of the two ports you are using. This will be detected automatically.
You can mount the sensor onto a servo package to scan a wide angle of about 180°. The servomotor can be connected to either servo port 1 or servo port 2. The product page can be viewed at https://www.dexterindustries.com/product/servo-package/:
In Chapter 2, Unit Testing of GoPiGo3, there is a specific test you can run with your robot to check that this unit works properly.
