129,99 €
Introduction to Robotics: Analysis, Control, Applications, 3rd Edition
The revised text to the analysis, control, and applications of robotics
The revised and updated third edition of Introduction to Robotics: Analysis, Control, Applications, offers a guide to the fundamentals of robotics, robot components and subsystems and applications. The author—a noted expert on the topic—covers the mechanics and kinematics of serial and parallel robots, both with the Denavit-Hartenberg approach as well as screw-based mechanics. In addition, the text contains information on microprocessor applications, control systems, vision systems, sensors, and actuators.
Introduction to Robotics gives engineering students and practicing engineers the information needed to design a robot, to integrate a robot in appropriate applications, or to analyze a robot. The updated third edition contains many new subjects and the content has been streamlined throughout the text. The new edition includes two completely new chapters on screw-based mechanics and parallel robots. The book is filled with many new illustrative examples and includes homework problems designed to enhance learning. This important text:
Written for students of engineering as well as practicing engineers, Introduction to Robotics, Third Edition reviews the basics of robotics, robot components and subsystems, applications, and has been revised to include the most recent developments in the field.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 837
Veröffentlichungsjahr: 2019
Cover
Preface
About the Companion Website
1 Fundamentals
1.1 Introduction
1.2 What Is a Robot?
1.3 Classification of Robots
1.4 What Is Robotics?
1.5 History of Robotics
1.6 Advantages and Disadvantages of Robots
1.7 Robot Components
1.8 Robot Degrees of Freedom
1.9 Robot Joints
1.10 Robot Coordinates
1.11 Robot Reference Frames
1.12 Programming Modes
1.13 Robot Characteristics
1.14 Robot Workspace
1.15 Robot Languages
1.16 Robot Applications
1.17 Other Robots and Applications
1.18 Collaborative Robots
1.19 Social Issues
1.20 Summary
References
Problems
2 Kinematics of Serial Robots: Position Analysis
2.1 Introduction
2.2 Robots as Mechanisms
2.3 Conventions
2.4 Matrix Representation
2.5 Homogeneous Transformation Matrices
2.6 Representation of Transformations
2.7 Inverse of Transformation Matrices
2.8 Forward and Inverse Kinematics of Robots
2.9 Forward and Inverse Kinematic Equations: Position
2.10 Forward and Inverse Kinematic Equations: Orientation
2.11 Forward and Inverse Kinematic Equations: Position and Orientation
2.12 Denavit‐Hartenberg Representation of Forward Kinematic Equations of Robots
2.13 The Inverse Kinematic Solution of Robots
2.14 Inverse Kinematic Programming of Robots
2.15 Dual‐Arm Cooperating Robots
2.16 Degeneracy and Dexterity
2.17 The Fundamental Problem with the Denavit‐Hartenberg Representation
2.18 Design Projects
2.19 Summary
References
Problems
3 Robot Kinematics with Screw‐Based Mechanics
3.1 Introduction
3.2 What Is a Screw?
3.3 Rotation about a Screw Axis
3.4 Homogenous Transformations about a General Screw Axis
3.5 Successive Screw‐Based Transformations
3.6 Forward and Inverse Position Analysis of an Articulated Robot
3.7 Design Projects
3.8 Summary
Additional Reading
Problems
4 Kinematics Analysis of Parallel Robots
4.1 Introduction
4.2 Physical Characteristics of Parallel Robots
4.3 The Denavit‐Hartenberg Approach vs. the Direct Kinematic Approach
4.4 Forward and Inverse Kinematics of Planar Parallel Robots
4.5 Forward and Inverse Kinematics of Spatial Parallel Robots
4.6 Other Parallel Robot Configurations
4.7 Design Projects
4.8 Summary
References
Problems
5 Differential Motions and Velocities
5.1 Introduction
5.2 Differential Relationships
5.3 The Jacobian
5.4 Differential versus Large‐Scale Motions
5.5 Differential Motions of a Frame versus a Robot
5.6 Differential Motions of a Frame
5.7 Interpretation of the Differential Change
5.8 Differential Changes between Frames
5.9 Differential Motions of a Robot and Its Hand Frame
5.10 Calculation of the Jacobian
5.11 How to Relate the Jacobian and the Differential Operator
5.12 The Inverse Jacobian
5.13 Calculation of the Jacobian with Screw‐Based Mechanics
5.14 The Inverse Jacobian for the Screw‐Based Method
5.15 Calculation of the Jacobians of Parallel Robots
5.16 Design Projects
5.17 Summary
References
Problems
6 Dynamic and Force Analysis
6.1 Introduction
6.2 Lagrangian Mechanics: A Short Overview
6.3 Effective Moments of Inertia
6.4 Dynamic Equations for Multiple‐DOF Robots
6.5 Static Force Analysis of Robots
6.6 Transformation of Forces and Moments between Coordinate Frames
6.7 Design Project
6.8 Summary
References
Problems
7 Trajectory Planning
7.1 Introduction
7.2 Path vs. Trajectory
7.3 Joint‐Space vs. Cartesian‐Space Descriptions
7.4 Basics of Trajectory Planning
7.5 Joint‐Space Trajectory Planning
7.6 Cartesian‐Space Trajectories
7.7 Continuous Trajectory Recording
7.8 Design Project
7.9 Summary
References
Problems
8 Motion Control Systems
8.1 Introduction
8.2 Basic Components and Terminology
8.3 Block Diagrams
8.4 System Dynamics
8.5 Laplace Transform
8.6 Inverse Laplace Transform
8.7 Transfer Functions
8.8 Block Diagram Algebra
8.9 Characteristics of First‐Order Transfer Functions
8.10 Characteristics of Second‐Order Transfer Functions
8.11 Characteristic Equation: Pole/Zero Mapping
8.12 Steady‐State Error
8.13 Root Locus Method
8.14 Proportional Controllers
8.15 Proportional‐Plus‐Integral Controllers
8.16 Proportional‐Plus‐Derivative Controllers
8.17 Proportional‐Integral‐Derivative Controller (PID)
8.18 Lead and Lag Compensators
8.19 Bode Diagram and Frequency‐Domain Analysis
8.20 Open‐Loop vs. Closed‐Loop Applications
8.21 Multiple‐Input and Multiple‐Output Systems
8.22 State‐Space Control Methodology
8.23 Digital Control
8.24 Nonlinear Control Systems
8.25 Electromechanical Systems Dynamics: Robot Actuation and Control
8.26 Design Projects
8.27 Summary
References
Problems
9 Actuators and Drive Systems
9.1 Introduction
9.2 Characteristics of Actuating Systems
9.3 Comparison of Actuating Systems
9.4 Hydraulic Actuators
9.5 Pneumatic Devices
9.6 Electric Motors
9.7 Microprocessor Control of Electric Motors
9.8 Magnetostrictive Actuators
9.9 Shape‐Memory Type Metals
9.10 Electroactive Polymer Actuators (EAPs)
9.11 Speed Reduction
9.12 Other Systems
9.13 Design Projects
9.14 Summary
References
Problems
10 Sensors
10.1 Introduction
10.2 Sensor Characteristics
10.3 Sensor Utilization
10.4 Position Sensors
10.5 Velocity Sensors
10.6 Acceleration Sensors
10.7 Force and Pressure Sensors
10.8 Torque Sensors
10.9 Microswitches
10.10 Visible Light and Infrared Sensors
10.11 Touch and Tactile Sensors
10.12 Proximity Sensors
10.13 Range Finders
10.14 Sniff Sensors
10.15 Vision Systems
10.16 Voice‐Recognition Devices
10.17 Voice Synthesizers
10.18 Remote Center Compliance (RCC) Device
10.19 Design Project
10.20 Summary
References
11 Image Processing and Analysis with Vision Systems
11.1 Introduction
11.2 Basic Concepts
11.3 Fourier Transform and Frequency Content of a Signal
11.4 Frequency Content of an Image: Noise and Edges
11.5 Resolution and Quantization
11.6 Sampling Theorem
11.7 Image‐Processing Techniques
11.8 Histograms of Images
11.9 Thresholding
11.10 Spatial Domain Operations: Convolution Mask
11.11 Connectivity
11.12 Noise Reduction
11.13 Edge Detection
11.14 Sharpening an Image
11.15 Hough Transform
11.16 Segmentation
11.17 Segmentation by Region Growing and Region Splitting
11.18 Binary Morphology Operations
11.19 Gray Morphology Operations
11.20 Image Analysis
11.21 Object Recognition by Features
11.22 Depth Measurement with Vision Systems
11.23 Specialized Lighting
11.24 Image Data Compression
11.25 Color Images
11.26 Heuristics
11.27 Applications of Vision Systems
11.28 Design Project
11.29 Summary
References
Problems
12 Fuzzy Logic Control
12.1 Introduction
12.2 Fuzzy Control: What Is Needed
12.3 Crisp Values vs. Fuzzy Values
12.4 Fuzzy Sets: Degrees of Truth and Membership
12.5 Fuzzification
12.6 Fuzzy Inference Rules
12.7 Defuzzification
12.8 Simulation of a Fuzzy Logic Controller
12.9 Applications of Fuzzy Logic in Robotics
12.10 Design Project
12.11 Summary
References
Problems
Appendix A: Appendix A
A.1 Matrix Algebra and Notation: A Review
A.2 Calculation of an Angle from its Sine, Cosine, or Tangent
A.3 Solving Equations with Sine and Cosine
Problems
Appendix B: Appendix B
Image‐Acquisition Systems
Index
End User License Agreement
Chapter 2
Table 2.1 D‐H parameters table.
Table 2.2 D‐H parameters table for Example 2.24.
Table 2.3 D‐H parameters table for Example 2.25.
Table 2.4 Parameters for the robot from Example 2.25.
Table 2.5 Parameters for the robot from Example 2.28.
Table 2.6 Parameters for the robot from Example 2.29.
Table 2.7 The parameters table for the Stanford arm.
Chapter 3
Table 3.1 Parameters table for the articulated arm in Figure 3.9.
Table 3.2 Parameters table for the Stanford arm from Example 3.4.
Table 3.3
Parameters table for the robot from Example 3.5.
Chapter 4
Table 4.1 Parameters and DOF values of the mechanisms in Figure 4.4 based on ...
Chapter 5
Table 5.1 Parameters for the robot from Example 2.26.
Table 2.6 The parameters table for the Stanford arm (repeated).
Chapter 7
Table 7.1 The coordinates and joint angles for Example 7.6.
Table 7.2 The hand frame coordinates and joint angles for the robot in Exampl...
Chapter 8
Table 8.1 Force‐voltage analogy between mechanical and electrical systems.
Table 8.2 Force‐current analogy between mechanical and electrical systems.
Table 8.3 Laplace transform pairs.
Table 8.4 Equivalent block diagrams.
Table 8.5 The angles of asymptotes based on their number.
Chapter 9
Table 9.1 Summary of actuator characteristics.
Table 9.2 (a) Full‐step and (b) half‐step sequence for stepper motors.
Chapter 10
Table 10.1 Binary and gray codes.
Chapter 11
Table 11.1 Possible L‐R schemes based on the direction of search.
Table 11.2 Lines formed in the
m,c
−plane corresponding to the points in the
x,
...
Chapter 12
Table 12.1 Symbolic representation of the rules from Example 12.1.
Chapter 1
Figure 1.1 (a) A Dalmec manipulator; (b) a KUKA robot. Although they are bot...
Figure 1.2 The 6‐axis Yaskawa GP7 robot body.
Figure 1.3 End effectors: (a) A FANUC robot.
Figure 1.4 The robot can move along a track, adding a degree of freedom to t...
Figure 1.5 Common robot coordinate frames for serial robots.
Figure 1.6 (a) A DENSO SCARA robot.
Figure 1.7 A robot's World, Joint, and Tool reference frames. Most robots ma...
Figure 1.8 Typical approximate workspaces for common robot configurations.
Figure 1.9 Robots performing loading and unloading of parts.
Figure 1.10 (a) A parallel robot stacking cookies. (b) A serial robot handli...
Figure 1.11 A robot welding parts together.
Figure 1.12 A robot engaged in sorting and inspection of manufactured parts....
Figure 1.13 (a) A Sawyer robot inserting electronic parts into a circuit wit...
Figure 1.14 A robot engaged in a manufacturing task.
Figure 1.15 The
da Vinci
surgical system.
Figure 1.16 The
EksoGT
rehabilitation suit.
Figure 1.17 Finger‐spelling hand for communication with deaf‐blind individua...
Figure 1.18 NASA Sojourner.
Figure 1.19 Atlas humanoid robot.
Figure 1.20 Dual‐arm robots.
Figure 1.21 The
EksoVest
suit.
Figure 1.22 The
Prosthesis
exoskeletal suit.
Figure 1.23 A mobile transport robot.
Figure 1.24 Boston Dynamics'
Spot
robot.
Figure 1.25 Collaborative robots are designed to detect the presence of huma...
Chapter 2
Figure 2.1 A 1‐DOF, closed‐loop, 4‐bar mechanism.
Figure 2.2 Closed‐loop (a) versus open‐loop (b) mechanisms.
Figure 2.3 A typical parallel manipulator.
Figure 2.4 Representation of a point in space.
Figure 2.5 Representation of a vector in space.
Figure 2.6 The normal‐, orientation‐, and approach‐axes of a moving frame.
Figure 2.7 Representation of a frame at the origin of the reference frame.
Figure 2.8 Representation of a frame in a frame.
Figure 2.9 An example of representation of a frame.
Figure 2.10 Representation of an object in space.
Figure 2.11 Representation of a pure translation in space.
Figure 2.12 Coordinates of a point in a rotating frame before and after rota...
Figure 2.13 Coordinates of a point relative to the reference frame and rotat...
Figure 2.14 Rotation of a frame relative to the
x
‐axis of the reference fram...
Figure 2.15 Effects of three successive transformations.
Figure 2.16 Changing the order of transformations will change the final resu...
Figure 2.17 Transformations relative to the current frames.
Figure 2.18 The Universe, robot, hand, part, and end effector frames.
Figure 2.19 The hand frame of the robot relative to the reference frame.
Figure 2.20 Cartesian coordinates.
Figure 2.21 Cylindrical coordinates.
Figure 2.22 Spherical coordinates.
Figure 2.23 Articulated coordinates.
Figure 2.24 RPY rotations about the current axes.
Figure 2.25 Cylindrical and RPY coordinates from Example 2.21.
Figure 2.26 Euler rotations about the current axes.
Figure 2.27 The Denavit‐Hartenberg representation of a general purpose joint...
Figure 2.28 A simple 2‐axis articulated robot arm.
Figure 2.29 The 3‐DOF robot from Example 2.25.
Figure 2.30 Robot from Example 2.24 in reset position.
Figure 2.31 A simple 6‐DOF articulate robot.
Figure 2.32 Reference frames for the simple 6‐DOF articulate robot.
Figure 2.33 Line drawing of the reference frames for the 6‐DOF articulate ro...
Figure 2.34 Schematic drawing of the Stanford arm.
Figure 2.35 The 4‐axis robot from Example 2.28.
Figure 2.36 The 4‐axis robot from Example 2.29.
Figure 2.37 The path between points
A
and
B
is divided into many small secti...
Figure 2.38 Cooperative robots may be two robots working together or a two‐a...
Figure 2.39 Dual‐arm robots handling a part.
Figure 2.40 An example of a robot in a degenerate position.
Figure 2.41 The frames of the Stanford arm.
Figure 2.42 Cal Poly finger‐spelling hand. Supported by the Smith‐Kettlewell...
Figure 2.43 A stair‐climbing robot. Designed by Jeremy DePangher, supported ...
Figure 2.44 Two simple designs for a joint.
Figure 2.45 A simple 3‐DOF robot design that may be used for the design proj...
Figure 2.46 Schematic representation of a 3‐DOF mobile robot.
Figure 2.47 Isometric grid.
Chapter 3
Figure 3.1 A screw axis representing rotations and translations.
Figure 3.2 Rotation about a screw axis through the origin of the reference f...
Figure 3.3 Rotation about a screw axis through the origin of the reference f...
Figure 3.4 Homogeneous transformation about a general screw axis.
Figure 3.5 The screw axis from Example 3.2.
Figure 3.6 The screw axis from Example 3.3.
Figure 3.7 Successive screw transformations.
Figure 3.8 Successive screw axes can be used to represent a robot.
Figure 3.9 An articulated robot arm in its reset position represented by scr...
Figure 3.10 The Stanford arm from Example 3.4.
Figure 3.11 The robot from Example 3.5.
Figure P.3.1
Figure P.3.2
Figure P.3.4
Figure P.3.5
Figure P.3.7
Figure P.3.8
Figure P.3.9
Figure P.3.10
Figure P.3.11
Figure P.3.12
Chapter 4
Figure 4.1 An Omron‐Adept Hornet‐565 parallel robot.
Figure 4.2 Kinematic chains and loops.
Figure 4.3 A mechanism may possess a passive degree of freedom, which should...
Figure 4.4 The DOF of different types of mechanisms.
Figure 4.5 Schematics of two possible planar parallel robots.
Figure 4.6 Spherical
Agile Eye
and
Agile Wrist
parallel robotic mechanisms....
Figure 4.7 A 4‐DOF Omron‐Adept Quattro 650‐800 robot.
Figure 4.8 An alternative design for a 6‐DOF parallel robot with all revolut...
Figure 4.9 Stewart‐Gough type 3‐3, 6‐3, and 6‐6 parallel robots.
Figure 4.10 D‐H frame representation for parallel robots.
Figure 4.11 A 3‐R
P
R planar parallel robot.
Figure 4.12 The robot from Example 4.1.
Figure 4.13 A 3‐
R
RR planar parallel robot.
Figure 4.14 Planar 3‐
R
RR parallel robot from Example 4.2.
Figure 4.15 Complementary configurations of linkages for Example 4.2.
Figure 4.16 A 6‐6 type Stewart platform.
Figure 4.17 A general Stewart‐Gough platform rotating and moving down withou...
Figure 4.18 A generic 6‐3 Stewart‐Gough platform.
Figure 4.19 Typical 3‐DOF industrial‐type parallel robot.
Figure 4.20 A typical 3‐DOF
R
SS‐type parallel robot.
Figure 4.21 The resulting actuating angles for
θ
1
for Example 4.5.
Figure 4.22 The resulting actuating angles for
θ
3
for Example 4.6.
Figure 4.23 A 4‐DOF
R
SS‐type parallel robot and its schematic.
Figure 4.24 The resulting actuating angles for
θ
1
for Example 4.7.
Figure 4.25 A 3‐DOF
P
SS‐type parallel robot.
Figure 4.26 Alternative designs for parallel robots. (a) Yaskawa specialty d...
Figure P.4.1
Chapter 5
Figure 5.1 (a) 2‐DOF planar mechanism; (b) velocity diagram.
Figure 5.2 Resulting motions of the robot are dependent on the geometry of t...
Figure 5.3 Differential motions versus non‐differential motions.
Figure 5.4 (a) Differential motions of a frame; (b) differential motions of ...
Figure 5.5 Differential rotations about a general axis
q
.
Figure 5.6 Reference frames for the simple 6‐DOF articulate robot.
Figure 3.8 (Repeated here.)
Figure 5.7 A generic 6‐DOF articulated arm.
Figure 3.10 repeated here with modified frames.
Figure 5.8 A generic 3‐
R
RR planar parallel robot.
Figure 5.9 A generic 6‐6 Stewart‐Gough parallel robot.
Chapter 6
Figure 6.1 Force‐mass‐acceleration and torque‐inertia‐angular‐acceleration r...
Figure 6.2 Schematic of a simple cart‐spring system.
Figure 6.3 Free‐body diagram for the cart‐spring system.
Figure 6.4 Schematic of a cart‐pendulum system.
Figure 6.5 A two‐link mechanism with concentrated masses.
Figure 6.6 A 2‐DOF robot arm.
Figure 6.7 A 2‐DOF polar robot arm.
Figure 6.8 A rigid body in 3D motion and in plane motion.
Figure 6.9 The 2‐DOF robot arm from Example 6.4.
Figure 6.10 Equivalent force‐moment systems in two different frames.
Chapter 7
Figure 7.1 Sequential robot movements in a path versus trajectory.
Figure 7.2 Sequential motions of a robot to follow a straight line.
Figure 7.3 Cartesian‐space trajectory problems. (a) The trajectory specified...
Figure 7.4 Joint‐space, non‐normalized movements of a 2‐DOF robot.
Figure 7.5 Joint‐space, normalized movements of a robot with 2 DOF.
Figure 7.6 Cartesian‐space movements of a 2‐DOF robot.
Figure 7.7 Trajectory planning with an acceleration/deceleration regimen.
Figure 7.8 Blending of different motion segments in a path.
Figure 7.9 An alternative scheme for ensuring that the robot goes through a ...
Figure 7.10 Joint positions, velocities, and accelerations for Example 7.1....
Figure 7.11 Joint positions, velocities, and accelerations for Example 7.2....
Figure 7.12 Joint positions, velocities, and accelerations for Example 7.3....
Figure 7.13 Scheme for linear segments with parabolic blends.
Figure 7.14 Position, velocity, and acceleration graphs for joint 1 from Exa...
Figure 7.15 The position, velocity, and acceleration curves for the motion o...
Figure 7.16 Transformation between initial and final locations in Cartesian‐...
Figure 7.17 With the continuous path on or off, the resulting motion differs...
Figure 7.18 The joint positions for the robot in Example 7.6.
Figure 7.19 Robot from Example 7.7 and its coordinate frames.
Figure 7.20 Joint angles for Example 7.7.
Figure P.7.1
Chapter 8
Figure 8.1 Basic components of a control system.
Figure 8.2 A simple block diagram.
Figure 6.2 (Repeated.)
Figure 6.3 (Repeated.)
Figure 8.3 Representation of the dynamic behavior of a motor.
Figure 8.4 A mechanical system and an electrical system.
Figure 8.5 Free‐body diagram for Example 8.1.
Figure 8.6 Representation of a hydraulic lift.
Figure 8.7 The system for Example 8.9.
Figure 8.8 The block diagram for a simple control system.
Figure 8.9 The system from Example 8.10.
Figure 8.10 A mass‐spring‐damper system with position feedback.
Figure 8.11 A generic leg design for jumping robots.
Figure 8.12 The block diagram for Example 8.12.
Figure 8.13 The time response of a first‐order system to a step function.
Figure 8.14 A closed‐loop first‐order system.
Figure 8.15 The response of a second‐order transfer function to a step funct...
Figure 8.16 A typical response of a second‐order transfer function and its c...
Figure 8.17 Pole‐zero mapping of the roots in Example 8.13.
Figure 8.18 An underdamped system with its pair of complex conjugate poles a...
Figure 8.19 The response of the system changes as the poles move in differen...
Figure 8.20 A typical control loop.
Figure 8.21 The typical control loop.
Figure 8.22 A typical control system.
Figure 8.23 The root locus for the system in Figure 8.22b.
Figure 8.24 The vectors between a point in the Real‐Imaginary plane and the ...
Figure 8.25 The root locus for Example 8.17.
Figure 8.26 The root locus for Example 8.18.
Figure 8.27 The root locus for Example 8.19.
Figure 8.28 The step response of the system from Example 8.19.
Figure 8.29 The response of the system from Example 8.19 with critical dampi...
Figure 8.30 A proportional hydraulic servo valve.
Figure 8.31 A proportional‐plus‐integral controller.
Figure 8.32 The root locus for Example 8.21.
Figure 8.33 The step responses for the system from Example 8.21.
Figure 8.34 A proportional‐plus‐derivative controller.
Figure 8.35 No roots are available for settling time less than 0.9.
Figure 8.36 The root locus for the system from Example 8.22.
Figure 8.37 The response of the system from Example 8.22 to a step function
Figure 8.38 A proportional‐integral‐derivative (PID) controller.
Figure 8.39 The root locus of the system from Example 8.23 with a proportion...
Figure 8.40 The response of the system from Example 8.23 to a step function....
Figure 8.41 A typical Bode diagram for a second‐order system.
Figure 8.42 A typical feedback control system.
Figure 8.43 Multiple‐input, single‐output system from Example 8.24.
Figure 8.44 MIMO system from Example 8.25.
Figure 8.4 (Repeated.)
Figure 8.45 A system with three states.
Figure 8.46 The representation of a state‐space system.
Figure 8.47 The application of estimators in control systems.
Figure 8.48 The electromechanical system for Example 8.27.
Figure 8.49 A typical digital system.
Figure 8.50 Examples of nonlinear behavior of system elements.
Figure 8.51 The pendulum from Example 8.30.
Figure 8.52 The feedback loop of a robot controller.
Figure 8.53 An electromechanical actuating system and its model.
Figure 8.54 The approximate response of the motor from Example 8.31.
Figure 8.55 An electromechanical system with a tachometer feedback sensor.
Figure 8.56 Completed block diagram for the robot's actuating motor.
Figure P.8.5
Figure P.8.6
Figure P.8.7
Figure P.8.8
Figure P.8.9
Chapter 9
Figure 9.1 Inertia and torque relationship between a motor and a load.
Figure 9.2 Free‐body diagrams of the motor and the load.
Figure 9.3 Schematic drawing of the system from Example 9.1.
Figure 9.4 A rotary hydraulic actuator. This actuator can be directly attach...
Figure 9.5 Schematic of a hydraulic system and its components.
Figure 9.6 A wire carrying a current, placed within a magnetic field, will e...
Figure 9.7 Heat‐dissipation path of motors.
Figure 9.8 The stator, rotor, commutators and the brushes of a DC motor.
Figure 9.9 Schematic diagram showing a DC motor armature circuit.
Figure 9.10 The output torque and power of a DC motor versus its angular vel...
Figure 9.11 Eliminating the soft iron from a rotor makes it lightweight with...
Figure 9.12 A disk (pancake) motor. The rotor has no iron core, and, consequ...
Figure 9.13 Frameless motors can be directly integrated into the joint struc...
Figure 9.14 An AC motor and center‐tapped AC motor winding.
Figure 9.15 Brushless DC motors.
Figure 9.16 Disk drive voice coil actuator.
Figure 9.17 (a) A servomotor and (b) the schematic of a servomotor controlle...
Figure 9.18 Basic principle of operation of a stepper motor. As the coils in...
Figure 9.19 A typical rotor of a can‐stack stepper motor.
Figure 9.20 (a) magnetic flux pattern on a refrigerator magnet and the rotor...
Figure 9.21 The stator of a can‐stack stepper motor.
Figure 9.22 Can‐stack stator windings and plates.
Figure 9.23 Center tapping of a coil allows changing the polarity of the mag...
Figure 9.24 The cross section of a can‐stack stepper motor. Each S‐N arc sho...
Figure 9.25 The stator (left) and rotor (right) of a hybrid stepper motor.
Figure 9.26 Application of unequal divisions for measuring lengths as in a c...
Figure 9.27 Basic operation of a hybrid stepper motor.
Figure 9.28 Schematic drawing for unipolar and bipolar drive circuits.
Figure 9.29 Stepper motor lead configurations.
Figure 9.30 A typical speed‐torque curve for a stepper motor.
Figure 9.31 Application of (a) a resistor; (b) a diode; and (c) a Zener diod...
Figure 9.32 Schematic drawing of the application of a microcontroller in dri...
Figure 9.33 Schematic of a typical stepper motor driver.
Figure 9.34 Schematic drawing of a timer circuit that creates a pulse train,...
Figure 9.35 (a) Application of a power transistor to control the supplied vo...
Figure 9.36 Pulse width modulation timing.
Figure 9.37 Sine wave generation with pulse width modulation.
Figure 9.38 Directional control of a motor using switches in an H‐bridge.
Figure 9.39 Application of H‐bridge in controlling the direction of rotation...
Figure 9.40 Schematic drawing of a planetary gear train.
Figure 9.41 Schematic of the strain wave gearing train.
Figure 9.42 A Harmonic Drive strain wave gear.
Figure 9.43 Nutating gears train.
Figure 9.44 Commercial actuators are available for more sophisticated and ad...
Figure 9.45 A schematic depiction of a possible arrangement for the cylinder...
Figure 9.46 A sphere robot.
Figure 9.47 A possible design for a crawling robot.
Figure P.9.1
Figure P.9.4
Chapter 10
Figure 10.1 Accuracy versus repeatability.
Figure 10.2 (a) Basic sensor circuit; (b) application of a capacitor, added ...
Figure 10.3 A potentiometer as a position sensor.
Figure 10.4 (a) A simple rotary incremental encoder disk mounted on a motor ...
Figure 10.5 Output signals of an incremental encoder.
Figure 10.6 Each portion of the absolute encoder disk has a unique signature...
Figure 10.7 Linear variable differential transformer.
Figure 10.8 Schematic of a resolver.
Figure 10.9 Schematic drawing of a magnetostrictive displacement sensor.
Figure 10.10 IBM 7565 magnetostrictive displacement transducers.
Figure 10.11 Shaft‐angle measuring device based on a tunnel‐diode oscillator...
Figure 10.12 Schematics of differentiating and integrating
R‐C
circuit...
Figure 10.13 A typical force‐sensing resistor (FSR). The resistance of this ...
Figure 10.14 (a) A strain gauge and (b) a Wheatstone bridge.
Figure 10.15 Arrangement of three pairs of strain gauges along the three maj...
Figure 10.16 Typical industrial force/torque sensors.
Figure 10.17 The torque can be found by measuring the changes in the frequen...
Figure 10.18 Tactile sensors are generally a collection of simple touch sens...
Figure 10.19 A tactile sensor can provide information about the object.
Figure 10.20 Optical proximity sensor.
Figure 10.21 An alternative optical proximity sensor.
Figure 10.22 Ultrasonic proximity sensors.
Figure 10.23 Triangulation method for range measurement. The receiver will o...
Figure 10.24 A commercial LiDAR device, and a scene captured by it.
Figure 10.25 Misalignment of assembling elements.
Figure 10.26 Instantaneous centers of zero velocity for a 4‐bar mechanism.
Figure 10.27 Special 4‐bar mechanisms, the basis for a remote‐center complia...
Figure 10.28 Schematic depiction of how an RCC device operates.
Figure 10.29 A commercial RCC device.
Chapter 11
Figure 11.1 Examples of how gray intensities are created in printed images. ...
Figure 11.2 An image and the binary representation of its first row using 4 ...
Figure 11.3 Time‐domain and frequency‐domain plots of a simple sine function...
Figure 11.4 Sine functions in the time and frequency domains for a successiv...
Figure 11.5 Two signals and their frequency spectrums.
Figure 11.6 Noise and edge information in an intensity diagram of an image. ...
Figure 11.7 Effect of different sampling rates on an image at (a) 480
320,...
Figure 11.8 An image at different quantization levels of 2, 4, 8, and 256 gr...
Figure 11.9 A low‐resolution (16
16) image.
Figure 11.10 (a) Sinusoidal signal with a frequency of
f
; (b) sampled amplit...
Figure 11.11 Reconstruction of signals from the sampled data, too. More than...
Figure 11.12 An inappropriate sampling rate may completely miss important da...
Figure 11.13 The original signal in (a) is sampled at a sampling rate that i...
Figure 11.14 The image in Figure 11.9, presented at higher resolutions of (a...
Figure 11.15 Effect of histogram equalization in improving an image.
Figure 11.16 Increasing the contrast in an image expands the histogram to in...
Figure 11.17 Thresholding an image (a) with 256 gray levels at two different...
Figure 11.18 An image and its histogram, binarized at different threshold le...
Figure 11.19 Images and histograms for Example 11.4.
Figure 11.20 When a convolution mask (kernel) is superimposed on an image, i...
Figure 11.21 The representation of an image and a mask.
Figure 11.22 An example of a convolution mask.
Figure 11.23 (a) Convolving the mask onto the cells of the image; (b) the re...
Figure 11.24 The mask and image for Example 11.6.
Figure 11.25 The result of convolving the mask on the image from Example 11....
Figure 11.26 Neighborhood connectivity of pixels.
Figure 11.27 The image for Example 11.7.
Figure 11.28 The results of the connectivity searches for Example 11.7.
Figure 11.29 Neighborhood averaging mask.
Figure 11.30 Neighborhood averaging of an image.
Figure 11.31 5
5 and 3
3 Gaussian averaging filters.
Figure 11.32 (a) The original image; (b) is the same image corrupted with a ...
Figure 11.33 Application of a median filter.
Figure 11.34 Edge detection with first and second derivatives.
Figure 11.35 Gradient of image intensity.
Figure 11.36 Intensity gradient between successive pixels.
Figure 11.37 The Laplacian‐1 high‐pass edge detector mask.
Figure 11.38 Other high‐pass filters.
Figure 11.39 The Sobel, Roberts, and Prewitt edge detectors.
Figure 11.40 (a) An image and its edges from (b) Laplacian‐1; (c) Laplacian‐...
Figure 11.41 Left‐right search technique for edge detection [7].
Figure 11.42 These masks emphasize the vertical, horizontal, and diagonal li...
Figure 11.43 (a) An original image with effects of (b) a vertical emphasis m...
Figure 11.38 (Repeated.)
Figure 11.44 (a) The original image; (b) after an averaging mask was applied...
Figure 11.45 The Hough transformation from the
x
,
y
‐plane into the
r
,
θ
‐...
Figure 11.46 Hough transform.
Figure 11.47 Transformation of a point in the
x
,
y
‐plane into a line in the H...
Figure 11.48 Hough transform for Example 11.8.
Figure 11.49 Representation of objects such as a tree or a house with models...
Figure 11.50 An image captured by a LiDAR.
Figure 11.51 Region growing based on a search technique. With a +4‐connectiv...
Figure 11.52 The result of a search for
4‐connectivity for Example 11.9.
Figure 11.53 The binary image of a bolt and its stick (skeleton) representat...
Figure 11.54 The union between two geometries creates
dilation
.
Figure 11.55 The subtraction of two geometries creates
erosion
.
Figure 11.56 The result of the union of the two shapes reduces the appearanc...
Figure 11.57 The application of union and subtraction operations for locatin...
Figure 11.58 The threads of the bolts are removed by a triple application of...
Figure 11.59 Effect of dilation operations. Here, the objects in (a) were su...
Figure 11.60 Effect of erosion operation on objects in (a) with (b) 3 and (c...
Figure 11.61 The effect of skeletonization on an image without thickening. T...
Figure 11.62 The skeleton of the objects in (a) after (b) the application of...
Figure 11.63 As a result of a fill operation, the hole in the nut is filled ...
Figure 11.64 (a) Aspect ratio of an object; (b) minimum aspect ratio.
Figure 11.65 Calculation of the moment of an image. For each pixel that belo...
Figure 11.66 Small differences between objects or small asymmetry in an obje...
Figure 11.67 Image used for Example 11.12.
Figure 11.68 Image used for Example 11.13.
Figure 11.69 Image used for Example 11.14.
Figure 11.70 Correspondence problem in stereo imaging.
Figure 11.71 Application of specialized lighting in depth measurement. A pla...
Figure 11.72 A blank image grid.
Figure P.11.2
Figure P.11.3
Figure P.11.5
Figure P.11.6
Figure P.11.7
Figure P.11.8
Figure P.11.9
Figure P.11.15
Figure P.11.16
Figure P.11.21
Figure P.11.22
Figure P.11.23
Figure P.11.24
Figure P.11.25
Figure P.11.27
Figure P.11.29
Figure P.11.32
Figure P.11.33
Figure P.11.34
Figure P.11.35
Chapter 12
Figure 12.1 A Gaussian membership function.
Figure 12.2 A trapezoidal membership function.
Figure 12.3 A triangular membership function.
Figure 12.4 Z‐ and S‐shaped membership functions.
Figure 12.5 Fuzzy sets for autonomous vehicle speed variable.
Figure 12.6 Graphical representation of rules.
Figure 12.7 Defuzzification based on Mamdani's method.
Figure 12.8 Input and output fuzzy sets for Example 12.1.
Figure 12.9 Graphical representation of some of the rules in Example 12.1. R...
Figure 12.10 Application of the Mamdani inference method.
Figure 12.11 The 3D output result of the wheelchair in Example 12.1 generate...
Figure 12.12 The gaps between different sets of input membership functions a...
Figure 12.13 The input and output fuzzy set membership functions were modifi...
Figure 12.14 Rethink Robotics
Sawyer
robot.
Figure 12.15 Fuzzy sets showing the input and the output from Example 12.2....
Figure 12.16 The result of the simulation from Example 12.2.
Appendix A
Figure A.1 Trigonometric functions.
Appendix B
Figure B.1 Image acquisition with a digital camera involves the development ...
Figure B.2 Image data collection model.
Cover
Table of Contents
Begin Reading
iii
iv
v
xv
xvi
xvii
xix
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
Third Edition
Saeed B. Niku, Ph.D., P.E.
California Polytechnic State UniversityCaliforniaUSA
This edition first published 2020© 2020 John Wiley & Sons Ltd
Edition HistoryPrentice Hall (1e, 2001) and John Wiley & sons (2e, 2011)
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Saeed B. Niku to be identified as the author of this work has been asserted in accordance with law.
Registered OfficesJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USAJohn Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
Editorial OfficeThe Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of WarrantyMATLAB® is a trademark of The MathWorks, Inc. and is used with permission. The MathWorks does not warrant the accuracy of the text or exercises in this book. This book's use or discussion of MATLAB® software or related products does not constitute endorsement or sponsorship by The MathWorks of a particular pedagogical approach or particular use of the MATLAB® software.
While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging‐in‐Publication Data
Names: Niku, Saeed B. (Saeed Benjamin), author.Title: Introduction to robotics: analysis, control, applications / Saeed B. Niku.Description: Third edition. | Hoboken: Wiley, 2020. | Includes bibliographical references and index.Identifiers: LCCN 2019024969 (print) | LCCN 2019024970 (ebook) | ISBN 9781119527626 (cloth) | ISBN 9781119527596 (adobe pdf) | ISBN 9781119527602 (epub)Subjects: LCSH: Robotics.Classification: LCC TJ211 .N547 2019 (print) | LCC TJ211 (ebook) | DDC 629.8/52–dc23LC record available at https://lccn.loc.gov/2019024969LC ebook record available at https://lccn.loc.gov/2019024970
Cover Design: WileyCover Images: 3D outline Robotic arm © cherezoff/Getty Images,Blue abstract modern background © Pobytov/Getty Images
Dedicated to
Shohreh, Adam, and Alan Niku
and to
Sara Niku and the memory of Saleh Niku
This new third edition of the Introduction to Robotics textbook is the culmination of over a year of intense work. If judging the previous edition by the number of instructors who adopted it, the number of countries in which it was popularly sold, and the number of languages into which it was translated indicates that it was a good book, I hope that this new edition is even better. It has two completely new chapters on screw‐based mechanics and parallel robots, it has many new examples and homework problems, it has many new subjects in most chapters, and the writing has been edited and streamlined throughout.
And still the old adage from one of my former students whose name I have long forgotten applies: in the life of any product there comes a time when you have to shoot the designer and go into production. For a book, there comes a time that you have to shoot the author and go into publication.
The intention behind writing this book was, and still is, to cover most subjects that an engineering student or a practicing engineer who intends to learn about robotics may need to know, whether to design a robot, to integrate a robot in appropriate applications, or to analyze a robot. As such, it covers all necessary fundamentals of robotics, robot components and subsystems, and applications.
The book is intended for senior or introductory graduate courses in robotics as well as for practicing engineers who would like to learn about robotics. Although the book covers a fair amount of mechanics and kinematics of both serial and parallel robots, both with the Denavit‐Hartenberg approach as well as screw‐based mechanics, it also covers microprocessor applications, control systems, vision systems, sensors, and actuators. Therefore, it can easily be used by mechanical engineers, electronic and electrical engineers, computer engineers, and engineering technologists. With the chapter about control theory, even if the student has not had a controls course, he or she can learn enough material to be able to understand robotic control and design.
The book consists of 12 chapters. Chapter 1 covers introductory subjects that familiarize the reader with the necessary background information. This includes some historical information, robot components, robot characteristics, robot languages, and robotic applications. Chapter 2 explores the forward and inverse kinematics of serial robots, including frame representations, transformations, position and orientation analysis, as well as the Denavit‐Hartenberg representation of robot kinematics. Chapter 3 covers the kinematics of serial robots with screw‐based mechanics. Chapter 4 discusses parallel robots of many different types. Chapter 5 continues with differential motions and velocity analysis of robots and frames. Chapter 6 presents an analysis of robot dynamics and forces. Lagrangian mechanics is used as the primary method of analysis and development for this chapter. Chapter 7 discusses methods of path and trajectory planning, both in joint space and in Cartesian space. Chapter 8 covers fundamentals of control engineering, including analysis and design tools. Among other things, it discusses the root locus; proportional, derivative, and integral control; as well as electromechanical system modeling. It also includes an introduction to multiple input, multiple output (MIMO) systems, digital systems, and nonlinear systems. However, the assumption is that students will need additional instruction to be proficient in actually designing systems. One chapter on this subject cannot be adequate, but can nicely serve as an introduction for majors in which a separate course in control engineering is not offered. Chapter 9 covers actuators, including hydraulic devices, electric motors such as DC servo motors and stepper motors, pneumatic devices, as well as many other novel actuators. It also covers microprocessor control of these actuators. Although this is not a complete mechatronics book, it does cover a fair amount of mechatronics. Except for the design of a microprocessor, many aspects of mechatronic applications are covered in this chapter. Chapter 10 is a discussion of sensors used in robotics and robotic applications. Chapter 11 covers vision systems, including many different techniques for image processing and image analysis. Chapter 12 discusses the basic principles of fuzzy logic and its applications in microprocessor control and robotics. This coverage is not intended to be a complete and thorough analysis of fuzzy logic, but an introduction. It is believed that students and engineers who find it interesting will continue on their own. Appendix A is a quick review of matrix algebra and some other mathematical facts that are needed throughout this book. Appendix B discusses digital image acquisition.
With the additional new chapters on screw‐based mechanics and parallel robots, it is almost impossible to cover everything in the book in a quarter‐based class with 30 lectures in 10 weeks. Therefore, for quarter‐based classes, the instructor must make some choices as to which subjects should be included. Depending on other classes the student takes, certain material may be skipped. For example, students at Cal Poly, San Luis Obispo all take a required controls class, and most have a mechatronics class. Therefore, we can skip chapters on these subjects. However, for a semester‐based 14‐week class with 3 lectures per week, there is ample material and time to cover the entirety of the book.
The following breakdown can be used as a model for setting up a course in robotics in a quarter system. In this case, certain subjects must be eliminated or shortened, as shown:
Introductory material and review: 1 lecture
Kinematics of position: 6 lectures
Screw‐based mechanics: 2 lectures
Parallel robots: 3 lectures
Differential motions: 4 lectures
Robot dynamics and force control: 2 lectures
Path and trajectory planning: 1 lecture
Actuators: 2 lectures
Sensors: 2 lectures
Vision systems: 5 lectures
Fuzzy logic: 1 lecture
Exam: 1 lecture
Alternately, for a 14‐week long semester course with 3 lectures per week, the course may be set up as follows:
Introductory material and review: 2 lectures
Kinematics of position: 7 lectures
Screw‐based mechanics: 2 lectures
Parallel robots: 3 lectures
Differential motions: 5 lectures
Robot dynamics and force control: 4 lectures
Path and trajectory planning: 3 lectures
Robot control and modeling: 4 lectures
Actuators: 2 lectures
Sensors: 2 lectures
Vision systems: 5 lectures
Fuzzy logic: 1 lecture
Exam: 1 lecture
The book also features design projects that start in Chapter 2 and continue throughout the book. At the end of each chapter, the student is directed to continue with the design projects in reference to the present subject. Therefore, by the end of the book, they may complete their project.
I would like to thank all the people who, in one way or another, have helped me. This includes my colleagues, including Drs. Bill Murray, Charles Birdsong, Lynne Slivovsky, and John Ridgely; all the countless individuals who did the research, development, and hard work that came before my time and that enabled me to learn the subject myself; all the users and students and anonymous reviewers who made countless suggestions to improve each edition; Dr. Thomas Cavicchi, Dr. Norali Pernalete, Dr. Fernando Gonzalez, as well as my students Tomy Tran, Jonathon Sather, Jonathon Stearns, and Trent Peterson; and the students who helped with the design and development of projects at Cal Poly, including the Robotics Club. I also thank Sandra Grayson, the acquisition editor at Wiley; Louis Manoharan, my project editor; Sathishwaran Pathbanabhan, my production editor; Tiffany Taylor, my copyeditor; and the editors and the artists who made the book look as it does. Finally, I thank my family, Shohreh, Adam, and Alan, who always inspire me in everything I do. Their patience is much appreciated. To all of you, my sincere thanks.
I hope that you will enjoy reading the book, and more importantly, that you will learn the subject. The joy of robotics comes from learning it.
Saeed Benjamin Niku, Ph.D., P.E.San Luis Obispo, California2019
This book is accompanied by a companion website:
www.wiley.com/go/niku3ed
The website includes:
Robotics related articles
Robotics related clips
Information about websites, companies, equipment manufacturers and service providers
New project ideas
Additional homework problems
Other robotics related material of interest
Additional comment and corrections/errata
Scan this QR code to visit the companion website.
Robotics, the fascinating world of creating devices that mimic living creatures and are capable of performing tasks and behaving as if they are almost alive and able to understand the world around them, has been on humans' minds since the time we could build things. You may have seen machines made by artisans, which try to mimic humans' motions and behavior. Examples include the statues in Venice's San Marcos clock tower that hit the clock on the hour, figurines that tell a story in the fifteenth century astronomical clock on the side of the Old Town Hall tower in Prague, and the systems that Leonardo da Vinci sketched in his notebooks. Toys, from very simple types to very sophisticated machines with repeating movements, are other examples. In Hollywood, movies have even portrayed robots and humanoids as superior to humans.
Although humanoids, autonomous cars, and mobile robots are fundamentally robots and are designed and governed by the same basics, in this book we primarily study industrial manipulator‐type robots. This book covers some basic introductory material that familiarizes you with the subject; presents an analysis of the mechanics of robots including kinematics, dynamics, and trajectory planning; and discusses the elements that are used in robots and in robotics, such as actuators, sensors, vision systems, and so on. Robot rovers are no different, although they usually have fewer degrees of freedom (DOF) and generally move in a plane. Exoskeletal and humanoid robots, walking machines, and robots that mimic animals and insects have many DOF and may possess unique capabilities. However, the same principles we learn about manipulators apply to robot rovers too, whether kinematics, differential motions, dynamics, or control.
Robots are very powerful elements of today's industry. They are capable of performing many different tasks and operations, are accurate, and do not require common safety and comfort elements humans need, including in hazardous environments such as underwater, disaster areas, and space. However, it takes much effort and many resources to make a robot function properly. Most of the hundreds of companies that made robots in the mid‐1980s are gone; and, with few exceptions, only companies that make real industrial robots have remained in the market (such as OMRON Adept, Stäubli, ABB, FANUC, KUKA, Epson, Motoman, DENSO, Fuji, Yaskawa, Kawasaki, and Universal Robots, as well as specialty robotic companies such as MAKO Surgical Corp., and Intuitive). Although there are several million robots working in factories, and the numbers are growing, early industrialists' predictions about the possible number of robots in industry never materialized because high expectations could not be met with the present robots. Innovations such as artificial intelligence embedded in robots, and new types of robots (such as parallel robots), have improved the situation and will continue to do so. However, robots are used where they are useful. Like humans, robots can do certain things but not others. As long as they are designed properly for the intended purposes, they are very useful and continue to be used. Current predictions indicate sustained growth in the number of robots used in industry in many different forms, from manufacturing and assembly to self‐driving delivery robots, and from autonomous vehicles to domestic workers [1–4].
The subject of robotics covers many different areas. Robots alone are hardly ever useful: they are used together with peripheral devices and other manufacturing machines. They are generally integrated into a system, which as a whole is designed to perform a task or do an operation. In this book, we will refer to some of these other devices and systems that are used with robots.
If you compare a conventional robot manipulator with a crane attached to, let's say, a utility or towing vehicle, you will notice that the robot manipulator is very similar to the crane. Both possess a number of links attached serially to each other with joints, where each joint can be moved by some type of actuator. In both systems, the “hand” of the manipulator can be moved in space and placed in any desired location within the workspace of the system. Each one can carry a certain load, and in each, a central controller controls the actuators. However, one is called a robot, and the other is called a manipulator (or, in this case, a crane). Similarly, material‐handling manipulators that move heavy objects in manufacturing plants look just like robots, but they are not robots. The fundamental difference between the two is that the crane and the manipulator are controlled by a human who operates and controls the actuators, whereas the robot manipulator is controlled by a computer or microprocessor that runs a program (Figure 1.1). This difference determines whether a device is a simple manipulator or a robot. In general, robots are designed and meant to be controlled by a computer or similar device. The motions of the robot are controlled through a controller under the supervision of the computer, which is running some type of program. Therefore, if the program is changed, the actions of the robot will change accordingly. The intention is to have a device that can perform many different tasks; consequently, it is very flexible in what it can do without having to be redesigned. Therefore, the robot is designed to be able to perform many tasks based on the running program(s) simply by changing the program. The simple manipulator (or the crane) cannot do this without an operator running it all the time.
Different countries have different standards for what they consider a robot. In American standards, a device must be easily reprogrammable to be considered a robot. Therefore, manual handling devices (devices that have multiple degrees of freedom and are actuated by an operator) and fixed‐sequence robots (devices controlled by hard stops to control actuator motions on a fixed sequence, which is difficult to change) are not considered robots.
Figure 1.1 (a) A Dalmec manipulator; (b) a KUKA robot. Although they are both handling large loads, one is controlled by a human operator and the other is controlled by a controller.
Source: Reproduced with permission from Dalmec USA and Kuka Robotics.
The following is a general list of classifications of devices that are considered robots. Different countries have different classifications, and, consequently, the number of robots in use in a country may be influenced by the definition:
Fixed‐sequence robot:
A device that performs the successive stages of a task according to a predetermined, unchanging method that is hard to modify.
Playback robot:
A human operator performs a task manually by leading the robot, which records the motions for later playback. The robot repeats the same motions according to the recorded information.
Numerical‐control robot:
The operator supplies the robot with a movement program rather than teaching it the task manually.
Intelligent robot:
A robot with the means to understand its environment and the ability to successfully complete a task despite changes in the surrounding conditions under which it is to be performed.
Robotics is the art, knowledge base, and know‐how of designing, applying, and using robots in human endeavors. Robotic systems consist of not just robots, but also other devices and systems that are used together with the robots. Robots may be used in manufacturing environments, in underwater and space exploration, in researching human and animal behavior, for aiding the disabled, for transportation and delivery, for military purposes, or even for fun. In any capacity, robots can be useful but need to be programmed and controlled. Robotics is an interdisciplinary subject that benefits from mechanical engineering, electrical and electronic engineering, computer science, cognitive sciences, biology, and many other disciplines.
Disregarding the early machines that were made to mimic humans and their actions, and concentrating on recent history, we can see a close relationship between the state of industry, the revolution in numeric and computer control of machinery, nuclear material handling, space exploration, and the vivid imagination of creative people. Starting with Karel Capek and his play R.U.R. (Rossum's Universal Robots) [5], and later, movies like Flash Gordon, Metropolis, Lost in Space, The Day the Earth Stood Still, and Forbidden Planet [6
