Contemporary Planetary Robotics -  - E-Book

Contemporary Planetary Robotics E-Book

0,0
147,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

For readers from both academia and industry wishing to pursue their studies and/or careers in planetary robotics, this book represents a one-stop tour of the history, evolution, key systems, and technologies of this emerging field. The book provides a comprehensive introduction to the key techniques and technologies that help to achieve autonomous space systems for cost-eff ective, high performing planetary robotic missions. Main topics covered include robotic vision, surface navigation, manipulation, mission operations and autonomy, being explained in both theoretical principles and practical use cases. The book recognizes the importance of system design hence discusses practices and tools that help take mission concepts to baseline design solutions, making it a practical piece of scientifi c reference suited to a variety of practitioners in planetary robotics.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 821

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright

List of Contributors

Chapter 1: Introduction

1.1 Evolution of Extraterrestrial Exploration and Robotics

1.2 Planetary Robotics Overview

1.3 Scope and Organization of the Book

1.4 Acknowledgments

Chapter 2: Planetary Robotic System Design

2.1 Introduction

2.2 A System Design Approach: From Mission Concept to Baseline Design

2.3 Mission Scenarios: Past, Current, and Future

2.4 Environment-Driven Design Considerations

2.5 Systems Design Drivers and Trade-Offs

2.6 System Operation Options

2.7 Subsystem Design Options

References

Chapter 3: Vision and Image Processing

3.1 Introduction

3.2 Scope of Vision Processing

3.3 Vision Sensors and Sensing

3.4 Vision Sensors Calibration

3.5 Ground-Based Vision Processing

3.6 Onboard Vision Processing

3.7 Past and Existing Mission Approaches

3.8 Advanced Concepts

References

Chapter 4: Surface Navigation

4.1 Introduction

4.2 Context

4.3 Designing a Navigation System

4.4 Localization Technologies and Systems

4.5 Autonomous Navigation

4.6 Future of Planetary Surface Navigation

References

Chapter 5: Manipulation and Control

5.1 Introduction

5.2 Robotic Arm System Design

5.3 Robotic Arm Control

5.4 Testing and Validation

5.5 Future Trends

References

Chapter 6: Mission Operations and Autonomy

6.1 Introduction

6.2 Context

6.3 Mission Operation Software

6.4 Planning and Scheduling (P&S)

6.5 Reconfigurable Autonomy

6.6 Validation and Verification

6.7 Case Study: Mars Rovers' Goal-Oriented Autonomous Operation

6.8 Future Trends

References

Index

End User License Agreement

Pages

xiii

xiv

xv

1

2

3

4

5

6

7

8

9

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

403

404

405

406

407

408

409

410

Guide

Cover

Table of Contents

Begin Reading

List of Illustrations

Chapter 1: Introduction

Figure 1.1 First successfully flown planetary robotic systems. (a) Surveyor 3 scoop, (b) Luna 16 arm-mounted drill, (c) Luna 17 rover (Lunokhod 1). (Credits NASA, Lavochkin Association).

Figure 1.2 Robotics: a multidisciplinary subject.

Figure 1.3 ECSS defined Level of autonomy for existing and planned planetary robotic systems.

Chapter 2: Planetary Robotic System Design

Figure 2.1 A system design approach.

Figure 2.2 Luna 20 lander (sample-return configuration [9]). 1 - instrument module of descent stage; 2 - attitude control thrusters; 3 - propellant tanks; 4 - antenna; 5 - instrument section of ascent vehicle; 6 - return capsule; 7 - drilling mechanism; 8 - rod of drill mechanism; 9 - telephotometer; 10 - propellant tank; 11 - propulsion system of descent stage; 12 - descent stage; 13 - ascent vehicle for Moon–Earth transfer.

Figure 2.3 Viking lander. (a) Photo [14]. (b) Configuration [12]

Figure 2.4 Mars Polar Lander configuration [16].

Figure 2.5 Mars Surveyor '01 lander configuration [17].

Figure 2.6 Phoenix lander configuration.

Figure 2.7 InSight lander and payload configuration.

Figure 2.8 Huygens lander on Titan surface (artist impression).

Figure 2.9 Beagle 2 lander: artist impression on Mars with the instrumented PAW upright.

Figure 2.10 Philae lander: artist impression on touchdown.

Figure 2.11 Lunokhod 1 rover with stowed solar panel.

Figure 2.12 Prop-M rover.

Figure 2.13 Prop-M rover deployment concept (top) and obstacle avoidance scheme (bottom).

Figure 2.14 Sojourner rover. (a) Photo and (b) Configuration.

Figure 2.15 Mars Exploration Rover [34]. (a) Artist impression and (b) Configuration.

Figure 2.16 Curiosity rover. (a) Full-scale model and (b) configuration.

Figure 2.17 Chang'E 3 rover on the lunar surface.

Figure 2.18 ExoMars rover: artist impression.

Figure 2.19 EDL sequence. (a) MPF [4] and (b) MSL [115].

Figure 2.20 MPF landing [4]. (a) Deceleration profile and (b) Loads profile.

Figure 2.21 SFR Ripple graph: design drivers, dependencies, and ultimate impact on rover mass [117].

Figure 2.22 Mobile platform subsystem design [117]. (a) Driven by factors relating to path length, (b) Driven by factors relating to speed along the path, and (c) Driven by factors relating to the time available.

Figure 2.23 Mobile platform design: representative rover mass distribution [117].

Figure 2.24 Power subsystem design options.

Figure 2.25 Energy storage options [126].

Figure 2.26 Thermal subsystem design options.

Chapter 3: Vision and Image Processing

Figure 3.1 Image processing/computer vision gives planetary rovers “eyes” to interpret their environment.

Figure 3.2 CAD drawing of PanCam and its optical bench on the pan-tilt unit.

Figure 3.3 Models of ExoMars Navigation and localization cameras, showing calibration cube.

Figure 3.4 Example of the impact of dust on the MER-A (Spirit) rover PanCam over approximately 1 year.

Figure 3.5 Expected stereo range accuracy for different stereo configurations on MER PanCam (highest curve), MSL MastCam (middle), and NASA Mars 2020 MastCam-Z (lowest curve).

Figure 3.6 3D rendering of a similar portion of Victoria crater (Cabo Corrientes [42]) reconstructed from the rover stereo images (a) and from two different rover poses stereo (embedded in the geometric context of the HiRISE-derived DTM) (b).

Figure 3.7 (a) MSL Navcam image from Sol 581. (b) Slope image (white = 20° and more); note the lack of data in the foreground where texture is insufficient for correlation, indicating unknown result.

Figure 3.8 Operations sequence cycle for planetary rover mission control (MER paradigm).

Figure 3.9 General scheme for mapping based on stereoscopy from a rover.

Figure 3.10 Imaging geometry for panorama (MER-B, 146 images taken in Sols 652,653,655,657–661,663,665,667–669,672,673,686,697, and 703). Note some images taken from the sky region used for aerosol optical depth determination and/or check/determination of camera vignetting.

Figure 3.11 Combination of mapping from a single viewpoint (circular region) and rover motion to the right. (a) DEM with low regions coded in dark and high regions in bright; (b) Ortho image. Note the black center in the circular region which stems from occlusion caused by the vehicle itself when image capture of panorama took place.

Figure 3.12 Localized rover positions (marked as dots) on HiRISE ORI for MER-B (a, b) and MSL (c, d) showing compliance between the automated localized rover traverse and the actual tracks recorded by a HiRISE image of 25 cm from orbit.

Figure 3.13 Screenshot of PRo3D showing a geologic interpretation session in the Shaler area (Gale Crater, MSL mission). This detailed interpretation of the stratigraphy shows the main stratigraphic boundaries as gray lines, bedset boundaries as thick white lines, and laminations within those bedsets as the thin white lines (note that the original image is in color). The dip and strike values are available directly in PRo3D color coded by dip value, and generally dip 15–20° to the southeast; however, this requires validation. The findings are consistent with those in Refs [69, 70] in that the outcrop represents a changing fluvial environment, with recessive, fine-grained units interlayered with coarse, pebbly units.

Figure 3.14 Screenshots taken from the StereoWS tool showing images from MER-A in stereo anaglyph on a standard flatscreen display (originally appearing in red/blue to be viewed with anaglyph spectacles). The left panel, the control panel, is shown along with the pull-down menu, which shows inputs and outputs including the importing of left points only for subsequent 3D stereo measurements.

Figure 3.15 (a) Reference and algorithmic disparity maps from a ground-truth simulation (PANGU [75]). (b) Scheme of Perception architecture (ExoMars perception system examples).

Figure 3.16 MER example of vision-based odometry with feature detection and matching shown [76] (contrast modified).

Figure 3.17 Raw (a) and linearized (b) image (MSL front HazCam from sol 151). Note fisheye distortion on the raw image.

Figure 3.18 Overlays on image from Figure 3.17. (a) Slope; dark = low slope; bright = high slope. (b)

XYZ

image where white = lines of constant

X

; gray =

Y

.

Figure 3.19 360° drive mosaic from MSL Navcam data, sol 169, cylindrical projection.

Figure 3.20 Parallax effects on mosaics. (a) Correcting on the ground, (b) correcting on the foreground solar panel.

Figure 3.21 Mosaic of Mastcam images of Mt Sharp taken on MSL Sol 45 (September 12, 2012; originally in color) using white balance to show an Earth-like sky.

Figure 3.22 ExoMars Rover Functional Architecture depicting the use of visual information via NavCam and LocCam.

Figure 3.23 (a) Comparison of the Mars images with the ExoMars simulator generated images. (b) Development and testing rover in the Mars yard.

Figure 3.24 ExoMars PanCam 3D vision processing workflow scheme.

Figure 3.25 Result of MSL Mastcam processing of Garden City outcrop area taken at MSL Sol 926 and 929 (DEM, rendered by PRo3D [66]) making use of ExoMars PanCam 3D vision processing workflow PRoViP.

Figure 3.26 The MSDP framework.

Figure 3.27 Detecting interesting objects on planetary surfaces (such as rocks) using visual saliency method. (a) Image capture, (b) detect salient objects, (c) segmentation, and (d) track detected objects.

Figure 3.28 Legged wheel sinkage detection in a deformable terrain by measuring the level of occlusion of the locomotion contour [127].

Figure 3.29 DTM generated for image section of the CE-3 Yutu rover tracks.

Figure 3.30 Soil stiffness of lunar landing sites based on small wheel model in Equation 3 and wheel sinkage data.

Figure 3.31 (a) Result of 3D fusion between MER Navcam and Pancam processing, overlaid with Microscopic Imager (MI) Data. (b) 3D rendering of fusion result. (c) Detail showing parts of mesh. (d) Textured result–the MI image shows about 10 times higher surface resolution compared with the Pancam texture.

Figure 3.32 Result of fusion between MSL Mastcam and Navcam (taken at MSL Sol 290) processing (textured spherical distance maps, rendered by PRo3D [66]) making use of ExoMars PanCam 3D vision processing workflow PRoViP.

Figure 3.33 (Top) Search for corresponding points in MSL Mastcam (a) and MAHLI (b) images. (Bottom) Overlay of co-registered MAHLI image onto Mastcam 3D reconstruction ((c) without overlay, (d) with overlay).

Figure 3.34 Example for the fusion between 3D surface reconstruction using AU ExoMars PanCam Emulator AUPE [140] and a depth profile from the ExoMars WISDOM mock-up, obtained during the ESA SAFER field trials campaign 2013 in Chile [141] (rendered by PRo3D [66]).

Chapter 4: Surface Navigation

Figure 4.1 Image from Sojourner showing a laser stripe used for obstacle detection.

Figure 4.2 Construction of network of rover images for iterative bundle adjustment algorithm on the MERs [15].

Figure 4.7 Orbiter (HiRISE) imagery (a) and rover imagery (b) used for matching algorithm for localization on the MERs [15].

Figure 4.3 Visualization of MER terrain evaluation. Darker squares indicate decreasing traversability.

Figure 4.4 Cameras on the Curiosity rover. The NavCams and HazCams are primarily engineering (navigation) sensors.

Figure 4.5 ExoMars rover navigation and mobility functional architecture [36].

Figure 4.6 Orientation estimation via measuring local gravity and Sun vectors and aligning these to corresponding vectors in a world frame of reference.

Figure 4.8 Sample mesh-based map generated by RASM autonomy software, also showing candidate motion arcs for rover.

Figure 4.9 A* paths on a grid (a) and on a connected graph (b) [130].

Figure 4.10 Navigation processing overview for the ExoMars rover [36]. (Courtesy Airbus DS).

Figure 4.11 Path planning visualization for the ExoMars rover [36].

Figure 4.12 CSA Analog Terrain.

Figure 4.13 The CSA Artemis Jr rover at the RESOLVE mission simulation on Mauna Kea, Hawaii.

Figure 4.14 Example of a 25 cm HiRISE orthorectified image created from a HiRISE stereo-pair (a), with an example of a 5 cm super-resolution restoration image derived from a stack of 8 HiRISE 25 cm images (b) taken over 7 years.

Chapter 5: Manipulation and Control

Figure 5.1 Specifications of existing planetary robotic arm systems.

Figure 5.2 MARS'01 arm [2, 3]. (a) Photo, (b) kinematics, (c) in operation for rover deployment.

Figure 5.3 Phoenix arm [4]. (a) Calibration and testing, (b) kinematics.

Figure 5.4 MER IDD [5]. (a) Breadboard, (b) kinematics.

Figure 5.5 Beagle 2 arm.

Figure 5.6 MSL robotic arm [8]. (a) Drawing, (b) primary workspace.

Figure 5.7 Layered structure of the motion control for a robot manipulator.

Figure 5.8 Feedback controller with position reference.

Figure 5.9 Feedback controller with position and velocity reference.

Figure 5.10 Force/motion control. (a) Hybrid, (b) parallel.

Figure 5.11 Control strategies. (a) Impedance control, (b) stiffness control.

Figure 5.12 Control strategies. (a) Admittance control, (b) compliance control.

Figure 5.13 Overview of a torque control scheme of a brushless DC motor-based joint.

Figure 5.14 Feedforward (inverse model) plus feedback controller.

Figure 5.15 Inverse model for decoupling robot model.

Figure 5.16 Computed torque control.

Figure 5.17 Image-based visual servoing control scheme.

Figure 5.18 Position-based visual servoing control scheme.

Figure 5.19 Example of a trajectory interpolation between two points using a cubic polynomial.

Figure 5.20 Bilateral teleoperation. (a) Position–position control, (b) force–position control.

Figure 5.21 The neglect curve.

Figure 5.22 InSight testing with scaled-mass payload.

Figure 5.23 Two types of ABT configurations. (a) Flat ABT testbed simulating the chaser spacecraft during capture maneuver with a robotic arm. (Courtesy CBK PAN.) (b) Round ABT testbed simulating frictionless motion around the roll, yaw, and pitch axis.

Figure 5.24 A lander mock-up installed through a spherical air bearing (SAB) on a cart moving on an inclined flat granite table with air bearings (FRAB), and the lander legs are touching a vertically oriented planetary surface mock-up.

Figure 5.25 A possible configuration of a lander and a manipulator arm on planetary surface.

Figure 5.26 An ABT testbed allowing tests of sampling tool interactions where all major components of motion induced by forces () are flat.

Figure 5.27 Justin robot.

Figure 5.28 Whole-body operational space control (WBOSC) diagram.

Figure 5.29 Robot AILA using whole-body control to perform some tasks within a mock-up International Space Station.

Figure 5.30 DRC robots in 2015 finals.

Figure 5.31 NASA's R5 or Valkyrie robot.

Chapter 6: Mission Operations and Autonomy

Figure 6.1 Two segments and communication links in mission operations.

Figure 6.2 An example of ECSS space system model.

Figure 6.3 Three operation processes and respective time frame.

Figure 6.4 Strategic and tactical operation flow.

Figure 6.5 LoA of existing rover missions.

Figure 6.6 Block diagram of basic ground operation software architecture.

Figure 6.7 Basic architecture of onboard operation software with low LoA.

Figure 6.8 Basic architecture of onboard operation software with high LoA.

Figure 6.9 Three-layer control architecture in robotics.

Figure 6.10 ASE's software architecture [27].

Figure 6.11 GOAC's software architecture.

Figure 6.12 An illustration of a temporal database.

Figure 6.13 An illustration of two timelines for planning with a chronicle.

Figure 6.14 Traditional relations between planning and scheduling.

Figure 6.15 EUROPA framework architecture [42].

Figure 6.16 APSI modeling components. (a) State variable, (b) resources.

Figure 6.17 APSI framework architecture.

Figure 6.18 A simple taxonomy of reconfiguration systems.

Figure 6.19 Block diagram of a self-reconfiguring rover GNC system.

Figure 6.20 The MAPE-K loop.

Figure 6.21 Modules of the Ontology.

Figure 6.22 Work flow of the Rational Agent for system reconfiguration.

Figure 6.23 CPU percentage usage during the initial uplink reconfiguration of the self-reconfiguring GNC system: (A) the Ontology Manager is receiving, validating, and merging with the uplink instructions; (B) the Rational Agent plans, validates, and executes reconfiguration; (C) the Rational Agent uses the generic PDDL planner; (D) the application layer is initialized.

Figure 6.25 Configuration for the application layer of the self-reconfiguring GNC system. (a) First configuration, (b) second configuration.

Figure 6.24 CPU percentage usage during the mission goal reconfiguration of the self-reconfiguring GNC system: (A) the Rational Agent plans, validates, and executes reconfiguration; (B) the Rational Agent uses the generic PDDL planner; (C) the application layer initiates scheduler.

Figure 6.26 Plan validation using model checking within MrSPOCK.

Figure 6.27 GOAC architecture.

Figure 6.28 The 3DROV environment (left column, Courtesy TRASYS) and the DALA rover (right column, Courtesy LAAS-CNRS).

Figure 6.29 GOAC test results using 3DROV.

List of Tables

Chapter 1: Introduction

Table 1.1 Statistics on planetary unmanned landing missions as of 2015

Table 1.2 Successfully flown robots on Mars, the Moon, and small bodies as of 2015

Chapter 2: Planetary Robotic System Design

Table 2.1 Defining suiTable requirements through S.M.A.R.T. criteria

Table 2.2 Trade-offs example: assessment of the various system characteristics

Table 2.3 Trade-offs example: setting up defined weighing factors

Table 2.4 Trade-offs example: weighting assessment of the various system characteristics

Table 2.5 Characteristics of the Surveyor landers and successors

Table 2.6 Autonomy characteristics of the Huygens lander [20]

Table 2.7 Autonomy characteristics of the Philae lander [20]

Table 2.8 Planned lunar missions

Table 2.9 Future missions: robotic platforms

Table 2.10 Future missions: mission destinations and concepts

Table 2.11 Future missions: operational mission concepts

Table 2.12 Rocky planets' characteristics [111]

Table 2.13 Characteristics of popular moon targets [111]

Table 2.14 H.E.A.R.D. scale: assessment of concepts for a mobile robotic platform to explore hazardous areas

Table 2.15 Solar irradiation across the solar system.

Table 2.16 Solar power technology: pros and cons.

Table 2.17 Radioisotope fuel options.

Table 2.18 Radioisotopic power: pros and cons

Table 2.19 NASA's RTG implementations [135, 138]

Table 2.20 Typical primary batteries characteristics [144].

Table 2.21 Primary batteries: pros and cons [143, 145].

Table 2.22 Space secondary batteries' characteristics [119, 144].

Table 2.23 Development roadmap of terrestrial rechargeable batteries.

Table 2.24 Fuel cell technologies and typical characteristics [148]

Table 2.25 Typical temperature ranges for robotic components

Table 2.26 Typical thermal trade-off design options for rovers

Chapter 3: Vision and Image Processing

Table 3.1 Important terms and acronyms in the field of planetary robotics vision

Table 3.2 Mission phases where robotics vision is required

Table 3.3 Vision sensors as realized on planetary robotic missions by 2015

Table 3.4 Normalized performance results for various saliency algorithms on chosen datasets

Table 3.5 Average saliency computation time for all models using different test datasets

Table 3.6 Dataset characteristics and types of terrains [127]

Table 3.7 Average and maximum error for all datasets [125]

Chapter 4: Surface Navigation

Table 4.1 Comparison of visual odometry pipelines of MERs and ExoMars rover

Table 4.2 Comparison of types of field testing sites

Chapter 5: Manipulation and Control

Table 5.1 Overview of existing planetary arms

Table 5.2 Beagle 2 arm performance

Table 5.3 Curiosity arm parameters

Chapter 6: Mission Operations and Autonomy

Table 6.1 ECSS LoA for space segment

Table 6.2 MER onboard operation modes

Table 6.3 LoA for practical features within operation software

Table 6.4 Three representations of a classical planning problem

Table 6.5 Comparison of various reconfiguration systems against 23 design attributes

Table 6.6 GOAC test results using DALA: CPU time taken by different modules within the functional layer

Edited by Yang Gao

 

Contemporary Planetary Robotics

An Approach Toward Autonomous Systems

 

 

 

Editor

Professor Yang Gao

University of Surrey

Surrey Space Centre

STAR Lab, Stag Hill

GU2 7XH, Guildford

United Kingdom

Cover

Courtesy NASA/JPL-Caltech

All books published by Wiley-VCH are carefully produced. Nevertheless, authors, editors, and publisher do not warrant the information contained in these books, including this book, to be free of errors. Readers are advised to keep in mind that statements, data, illustrations, procedural details or other items may inadvertently be inaccurate.

 

Library of Congress Card No.: applied for

 

British Library Cataloguing-in-Publication Data

A catalogue record for this book is available from the British Library.

 

Bibliographic information published by the Deutsche Nationalbibliothek

The Deutsche Nationalbibliothek lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at <http://dnb.d-nb.de>.

 

© 2016 Wiley-VCH Verlag GmbH & Co. KGaA,

Boschstr. 12, 69469 Weinheim, Germany

 

All rights reserved (including those of translation into other languages). No part of this book may be reproduced in any form – by photoprinting, microfilm, or any other means – nor transmitted or translated into a machine language without written permission from the publishers. Registered names, trademarks, etc. used in this book, even when not specifically marked as such, are not to be considered unprotected by law.

Print ISBN: 978-3-527-41325-6

ePDF ISBN: 978-3-527-68494-6

ePub ISBN: 978-3-527-68495-3

Mobi ISBN: 978-3-527-68496-0

oBook ISBN: 978-3-527-68497-7

Cover Design Grafik-Design Schulz, Fußgönheim, Germany

List of Contributors

Elie Allouis

Airbus Defence and Space Ltd.

Future Programmes

Gunnels Wood Road

SG1 2AS Stevenage

United Kingdom

 

Abhinav Bajpai

University of Surrey

Surrey Space Centre

STAR Lab, Stag Hill

GU2 7XH Guildford

Surrey

United Kingdom

 

Guy Burroughes

University of Surrey

Surrey Space Centre

STAR Lab, Stag Hill

GU2 7XH Guildford

Surrey

United Kingdom

 

Robert G. Deen

California Institute of Technology

Instrument Software and Science Data Systems Section

Jet Propulsion Laboratory

4800 Oak Grove Drive

Pasadena

CA 91109

USA

 

Alessandro Donati

European Space Agency

ESOC, OPS-OSA

Robert-Bosch-Strasse 5

64293 Darmstadt

Germany

 

José de Gea Fernández

DFKI GmbH

Robotics Innovation Center

Robert-Hooke-Str. 1

D-28359 Bremen

Germany

 

Simone Fratini

European Space Agency

ESOC, OPS-OSA

Robert-Bosch-Strasse 5

64293 Darmstadt

Germany

 

Yang Gao

University of Surrey

Surrey Space Centre

STAR Lab, Stag Hill

GU2 7XH Guildford

Surrey

United Kingdom

 

Peter Iles

Neptec Design Group

Space Exploration

302 Legget Drive

Ottawa

ON K2K 1Y5

Canada

 

Frank Kirchner

DFKI GmbH

Robotics Innovation Center

Robert-Hooke-Str. 1

D-28359 Bremen

Germany

 

and

 

University of Bremen

Faculty of Mathematics and Computer Science

Robert-Hooke-Str. 1

D-28359 Bremen

Germany

 

Jan-Peter Muller

Imaging Group

Mullard Space Science Laboratory

UCL Department of Space & Climate Physics

Holmbury St Mary

RH5 6NT Surrey

United Kingdom

 

Jorge Ocón

GMV Aerospace and Defense

Avionics On-board Software Division

Space Segment Business Unit

Isaac Newton, 11 (PTM)

Tres Cantos

28760 Madrid

Spain

 

Gerhard Paar

JOANNEUM RESEARCH Institute for Information and Communication Technologies

Machine Vision Applications Group

Steyrergasse 17

8010 Graz

Austria

 

Nicola Policella

European Space Agency

ESOC, OPS-OSA

Robert-Bosch-Strasse 5

64293 Darmstadt

Germany

 

Karol Seweryn

Space Research Centre of the Polish Academy of Sciences (CBK PAN)

18a Bartycka str.

00-716 Warsaw

Poland

 

Affan Shaukat

University of Surrey

Surrey Space Centre

Department of Electrical & Electronic Engineering

GU2 7XH Guildford

United Kingdom

 

Nuno Silva

Airbus Defence and Space Ltd.

Department of AOCS/GNC and Flight Dynamics

Gunnels Wood Road

Stevenage

SG1 2AS Hertfordshire

United Kingdom

 

Matthias Winter

Airbus Defence and Space Ltd.

Department of AOCS/GNC and Flight Dynamics

Gunnels Wood Road

Stevenage

SG1 2AS Hertfordshire

United Kingdom

Chapter 1Introduction

Yang Gao, Elie Allouis, Peter Iles, Gerhard Paar and José de Gea Fernández

Planetary robotics is an emerging multidisciplinary field that builds on knowledge of astronautics, terrestrial robotics, computer science, and engineering. This book offers a comprehensive introduction to major research and development efforts for planetary robotics, with a particular focus on autonomous space systems, which will enable cost-effective, high-performing, planetary missions. Topics covered in this book include techniques and technologies enabling planetary robotic vision processing, surface navigation, manipulation, mission operations, and autonomy. Each topic or technological area is explained in a dedicated chapter using a typical space system design approach whereby design considerations and requirements are first discussed and followed by descriptions of relevant techniques and principles. Most chapters contain design examples or use cases that help demonstrate how techniques or theoretical principles can be implemented in real missions. Since any space engineering design or development is a system engineering process, this book also dedicates one chapter to planetary robotic system design – from mission concepts to baseline designs. As a result, this book can be used as a text or reference book for relevant engineering or science courses at the undergraduate and postgraduate level, or a handbook for industrial professionals in the space sector.

This chapter introduces the book by offering a chronicle on how planetary exploration and robotics have evolved to date, a systematic overview of planetary robotics, as well as an explanation on the organization and scope of the book.

1.1 Evolution of Extraterrestrial Exploration and Robotics

The need for humans to explore beyond the realm of Earth is driven by our inherent curiosity. Throughout our history, new worlds have been discovered by daring explorers who set out to discover new lands, find riches, or better understand these little-known territories. These journeys were fueled by the technological advances of the times such as the compass, maritime maps, or plane, and in return contributed tremendously to the scientific knowledge of humankind. For all the good provided by these exploratory endeavors, history also reveals that exploration is difficult, perilous, and can be fraught with unforeseeable consequences. For examples, within early maritime exploration, only a fraction of all the ships that aimed for the new worlds eventually achieved their goals. There have also been countless instances where the discovery of the new lands was detrimental to the indigenous populations. The past and lessons learned serve as a stark reminder to all new exploration endeavors.

Outer space has provided real, new exploration frontiers for mankind since the 1950s. With the capability and the irresistible attraction to go beyond our planet Earth, minimizing the impact of mankind on other extraterrestrial bodies (be it a planet, a moon, a comet, or an asteroid) is paramount. Strong with the hindsight and knowledge provided by humans' own history, we are continuously learning about these new space frontiers and taking precautions to avoid repeating mistakes learned from the past exploration activities.

The onset of space exploration in the late 1950s to early 1960s focused on sending humans into space and the Moon, a key priority for the two main adversaries of the Cold War. However, it was true then as it is now, in parallel to the expensive development of manned space programs, the use of cheaper robotic proxies was deemed important for understanding the space environment where the astronauts will be operating. The USSR had the first set of robotics missions, successfully launching a series of Luna probes starting from 1959. Within a year, the Luna 1 managed a flyby of the Moon, Luna 2 crash landed on the Moon, and Luna 3 took pictures of the Moon's far side. It took another 7 years before both the USSR and the United States, within a few months from each other, performed soft landing on the Moon with their respective probes, Luna 9 and Surveyor 1. These missions paved the way for the first human landing on the Moon in 1969 by the United States. Building on these earlier successes, robotic exploration missions have extended their reach to Mercury, Venus, Mars (known as the inner solar system), and subsequently the outer solar system where tantalizing glimpses of the volcanic Io, the frozen Europa, or the methane rains of Titan have beenobtained.

Planetary missions can use various ways to explore an extraterrestrial body, often starting with reconnaissance or remote sensing using orbiting satellites. More advanced approaches (such as landing, surface operation, and sample return) enabled by sophisticated robotic systems represent a giant leap in terms of mission complexity and risk, but more importantly scientific return. Not surprisingly, advanced extraterrestrial exploration is littered with unsuccessful missions bearing witness to serious technical challenges of such endeavors. Table 1.1 presents statistics of successful surface missions aimed for the solar system (excluding manned missions). The relatively low success rate is a clear reflection on the technical difficulties involved in designing, building, and operating the required robotic spacecraft. It is worth noting that space engineers and scientists have created the landscape of what we know today. With sheer determination, they continue to address countless challenges, failing often, but regrouping until they succeed.

Table 1.1 Statistics on planetary unmanned landing missions as of 2015

Venus

Moon

Mars

Titan

Asteroids/comets

Total landing missions launched

19

35

16

1

6

Successful surface operation

9

13

9

1

1

Successful sample return

0

3

0

0

1

Within the existing successful unmanned missions, various types of robotic systems have played significant roles, including robotic platforms (such as the surface rovers) or robotic payloads (such as the manipulators or robotic arms, subsurface samplers, and drills). Table 1.2 summarizes those successfully flown robots found on the Moon, Mars, and small bodies. The first genuine robotic payload successfully operated on an extraterrestrial body was a scoop (i.e., a manipulation cum sampling device) onboard the Surveyor 3 lander launched in 1967 to the Moon (as shown in Figure 1.1a). Following that, Luna 16 succeeded with the first planetary robotic arm-mounted drill in 1970 (as shown in Figure 1.1b), and Luna 17 succeeded with the first planetary rover called Lunokhod 1 in 1970 (as shown in Figure 1.1c).

Table 1.2 Successfully flown robots on Mars, the Moon, and small bodies as of 2015

Mission

Country

Target

Rover

Arm

Sampler

Drill

Surveyor 3

United States

Moon

Luna 16/20/24

USSR

Moon

Luna 17/21

USSR

Moon

Viking

United States

Mars

Mars Path Finder

United States

Mars

Hayabusa (or Muses-C)

Japan

Asteroid

Mars Exploration Rovers

United States

Mars

Phoenix

United States

Mars

Mars Science Laboratory

United States

Mars

Chang'E 3

China

Moon

Rosetta

Europe

Comet

Figure 1.1 First successfully flown planetary robotic systems. (a) Surveyor 3 scoop, (b) Luna 16 arm-mounted drill, (c) Luna 17 rover (Lunokhod 1). (Credits NASA, Lavochkin Association).

There is no denying that these “firsts” led to incredible mission successes and science discoveries as a result of unabated and relentless launch attempts during the space race between the superpowers. Building on these foundations, the new generation of planetary exploration has since the 1990s not only traveled further into the solar system but also studied deeper fundamental scientific questions. The desire to go and explore is as strong as ever. Past space powers have been gradually joined by a flurry of new nations eager to test and demonstrate their technologies and contribute to an increasing body of knowledge. Commercial endeavors also have eyes on space and actively promote the Moon and Mars as possible destinations for long-term human presence or habitation. Shall the future exploration missions be manned or unmanned, planetary robots are always desired to deliver the robotic “avatars” and perform in situ tasks to proxy or assist through their “eyes,” “ears,” and “hands.”

1.2 Planetary Robotics Overview

A typical robot on Earth is an unmanned electromechanical machine controlled by a set of automatic or semi-autonomous functions. Industrial standard robots are typically used to address the “3D” activities: Dirty, Dull, and Dangerous. This notion was created in reference to the Japanese concept of “3K” (kitanai, kiken, and kitsui) describing the major areas where the robots should be used to effectively relieve human workers from working environments such as with the construction industry. Therefore, robotic systems are envisaged to work on repetitive, long-duration or high-precision operations in the environment where humans are expected to perform poorly or where it is impractical for human presence.

Robotics, as an engineering or scientific subject, emerges from a number of traditional disciplines such as electronics, mechanics, control, and software, as illustrated in Figure 1.2. Designing a robotic system, therefore, involves the design of hardware subsystems (e.g., sensors, electronics, mechanisms, and materials) and software subsystems (such as perception, control, and autonomy). A planetary robotic system is functionally similar to a terrestrial robotic system, with different performance characteristics to cope with things such as stringent space mission requirements (often in aspects such as radiation hardness to survive the space environment), scarce power and computational resources, and high autonomy demanded due to communication latency.

Figure 1.2 Robotics: a multidisciplinary subject.

A robotic system is not required to possess a fixed level of automation or autonomy. In fact, it can employ a wide range of control modes from remote or teleoperation, semi-autonomous to fully autonomous operation as appropriate to its mission goal, location, and operational constraints. Fundamental differences between the automatic and the autonomous control mode relates to the level of judgment or self-direction in the action performed. An autonomic response or control is associated to a reflex, an involuntary behavior that is “hard-wired” into the robot with no decision making involved. An autonomous behavior on the other hand represents a complex independent response not controlled by others (meaning there is decision making involved). An analogy can be drawn from the nature that evolved from the monocellular organisms behaving similar to an automatic system reacting to external stimuli to complex organisms such as mammals and birds that exhibit significantly more advanced independent logical behaviors.

According to the European Cooperation for Space Standardization (ECSS), a spacecraft (or a planetary robotic system in this case) is standardized to work on four different levels of autonomy or control modes as follows:

Level E1: execution mainly under real-time ground control, that is, remote or teleoperation;

Level E2: execution of preplanned mission operations onboard, that is, automatic operation;

Level E3: execution of adaptive mission operations onboard, that is, semi-autonomous operation;

Level E4: execution of goal-oriented mission operations onboard, that is, fully autonomous operation.

Planetary robots can also be classified into three groups depending on their capabilities of achieving different ECSS levels of autonomy:

Robotic agents

that act as human proxies in space to perform exploration, assembly, maintenance, and production tasks in the level E1–E3 operations.

Robotic assistants

that help human astronauts to perform tasks quickly and safely, with higher quality and cost efficiency using the level E3 or potentially E4 operation.

Robotic explorers

that explore the extraterrestrial targets using the level E4 operation.

Figure 1.3 presents the timeline of existing and foreseen planetary robotic systems with respect to the ECSS levels E1–E4. Existing, successfully flown planetary robots are all within the robotic agent category. It is evident that as time proceeds, modern planetary missions with increasingly challenging goals require increased level of autonomy within the robotics systems, hence a shift from robotic agents to robotic explorers.

Figure 1.3 ECSS defined Level of autonomy for existing and planned planetary robotic systems.

1.3 Scope and Organization of the Book

The book focuses on R&D topics that directly influence the onboard software and control capabilities of a planetary robot in achieving greater level of autonomy. It does not aim to cover design issues of any hardware subsystems such as sensors, mechanisms, electronics, or materials. However, discussions on hardware-related issues can be provided whereby they influence the design of required functions, or to provide a wider context of the robotic system design. The rest of the book is organized in such a way that each chapter focuses on a specific technical topic so that it can be read or used without much dependency on other chapters. At the same time, the technical chapters are cross-referenced among each other when there are crossovers between their subjects to help the readers establish an understanding of the system engineering philosophy that is fundamental to any space system design and development.

The design of a planetary robotic mission is complex. Past and current missions have shown how the endeavor can be treacherous and fraught with design, implementation and operational challenges. From the impact of the environment, the management of resources, to the operational concept, the system design and development must be approached as a whole rather than the sum of discrete elements. Chapter 2 conveys that a top-level view of mission-driven considerations should be established at the early stage of the robotic system design assessment, and hence introduces the space system design methodology and tools. Following the introduction Section 2.1, Section 2.2 presents a system engineering approach required to design the planetary robotic systems. Section 2.3 introduces a range of planetary robotic systems used as part of past and present exploration missions, as well as how they can address specific mission challenges and contribute to the return of the valuable science data. This section also looks ahead at future robotic systems that are currently being investigated to implement more adventurous mission concepts and operational scenarios. Section 2.4 reviews a range of planetary environmental factors that are driven by mission targets and at the same time drive designs of various robotic systems and subsystems. Section 2.5 demonstrates using a case study how to define system-level design drivers and perform subsystem design trade-offs. Finally in this chapter, Sections 2.6 and 2.7 provides insightful design options for key system operations and subsystems that have major influence to the overall system design.

A planetary robot is expected to interact with the environment and other assets of a mission, and perceive information. Similar to humans, visual sensing to the robot is the most effective and powerful way for collecting information in an unknown environment and situation. Chapter 3 addresses the vision aspects as being a prerequisite for navigation, autonomy, manipulation, and scientific decision-making. The introduction Sections 3.1 and 3.2 presents the scope, aims, terms, and most important requirements as well as constraints for robotic vision in the planetary context. Vision sensors and sensing are addressed in Section 3.3 including representative examples. Section 3.4 describes the radiometric and geometric sensor calibration that is the key to objective and meaningful sensor data interpretation and exploitation with important error influences listed as well. The following sections cover the complementary approaches of ground-based vision processing (Section 3.5) and onboard vision processing (Section 3.6) offering complementary material to the surface navigation and localization in Chapter 4. Section 3.7 presents past and present mission approaches exploiting robotic vision techniques and highlights the vision processing mechanisms used in Mars missions MER, MSL, and ExoMars. The chapter closes with a set of advanced concepts in Section 3.8.

Planetary surface navigation is among the key technologies in any robotic exploration missions, particularly involving mobile robotic platforms such as the rovers. Navigation technologies allow the rover (and hence the ground operators) to know where the robot is, where the robot should go next, and to guide the robot along a selected path. In the presence of obstacles, the navigation system enables safe and efficient exploration of its environment. Chapter 4 investigates all aspects of the rover navigation system. Following the introduction Section 4.1, Section 4.2 presents challenges of navigating on different extraterrestrial bodies and describes relevant flight rover systems including the Apollo LRV, the Russian Lunokhods, the Mars Exploration Rovers, and Curiosity. Section 4.3 presents the navigation system design process through a discussion of requirements and major design concepts. A thorough description of localization technologies is given in Section 4.4, including orientation estimation, relative localization, absolute localization, and fusion of localization sources. This is followed by Section 4.5 with a discussion of all the steps necessary to achieve autonomous navigation, from sensing to control. Finally, in Section 4.6 of the chapter, the prospect of planetary robotic navigation is presented, with a review of planned flight rovers, missions, and as enabling future technologies.

As evident from existing planetary robotic missions, robotic manipulators have played an important role, such as serving scientific experiments by grabbing samples or delivering the drills to access rocks or soil. The first part of Chapter 5 reviews past manipulators and their technical characteristics (see Section 5.1). Section 5.2 provides an overview of design criteria, specifications, and requirements for constructing a planetary manipulator, a lot of which have synergies to constructing a rover. Section 5.3 discusses control algorithms, from the low-level control of an actuator to high-level motion planning for the arm including trajectory generation, teleoperation, and possible autonomous mode. Section 5.4 further discusses testing and validation procedures for a planetary robotic arm system. Future planetary robots are envisaged to possess not only sophisticated manipulation skills and the ability to reuse these skills for different tasks but also a high level of autonomy to cope with complexmission scenarios (such as building lunar outpost). Hence, the last section (Section 5.5) of the chapter investigates various novel capabilities of planetary manipulators in the long term, for example, the use of two arms, the use of whole-body control algorithms, which considers the mobile platform as part of the manipulation system, or the ability to act in dynamically changing environments.

There is no doubt that future planetary robotic missions aim for high operational autonomy and improved onboard software capabilities. Chapter 6 offers a systematic, thorough discussion on mission operations and autonomy. Section 6.1 introduces the background and Section 6.2 sets the context of the topic by introducing the basic concepts of mission operations, processes and procedures, and typical operation modes of planetary robotic systems. Section 6.3 discusses the first step in developing the mission operation software, that is, how to establish the software architecture (both onboard and on ground) for a given mission operation. The following three sections investigate the main design aspects or core technologies in mission operations: Section 6.4 discusses the planning and scheduling (P&S) techniques and representative design solutions that can enable high level of autonomy; Section 6.5 presents the technology that allows reconfiguration of autonomous software within mission operation; and Section 6.6 covers various tools and techniques for validation and verification of autonomous software. To demonstrate the practicality of the theoretical principles, Section 6.7 presents a design example of mission operation software for Mars rovers. The last Section 6.8 of the chapter outlines some over-the-horizon R&D ideas in achieving autonomous operations and systems for future planetary robotic missions.

1.4 Acknowledgments

The editor and coauthors of the book thank Wiley-VCH for publishing this work and the editorial team for their support.

Some parts of the book are based on results, experience, and knowledge gained from past funded R&D activities. The following funded projects should be acknowledged:

FASTER funded by European Community's Seventh Framework Programme (EU FP7/2007–2013) under grant agreement no. 284419.

PRoViDE funded by European Community's Seventh Framework Programme (EU FP7/2007–2013) under grant agreement no. 312377.

ExoMars mission development funded by European Space Agency (ESA) and European national space agencies.

GOAC funded by European Space Agency Technology Research Programme (ESA-TRP) under ESTEC contract no. 22361/09/NL/RA.

RoboSat funded by UK Royal Academy of Engineering (RAEng) Newton Research Collaboration Programme under grant agreement NRCP/1415/89.

Reconfigurable Autonomy funded by UK Engineering & Physical Sciences Research Council (EPSRC) under grant agreement EP/J011916/1.

Consolidated grant to MSSL funded by UK Science and Technology Facilities Council (STFC) under grant agreement ST/K000977/1.

BesMan funded by German Federal Ministry of Economics and Technology under grant agreement FKZ 50 RA 1216 and FKZ 50 RA 1217.

Projects and research carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

The editor and coauthors also thank the following organizations or colleagues for their support to this book project: Neptec Design Group, Airbus Defence and Space Future Programme and ExoMars GNC Team, Surrey Space Centre STAR Lab, JR 3D Vision Team led by Arnold Bauer, GOAC team, including Antonio Ceballos, Michel Van Winnendael, Kanna Rajan, Amedeo Cesta, Saddek Bensalem, and Konstantinos Kapellos. The book is also in memory of Dave Barnes, a long-time contributor to the planetary robotics R&D community in Europe.

Last but surely not the least, many loved ones of the coauthors are greatly appreciated for their unreserved support when this book project was carried out mostly during everyone's spare time, to name a few Yang's mom and dad, Peter's wife Tracey and mom, Elie's wife Anneso, and others.

Chapter 2Planetary Robotic System Design

Elie Allouis and Yang Gao

2.1 Introduction

At the inception of a new mission concept, the system design of a spacecraft (planetary robots included) is a critical phase that should not be overlooked. Galvanized by the prospects of an exciting concept, it is often tempting to delve too quickly into subsystem design, at the peril of the mission and its development team. Identification of key links, interactions, and ramifications of design decisions are crucial to the feasibility of the concept and success of the mission. Only then can the optimized system design be found, which may not be the sum of the optimized subsystems.

Building on a comprehensive review of existing and future planetary robotic missions, this chapter uses the system engineering design philosophy and discusses the mission-driven design considerations, the system-level design drivers as well as the subsystem design trade-offs for a robotic system, whether the mission is set to explore the lunar craters, the Martian landscape, or beyond. It demonstrates the design thought process starting from the mission concept up to the baseline design at the system and subsystem levels. As a result, the chapter offers a number of system design tools as the foundation or integrator for technologies discussed in subsequent chapters.

The chapter is structured systematically as follows:

Section 2.2

: This section describes in detail the system design approach and implementation steps applicable to planetary robotic missions and required robotic systems. The process starts from defining the mission scenario, which provides inputs for system-level functional analysis and determination of functional objectives for the robot(s). This then allows the progression to the next phase of the system definition by specifying and reviewing design requirements (e.g., using the S.M.A.R.T. method). Design drivers are subsequently identified and used to evaluate and trade-off different design choices, which results in the baseline design.

Section 2.3

: Having introduced the system design philosophy and general tools, this section presents the state-of-the-art design and development examples of primary robotic platforms demonstrated through existing and future exploration missions. This provides a comprehensive overview of key robotic systems, subsystems, and their performance. The section provides extensive real-world examples of relevant robotic concepts, designs, and technologies.

Section 2.4

: This section identifies the space environmental factors that should be considered by any planetary robotic system design for a given mission, namely gravity, temperature, atmosphere/vacuum, orbital characteristics, and surface conditions, and so on. Each factor is investigated thoroughly in terms of its impact on different system- or subsystem-level designs for the planetary robot(s). It also presents the properties of various popular extraterrestrial targets showing rationales behind the resulting differences in robotic system design for different targets.

Section 2.5

: This section describes the design drivers based on the mission targets and objectives. A case study on Sample-Fetching Rover (SFR) for a Mars sample-return (MSR) mission concept is used to demonstrate how to implement system and subsystem design drivers, and how to perform trade-off analysis and design evaluation given the design drivers and options. It also presents the design tools that systematically capture the requirements (i.e., ripple graph) and evaluate design options (i.e., H.E.A.R.D.).

Section 2.6

: This section reviews the robotic system operation sequence and design options that allow the robot to vary its level of autonomy between teleoperation and full autonomy. It identifies that the autonomy functions and levels of the robot are interrelated and/or to be defined driven by design requirements and trade-off constraints.

Section 2.7

: This section presents design options for a couple of selected subsystems for planetary robots (i.e., power and thermal). These two are chosen because they are crucial subsystems, which also drive the entire system design of planetary robots. In addition, they cover complementary design aspects of planetary robotic systems to those addressed by the subsequent chapters in the book. The section also presents a wide range of design options when each subsystem is discussed, covering both the state-of-the-art and future technologies.

2.2 A System Design Approach: From Mission Concept to Baseline Design

Similar to any space system, the design of a planetary robot is an iterative process starting from a mission concept to a consolidated design. This chapter discusses the system engineering approach where the readers are guided through the typical system definition activities where background information is provided to understand the state of the art and to critically review and understand the design decisions that have shaped past and current planetary missions. This process is composed of a number of important steps that should be investigated sequentially in order to avoid being drawn too quickly from a concept to a technological solution, otherwise such pitfalls can significantly constraint the implementation of the target system and lead to costly changes at a later stage in the mission development. The system design approach, as shown in Figure 2.1, therefore concentrates on the preliminary specification and definition of the target system, and the identification of a wide range of possible implementations options. Then, these options can be critically compared against their merits (e.g., performance, cost, and complexity) before being downselected as a baseline design. Hence, the definition of the initial problem is critical to ensure the final system fulfills its original purpose. In the remainder of this section, an example mission concept is used to illustrate the key steps shown in Figure 2.1 and how the various stages of the design process provide critical data or inputs required by the following step of the process.

Figure 2.1 A system design approach.

2.2.1 Mission Scenario Definition

A new mission concept starts with a number of key criteria that the mission is expected to achieve. Mission concepts are more often than not either led by a science team pitching the concept to a space agency, or reciprocally initiated by the space agency itself involving a mission definition team to flesh out a mission concept. To date, most planetary exploration missions have been primarily science driven to answer some fundamental questions about the target bodies that the robotic systems are used to explore (e.g., find traces of life on Mars, or understand and characterize the presence of water in the shadowed craters on the Moon). In the future, the robotic systems may be used to support human settlements on the Moon and beyond, in which case the mission scenario is to concentrate on addressing needs beyond pure science such as the need to build the necessary infrastructures prior to the arrival of Humans (e.g., habitat, local production of oxygen or water from in situ resources utilization).

Outputs

– Mission objective, science objectives.

Example

– A challenging mission to Mars is proposed to explore caves on the red planet. It is anticipated that a number of candidate locations have been identified and that a mobile platform is required to investigate whether traces of water ice can be found as well as possible signs of past or present life. The platform will carry a suite of instruments to characterize the environment through optical systems, contact and noncontact sensing (e.g., drill and spectrometers).

2.2.2 Functional Analysis

A functional analysis is required at a number of design stages. At the system level, it captures and reformulates the mission objectives in terms of functions that the overall system must perform to help with the requirement definition. It is important to capture the functions, not the implementation of the function at this stage. These can be expressed diagrammatically or through short statements.

Outputs

– Identification of the system-level functionalities and operations.

Example

– A mobile platform will be deployed on the surface of Mars. The mobile platform will carry a suite of instruments. The mobile platform will access the Martian caves. The platform will deploy a suite of instruments in the cave to characterize soils and rocks. The platform will communicate directly to an orbiter. The platform will not rely on radioisotopic power systems. The platform will be compliant with planetary Protection Level X. In a typical industrial setup, the Mission Definition and the System-Level Functional Analysis tend to be both performed by the customer (e.g., a space agency) at the inception of the project. These can also be originated by a working group fleshing out mission concepts to address a specific science need.

2.2.3 Requirements Definition and Review

Building upon the functional analysis, this stage formalizes the functions into more formal requirements that define the system or represent the expected outcome of the project. These requirements capture functional and nonfunctional aspects to frame the definition of the system. Functional requirements include the features of the system, its behaviors, its capabilities, and the conditions that must exist for its operation. Nonfunctional requirements capture the environmental conditions under which the system must remain effective or describe the performance or quality of service of the solution. This process is critical as the set of requirements bound the design of the system and flow down to every subsystem and operation. Given a great number of people are often involved in a typical space mission project from inception to operation, it is, therefore, important to define these requirements in such a way that they do not initially overconstrain the design or do not prescribe a specific technological solution too early. In addition, they must provide an unambiguous way of framing the design of the system despite the different perspectives of the stakeholders. As such, a requirement for a mobile platform could be written as “The platform shall have six wheels.” This particular wording is not advised as it constrains already the type of locomotion method (i.e., wheels and the number of them). It is possible that a specific design heritage is anticipated as part of the project, leading to the need to reuse a specific solution. Nevertheless, the requirement can be made more general such as “The platform shall provide the necessary mobility to access and return from the cave environment.” This wording keeps options open and allows a subsequent trade-off study to investigate how the mobility could be achieved in the target environment, for example, four or six wheels, or maybe legs? To help define good and useful requirements, Doran [1] proposed the S.M.A.R.T. method, a mnemonic acronym that consists of five criteria to set objectives and requirements as detailed in Table 2.1. Pending on the context and the application, a number of alternative descriptions for each of the S.M.A.R.T. criteria can be found. For the purpose of defining technical requirements, definitions for the criteria draw upon further work in Refs [2, 3] are also summarized in Table 2.1.

Table 2.1 Defining suitable requirements through S.M.A.R.T. criteria

ID

Meaning

Definition

S

Specific

Concise and Complete