Matrix Algebra Useful for Statistics - Shayle R. Searle - E-Book

Matrix Algebra Useful for Statistics E-Book

Shayle R. Searle

0,0
123,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

A thoroughly updated guide to matrix algebra and it uses in statistical analysis and features SAS®, MATLAB®, and R throughout

This Second Edition addresses matrix algebra that is useful in the statistical analysis of data as well as within statistics as a whole. The material is presented in an explanatory style rather than a formal theorem-proof format and is self-contained. Featuring numerous applied illustrations, numerical examples, and exercises, the book has been updated to include the use of SAS, MATLAB, and R for the execution of matrix computations. In addition, André I. Khuri, who has extensive research and teaching experience in the field, joins this new edition as co-author. The Second Edition also:

  • Contains new coverage on vector spaces and linear transformations and discusses computational aspects of matrices
  • Covers the analysis of balanced linear models using direct products of matrices
  • Analyzes multiresponse linear models where several responses can be of interest
  • Includes extensive use of SAS, MATLAB, and R throughout
  • Contains over 400 examples and exercises to reinforce understanding along with select solutions
  • Includes plentiful new illustrations depicting the importance of geometry as well as historical interludes

Matrix Algebra Useful for Statistics, Second Edition is an ideal textbook for advanced undergraduate and first-year graduate level courses in statistics and other related disciplines. The book is also appropriate as a reference for independent readers who use statistics and wish to improve their knowledge of matrix algebra.

THE LATE SHAYLE R. SEARLE, PHD, was professor emeritus of biometry at Cornell University. He was the author of Linear Models for Unbalanced Data and Linear Models and co-author of Generalized, Linear, and Mixed Models, Second Edition, Matrix Algebra for Applied Economics, and Variance Components, all published by Wiley. Dr. Searle received the Alexander von Humboldt Senior Scientist Award, and he was an honorary fellow of the Royal Society of New Zealand.

ANDRÉ I. KHURI, PHD, is Professor Emeritus of Statistics at the University of Florida. He is the author of Advanced Calculus with Applications in Statistics, Second Edition and co-author of Statistical Tests for Mixed Linear Models, all published by Wiley. Dr. Khuri is a member of numerous academic associations, among them the American Statistical Association and the Institute of Mathematical Statistics.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 687

Veröffentlichungsjahr: 2017

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



WILEY SERIES IN PROBABILITY AND STATISTICS

Established by Walter A. Shewhart and Samuel S. Wilks

Editors: David J. Balding, Noel A. C. Cressie, Garrett M. Fitzmaurice,Geof H. Givens, Harvey Goldstein, Geert Molenberghs, David W. Scott,Adrian F. M. Smith, Ruey S. Tsay

Editors Emeriti: J. Stuart Hunter, Iain M. Johnstone, Joseph B. Kadane,Jozef L. Teugels

The Wiley Series in Probability and Statistics is well established and authoritative. It covers many topics of current research interest in both pure and applied statistics and probability theory. Written by leading statisticians and institutions, the titles span both state-of-the-art developments in the field and classical methods.

Reflecting the wide range of current research in statistics, the series encompasses applied, methodological and theoretical statistics, ranging from applications and new techniques made possible by advances in computerized practice to rigorous treatment of theoretical approaches. This series provides essential and invaluable reading for all statisticians, whether in academia, industry, government, or research.

A complete list of titles in this series can be found at http://www.wiley.com/go/wsps

Matrix Algebra Useful for Statistics

Second Edition

Shayle R. Searle

André I. Khuri

Copyright © 2017 by John Wiley & Sons, Inc. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey. Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging-in-Publication Data is available.

ISBN: 978-1-118-93514-9

In Memory of Shayle R. Searle, a Good Friend and Colleague

To My Faithful Wife, Ronnie, and Dedicated Children, Marcus and Roxanne, and Their Families

CONTENTS

PREFACE

PREFACE TO THE FIRST EDITION

INTRODUCTION

ABOUT THE COMPANION WEBSITE

PART I DEFINITIONS, BASIC CONCEPTS, AND MATRIX OPERATIONS

1 Vector Spaces, Subspaces, and Linear Transformations

1.1 Vector Spaces

1.2 Base of a Vector Space

1.3 Linear Transformations

Reference

Exercises

2 Matrix Notation and Terminology

2.1 Plotting of a Matrix

2.2 Vectors and Scalars

2.3 General Notation

Exercises

3 Determinants

3.1 Expansion by Minors

3.2 Formal Definition

3.3 Basic Properties

3.4 Elementary Row Operations

3.5 Examples

3.6 Diagonal Expansion

3.7 The Laplace Expansion

3.8 Sums and Differences of Determinants

3.9 A Graphical Representation of a 3 × 3 Determinant

References

Exercises

Notes

4 Matrix Operations

4.1 The Transpose of a Matrix

4.2 Partitioned Matrices

4.3 The Trace of a Matrix

4.4 Addition

4.5 Scalar Multiplication

4.6 Equality and the Null Matrix

4.7 Multiplication

4.8 The Laws of Algebra

4.9 Contrasts With Scalar Algebra

4.10 Direct Sum of Matrices

4.11 Direct Product of Matrices

4.12 The Inverse of a Matrix

4.13 Rank of a Matrix—Some Preliminary Results

4.14 The Number of LIN Rows and Columns in a Matrix

4.15 Determination of The Rank of a Matrix

4.16 Rank and Inverse Matrices

4.17 Permutation Matrices

4.18 Full-Rank Factorization

References

Exercises

5 Special Matrices

5.1 Symmetric Matrices

5.2 Matrices Having all Elements Equal

5.3 Idempotent Matrices

5.4 Orthogonal Matrices

5.5 Parameterization of Orthogonal Matrices

5.6 Quadratic Forms

5.7 Positive Definite Matrices

References

Exercises

6 Eigenvalues and Eigenvectors

6.1 Derivation of Eigenvalues

6.2 Elementary Properties of Eigenvalues

6.3 Calculating Eigenvectors

6.4 The Similar Canonical Form

6.5 Symmetric Matrices

6.6 Eigenvalues of orthogonal and Idempotent Matrices

6.7 Eigenvalues of Direct Products and Direct Sums of Matrices

6.8 Nonzero Eigenvalues of AB and BA

References

Exercises

Notes

7 Diagonalization of Matrices

7.1 Proving the Diagonability Theorem

7.2 Other Results for Symmetric Matrices

7.3 The Cayley–Hamilton Theorem

7.4 The Singular-Value Decomposition

References

Exercises

8 Generalized Inverses

8.1 The Moore–Penrose Inverse

8.2 Generalized Inverses

8.3 Other Names and Symbols

8.4 Symmetric Matrices

References

Exercises

9 Matrix Calculus

9.1 Matrix Functions

9.2 Iterative Solution of Nonlinear Equations

9.3 Vectors of Differential Operators

9.4 Vec and Vech Operators

9.5 Other Calculus Results

9.6 Matrices With Elements That Are Complex Numbers

9.7 Matrix Inequalities

References

Exercises

Notes

PART II APPLICATIONS OF MATRICES IN STATISTICS

10 Multivariate Distributions and Quadratic Forms

10.1 Variance-Covariance Matrices

10.2 Correlation Matrices

10.3 Matrices of Sums of Squares and Cross-Products

10.4 The Multivariate Normal Distribution

10.5 Quadratic Forms and χ

2

-Distributions

10.6 Computing the Cumulative Distribution Function of a Quadratic Form

References

Exercises

11 Matrix Algebra of Full-Rank Linear Models

11.1 Estimation of β by the Method of Least Squares

11.2 Statistical Properties of the Least-Squares Estimator

11.3 Multiple Correlation Coefficient

11.4 Statistical Properties Under the Normality Assumption

11.5 Analysis of Variance

11.6 The Gauss–Markov Theorem

11.7 Testing Linear Hypotheses

11.8 Fitting Subsets of the x-Variables

11.9 The use of the R(.|.) Notation in Hypothesis Testing

References

Exercises

12 Less-Than-Full-Rank Linear Models

12.1 General Description

12.2 The Normal Equations

12.3 Solving the Normal Equations

12.4 Expected values and variances

12.5 Predicted y-Values

12.6 Estimating the Error Variance

12.7 Partitioning the Total Sum of Squares

12.8 Analysis of Variance

12.9 The R( · | · ) Notation

12.10 Estimable Linear Functions

12.11 Confidence Intervals

12.12 Some Particular Models

12.13 The R( · | ·) Notation (continued)

12.14 Reparameterization to a Full-Rank Model

References

Exercises

13 Analysis of Balanced Linear Models Using Direct Products of Matrices

13.1 General Notation for Balanced Linear Models

13.2 Properties Associated with Balanced Linear Models

13.3 Analysis of Balanced Linear Models

References

Exercises

14 Multiresponse Models

14.1 Multiresponse Estimation of Parameters

14.2 Linear Multiresponse Models

14.3 Lack of Fit of a Linear Multiresponse Model

References

Exercises

PART III MATRIX COMPUTATIONS AND RELATED SOFTWARE

15 SAS/IML

15.1 Getting Started

15.2 Defining a Matrix

15.3 Creating a Matrix

15.4 Matrix Operations

15.5 Explanations of SAS Statements Used Earlier in the Text

References

Exercises

16 Use of MATLAB in Matrix Computations

16.1 Arithmetic Operators

16.2 Mathematical Functions

16.3 Construction of Matrices

16.4 Two- and Three-Dimensional Plots

References

Exercises

17 Use of R in Matrix Computations

17.1 Two- and Three-Dimensional Plots

References

Exercises

APPENDIX SOLUTIONS TO EXERCISES

Chapter 1

Chapter 2

Chapter 3

Chapter 4

Chapter 5

Chapter 6

Chapter 7

Chapter 8

Chapter 9

Chapter 10

Chapter 11

Chapter 12

Chapter 13

Chapter 14

Chapter 15

Chapter 16

Chapter 17

INDEX

EULA

List of Tables

Chapter 4

Table 4.1

Table 4.2

Chapter 9

Table 9.1

Chapter 10

Table 10.1

Table 10.2

Chapter 11

Table 11.1

Table 11.2

Table 11.3

Table 11.4

Table 11.5

Table 11.6

Table 11.7

Table 11.8

Table 11.9

Table 11.10

Table 11.11

Chapter 12

Table 12.1

Table 12.2

Table 12.3

Table 12.4

Table 12.5

Table 12.6

Table 12.8

Table 12.9

Table 12.10

Table 12.11

Table 12.12

Table 12.13

Chapter 13

Table 13.1

Table 13.2

Table 13.3

Table 13.4

Table 13.5

Table 13.6

Table 13.7

Table 13.8

Table 13.9

Chapter 14

Table 14.1

Chapter 15

Table 15.1

Chapter 16

Table 16.1

Table 16.2

Chapter 17

Table 17.1

Guide

Cover

Table of Contents

Preface

Pages

xvii

xviii

xix

xx

xxi

xxii

xxiii

xxiv

xxv

xxvi

xxvii

xxviii

xxix

xxx

1

2

3

4

5

6

7

8

9

10

11

12

13

14

16

17

18

19

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

199

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

313

327

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

353

354

355

356

357

358

359

360

361

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

404

406

408

409

410

411

413

414

415

416

417

418

419

420

421

422

423

424

425

426

427

428

429

430

431

432

433

434

435

436

437

438

439

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

475

476

477

478

479

480

Preface

The primary objective of the second edition is to update the material in the first edition. This is a significant undertaking given that the first edition appeared in 1982. It should be first pointed out that this is more than just an update. It is in fact a major revision of the material affecting not only its presentation, but also its applicability and use by the reader.

The second edition consists of three parts. Part I is comprised of Chapters 1–9, which with the exception of Chapter 1, covers material based on an update of Chapters 1–12 in the first edition. These chapters are preceded by an introductory chapter giving historical perspectives on matrix algebra. Chapter 1 is new. It discusses vector spaces and linear transformations that represent an introduction to matrices. Part II addresses applications of matrices in statistics. It consists of Chapters 10–14. Chapters 10–11 constitute an update of Chapters 13–14 in the first edition. Chapter 12 is similar to Chapter 15 in the first edition. It covers models that are less than full rank. Chapter 13 is entirely new. It discusses the analysis of balanced linear models using direct products of matrices. Chapter 14 is also a new addition that covers multiresponse linear models where several responses can be of interest. Part III is new. It covers computational aspects of matrices and consists of three chapters. Chapter 15 is on the use of SAS/IML, Chapter 16 covers the use of MATLAB, and Chapter 17 discusses the implementation of R in matrix computations. These three chapters are self-contained and provide the reader with the necessary tools to carry out all the computations described in the book. The reader can choose whichever software he/she feels comfortable with. It is also quite easy to learn new computational techniques that can be beneficial.

The second edition displays a large number of figures to illustrate certain computational details. This provides a visual depiction of matrix entities such as the plotting of a matrix and the graphical representation of a determinant. In addition, many examples have been included to provide a better understanding of the material.

A new feature in the second edition is the addition of detailed solutions to all the odd-numbered exercises. The even-numbered solutions will be placed online by the publisher. This can be helpful to the reader who desires to use the book as a source for learning matrix algebra.

As with the first edition, the second edition emphasizes the “bringing to a broad spectrum of readers a knowledge of matrix algebra that is useful in the statistical analysis of data and in statistics in general.” The second edition should therefore appeal to all those who desire to gain a better understanding of matrix algebra and its applications in linear models and multivariate statistics. The computing capability that the reader needs is particularly enhanced by the inclusion of Part III on matrix computations.

I am grateful to my wife Ronnie, my daughter Roxanne, and son Marcus for their support and keeping up with my progress in writing the book over the past 3 years. I am also grateful to Steve Quigley, a former editor with John Wiley & Sons, for having given me the opportunity to revise the first edition. Furthermore, my gratitude goes to Julie Platt, an Editor-in-Chief with the SAS Institute, for allowing me to use the SAS software in the second edition for two consecutive years.

ANDRÉ I. KHURI

Jacksonville, FloridaJanuary 2017

Preface to the First Edition

Algebra is a mathematical shorthand for language, and matrices are a shorthand for algebra. Consequently, a special value of matrices is that they enable many mathematical operations, especially those arising in statistics and the quantitative sciences, to be expressed concisely and with clarity. The algebra of matrices is, of course, in no way new, but its presentation is often so surrounded by the trappings of mathematical generality that assimilation can be difficult for readers who have only limited ability or training in mathematics. Yet many such people nowadays find a knowledge of matrix algebra necessary for their work, especially where statistics and/or computers are involved. It is to these people that I address this book, and for them, I have attempted to keep the mathematical presentation as informal as possible.

The pursuit of knowledge frequently involves collecting data, and those responsible for the collecting must appreciate the need for analyzing their data to recover and interpret the information contained therein. Such people must therefore understand some of the mathematical tools necessary for this analysis, to an extent either that they can carry out their own analysis, or that they can converse with statisticians and mathematicians whose help will otherwise be needed. One of the necessary tools is matrix algebra. It is becoming as necessary to science today as elementary calculus has been for generations. Matrices originated in mathematics more than a century ago, but their broad adaptation to science is relatively recent, prompted by the widespread acceptance of statistical analysis of data, and of computers to do that analysis; both statistics and computing rely heavily on matrix algebra. The purpose of this book is therefore that of bringing to a broad spectrum of readers a knowledge of matrix algebra that is useful in the statistical analysis of data and in statistics generally.

The basic prerequisite for using the book is high school algebra. Differential calculus is used on only a few pages, which can easily be omitted; nothing will be lost insofar as a general understanding of matrix algebra is concerned. Proofs and demonstrations of most of the theory are given, for without them the presentation would be lifeless. But in every chapter the theoretical development is profusely illustrated with elementary numerical examples and with illustrations taken from a variety of applied sciences. And the last three chapters are devoted solely to uses of matrix algebra in statistics, with Chapters 14 and 15 outlining two of the most widely used statistical techniques: regression and linear models.

The mainstream of the book is its first 11 chapters, beginning with one on introductory concepts that includes a discussion of subscript and summation notation. This is followed by four chapters dealing with basic arithmetic, special matrices, determinants and inverses. Chapters 6 and 7 are on rank and canonical forms, 8 and 9 deal with generalized inverses and solving linear equations, 10 is a collection of results on partitioned matrices, and 11 describes eigenvalues and eigenvectors. Background theory for Chapter 11 is collected in an appendix, Chapter 11A, some summaries and miscellaneous topics make up Chapter 12, statistical illustrations constitute Chapter 13, and Chapters 14 and 15 describe regression and linear models. All chapters except the last two end with exercises.

Occasional sections and paragraphs can be omitted at a first reading, especially by those whose experience in mathematics is somewhat limited. These portions of the book are printed in small type and, generally speaking, contain material subsidiary to the main flow of the text—material that may be a little more advanced in mathematical presentation than the general level otherwise maintained.

Chapters, and sections within chapters, are numbered with Arabic numerals 1, 2, 3,… Within-chapter references to sections are by section number, but references across chapters use the decimal system, for example, Section 1.3 is Section 3 of Chapter 1. These numbers are also shown in the running head of each page, for example, [1.3] is found on page 4. Numbered equations are (1), (2),…, within each chapter. Those of one chapter are seldom referred to in another, but when they are, the chapter reference is explicit; otherwise “equation (3)” or more simply “(3)” means the equation numbered (3) in the chapter concerned. Exercises are in unnumbered sections and are referenced by their chapter number; for example, Exercise 6.2 is Exercise 2 at the end of Chapter 6.

I am greatly indebted to George P. H. Styan for his exquisitely thorough readings of two drafts of the manuscript and his extensive and very helpful array of comments. Harold V. Henderson’s numerous suggestions for the final manuscript were equally as helpful. Readers of Matrix Algebra for the Biological Sciences (Wiley, 1966), and students in 15 years of my matrix algebra course at Cornell have also contributed many useful ideas. Particular thanks go to Mrs. Helen Seamon for her superb accuracy on the typewriter, patience, and fantastic attention to detail; such attributes are greatly appreciated.

SHAYLE R. SEARLE

Ithaca, New York May 1982

About the Companion Website

This book is accompanied by a companion website:

    www.wiley.com/go/searle/matrixalgebra2e

The website includes:

Solutions to even numbered exercises (for instructors only)

Part IDefinitions, Basic Concepts, and Matrix Operations

This is the first of three parts that make up this book. The purpose of Part I is to familiarize the reader with the basic concepts and results of matrix algebra. It is designed to provide the tools needed for the understanding of a wide variety of topics in statistics where matrices are used, such as linear models and multivariate analysis, among others. Some of these topics will be addressed in Part II. Proofs of several theorems are given as we believe that understanding the development of a proof can in itself contribute to acquiring a greater ability in dealing with certain matrix intricacies that may be encountered in statistics. However, we shall not attempt to turn this part into a matrix theory treatise overladen with theorems and proofs, which can be quite insipid. Instead, emphasis will be placed on providing an appreciation of the theory, but without losing track of the objective of learning matrix algebra, namely acquiring the ability to apply matrix results in statistics. The theoretical development in every chapter is illustrated with numerous examples to motivate the learning of the theory. The material in Part II will demonstrate the effectiveness of using such theory in statistics.

Part I consists of the following nine chapters:

Chapter 1: Vector Spaces, Subspaces, and Linear Transformations.

Matrix algebra had its foundation in simultaneous linear equations which represented a linear transformation from one n-dimensional Euclidean space to another of the same dimension. This idea was later extended to include linear transformations between more general spaces, not necessarily of the same dimension. Such linear transformations gave rise to matrices. An n-dimensional Euclidean space is a special case of a wider concept called a vector space.

Chapter 2: Matrix Notation and Terminology.

In order to understand and work with matrices, it is necessary to be quite familiar with the notation and system of terms used in matrix algebra. This chapter defines matrices as rectangular or square arrays of numbers arranged in rows and columns.

Chapter 3: Determinants.

This chapter introduces determinants and provides a description of their basic properties. Various methods of determinantal expansions are included.

Chapter 4: Matrix Operations.

This chapter covers various aspects of matrix operations such as partitioning of matrices, multiplication, direct sum, and direct products of matrices, the inverse and rank of matrices, and full-rank factorization.

Chapter 5: Special Matrices.

Certain types of matrices are frequently used in statistics, such as symmetric, orthogonal, idempotent, positive definite matrices. This chapter also includes different methods to parameterize orthogonal matrices.

Chapter 6: Eigenvalues and Eigenvectors.

A detailed study is given of the eigenvalues and eigenvectors of square matrices, their properties and actual computation. Eigenvalues of certain special matrices, such as symmetric, orthogonal, and idempotent matrices, are discussed, in addition to those that pertain to direct products and direct sums of matrices.

Chapter 7: Diagonalization of Matrices.

Different methods are given to diagonalize matrices that satisfy certain properties. The Cayley–Hamilton theorem, and the singular-value decomposition of matrices are also covered.

Chapter 8: Generalized Inverses.

The Moore–Penrose inverse and the more general generalized inverses of matrices are discussed. Properties of generalized inverses of symmetric matrices are studied, including the special case of the matrix.

Chapter 9: Matrix Calculus.

Coverage is given of calculus results associated with matrices, such as functions of matrices, infinite series of matrices, vectors of differential operators, quadratic forms, differentiation of matrices, traces, and determinants, in addition to matrices of second-order partial derivatives, and matrix inequalities.

1Vector Spaces, Subspaces, and Linear Transformations

The study of matrices is based on the concept of linear transformations between two vector spaces. It is therefore necessary to define what this concept means in order to understand the setup of a matrix. In this chapter, as well as in the remainder of the book, the set of all real numbers is denoted by R, and its elements are referred to as scalars. The set of all n-tuples of real numbers will be denoted by Rn (n ≥ 1).

1.1 Vector Spaces

This section introduces the reader to ideas that are used extensively in many books on linear and matrix algebra. They involve extensions of the Euclidean geometry which are important in the current mathematical literature and are described here as a convenient introductory reference for the reader. We confine ourselves to real numbers and to vectors whose elements are real numbers.

1.1.1 Euclidean Space

A vector (x0, y0)′ of two elements can be thought of as representing a point in a two-dimensional Euclidean space using the familiar Cartesian x, y coordinates, as in Figure 1.1. Similarly, a vector (x0, y0, z0)′ of three elements can represent a point in a three-dimensional Euclidean space, also shown in Figure 1.1. In general, a vector of n elements can be said to represent a point (an n-tuple) in what is called an n-dimensional Euclidean space. This is a special case of a wider concept called a vector space, which we now define.

Definition 1.1 (Vector Spaces)