Introduction to Probability - Narayanaswamy Balakrishnan - E-Book

Introduction to Probability E-Book

Narayanaswamy Balakrishnan

0,0
107,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

An essential guide to the concepts of probability theory that puts the focus on models and applications Introduction to Probability offers an authoritative text that presents the main ideas and concepts, as well as the theoretical background, models, and applications of probability. The authors--noted experts in the field--include a review of problems where probabilistic models naturally arise, and discuss the methodology to tackle these problems. A wide-range of topics are covered that include the concepts of probability and conditional probability, univariate discrete distributions, univariate continuous distributions, along with a detailed presentation of the most important probability distributions used in practice, with their main properties and applications. Designed as a useful guide, the text contains theory of probability, de finitions, charts, examples with solutions, illustrations, self-assessment exercises, computational exercises, problems and a glossary. This important text: * Includes classroom-tested problems and solutions to probability exercises * Highlights real-world exercises designed to make clear the concepts presented * Uses Mathematica software to illustrate the text's computer exercises * Features applications representing worldwide situations and processes * Offers two types of self-assessment exercises at the end of each chapter, so that students may review the material in that chapter and monitor their progress. Written for students majoring in statistics, engineering, operations research, computer science, physics, and mathematics, Introduction to Probability: Models and Applications is an accessible text that explores the basic concepts of probability and includes detailed information on models and applications.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 862

Veröffentlichungsjahr: 2019

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Dedication

Preface

1 The Concept of Probability

1.1 Chance Experiments – Sample Spaces

Group A

1.2 Operations Between Events

Group A

Group B

1.3 Probability as Relative Frequency

1.4 Axiomatic Definition of Probability

Group A

Group B

1.5 Properties of Probability

Group A

Group B

1.6 The Continuity Property of Probability

Group A

Group B

1.7 Basic Concepts and Formulas

1.8 Computational Exercises

1.9 Self‐assessment Exercises

1.10 Review Problems

1.11 Applications

Key Terms

2 Finite Sample Spaces – Combinatorial Methods

2.1 Finite Sample Spaces with Events of Equal Probability

Group A

Group B

2.2 Main Principles of Counting

Group A

Group B

2.3 Permutations

Group A

Group B

2.4 Combinations

Group A

Group B

2.5 The Binomial Theorem

Group A

Group B

2.6 Basic Concepts and Formulas

2.7 Computational Exercises

2.8 Self‐Assessment Exercises

2.9 Review Problems

2.10 Applications

Key Terms

3 Conditional Probability – Independent Events

3.1 Conditional Probability

Group A

Group B

3.2 The Multiplicative Law of Probability

Group A

Group B

3.3 The Law of Total Probability

Group A

Group B

3.4 Bayes' Formula

Group A

Group B

3.5 Independent Events

Group A

Group B

3.6 Basic Concepts and Formulas

3.7 Computational Exercises

3.8 Self‐assessment Exercises

3.9 Review Problems

3.10 Applications

Key Terms

4 Discrete Random Variables and Distributions

4.1 Random Variables

4.2 Distribution Functions

Group A

Group B

4.3 Discrete Random Variables

Group A

Group B

4.4 Expectation of a Discrete Random Variable

Group A

Group B

4.5 Variance of a Discrete Random Variable

Group A

Group B

4.6 Some Results for Expectation and Variance

Group A

Group B

4.7 Basic Concepts and Formulas

4.8 Computational Exercises

4.9 Self‐Assessment Exercises

4.10 Review Problems

4.11 Applications

Key Terms

5 Some Important Discrete Distributions

5.1 Bernoulli Trials and Binomial Distribution

Group A

Group B

5.2 Geometric and Negative Binomial Distributions

Group A

Group B

5.3 The Hypergeometric Distribution

Group A

Group B

5.4 The Poisson Distribution

Group A

Group B

5.5 The Poisson Process

Group A

Group B

5.6 Basic Concepts and Formulas

5.7 Computational Exercises

5.8 Self‐Assessment Exercises

5.9 Review Problems

5.10 Applications

Key Terms

6 Continuous Random Variables

6.1 Density Functions

Group A

Group B

6.2 Distribution for a Function of a Random Variable

Group A

Group B

6.3 Expectation and Variance

Group A

Group B

6.4 Additional Useful Results for the Expectation

Group A

Group B

6.5 Mixed Distributions

Group A

Group B

6.6 Basic Concepts and Formulas

6.7 Computational Exercises

6.8 Self‐Assessment Exercises

6.9 Review Problems

6.10 Applications

Key Terms

CHAPTER 7: Some Important Continuous Distributions

7.1 The Uniform Distribution

Group A

Group B

7.2 The Normal Distribution

Group A

Group B

7.3 The Exponential Distribution

Group A

Group B

7.4 Other Continuous Distributions

Group A

Group B

7.5 Basic Concepts and Formulas

7.6 Computational Exercises

7.7 Self‐Assessment Exercises

7.8 Review Problems

7.9 Applications

Key Terms

Appendix A: Sums and Products

Useful Formulas

Appendix B: Distribution Function of the Standard Normal Distribution

Appendix C: Simulation

Appendix D: Discrete and Continuous Distributions

Bibliography

Other non-technical books

Web sources

Index

End User License Agreement

List of Tables

Chapter 1

Table 1.1 Dice outcomes of two throws (

repetitions).

Table 1.2 The system reliability for several values of functioning probability ...

Chapter 2

Table 2.1 Values of

for

and

.

Table 2.2 Choosing three elements with repetitions from a set of four elements.

Table 2.3 Different selections for the five cars ordered.

Table 2.4 A different way of presenting combinations with repetitions.

Table 2.5 Values of the probabilities

for

,

.

Chapter 4

Table 4.1 Probabilities of events for a variable

given in terms of its distri...

Table 4.3 Calculations for

.

Table 4.4 Notation and terminology for the various types of moments.

Chapter 5

Table 5.1 Mathematica functions used to obtain various quantities associated wit...

Table 5.2 Mathematica functions for discrete distributions.

Table 5.3 Probability that all passengers can be accommodated in an airplane of ...

Chapter 7

Table 7.1 Extract from the table of values of the standard normal distribution f...

Table 7.2 Mathematica functions for continuous distributions.

List of Illustrations

Chapter 1

Figure 1.1 Tree diagram for the experiment of tossing three coins.

Figure 1.2 Tree diagram for the emission of a signal consisting of four...

Figure 1.3 A Venn diagram.

Figure 1.4

.

Figure 1.5 Complement of an event.

Figure 1.6 Union of events.

Figure 1.7 Intersection of events.

Figure 1.8 Disjoint events.

Figure 1.9 Difference between two events.

Figure 1.10

.

Figure 1.11 Venn diagram for the event

.

Figure 1.12 Venn diagram for the event

.

Figure 1.13 Venn diagram for the event

.

Figure 1.14 Long‐run behavior of relative frequencies.

Figure 1.15 An increasing sequence of events.

Figure 1.16 A decreasing sequence of events.

Figure 1.17 The sequences

,

.

Figure 1.18 Water network with three connections.

Figure 1.19 Plot of the system reliability,

, as a function of

. ...

Chapter 2

Figure 2.1 Tree diagram for the words that can be formed.

Figure 2.2 Diagram with the routes from city A to city C.

Figure 2.3 Tree diagram showing all possible routes from City A to City...

Figure 2.4 Pascal's triangle.

Figure 2.5 Combinations with repetitions.

Chapter 3

Figure 3.1 A tree diagram for the law of total probability for Example ...

Figure 3.2 A partition of the sample space

.

Figure 3.3 An electrical system.

Figure 3.4 Serial connection.

Figure 3.5 Parallel connection.

Figure 3.6 Plot of the probability

) as a function of

for

.

Figure 3.7 Plot of the probability

as a function of

for several...

Chapter 4

Figure 4.1 Plot of the distribution function for Example 4.5.

Figure 4.2Figure 4.2 Plot of the distribution function for Example 4.6.

Figure 4.3 Plot of the distribution function for Example 4.7.

Figure 4.4 Probability function of a discrete random variable.

Figure 4.5 From the probability function to the distribution function. ...

Figure 4.6 From the distribution function to the probability function. ...

Figure 4.7 Physical interpretation of expectation.

Figure 4.8 Illustration of the result in Proposition 4.13.

Chapter 5

Figure 5.1 The probability function of a Bernoulli random variable.

Figure 5.2 The distribution function of a Bernoulli random variable.

Figure 5.3 The probability function (left) and the cumulative distribut...

Figure 5.4 The probability function (left) and the cumulative distribut...

Figure 5.5 The probability function (left) and the cumulative distribut...

Figure 5.6 The probability function (left) and the cumulative distribut...

Figure 5.7 Approximation of the hypergeometric distribution by the bino...

Figure 5.8 The probability function (left) and the cumulative distribut...

Figure 5.9 The probability function of the Poisson distribution with ...

Chapter 6

Figure 6.1 The probability function of a discrete random variable.

Figure 6.2 The density function of a continuous random variable.

Figure 6.3 The density function and the distribution function of the va...

Figure 6.4 Approximation of

.

Figure 6.5 The density functions of

and

.

Figure 6.6 Rotation of a sphere around point O.

Figure 6.7 The distribution function of

in Example 6.14.

Figure 6.8 Plot of the mean monthly profit

for (a)

,

; (b) ...

Chapter 7

Figure 7.1 Plot of probability density and cumulative distribution fun...

Figure 7.2 The density function (left) and the distribution function (r...

Figure 7.3 Area of the standard normal density function inside each of ...

Figure 7.4 Graphical representation of the

‐quantile of the standard...

Figure 7.5 A comparison of the density functions for the

distributi...

Figure 7.6 The density function (left) and the distribution function (r...

Figure 7.7 The probability function of the binomial distribution functi...

Figure 7.8 Binomial probability function and approximating normal form....

Figure 7.9 Shaded region is

for a discrete variable.

Figure 7.10 A corrected normal approximation to discrete probabilities....

Figure 7.11 The density function (left) and the distribution function (...

Figure 7.12 Early period, useful period and wear‐out period of a device...

Figure 7.13 The density function of the Gamma distribution for differen...

Figure 7.14 The density function (left) and cumulative distribution fun...

Figure 7.15 The density function of the Beta distribution for different...

Figure 7.16 The density function (left) and the distribution function (...

Figure 7.17 The density function of the lognormal distribution for diff...

Figure 7.18 The expected profit of the company as a function of

.

Guide

Cover

Table of Contents

Begin Reading

Pages

5

3

4

5

xi

xii

xiii

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

404

405

406

407

408

409

410

411

412

413

414

415

415

416

417

418

419

420

421

422

423

424

425

426

427

428

429

430

431

432

433

434

435

436

437

438

439

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

473

474

475

476

477

478

479

480

481

482

483

484

485

486

487

488

489

490

491

491

492

493

494

495

496

497

498

499

500

501

502

503

504

505

506

507

508

509

510

511

512

513

514

515

516

517

518

519

520

521

522

523

524

525

526

527

528

529

530

531

532

533

534

535

536

537

538

539

540

541

542

543

544

545

546

547

548

549

550

551

552

553

554

555

556

557

558

559

560

561

562

563

564

565

566

567

568

569

570

571

572

573

574

575

576

577

578

579

579

580

581

582

583

584

585

586

587

588

589

590

591

592

593

595

596

597

598

599

600

601

602

603

604

605

605

606

607

608

609

610

WILEY SERIES IN PROBABILITY AND STATISTICS

Established by Walter A. Shewhart and Samuel S. Wilks

Editors: David J. Balding, Noel A. C. Cressie, Garrett M. Fitzmaurice,

Geof H. Givens, Harvey Goldstein, Geert Molenberghs, David W. Scott,

Adrian F. M. Smith, Ruey S. Tsay

Editors Emeriti: J. Stuart Hunter, Iain M. Johnstone, Joseph B. Kadane,

Jozef L. Teugels

The Wiley Series in Probability and Statistics is well established and authoritative. It covers many topics of current research interest in both pure and applied statistics and probability theory. Written by leading statisticians and institutions, the titles span both state‐of‐the‐art developments in the field and classical methods.

Reflecting the wide range of current research in statistics, the series encompasses applied, methodological and theoretical statistics, ranging from applications and new techniques made possible by advances in computerized practice to rigorous treatment of theoretical approaches. This series provides essential and invaluable reading for all statisticians, whether in academia, industry, government, or research.

A complete list of titles in this series can be found at http://www.wiley.com/go/wsps

INTRODUCTION TO PROBABILITY

Models and Applications

N. Balakrishnan

McMaster University, Canada

 

Markos V. Koutras

University of Piraeus, Greece

 

Konstadinos G. Politis

University of Piraeus, Greece

Copyright

This edition first published 2020

© 2020, John Wiley & Sons, Inc.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.

The right of N. Balakrishnan, Markos V. Koutras, Konstadinos G. Politis to be identified as the authors of the editorial material in this work has been asserted in accordance with law.

Registered Office

John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA

Editorial Office

111 River Street, Hoboken, NJ 07030, USA

For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.

Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.

Limit of Liability/Disclaimer of Warranty

While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

Library of Congress Cataloging‐in‐Publication Data

Names: Balakrishnan, N., 1956- author. | Koutras, Markos V., author. |

     Politis, Konstadinos G., 1966- author.

Title: Introduction to probability : models and applications / N.

     Balakrishnan (McMaster University, Canada), Markos V. Koutras (University

     of Piraeus, Greece), Konstadinos G. Politis.

Description: Hoboken, NJ : Wiley, 2019. | Series: Wiley series in probability

     and statistics | Includes bibliographical references and index. |

     Identifiers: LCCN 2018060140 (print) | LCCN 2019000211 (ebook) | ISBN

     9781118548714 (Adobe PDF) | ISBN 9781118548493 (ePub) | ISBN 9781118123348

     (hardcover)

Subjects: LCSH: Probabilities–Textbooks.

Classification: LCC QA273 (ebook) | LCC QA273 .B254727 2019 (print) | DDC

     519.2–dc23

LC record available at https://lccn.loc.gov/2018060140

Cover Design: Wiley

Cover Image: Photograph by Mike Jack/Moment/Getty Images

Dedication

To my mother, due to whom I am who I am today

N. Balakrishnan

To my family

Markos V. Koutras

To my daughters

Konstadinos G. Politis

Preface

Probability theory deals with phenomena whose outcome is affected by random events, and therefore they cannot be predicted with certainty. For example, the result of throwing a coin or a dice, the time of occurrence of a natural phenomenon or disaster (e.g. snowfall, earthquake, tsunami etc.) are some of the cases where “randomness” plays an important role and the use of probability theory is almost inevitable.

It is more than five centuries ago, when the Italians Luca Pacioli, Niccolo Tartaglia, Galileo Galilei and the French Pierre de Fermat and Blaise Pascal started setting the foundations of probability theory. Nowadays this area has been fully developed as an independent research area and offers valuable tools for almost all applied sciences. As a consequence, introductory concepts of Probability Theory are taught in the first years of most University and College programs.

This book is an introductory textbook in probability and can be used by majors in Mathematics, Statistics, Physics, Computer Science, Actuarial Science, Operations Research, Engineering etc. No prior knowledge of probability theory is required.

In most Universities and Colleges where an introductory Probability course, such as one that may be based on this textbook, is offered, it would normally follow a rigorous Calculus course. Consequently, the Probability course can make use of differential and integral calculus, and formal proofs for theorems and propositions may be presented to the students, thereof offering them a mathematically sound understanding of the field.

For this reason, we have taken a calculus‐based approach in this textbook for teaching an introductory course on Probability. In doing so, we have also introduced some novelties hoping that these will be of benefit to both students and instructors.

In each chapter, we have included a section with a series of examples/problems for which the use of a computer is required. We demonstrate, through ample examples, how one can make effective use of computers for understanding probability concepts and carrying out various probability calculations. For these examples it is suggested to use a computer algebra software such as Mathematica, Maple, Derive, etc. Such programs provide excellent tools for creating graphs in an easy way as well as for performing mathematical operations such as derivation, summation, integration, etc; most importantly, one can handle symbols and variables without having to replace them with specific numerical values. In order to facilitate the reader, an example set of Mathematica commands is given each time (analogous commands can be assembled for the other programs mentioned above). These commands may be used to perform a specific task and then various similar tasks are requested in the form of exercises. No effort is made to present the most effective Mathematica program for tackling the suggested problem and no detailed description of the Mathematica syntax is provided; the interested reader is referred to the Mathematica Instruction Manual (Wolfram Research) to check the, virtually unlimited, commands available in this software (or alternative computer algebra software) and use them for creating several alternative instruction sets for the suggested exercises.

Moreover, a novel feature of the book is that, at the end of each chapter, we have included a section detailing a case study through which we demonstrate the usefulness of the results and concepts discussed in that chapter for a real‐life problem; we also carry out the required computations through the use of Mathematica.

At the beginning of each chapter we provide a brief historical account of some pioneers in Probability who made exemplary contributions to the topic of discussion within that chapter. This is done so as to provide students with a sense of history and appreciation of the vital contributions made by some renowned probabilists. Apart from the books on the history of probability and statistics that can be found in the bibliography, we have used Wikipedia as a source for biographical details.

In most sections, the exercises have been classified into two groups, A and B. Group A exercises are usually routine extensions of the theory or involve simple calculations based on theoretical tools developed in the section and should be the vehicle for a self‐control of the knowledge gained so far by the reader. Group B exercises are more advanced, require substantial critical thinking and quite often include fascinating applications of the corresponding theory.

In addition to regular exercises within each chapter, we have also provided a long list of True/False questions and another list of multiple choice questions. In our opinion, these will not only be useful for students to practice with (and assess their progress), but can also be helpful for instructors to give regular in‐class quizzes.

Particular effort has been made to give the theoretical results in their simplest form, so that they can be understood easily by the reader. In an effort to offer the book user an additional means of understanding the concepts presented, intuitive approaches and illustrative graphical representations/figures are provided in several places.

The material of this book emerged from a similar book (Introduction to Probability: Theory and Applications, Stamoulis Publications) written by one of us (MVK) in Greek, which is being used as a textbook for many years in several Greek Universities. Of course, we have expanded and transformed this material to reach an international audience.

This is the first volume in a set of two for teaching probability theory. In this volume, we have detailed the basic rules and concepts of probability, combinatorial methods for probabilistic computations, discrete random variables, continuous random variables, and well‐known discrete and continuous distributions. These form the core topics for an introduction to probability. More advanced topics such as joint distributions, measures of dependence, multivariate random variables, well‐known multivariate discrete and continuous distributions, generating functions, Laws of Large Numbers and the Central Limit Theorem should come out as core topics for a second course on probability. The second volume of our set will expand on all these advanced topics and hence it can be used effectively as a textbook for a second course on probability; the form and structure of each chapter will be similar to those in the present volume.

We wish to thank our colleagues G. Psarrakos and V. Dermitzakis who read parts of the book and to our students who attended our classes and made several insightful remarks and suggestions through the years.

In a book of this size and content, it is inevitable that there are some typographical errors and mistakes (that have clearly escaped several pairs of eyes). If you do notice any of them, please inform us about them so that we can do suitable corrections in future editions of this book.

It is our sincere hope that instructors find this textbook to be easy‐to‐use for teaching an introductory course on probability, while the students find the book to be user‐friendly with easy and logical explanations, plethora of examples, and numerous exercises (including computational ones) that they could practice with!

Finally, we would like to thank the Wiley production team for their help and patience during the preparation of this book!

March, 2019

N. Balakrishnan

Markos V. Koutras

Konstadinos G. Politis

1The Concept of Probability

Andrey Nikolaevich Kolmogorov (Tambov, Russia 1903–Moscow 1987)

Source: Keystone‐France / Getty Images.

Regarded as the founder of modern probability theory, Kolmogorov was a Soviet mathematician whose work was also influential in several other scientific areas, notably in topology, constructive logic, classical mechanics, mathematical ecology, and algorithmic information theory.

He earned his Doctor of Philosophy (PhD) degree from Moscow State University in 1929, and two years later, he was appointed a professor in that university. In his book, Foundations of the Theory of Probability, which was published in 1933 and which remains a classic text to this day, he built up probability theory from fundamental axioms in a rigorous manner, comparable to Euclid's axiomatic development of geometry.

1.1 Chance Experiments – Sample Spaces

In this chapter, we present the main ideas and the theoretical background to understand what probability is and provide some illustrations of the way it is used to tackle problems in everyday life. It is rather difficult to try to answer the question “what is probability?” in a single sentence. However, from our experience and the use of this word in common language, we understand that it is a way to deal with uncertainty in our lives. In fact, probability theory has been referred to as “the science of uncertainty”; although intuitively most people associate probability with the degree of belief that something may happen, probability theory goes far beyond that as it attempts to formalize uncertainty in a way that is universally accepted and is also subject to rigorous mathematical treatment.

Since the idea of uncertainty is paramount when we discuss probability, we shall first introduce a concept that is broad enough to deal with uncertainties in a wide‐ranging context when we consider practical applications. A chance experiment or a random experiment is any process which leads to an outcome that is not known beforehand. So tossing a coin, selecting a person at random and asking their age, or testing the lifetime of a new machine are all examples of random experiments.

Definition 1.1

A sample space of a chance experiment is the set of all possible outcomes that may appear in a realization of this experiment. The elements of are called sample points for this experiment. A subset of is called an event.

An event , consisting of a single sample point, i.e. a single outcome , is called an elementary event. We use capital letters , and so on to denote events.1 If an event consists of more than one outcome, then it is called a compound event.

The following simple examples illustrate the above concepts.

Example 1.1

Perhaps the simplest example of a chance experiment is tossing a coin. There are two possible outcomes – Heads (denoted by and Tails (denoted by . In this notation, the sample space of the experiment is

If we toss two coins instead, there are four possible outcomes, represented by the pairs . The sample space for this experiment is thus

Here, the symbol means that both coins land Heads, while means that the first coin lands Heads and the second lands Tails. Note in particular that we treat the two events and as distinguishable, rather than combining them into a single event. The main reason for this is that the events and are elementary events, while the event “one coin lands Heads and the other lands Tails,” which contains both and , is no longer an elementary event. As we will see later on, when we assign probabilities to the events of a sample space, it is much easier to work with elementary events, since in many cases such events are equally likely, and so it is reasonable the same probability to be assigned to each of them.

Consider now the experiment of tossing three coins. The sample space consists of triplets of the form , and so on. Since for each coin toss there are two outcomes, for three coins the number of possible outcomes is . More explicitly, the sample space for this experiment is

(1.1)

Each of the eight elements of this set forms an elementary event. Note that for events which are not elementary, it is sometimes easier to express them in words, by describing a certain property shared by all elements of the event we consider, rather than by listing all its elements (which may become inconvenient if these elements are too many). For instance, let be the event “exactly two Heads appear when we toss three coins.” Then,

On the other hand, the event

could be described in words as “all three coin outcomes are the same.”

Example 1.2

Another very simple experiment consists of throwing a single die. The die may land on any face with a number on it, where takes the values . Therefore, this experiment has sample space

The elementary events are the sets

Any other event may again be described either by listing the sample points in it, such as

or, in words, by expressing a certain property of its elements. For instance, the event

may be expressed as “the outcome of the die is an even integer.”

For the experiments we considered in the above two examples, the number of sample points was finite in each case. For instance, in Example 1.2, the sample space has six elements, while in the experiment of throwing three coins there are eight sample points as given in 1.1. Such sample spaces which contain a finite number of elements (possible outcomes) are called finite sample spaces. It is obvious that any event, i.e. a subset of the sample space, in this case has also finitely many elements.

When dealing with finite sample spaces, the process of enumerating their elements, or the elements of events in such spaces, is often facilitated by the use of tree diagrams. Figure 1.1 depicts such a diagram which corresponds to the experiment of tossing three coins, as considered in Example 1.1.

Figure 1.1 Tree diagram for the experiment of tossing three coins.

From the “root” of the tree, two segments start, each representing an outcome ( and , resp.) of the first coin toss. Thus, at the first stage, i.e. after the first throw, there are two nodes. From each of these, in turn, two further segments start corresponding to the two outcomes of the second toss. At the end of the second stage (after the second toss of the coin), there are four nodes. Finally, each of these is associated with two further nodes, which are shown at the next level (end of the three coin tosses). The final column in Figure 1.1 shows the eight possible outcomes for this experiment, i.e. it contains all the elements of the sample space . Each outcome can be traced by connecting the endpoint to the root and writing down the corresponding three‐step tree route.

Example 1.3 (An unlimited sequence of coin tosses)

Let us consider the experiment of tossing a coin until “Tails” appear for the first time. In this case, our sample space consists of sequences like ; that is, The event “Tails appear for the first time at the fifth trial” is then the elementary event

while the set

has as its elements all outcomes where Tails appear in the first three tosses. So the event can be described by saying “the experiment is terminated within the first three coin tosses.” Finally, the event “there are at least four tosses until the experiment is terminated” corresponds to the set (event)

In the previous example, the sample space has infinitely many points. In particular, and since these points can be enumerated, we speak of a countably infinite sample space. Examples of sets with countably2 many points are the set of integers, the set of positive integers, the set of rationals, etc. When a sample space is countably infinite, the events of that sample space may have either finitely many elements (e.g. the event in Example 1.3) or infinitely many elements (e.g. the event in Example 1.3).

In contrast, a set whose points cannot be enumerated, is called an uncountable set; typical examples of such sets are intervals and unions of intervals on the real line. To illustrate this, we consider the following example.

Example 1.4

In order to monitor the quality of light bulbs that are produced by a manufacturing line, we select a bulb at random, plug it in and record the length of time (in hours) until it fails. In principle, this time length may take any nonnegative real value (however, this presupposes we can take an infinitely accurate measurement of time!). Therefore, the sample space for the experiment whose outcome is the life duration of the bulb is

The subset of

describes the event “the life time of the light bulb does not exceed 500 hours,” while the event “the light bulb works for at least 300 hours” corresponds to the set

The sample space in the last example is the half‐line of nonnegative real numbers, which is an uncountable set. If is uncountable, then it is usually referred to as a continuous sample space. Typically, the study of such sample spaces requires different treatment compared with sample spaces which are either finite or countably infinite. The latter two cases, however, present several similarities and in probability theory the techniques we use are very similar. As a consequence, there is a collective term for sample spaces which have either finitely many or a countably infinite number of elements, called discrete sample spaces.

At this point, it is worth noting the difference between an “ideal” continuous sample space and the one we use in practice. With reference to Example 1.4 regarding the lifetime of electric bulbs, such a lifetime does not, in practice, take values such as or . Since time is measured in hours, it is customary to record a value rounded to the closest integer or, if more precision is required, keep two decimal places, say. In either case, and in contrast to the one used in the example above, the sample space is countable. Moreover, if we know that a lifetime of a bulb cannot exceed some (large) value , the sample space becomes so that it is then in fact finite. However, the number of elements in that space is , so that when is a large integer, this can be very large. It is often the case that it is much simpler mathematically to assume that is continuous although in practice we can only observe a finite (or infinitely countable) number of outcomes. This convention will be used frequently in the sequel, when we study, for example, weight, age, length, etc.

We conclude this section with an example which demonstrates that for the same experiment, we can define more than one sample space, depending on different aspects we might be interested in studying.

Example 1.5 (Different sample spaces for the same experiment)

Suppose that a store which sells cars has two salespersons. The store has in stock only two cars of a particular make. We are interested in the number of cars which will be sold by each of the two salespersons during next week. Then, a suitable sample space for this experiment is the set of pairs for , where stands for the number of cars sold by the first salesperson and for the number of cars sold by the second one. However, since there are only two cars available for sale, it must also hold that , and we thus arrive at the following sample space

Notice that we treat again the pairs and as being distinguishable; if, however, the store owner is only interested in the total number of cars sold during next week, then we could use as a sample space the set

In this case, an element denotes the total number of cars sold. It is worth noting that a specific event of interest is expressed in a completely different manner under these two different sample spaces. Consider, for example, the event

Viewed as a subset of , the event is a compound event which can be described as

However, when we consider the sample space , the event is an elementary event,

Exercises

Group A

Provide suitable sample spaces for each of the following experiments. For each sample space, specify whether it is finite, infinitely countable or uncountable.

Two throws of a die

Gender of the children in a family that has three children

Random selection of a natural number less than 100

Random selection of a real number from the interval

Number of telephone calls that someone receives on a mobile phone during a day

Number of animals living in a certain forest area

Life duration of an electronic appliance

Change in the price of a stock during a day

Percentage change in the price of a stock during a day.

John throws a die and subsequently he tosses a coin.

Suggest a suitable sample space that describes the outcomes of this experiment.

Let

be the event that “the outcome of the coin toss is Heads.” Which elements of the sample space are included in the event

?

We toss a coin until either Heads appear for the first time or we have five tosses which all result in Tails. Give a suitable sample space for this experiment, and then write explicitly (i.e. by listing their elements) each of the following events:

A

: the experiment terminates at the third toss of the coin;

B

: the experiment terminates after the third toss of the coin;

C

: the experiment terminates before the fourth toss of the coin;

D

: the experiment terminates with the appearance of

(Heads).

Which, if any, of the events are elementary events in the sample space that you have considered?

A digital scale has an accuracy of two decimal places shown on its screen. Each time a person steps on the scale, we record his/her weight by rounding it to the closest integer (in kilograms). Thus, if the decimal part is 0.50 or greater, we round up to the next integer.

Give an appropriate sample space for the experiment whose outcome is the rounding error during the above procedure.

Write explicitly each of the events

A

: the rounding error is at most 0.10;

B

: the absolute value of the rounding error is at least 0.20.

We throw a die twice. Give a suitable sample space for this experiment and then identify the elements each of the following events contains:

A

1

: the outcome of the first throw is 6;

A

2

: the outcome of the second throw is a multiple of 3;

A

3

: the outcome of the first throw is 6 and the outcome of the second throw is a multiple of 3;

A

4

: the sum of the two outcomes is 7;

A

5

: the sum of the two outcomes is at least 9;

A

6

: the two outcomes are identical.

A company salesman wants to visit the four cities

wherein his company has stores. If he plans to visit each city once, give a suitable sample space to describe the order in which he visits the cities. Then identify, by an appropriate listing of its elements, each of the following events:

A

1

: the salesman visits first city

;

A

2

: the salesman visits city

first and after that visits city

;

A

3

: the salesman visits city

before he visits city

;

A

4

: the salesman visits the cities

successively.

A box contains 3 red balls and 2 yellow balls. Give a suitable sample space to describe all possible outcomes for the experiment of selecting 4 balls at random, in each of the following schemes:

For each ball selected, we note its color and return it to the box so that it is available for the next selection (such a scheme is called

selection with replacement

).

Every ball selected is subsequently removed from the box (which is called

selection without replacement

).

Irène has four books that she wants to put on a library shelf. Three of these books form a 3‐volume set of a dictionary, so that they are marked as Volumes

,

, and

, respectively.

Find an appropriate sample space for all possible ways she can put the books on the shelf.

Identify the elements of this sample space that each of the following three events contains:

B

1

: the three volumes of the dictionary are put next to one another;

B

2

: the three volumes of the dictionary are put in the right order (but not necessarily in adjacent places), so that Volume

is placed to the left of Volume

, which in turn is placed to the left of Volume

;

B

3

: the three volumes of the dictionary are placed next to one another and in the right order.

Mary has in her wallet three $1 coins, one $2 coin and four coins of 25 ¢. She selects four coins at random from her wallet.

Write down a sample space for the possible selections she can make.

Express the following events as subsets of the above sample space:

C

1

: exactly three 25 ¢ coins are selected;

C

2

: the total value of the coins selected is $2.50;

C

3

: the total value of the coins selected is $3.50.

Group B

Bill just visited a friend who lives in Place

of the graph below and he wants to return home, which is at Place

on the graph. In order to minimize the distance he has to walk, he moves either downwards (e.g. from Place

to Place

) or to the right (e.g. from Place

to Place

on the graph). At each time, he makes a choice for his next movement by tossing a coin.

Give a sample space for the different routes Bill can follow to return home.

Write down explicitly the following events, representing them as subsets of the sample space given in (i):

A

1

: he passes through Place

on his way back home;

A

2

: he does not pass through Place

;

A

3

: on his way back, he has to toss the coin only twice to decide on his next move.

A bus, which has a capacity of carrying 50 passengers, passes through a certain bus stop every day at some time point between 10:00 a.m. and 10:30 a.m. In order to study the time the bus arrives at this stop, as well as the number of passengers in the bus at the time of its arrival, we use the following sample space

where above denotes the number of passengers in the bus and denotes the arrival time at the bus stop (in hours, expressed as a decimal number).

Is the sample space

for this experiment countable or uncountable (continuous)?

Write down explicitly each of the events below:

A

1

: the bus arrives at the stop at 10:10 a.m. carrying 25 passengers;

A

2

: the bus arrives at the stop at 10:10 a.m. with less than 25 passengers;

A

3

: the bus arrives at the stop between 10:10 a.m. and 10:15 a.m.;

A

4

: the bus arrives at the stop between 10:10 a.m. and 10:15 a.m., carrying at most 40 passengers.

Which, if any, of the events in part (ii) is a countable set?

Suppose now that, in order to describe the experiment of the bus arrival at the bus stop, we use pairs (

), where

is the number of passengers that may get on the bus when it arrives, while

represents the time after 10:00 a.m. that the bus arrives at the stop. Write down a sample space

for this experiment. Express each of the events in part (ii) as subsets of this new sample space.

At a car production line in a factory, each engine produced is tested to examine whether it is ready for use or has some fault. If two consecutive engines that are examined are found faulty, the production process stops and is revised (in such a case, the experiment is terminated). Otherwise, the process continues.

Provide a suitable sample space to describe the inspection process of the engines.

Find an expression to describe each of the following events

: the production line will be revised after

engines have been examined

for

.

In a water supply network, depicted below, the water is transferred from point

to point

through water tubes. At the positions marked with the numbers

and 4 on the graph, there are four switches which, if turned off, stop the water supply passing through the tube.

Find a sample space for the experiment which describes the positions of the four switches (ON or OFF).

Identify each of the following events as a suitable subset of the sample space given in part (i):

A

1

: there is water flow from point

to point

;

A

2

: there is water flow from point

to point

;

A

3

: there is water flow from point

to point

.

1.2 Operations Between Events

In the preceding section, we made a distinction between discrete (i.e. finite or countably infinite) and continuous sample spaces. When a sample space is discrete, then any subset of is an event. For continuous sample spaces, however, some theoretical difficulties appear if we assume that all subsets of are events. There are cases where certain3 sets have to be excluded from the “family of events” related to a sample space . The treatment of such technical difficulties is beyond the scope of the present book and in all applications we will consider, we assume that any subset of the sample space is an event.

Suppose that is a sample space for a chance experiment and is an event. If, in a realization of the experiment we are studying, we observe the outcome which belongs to , then we say that has occurred or that has appeared. For example, if we toss a coin three times and we observe the outcome , then (with reference to Example 1.1) we may say that

the event

has occurred, but

the event

has not occurred.

For the study of events associated with a certain experiment, and the assignment of probabilities to these events later on, it is essential to consider various relations among the events of a sample space, as well as operations among them. Recall that each event is mathematically represented by a set (a subset of ); thus, it is no surprise that the relations and operations we consider are borrowed from mathematical set theory.

To begin with, assume that and are events on the same sample space . If every element (sample point) of is also a member of , then we use the standard notation for subsets and write ( is a subset of ). In words, this means that whenever occurs, occurs too. For instance, in a single throw of a die (see Example 1.2), consider the events:

Then, expressing and as sets, we have and it is clear that . On the other hand, if we know that the outcome is 4 ( has occurred), then necessarily the outcome is even, so that occurs, too.

If and , then obviously occurs iff (if and only if) occurs in which case we have the following definition.

Definition 1.2

Two events and , defined on a sample space , are called equivalent if when appears, then appears, and vice versa. In this case, we shall write .

The entire sample space itself is an event (we have trivially ) and, since contains all possible experiment outcomes, it is called a certain event. On the other hand, if we are certain that an event cannot happen, then we call this an impossible event for which we use the empty set symbol, .

Coming back to the experiment of throwing a die, consider again the event and, instead of , the event

Suppose now that Nick, who likes gambling, throws a die and wins a bet if the outcome of the die is either 4 or 5. Then, the event

occurs if and only if at least one of the events and occur. The event is the same as the event “at least one of and occur” (more precisely, and according to Definition 1.2, these two events are equivalent), and so, using set notation, we can write . We thus see that, expressed as a set, coincides with the union of the two sets and .

Definition 1.3

The union of two events and , denoted by , is the event which occurs if and only if at least one of and occur.

The union operation can be easily extended when more than two sets are involved. More specifically, if are events on a sample space, then the event “at least one of the 's occur” is called the union of the events and is expressed in symbols as

To illustrate the next concept, viz., the intersection between two events, we return to the example of throwing a die. Suppose we have two gamblers who play against an opponent. In particular, each of them throws a die: the first player wins if the outcome of the die is greater than or equal to 4, while the other one wins if the outcome is an even integer. How do we describe the event that “both players win their bets?”

Let be the event that the first gambler wins his bet, the event that the second gambler wins, and the event that they both win their bets. Then, clearly, occurs if and only if both and occur. In order to find explicitly, we resort to set notation again; and can be written as follows:

It is now apparent that the event “both and occur” contains exactly those elements of the sample space