Nonlinear Time Series Analysis - Ruey S. Tsay - E-Book

Nonlinear Time Series Analysis E-Book

Ruey S. Tsay

0,0
107,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

A comprehensive resource that draws a balance between theory and applications of nonlinear time series analysis Nonlinear Time Series Analysis offers an important guide to both parametric and nonparametric methods, nonlinear state-space models, and Bayesian as well as classical approaches to nonlinear time series analysis. The authors--noted experts in the field--explore the advantages and limitations of the nonlinear models and methods and review the improvements upon linear time series models. The need for this book is based on the recent developments in nonlinear time series analysis, statistical learning, dynamic systems and advanced computational methods. Parametric and nonparametric methods and nonlinear and non-Gaussian state space models provide a much wider range of tools for time series analysis. In addition, advances in computing and data collection have made available large data sets and high-frequency data. These new data make it not only feasible, but also necessary to take into consideration the nonlinearity embedded in most real-world time series. This vital guide: * Offers research developed by leading scholars of time series analysis * Presents R commands making it possible to reproduce all the analyses included in the text * Contains real-world examples throughout the book * Recommends exercises to test understanding of material presented * Includes an instructor solutions manual and companion website Written for students, researchers, and practitioners who are interested in exploring nonlinearity in time series, Nonlinear Time Series Analysis offers a comprehensive text that explores the advantages and limitations of the nonlinear models and methods and demonstrates the improvements upon linear time series models.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 673

Veröffentlichungsjahr: 2018

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



NONLINEAR TIMESERIES ANALYSIS

Ruey S. TsayUniversity of Chicago, Chicago, Illinois, United States

Rong ChenRutgers, The State University of New Jersey,New Jersey, United States

WILEY SERIES IN PROBABILITY AND STATISTICS

Established by Walter A. Shewhart and Samuel S. Wilks

Editors: David J. Balding, Noel A. C. Cressie, Garrett M. Fitzmaurice, Geof H. Givens, Harvey Goldstein, Geert Molenberghs, David W. Scott, Adrian F. M. Smith, Ruey S. Tsay

Editors Emeriti: J. Stuart Hunter, Iain M. Johnstone, Joseph B. Kadane, Jozef L. Teugels

The Wiley Series in Probability and Statistics is well established and authoritative. It covers many topics of current research interest in both pure and applied statistics and probability theory. Written by leading statisticians and institutions, the titles span both state-of-the-art developments in the field and classical methods.

Reflecting the wide range of current research in statistics, the series encompasses applied, methodological and theoretical statistics, ranging from applications and new techniques made possible by advances in computerized practice to rigorous treatment of theoretical approaches. This series provides essential and invaluable reading for all statisticians, whether in academia, industry, government, or research.

This edition first published 2019© 2019 John Wiley & Sons, Inc.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.

The right of Ruey S. Tsay and Rong Chen to be identified as the authors of this work has been asserted in accordance with law.

Registered OfficeJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA

Editorial OfficeJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA

For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.

Wiley also publishes its books in a variety of electronic formats and by print-on-demand. Some content that appears in standard print versions of this book may not be available in other formats.

Limit of Liability/Disclaimer of WarrantyWhile the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

Library of Congress Cataloging-in-Publication DataNames: Tsay, Ruey S., 1951- author. | Chen, Rong, 1963- author.Title: Nonlinear time series analysis / by Ruey S. Tsay and Rong Chen.Description: Hoboken, NJ : John Wiley & Sons, 2019. | Series: Wiley series in probability and statistics | Includes index. |Identifiers: LCCN 2018009385 (print) | LCCN 2018031564 (ebook) | ISBN 9781119264064 (pdf) | ISBN 9781119264071 (epub) | ISBN 9781119264057 (cloth)Subjects: LCSH: Time-series analysis. | Nonlinear theories.Classification: LCC QA280 (ebook) | LCC QA280 .T733 2019 (print) | DDC 519.5/5–dc23LC record available at https://lccn.loc.gov/2018009385

Cover Design: WileyCover Image: Background: © gremlin/iStockphoto;Graphs: Courtesy of the author Ruey S. Tsay and Rong Chen

To Teresa, Julie, Richard, and Victoria (RST)To Danping, Anthony, and Angelina (RC)

CONTENTS

Preface

Chapter 1: Why Should We Care About Nonlinearity?

1.1 Some Basic Concepts

1.2 Linear Time Series

1.3 Examples of Nonlinear Time Series

1.4 Nonlinearity Tests

1.4.1 Nonparametric Tests

1.4.2 Parametric Tests

1.5 Exercises

References

Chapter 2: Univariate Parametric Nonlinear Models

2.1 A General Formulation

2.1.1 Probability Structure

2.2 Threshold Autoregressive Models

2.2.1 A Two-regime TAR Model

2.2.2 Properties of Two-regime TAR(1) Models

2.2.3 Multiple-regime TAR Models

2.2.4 Estimation of TAR Models

2.2.5 TAR Modeling

2.2.6 Examples

2.2.7 Predictions of TAR Models

2.3 Markov Switching Models

2.3.1 Properties of Markov Switching Models

2.3.2 Statistical Inference of the State Variable

2.3.2.1 Filtering State Probabilities

2.3.2.2 Smoothing State Probabilities

2.3.3 Estimation of Markov Switching Models

2.3.3.1 The States are Known

2.3.3.2 The States are Unknown

2.3.3.3 Sampling the Unknown Transition Matrix

2.3.4 Selecting the Number of States

2.3.5 Prediction of Markov Switching Models

2.3.6 Examples

2.4 Smooth Transition Autoregressive Models

2.5 Time-varying Coefficient Models

2.5.1 Functional Coefficient AR Models

2.5.2 Time-varying Coefficient AR Models

2.6 Appendix: Markov Chains

2.7 Exercises

References

Chapter 3: Univariate Nonparametric Models

3.1 Kernel Smoothing

3.2 Local Conditional Mean

3.3 Local Polynomial Fitting

3.4 Splines

3.4.1 Cubic and B-Splines

3.4.2 Smoothing Splines

3.5 Wavelet Smoothing

3.5.1 Wavelets

3.5.2 The Wavelet Transform

3.5.3 Thresholding and Smoothing

3.6 Nonlinear Additive Models

3.7 Index Model and Sliced Inverse Regression

3.8 Exercises

References

Chapter 4: Neural Networks, Deep Learning, and Tree-based Methods

4.1 Neural Networks

4.1.1 Estimation or Training of Neural Networks

4.1.2 An Example

4.2 Deep Learning

4.2.1 Deep Belief Nets

4.2.2 Demonstration

4.3 Tree-based Methods

4.3.1 Decision Trees

4.3.1.1 Regression Tree

4.3.1.2 Tree Pruning

4.3.1.3 Classification Tree

4.3.1.4 Bagging

4.3.2 Random Forests

4.4 Exercises

References

Chapter 5: Analysis of Non-Gaussian Time Series

5.1 Generalized Linear Time Series Models

5.1.1 Count Data and GLARMA Models

5.2 Autoregressive Conditional Mean Models

5.3 Martingalized GARMA Models

5.4 Volatility Models

5.5 Functional Time Series

5.5.1 Convolution FAR models

5.5.2 Estimation of CFAR Models

5.5.3 Fitted Values and Approximate Residuals

5.5.4 Prediction

5.5.5 Asymptotic Properties

5.5.6 Application

5.6 Appendix: Discrete Distributions for Count Data

5.7 Exercises

References

Chapter 6: State Space Models

6.1 A General Model and Statistical Inference

6.2 Selected Examples

6.2.1 Linear Time Series Models

6.2.2 Time Series With Observational Noises

6.2.3 Time-varying Coefficient Models

6.2.4 Target Tracking

6.2.5 Signal Processing in Communications

6.2.6 Dynamic Factor Models

6.2.7 Functional and Distributional Time Series

6.2.8 Markov Regime Switching Models

6.2.9 Stochastic Volatility Models

6.2.10 Non-Gaussian Time Series

6.2.11 Mixed Frequency Models

6.2.12 Other Applications

6.3 Linear Gaussian State Space Models

6.3.1 Filtering and the Kalman Filter

6.3.2 Evaluating the likelihood function

6.3.3 Smoothing

6.3.4 Prediction and Missing Data

6.3.5 Sequential Processing

6.3.6 Examples and R Demonstrations

6.4 Exercises

References

Chapter 7: Nonlinear State Space Models

7.1 Linear and Gaussian Approximations

7.1.1 Kalman Filter for Linear Non-Gaussian Systems

7.1.2 Extended Kalman Filters for Nonlinear Systems

7.1.3 Gaussian Sum Filters

7.1.4 The Unscented Kalman Filter

7.1.5 Ensemble Kalman Filters

7.1.6 Examples and R implementations

7.2 Hidden Markov Models

7.2.1 Filtering

7.2.2 Smoothing

7.2.3 The Most Likely State Path: the Viterbi Algorithm

7.2.4 Parameter Estimation: the Baum–Welch Algorithm

7.2.5 HMM Examples and R Implementation

7.3 Exercises

References

Chapter 8: Sequential Monte Carlo

8.1 A Brief Overview of Monte Carlo Methods

8.1.1 General Methods of Generating Random Samples

8.1.2 Variance Reduction Methods

8.1.3 Importance Sampling

8.1.4 Markov Chain Monte Carlo

8.2 The SMC Framework

8.3 Design Issue I: Propagation

8.3.1 Proposal Distributions

8.3.2 Delay Strategy (Lookahead)

8.4 Design Issue II: Resampling

8.4.1 The Priority Score

8.4.2 Choice of Sampling Methods in Resampling

8.4.3 Resampling Schedule

8.4.4 Benefits of Resampling

8.5 Design Issue III: Inference

8.6 Design Issue IV: Marginalization and the Mixture Kalman Filter

8.6.1 Conditional Dynamic Linear Models

8.6.2 Mixture Kalman Filters

8.7 Smoothing with SMC

8.7.1 Simple Weighting Approach

8.7.2 Weight Marginalization Approach

8.7.3 Two-filter Sampling

8.8 Parameter Estimation with SMC

8.8.1 Maximum Likelihood Estimation

8.8.2 Bayesian Parameter Estimation

8.8.3 Varying Parameter Approach

8.9 Implementation Considerations

8.10 Examples and R Implementation

8.10.1 R Implementation of SMC: Generic SMC and Resampling Methods

8.10.1.1 Generic R Code for SMC Implementation

8.10.1.2 R Code for Resampling

8.10.2 Tracking in a Clutter Environment

8.10.3 Bearing-only Tracking with Passive Sonar

8.10.4 Stochastic Volatility Models

8.10.5 Fading Channels as Conditional Dynamic Linear Models

8.11 Exercises

References

Index

EULA

List of Tables

Chapter 1

Table 1.1

Table 1.2

Table 1.3

Table 1.4

Chapter 2

Table 2.1

Chapter 4

Table 4.1

Table 4.2

Chapter 5

Table 5.1

Table 5.2

Table 5.3

Table 5.4

Table 5.5

Chapter 6

Table 6.1

Table 6.2

Table 6.3

Table 6.4

Table 6.5

Chapter 7

Table 7.1

Table 7.2

Table 7.3

Table 7.4

Table 7.5

Table 7.6

Table 7.7

Table 7.8

Chapter 8

Table 8.1

Table 8.2

List of Illustrations

Chapter 1

Figure 1.1

Figure 1.2

Figure 1.3

Figure 1.4

Figure 1.5

Figure 1.6

Figure 1.7

Figure 1.8

Figure 1.9

Figure 1.10

Figure 1.11

Figure 1.12

Figure 1.13

Figure 1.14

Figure 1.15

Figure 1.16

Chapter 2

Figure 2.1

Figure 2.2

Figure 2.3

Figure 2.4

Figure 2.5

Figure 2.6

Figure 2.7

Figure 2.8

Figure 2.9

Figure 2.10

Figure 2.11

Figure 2.12

Figure 2.13

Figure 2.14

Figure 2.15

Figure 2.16

Figure 2.17

Figure 2.18

Figure 2.19

Figure 2.20

Figure 2.21

Figure 2.22

Figure 2.23

Figure 2.24

Figure 2.25

Figure 2.26

Figure 2.27

Figure 2.28

Figure 2.29

Figure 2.30

Chapter 3

Figure 3.1

Figure 3.2

Figure 3.3

Figure 3.4

Figure 3.5

Figure 3.6

Figure 3.7

Figure 3.8

Figure 3.9

Figure 3.10

Figure 3.11

Figure 3.12

Figure 3.13

Figure 3.14

Figure 3.15

Figure 3.16

Figure 3.17

Figure 3.18

Figure 3.19

Figure 3.20

Figure 3.21

Figure 3.22

Figure 3.23

Figure 3.24

Figure 3.25

Figure 3.26

Figure 3.27

Figure 3.28

Chapter 4

Figure 4.1

Figure 4.2

Figure 4.3

Figure 4.4

Figure 4.5

Figure 4.6

Figure 4.7

Figure 4.8

Figure 4.9

Figure 4.10

Figure 4.11

Figure 4.12

Figure 4.13

Figure 4.14

Figure 4.15

Figure 4.16

Figure 4.17

Chapter 5

Figure 5.1

Figure 5.2

Figure 5.3

Figure 5.4

Figure 5.5

Figure 5.6

Figure 5.7

Figure 5.8

Figure 5.9

Figure 5.10

Figure 5.11

Figure 5.12

Figure 5.13

Figure 5.14

Figure 5.15

Figure 5.16

Chapter 6

Figure 6.1

Figure 6.2

Figure 6.3

Figure 6.4

Figure 6.5

Figure 6.6

Figure 6.7

Figure 6.8

Figure 6.9

Figure 6.10

Figure 6.11

Figure 6.12

Figure 6.13

Figure 6.14

Figure 6.15

Figure 6.16

Figure 6.17

Figure 6.18

Figure 6.19

Figure 6.20

Figure 6.21

Figure 6.22

Figure 6.23

Figure 6.24

Figure 6.25

Chapter 7

Figure 7.1

Figure 7.2

Figure 7.3

Figure 7.4

Figure 7.5

Figure 7.6

Figure 7.7

Figure 7.8

Figure 7.9

Figure 7.10

Figure 7.11

Figure 7.12

Figure 7.13

Figure 7.14

Figure 7.15

Figure 7.16

Figure 7.17

Chapter 8

Figure 8.1

Figure 8.2

Figure 8.3

Figure 8.4

Figure 8.5

Figure 8.6

Figure 8.7

Figure 8.8

Figure 8.9

Figure 8.10

Figure 8.11

Figure 8.12

Figure 8.13

Figure 8.14

Figure 8.15

Figure 8.16

Figure 8.17

Figure 8.18

Figure 8.19

Figure 8.20

Figure 8.21

Figure 8.22

Figure 8.23

Figure 8.24

Figure 8.25

Figure 8.26

Figure 8.27

Figure 8.28

Figure 8.29

Figure 8.30

Guide

Cover

Table of Contents

Preface

Pages

C1

ii

iii

iv

v

xiii

xiv

xv

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

404

405

406

407

408

409

410

411

412

413

414

415

416

417

418

419

420

421

422

423

424

425

426

427

428

429

430

431

432

433

434

435

436

437

438

439

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

473

474

475

476

477

478

479

480

481

482

483

484

485

486

487

488

489

490

491

493

494

495

496

Preface

Time series analysis is concerned with understanding the dynamic dependence of real-world phenomena and has a long history. Much of the work in time series analysis focuses on linear models, even though the real world is not linear. One may argue that linear models can provide good approximations in many applications, but there are cases in which a nonlinear model can shed light far beyond where linear models can. The goal of this book is to introduce some simple yet useful nonlinear models, to consider situations in which nonlinear models can make significant contributions, to study basic properties of nonlinear models, and to demonstrate the use of nonlinear models in practice. Real examples from various scientific fields are used throughout the book for demonstration.

The literature on nonlinear time series analysis is enormous. It is too much to expect that a single book can cover all the topics and all recent developments. The topics and models discussed in this book reflect our preferences and personal experience. For the topics discussed, we try to provide a comprehensive treatment. Our emphasis is on application, but important theoretical justifications are also provided. All the demonstrations are carried out using R packages and a companion NTS package for the book has also been developed to facilitate data analysis. In some cases, a command in the NTS package simply provides an interface between the users and a function in another R package. In other cases, we developed commands that make analysis discussed in the book more user friendly. All data sets used in this book are either in the public domain or available from the book’s web page.

The book starts with some examples demonstrating the use of nonlinear time series models and the contributions a nonlinear model can provide. Chapter 1 also discusses various statistics for detecting nonlinearity in an observed time series. We hope that the chapter can convince readers that it is worthwhile pursuing nonlinear modeling in analyzing time series data when nonlinearity is detected. In Chapter 2 we introduce some well-known nonlinear time series models available in the literature. The models discussed include the threshold autoregressive models, the Markov switching models, the smooth transition autoregressive models, and time-varying coefficient models. The process of building those nonlinear models is also addressed. Real examples are used to show the features and applicability of the models introduced. In Chapter 3 we introduce some nonparametric methods and discuss their applications in modeling nonlinear time series. The methods discussed include kernel smoothing, local polynomials, splines, and wavelets. We then consider nonlinear additive models, index models, and sliced inverse regression. Chapter 4 describes neural networks, deep learning, tree-based methods, and random forests. These topics are highly relevant in the current big-data environment, and we illustrate applications of these methods with real examples. In Chapter 5 we discuss methods and models for modeling non-Gaussian time series such as time series of count data, volatility models, and functional time series analysis. Poisson, negative binomial, and double Poisson distributions are used for count data. The chapter extends the idea of generalized linear models to generalized linear autoregressive and moving-average models. For functional time series, we focus on the class of convolution functional autoregressive models and employ sieve estimation with B-splines basis functions to approximate the true underlying convolution functions.

The book then turns to general (nonlinear) state space models (SSMs) in Chapter 6. Several models discussed in the previous chapters become special cases of this general SSM. In addition, some new nonlinear models are introduced under the SSM framework, including targeting tracking, among others. We then discuss methods for filtering, smoothing, prediction, and maximum likelihood estimation of the linear and Gaussian SSM via the Kalman filter. Special attention is paid to the linear Gaussian SSM as it is the foundation for further developments and the model can provide good approximations in many applications. Again, real examples are used to demonstrate various applications of SSMs. Chapter 7 is a continuation of Chapter 6. It introduces various extensions of the Kalman filter, including extended, unscented, and ensemble Kalman filters. The chapter then focuses on hidden Markov models (HMMs) to which the Markov switching model belongs. Filtering and estimation of HMMs are discussed in detail and real examples are used to demonstrate the applications. In Chapter 8 we introduce a general framework of sequential Monte Carlo methods that is designed to analyze nonlinear and non-Gaussian SSM. Some of the methods discussed are also referred to as particle filters in the literature. Implementation issues are discussed in detail and several applications are used for demonstration. We do not discuss multivariate nonlinear time series, even though many of the models and methods discussed can be generalized.

Some exercises are given in each chapter so that readers can practice empirical analysis and learn applications of the models and methods discussed in the book. Most of the exercises use real data so that there exist no true models, but good approximate models can always be found by using the methods discussed in the chapter.

Finally, we would like to express our sincere thanks to our friends, colleagues, and students who helped us in various ways during our research in nonlinear models and in preparing this book. In particular, Xialu Liu provided R code and valuable help in the analysis of convolution functional time series and Chencheng Cai provided R code of optimized parallel implementation of likelihood function evaluation. Daniel Peña provided valuable comments on the original draft. William Gonzalo Rojas and Yimeng Shi read over multiple draft chapters and pointed out various typos. Howell Tong encouraged us in pursuing research in nonlinear time series and K.S. Chan engaged in various discussions over the years. Last but not least, we would like to thank our families for their unconditional support throughout our careers. Their love and encouragement are the main source of our energy and motivation. The book would not have been written without all the support we have received.

The web page of the book is http://faculty.chicagobooth.edu/ruey.tsay/ teaching/nts (for data sets) and www.wiley.com/go/tsay/nonlineartimeseries (for instructors).

R.S.T. Chicago, IL

R.C. Princeton, NJ

November 2017