Introduction to Linear Regression Analysis, 6e Solutions Manual - Douglas C. Montgomery - E-Book

Introduction to Linear Regression Analysis, 6e Solutions Manual E-Book

Douglas C. Montgomery

0,0
27,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

A comprehensive and current introduction to the fundamentals of regression analysis Introduction to Linear Regression Analysis, 6th Edition is the most comprehensive, fulsome, and current examination of the foundations of linear regression analysis. Fully updated in this new sixth edition, the distinguished authors have included new material on generalized regression techniques and new examples to help the reader understand retain the concepts taught in the book. The new edition focuses on four key areas of improvement over the fifth edition: * New exercises and data sets * New material on generalized regression techniques * The inclusion of JMP software in key areas * Carefully condensing the text where possible Introduction to Linear Regression Analysis skillfully blends theory and application in both the conventional and less common uses of regression analysis in today's cutting-edge scientific research. The text equips readers to understand the basic principles needed to apply regression model-building techniques in various fields of study, including engineering, management, and the health sciences.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 100

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright

Preface

Chapter 2: Simple Linear Regression

Chapter 3: Multiple Linear Regression

Chapter 4: Model Adequacy Checking

Chapter 5: Transformations and Weighting to Correct Model Inadequacies

Chapter 6: Diagnostics for Leverage and Influence

Chapter 7: Polynomial Regression Models

Chapter 8: Indicator Variables

Chapter 9: Multicollinearity

Chapter 10: Variable Selection and Model Building

Chapter 11: Validation of Regression Models

Chapter 12: Introduction to Nonlinear Regression

Chapter 13: Generalized Linear Models

Chapter 14: Regression Analysis of Time Series Data

Chapter 15: Other Topics in the Use of Regression Analysis

End User License Agreement

Guide

Cover

Table of Contents

Title Page

Copyright

Preface

Begin Reading

End User License Agreement

Pages

iii

vi

vii

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

Solutions Manual to Accompany Introduction to Linear Regression Analysis

Sixth Edition

 

Douglas C. MontgomeryArizona State UniversitySchool of Computing, Informatics, and Decisions Systems EngineeringTempe, AZ

Elizabeth A. PeckThe Coca-Cola Company (retired)Atlanta, GA

G. Geoffrey ViningVirginia TechDepartment of StatisticsBlacksburg, VA

 

Prepared by

Anne G. RyanVirginia TechDepartment of StatisticsBlacksburg, VA

 

 

 

This sixth edition first published 2022© 2022 John Wiley and Sons, Inc.

Edition HistoryJohn Wiley and Sons, Inc. (5e, 2012)

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.

The right of Anne G. Ryan to be identified as the author of this work has been asserted in accordance with law.

Registered OfficesJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA

Editorial Office111 River Street, Hoboken, NJ 07030, USA

For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.

Wiley also publishes its books in a variety of electronic formats and by print-on-demand. Some content that appears in standard print versions of this book may not be available in other formats.

Limit of Liability/Disclaimer of WarrantyWhile the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

Library of Congress Cataloging-in-Publication Data Applied forISBN 978-1-119-57869-7 (paper)

Cover Design: Wiley

PREFACE

This book contains the complete solutions to the first eight chapters and the odd-numbered problems for chapters nine through fifteen in Introduction to Linear Regression Analysis, Sixth Edition. The solutions were obtained using Minitab®, JMP®, and SAS®.

The purpose of the solutions manual is to provide students with a reference to check their answers and to show the complete solution. Students are advised to try to work out the problems on their own before appealing to the solutions manual.

Anne R. Driscoll

Virginia Tech

Chapter 2Simple Linear Regression

2.1

Source

d.f.

SS

MS

Regression

1

178.09

178.09

Error

26

148.87

5.73

Total

27

326.96

A 95% confidence interval for the slope parameter is −0.007025 ± 2.056(0.00126) = (−0.0096, −0.0044).

R

2

= 54.5%

A 95% confidence interval on the mean number of games won if opponents' yards rushing is limited to 2000 yards is 7.738 ± 2.056(.473) = (6.766, 8.711).

2.2 The fitted value is 9.14 and a 90% prediction interval on the number of games won if opponents' yards rushing is limited to 1800 yards is (4.935, 13.351).

2.3

Source

d.f.

SS

MS

Regression

1

10579

10579

Error

27

4103

152

Total

28

14682

A 99% confidence interval for the slope parameter is −21.402 ± 2.771(2.565) = (−28.51, −14.29).

R

2

= 72.1%

A 95% confidence interval on the mean heat flux when the radial deflection is 16.5 milliradians is 253.96 ± 2.145(2.35) = (249.15, 258.78).

2.4

Source

d.f.

SS

MS

Regression

1

955.34

955.34

Error

30

282.20

9.41

Total

31

1237.54

R

2

= 77.2%

A 95% confidence interval on the mean gasoline mileage if the engine displacement is 275 in

3

is 20.685 ± 2.042(.544) = (19.573, 21.796).

A 95% prediction interval on the mean gasoline mileage if the engine displacement is 275 in

3

is 20.685 ± 2.042(3.116) = (14.322, 27.048).

Part d. is an interval estimator on the mean response at 275 in

3

while part e. is an interval estimator on a future observation at 275 in

3

. The prediction interval is wider than the confidence interval on the mean because it depends on the error from the fitted model and the future observation.

2.5

Source

d.f.

SS

MS

Regression

1

921.53

921.53

Error

30

316.02

10.53

Total

31

1237.54

R

2

= 74.5%

The two variables seem to fit about the same. It does not appear that x1 is a better regressor than x10.

2.6

Source

d.f.

SS

MS

Regression

1

636.16

636.16

Error

22

192.89

8.77

Total

23

829.05

R

2

= 76.7%

A 95% confidence interval on the slope parameter is 3.3244 ± 2.074(.3903) = (2.51, 4.13).

A 95% confidence interval on the mean selling price of a house for which the current taxes are $750 is 15.813 ± 2.074(2.288) = (11.07, 20.56).

2.7

with

p

= 0.003. The null hypothesis is rejected and we conclude there is a linear relationship between percent purity and percent of hydrocarbons.

R

2

= 38.9%

A 95% confidence interval on the slope parameter is 11.801 ± 2.101(3.485) = (4.48, 19.12).

A 95% confidence interval on the mean purity when the hydrocarbon percentage is 1.00 is 89.664 ± 2.101(1.025) = (87.51, 91.82).

2.8

This is the same as the test statistic for testing

β

1

= 0,

t

= 3.39 with

p

= 0.003.

A 95% confidence interval for

ρ

is

2.9 The no-intercept model is

with MSE = 21.029. The MSE for the model containing the intercept is 17.484. Also, the test of

β

0

= 0 is significant. Therefore, the model should not be forced through the origin.

2.10

r

= .773

t

= 5.979 with

p

= 0.000, reject

H

0

and claim there is evidence that the correlation is different from zero.

The test is

Since the rejection region is |Z0| > Zα/2 = 1.96, we fail to reject H0.

A 95% confidence interval for

ρ

is

2.11

with MSE = 158.707. The model with the intercept has MSE = 75.357 and the test on

β

0

is significant. The model with the intercept is superior.

2.12

F

= 280590/4 = 74, 122.73, it is significant.

H

0

:

β

1

= 10000 vs

H

1

:

β

1

≠ 10000 gives

t

= (9.208 − 10)/.03382 = −23.4 with

p

= 0.000. Reject

H

0

and claim that the usage increase is less than 10,000.

A 99% prediction interval on steam usage in a month with average ambient temperature of 58° is 527.759 ± 3.169(2.063) = (521.22, 534.29).

2.13

F

= 349.688/973.196 = .359 with

p

= 0.558. The data suggests no linear association.

2.14

F

= .0369/.0225 = 1.64 with

p

= 0.248.

R

2

= 21.5%. A linear association is not present.

2.15

F

= .32529..00225 = 144.58 with

p

= 0.000.

R

2

= 96%. There is a linear association between viscosity and temperature.

2.16

,

F

= 34286009 with

p

= 0.000,

R

2

= 100%. There is almost a perfect linear fit of the data.

2.17

,

F

= 226.4 with

p

= 0.000,

R

2

= 93.8%. The model is a good fit of the data.

2.18

F

= 13.98 with

p

= 0.001, so the relationship is statistically significant. However, the

R

2

= 42.4%, so there is still a lot of unexplained variation in this model.

A 95% confidence interval on returned impressions for MCI (x=26.9) is

A 95% prediction interval is

2.19

,

F

= 72.09 with

p

= 0.000,

R

2

= 75.8%. The model is a good fit of the data.

The fit for the SLR model relating satisfaction to age is much better compared to the fit for the SLR model relating satisfaction to severity in terms of

R

2

. For the SLR with satisfaction and age

R

2

= 75.8% compared to

R

2

= 42.7% for the model relating satisfaction and severity.

2.20

,

F

= 7.51 with

p

= 0.016,

R

2

= 34.9%. The engineer is correct that there is a relationship between initial boiling point of the fuel and fuel consumption. However, the

R

2