73,99 €
QUALITY IN THE ERA OF INDUSTRY 4.0 Enables readers to use real-world data from connected devices to improve product performance, detect design vulnerabilities, and design better solutions Quality in the Era of Industry 4.0 provides an insightful guide to harnessing user performance and behavior data through AI and other Industry 4.0 technologies. This transformative approach enables companies to not only optimize products and services in real-time, but also to anticipate and mitigate likely failures proactively. In a succinct and lucid style, the book presents a pioneering framework for a new paradigm of quality management in the Industry 4.0 landscape. It introduces groundbreaking techniques such as utilizing real-world data to tailor products for superior fit and performance, leveraging connectivity to adapt products to evolving needs and use-cases, and employing cutting-edge manufacturing methods to create bespoke, cost-effective solutions with greater efficiency. Case examples featuring applications from the automotive, mobile device, home appliance, and healthcare industries are used to illustrate how these new quality approaches can be used to benchmark the product's performance and durability, maintain smart manufacturing, and detect design vulnerabilities. Written by a seasoned expert with experience teaching quality management in both corporate and academic settings, Quality in the Era of Industry 4.0 covers topics such as: * Evolution of quality through industrial revolutions, from ancient times to the first and second industrial revolutions * Quality by customer value creation, explaining differences in producers, stakeholders, and customers in the new digital age, along with new realities brought by Industry 4.0 * Data quality dimensions and strategy, data governance, and new talents and skill sets for quality professionals in Industry 4.0 * Automated product lifecycle management, predictive quality control, and defect prevention using technologies like smart factories, IoT, and sensors Quality in the Era of Industry 4.0 is a highly valuable resource for product engineers, quality managers, quality engineers, quality consultants, industrial engineers, and systems engineers who wish to make a participatory approach towards data-driven design, economical mass-customization, and late differentiation.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 761
Veröffentlichungsjahr: 2023
Cover
Table of Contents
Title Page
Copyright Page
Preface
Brief Description of the Book
Explanation of the Significance of Discussing Quality in the Context of Industry 4.0 and AI
Target Audience
Motivations to Write This Book
Book Structure and Chapter Summaries
Acknowledgments
1 Evolution of Quality Through Industrial Revolutions
1.1 Quality Before Industrial Revolutions
1.2 Quality in the First Industrial Revolution
1.3 The Second Industrial Revolution and the Birth of Modern Quality Management
1.4 The Third Industrial Revolution and the Maturity of Modern Quality Management System
1.5 Current Challenges and Difficulties for Quality Management
1.6 Summary
References
2 Evolving Paradigm for Quality in the Era of Industry 4.0
2.1 Current Quality Definitions and Paradigms
2.2 Changes Brought by Industry 4.0
2.3 Quality 4.0
2.4 Hidden Gems: Lesser Known but Potent Ideas on Quality
2.5 Evolving Paradigm for Quality in the Era of Industry 4.0
References
3 Quality by Design and Innovation
3.1 The Trend of Quality: Going Upstream
3.2 The Journey into Quality by Design
3.3 Design for Six Sigma, A Serious Attempt for Quality by Design
3.4 Quality by Design in the Era of Industry 4.0
3.5 Customer Value Creation by Innovation
3.6 Quality Management and Assurance in Early Product Life Cycle
References
4 Quality Management in the Era of Industry 4.0
4.1 Introduction
4.2 Smart Factory
4.3 Quality Management for Smart Supply Chain
4.4 Quality Management in After‐Sale Customer Service
4.5 Quality Management for Service Industry
4.6 Digital Quality Management System Under Industry 4.0
References
5 Predictive Quality
5.1 Introduction
5.2 Elements of Predictive Quality
5.3 Exploration of Predictive Quality Models
5.4 Performance Metrics in Predictive Modeling
5.5 Application of Predictive Quality in Various Industries
5.6 The Challenges and Limitations of Predictive Quality
5.7 The Future of Predictive Quality
References
6 Data Quality
6.1 Introduction
6.2 Data and Data Quality
6.3 Data Quality Dimensions and Measurement
6.4 Data Quality Management
6.5 Data Governess
6.6 The Role of Quality Professionals
6.7 Future Trends in Data Quality
References
7 Risk Management in the 21st Century
7.1 Introduction
7.2 Deciphering the Nature of Risk
7.3 Risk Management Frameworks
7.4 Risk Management Techniques
7.5 Technology and Risk Management
7.6 Resilience and Business Continuity
References
8 Emerging Organizational Changes in the 21st Century
8.1 The Continuously Shifting Landscape of Organizational Structures
8.2 Impact of Technological Advances on Organizational Structures
8.3 Emerging Organizational Models in the 21st Century
8.4 Future of Organizational Structures
8.5 The Impact on Quality Professionals
8.6 Required Skills and Knowledge for Quality Professionals in the Future
References
Index
End User License Agreement
Chapter 1
Table 1.1 Evolution of the Quality Concept in Step with the Different Indus...
Chapter 2
Table 2.1 Christopher Alexander's Design Quality Practices
Table 2.2 Industrial Revolutions and Quality (ASQ)
Chapter 3
Table 3.1 Traditional Product Life Cycle and Quality Assurance
Table 3.2 Survey Form for Customer Values
Table 3.3 Customer Value Studies of Gallbladder Operations Endo‐Surgery Ver...
Table 3.4 QoE Assessment Scale
Chapter 5
Table 5.1 Confusion Matrix
Table 5.2 Feature Values and Quality Status of Each Sample
Chapter 6
Table 6.1 Metadata Example
Table 6.2 Example of Transactional Data
Table 6.3a Customer Master Data
Table 6.3b Product Master Data
Table 6.4 Reference Data Example
Table 6.5 Operational Data Example
Table 6.6 Example of Measuring Completeness
Table 6.7a Customer Database—System A
Table 6.7b Customer Database—System B
Table 6.8 Example of Timelines in Data
Table 6.9 Example of Validity in Data
Table 6.10 Example of Uniqueness in Data
Table 6.11 Example of Integrity in Data
Table 6.12 Example of Relevance in Data
Table 6.13 Example of Reliability in Data
Table 6.14 Data quality in multiple dimensions
Table 6.15 A sample data quality assessment report
Chapter 7
Table 7.1a Risk Likelihood Scale
Table 7.1b Risk Impact Scale
Table 7.2 Sample Risk Assessment Table
Table 7.3 A Sample Risk Matrix
Chapter 1
Figure 1.1 Google Search Trends for Quality Management and Six Sigma
Figure 1.2 Comparison Between Cars Produced from Craftsman and Mass Producti...
Figure 1.3 Key Industry 4.0 Technologies
Figure 1.4 Evolutions of Customers and Manufacturers
Figure 1.5 Initial Quality Rating and Market Capitalization of J.D. Power fo...
Figure 1.6 Target Group Index for Gen Z Consumers
Figure 1.7 Increasing Shares of Smart Products in Appliance Market
Figure 1.8 Increasing Software Contents for Automotives
Chapter 2
Figure 2.1 Village in Santorini.
Figure 2.2 The NIST Smart Manufacturing Ecosystem
Figure 2.3 The Industry 4.0 Technology Pyramid.
Figure 2.4 How CosmoPlat interact with Manufacturing Ecosystem
Figure 2.5 Fully connected and Interactive Product Development Process
Figure 2.6 Sports Utility Vehicle's Customer Value.
Figure 2.7 The Value of Brands
Figure 2.8 Quality Pyramid: Individualized Customer Value
Chapter 3
Figure 3.1 Effect of Design Phases on Life Cycle
Figure 3.2 Innovation Road Map
Figure 3.3 Typical Fitness Trackers.
Figure 3.4 Customer Value Curves of Cirque du Soleil
Figure 3.5 East Gate Center.
Figure 3.6 Apple's First Mouse.
Figure 3.7 Procter & Gamble's Swiffer.
Figure 3.8 Development of a Self‐Cleaning Air Conditioning
Figure 3.9 Nest Learning Thermostat.
Figure 3.10 User Behavior According to the Telematics Data from Vehicle
Figure 3.11 Typical Car Usage Experience Scenarios and Scenario Chain
Figure 3.12 Urban Traveling Refining to QoE Merit Scores
Figure 3.13 Refined Scenario Tree to Obtain QoE Merits Rating Scores
Figure 3.14 Agile Development Model
Figure 3.15 Agile Development Process for Embedded System Products
Chapter 4
Figure 4.1 Certified Global Lighthouse Smart Factories in 2023
Figure 4.2 A 3D Scanning Image of an Automotive Part with Deviation Spotted...
Figure 4.3 Traditional Automotive Body Assembly Process
Figure 4.4 Tesla's Giga Press.
Figure 4.5 Siemens MindSphere for Supply Chain
Figure 4.6 Workflow of 3‐D Digital Dentistry
Chapter 5
Figure 5.1 Two Categories Classification Problem
Figure 5.2 Separating Hyperplane and Supporting Vectors
Figure 5.3 A Basic Neural Network Structure
Figure 5.4 Basic Structure of a Feedforward Neural Network
Figure 5.5 An Example of AUC‐ROC Curve in Healthcare Applications.
Figure 5.6 LQC Monitoring System for defect Prediction
Figure 5.7 LQC Monitoring System for Defect Detection
Figure 5.8 3D Patterns for Features and Quality Classification
Figure 5.9 1‐Feature Quality Classification Analysis
Figure 5.10 I‐MR Univariate Control Charts for Features 1–3
Figure 5.11 2D Quality Classification Analysis (Features 1 and 2)
Figure 5.12 Linear Separation Scheme Based on the SVM, Separating Hyperplane...
Figure 5.13 Car Body with No Significant Deviations, Good Quality Samples
Figure 5.14 Car Bodies with Significant Deviations, Defective Samples
Figure 5.15 Confusion Matrixes of Training, Validation, and Test Sets
Chapter 6
Figure 6.1 Proactive Data Quality Management
Cover Page
Table of Contents
Title Page
Copyright Page
Preface
Acknowledgments
Begin Reading
Index
WILEY END USER LICENSE AGREEMENT
iii
iv
xiii
xiv
xv
xvi
xviii
xix
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
315
316
317
318
319
320
321
322
323
324
Kai Yang
Wayne State UniversityMichigan, USA
Copyright © 2024 by John Wiley & Sons, Inc. All rights reserved.
Published by John Wiley & Sons, Inc., Hoboken, New Jersey.Published simultaneously in Canada.
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per‐copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750‐8400, fax (978) 750‐4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748‐6011, fax (201) 748‐6008, or online at http://www.wiley.com/go/permission.
Trademarks: Wiley and the Wiley logo are trademarks or registered trademarks of John Wiley & Sons, Inc. and/or its affiliates in the United States and other countries and may not be used without written permission. All other trademarks are the property of their respective owners. John Wiley & Sons, Inc. is not associated with any product or vendor mentioned in this book.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762‐2974, outside the United States at (317) 572‐3993 or fax (317) 572‐4002.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.
Library of Congress Cataloging‐in‐Publication Data Applied for
Hardback ISBN: 9781119932444
Cover Design: WileyCover Image: © chombosan/Alamy Stock Photo metamorworks/Getty Images
Quality in the Era of Industry 4.0: Integrating Tradition and Innovation in the Age of Data and AI seeks to explore the complex interplay between emerging technologies that are behind Industry 4.0 such as artificial intelligence (AI), big data analytics, smart manufacturing, and the multifaceted domain of quality management. The primary focus of this book is to examine how these disruptive technologies, particularly the rapid advancements in AI, are altering the paradigms of quality, from design to delivery and beyond.
Key themes covered in this book include:
Historical Evolution of Quality:
Tracing the evolution of quality management over time, culminating in the contemporary transformation driven in part by Industry 4.0 and AI.
Changing Paradigms:
Exploring how conventional quality management and quality engineering must be complemented, and in some instances replaced, by new realities and technology‐driven approaches.
Design and Innovation:
Charting the evolution of quality by redirecting its focus toward the early stages of the product or service lifecycle. This involves harnessing the capabilities of cutting‐edge technologies like connectivity and AI to streamline product design and service delivery, resulting in superior customer value through dynamic, real‐time iterations among stakeholders, including customers.
Predictive Quality and Data
: Investigating how AI‐driven analytics and machine learning models are becoming foundational to predictive quality management, allowing businesses to anticipate and address issues before they escalate into critical failures. Data quality emerges as a critical component in today's digital age; practical guidance on ensuring data integrity is provided.
Risk Management:
Examining the integration of AI into risk assessment, utilizing data analytics to predict potential future risks, and devising proactive strategies for mitigation. In the rapidly evolving landscape of the 21st century, effective risk management has become an indispensable cornerstone of quality assurance, helping organizations navigate the complexities of modern operations.
Future of Organizational Structures:
Analyzing the impact of technological advancements such as Industry 4.0 and AI on organizational structures, subsequently influencing mechanisms for quality management.
In an era marked by rapid technological changes, with AI standing at the forefront, the concept of quality is expanding beyond traditional realms like defect control and customer satisfaction. Industry 4.0 and AI together introduce disruptions and opportunities that necessitate a reassessment of quality management practices. Smart manufacturing systems, real‐time data analytics, interconnected business ecosystems, and AI technologies are not only enhancing but also revolutionizing how we define, implement, and evaluate “quality.”
The recent impressive developments in AI, such as advanced neural networks, natural language processing, and robotics, are adding layers of complexity and capability to quality management systems. These advancements make it not just possible but imperative to incorporate AI into discussions of quality in Industry 4.0. Issues like data quality, cybersecurity, and workforce reskilling are becoming increasingly intricate due to AI's growing role in organizational processes.
By focusing on these themes and exploring their intersection, this book aims to provide quality professionals, academicians, and industry leaders with the tools and insights needed to navigate the evolving landscape of quality management.
The content and insights provided in Quality in the Era of Industry 4.0: Integrating Tradition and Innovation in the Age of Data and AI are specifically designed to cater to a diverse audience with varying levels of expertise and interest in quality management and AI. Following are the key target audience groups:
Industry Professionals
: This book aims to serve as a comprehensive guide for quality managers, engineers, data analysts, and other professionals working in fields disrupted by Industry 4.0 and AI. It offers actionable insights and methodologies for adapting to new paradigms of quality management.
Academicians and Researchers
: Professors, scholars, and students in disciplines such as industrial engineering, data science, and business management will find a wealth of research material, case studies, and future research avenues. The book also serves as an up‐to‐date academic resource that reflects the latest trends and technologies, including advancements in AI.
Corporate Leaders and Decision‐Makers
: CEOs, CTOs, and other executives can benefit from this book by gaining an understanding of how Industry 4.0 and AI are transforming quality management. The book provides strategic insights into integrating technology‐driven quality measures and risk assessments into organizational processes.
Consultants and Advisors
: Those involved in advising firms on quality management and technological adoption will find this book to be a valuable resource. It covers the breadth and depth of quality management changes brought on by AI and Industry 4.0, providing a robust framework for consultation services
Policy Makers
: With Industry 4.0 and AI ushering in a new regulatory environment, policy makers can gain insights into the complexities and challenges posed by these technologies, helping them craft more informed and effective policies for quality control and data management.
Technology Enthusiasts and General Readers
: Anyone interested in the intersection of technology and quality will find this book to be an accessible and enlightening read. It aims to demystify the complexities surrounding Industry 4.0 and AI, making the content relatable to those without a specialized background.
Data Scientists and AI Practitioners
: Given the significant focus on predictive quality and data analytics, professionals in these fields will find relevant methodologies and algorithms that can be directly applied to AI‐driven quality management systems.
Non‐Profit Organizations and Activists
: Those interested in ethical manufacturing, consumer protection, and the societal implications of AI will find the book’s sections on risk management and ethical considerations particularly relevant.
By addressing the needs and curiosities of such a varied audience, this book aims to be a comprehensive, go‐to resource for anyone looking to understand the transformative impact of Industry 4.0 and AI on the world of quality management.
The decision to write Quality in the Era of Industry 4.0: Integrating Tradition and Innovation in the Age of Data and AI was spurred by multiple motivating factors, each contributing to the overall purpose and drive behind this endeavor. Here are the key motivations:
Stagnation in Quality Theories and Methods
: One of the primary motivators was the observation that the field of quality management has experienced a significant stagnation in terms of theory and methods. The core principles and methodologies that have served as the backbone of quality management have seen little innovative evolution in the past 60 years. This static nature presented an urgent call to rejuvenate the field and explore new avenues and frameworks.
The Advent of Industry 4.0, AI, and Big Data
: The emergence of technologies under the Industry 4.0 umbrella, particularly AI and big data, has radically altered the operational landscape across industries. These advancements have not only opened up opportunities for redefining quality but also made it imperative to reevaluate traditional quality management frameworks. The book aims to bridge this gap by providing cutting‐edge insights into how Industry 4.0 and AI are transforming the field.
Rich Experience in the Field of Quality
: With several published books in the quality domain—including works on Design for Six Sigma (DFSS), Voices of Customers, and Applied Statistical Methods, and my research and consulting experiences, I bring a deep‐rooted understanding and expertise in quality management. The extensive experience allows for a nuanced understanding of the field's intricacies and forms the backbone of the actionable insights offered throughout the book.
Community Consensus on the Need for Change
: There is a growing consensus among professionals and governing bodies about the urgent need for change in quality management paradigms. Prominent organizations like the American Society for Quality (ASQ) have introduced concepts such as “Quality 4.0,” acknowledging the transformation brought about by Industry 4.0. This book aims to contribute to that discourse, offering a comprehensive look at evolving paradigms and practical implementations.
The synthesis of these motivating factors lends this book its focus and urgency. It is an endeavor to not only highlight the challenges and opportunities introduced by Industry 4.0 and AI but also act as a catalyst in propelling the field of quality management into a future teeming with possibilities and innovations.
To ensure the reader gains a holistic and nuanced understanding of quality management in the age of Industry 4.0 and AI, the book is organized into eight comprehensive chapters, each serving a specific purpose and building upon previous chapters. Here is an overview:
Chapter 1
: Evolution of Quality Through Industrial Revolutions
This chapter serves as the foundational block, tracing the journey of quality management through various industrial revolutions. It not only sets the historical context but also introduces the reader to the concept of Industry 4.0 and its potential impact on quality management.
Chapter 2
: Evolving Paradigm for Quality in the Era of Industry 4.0
After establishing the historical backdrop, the book delves into the current shifts in quality paradigms, made more complex and challenging by technologies like AI and super connectivity. The chapter scrutinizes the emergence of “Quality 4.0,” as outlined by the ASQ.
Chapter 3
: Quality by Design and Innovation
This chapter explores the role of design in quality, especially in a landscape transformed by AI and Industry 4.0 technologies. Key industry case studies like Apple and Samsung provide actionable insights.
Chapter 4
: Quality Management in the Era of Industry 4.0
Focusing on practical applications, this chapter goes deep into how quality management is changing in various sectors from manufacturing to service, thanks to smart technologies and AI‐driven data analytics.
Chapter 5
: Predictive Quality
This chapter dives into one of the most cutting‐edge applications of AI in quality—predictive analytics. With a focus on AI‐powered predictive models, the chapter serves as a how‐to guide for implementing predictive quality measures across various sectors.
Chapter 6
: Data Quality
Data serves as the backbone of AI and Industry 4.0, making this chapter crucial. It covers everything from the basics of data quality to advanced data governance strategies, all the while emphasizing its importance in modern quality management.
Chapter 7
: Risk Management in the 21st Century
Taking a holistic view, this chapter explores how risk management is being redefined by AI and data analytics. It provides a comprehensive discourse on both traditional and emerging frameworks in risk management, making it highly relevant in today's volatile environment.
Chapter 8
: Emerging Organizational Changes in the 21st Century
The final chapter delves into the human and organizational aspects of quality management in the age of AI and Industry 4.0. It explores how roles, skills, and organizational structures are evolving and what that means for quality professionals.
By systematically addressing these various facets, the book offers a multi‐dimensional view of quality in the age of Industry 4.0 and AI. It is designed to be not just a resource but a guide that challenges, educates, and inspires its readers to think differently about quality.
I am profoundly grateful for the invaluable advice and encouragement I have received from a host of individuals throughout the development of this book. My heartfelt thanks go to Gregory Watson, Larry Smith, Nicole Radziwill and Ron Atkinson of the American Society for Quality (ASQ), as well as Dr. Jack Feng from the Institute of Industrial and Systems Engineers. I am also deeply appreciative of my colleagues on ASQ’s Quality 4.0 Technical Program Committee: Christiana Hayes, Wendy Diezler, Kristine Bradley, Nicole Johnson, Lisa Custer and Manny Veloso.
Special recognition is extended to Mr. Lin Ben, Mr. Xueping Lin, Mr. Lingyun Dong, and Mr. Tony (Wei) Tong for their invaluable contributions, offering in‐depth insights into real‐world Industry 4.0 practices.
I extend my sincere gratitude to Dr. Ben Mejabi for his significant input in Chapter 2, which explores the paradigms on quality set forth by Robert Pirsig and Christopher Alexander.
My thanks also go to Dr. Zuoping Yao and Dr. Jie Hu, who enriched Chapter 3 with their expertise on Quality of Experience practices at SAIC‐GM‐Wuling Automobile Company.
Contributions from Yuping Liu and Hongjuan Yao have enriched Chapters 2 and 3, offering vital perspectives on super‐connectivity and co‐creation practices of Haier Group.
I am very grateful to Dr. Carlos Alberto Escobar Diaz, Dr. Ruben Morales‐Menendez, and Dr. José Antonio Cantoral Ceballos of Tecnológico de Monterrey for their scholarly contributions to Chapter 5.
Feedback from readers is invaluable; your comments and suggestions are most welcome. Rest assured, I will give thoughtful consideration to your input for future editions of this book.
Industry 4.0, also known as the Fourth Industrial Revolution, is fueled by a remarkable suite of emerging technologies. These encompass everything from artificial intelligence and interconnected smart devices to groundbreaking advancements in nanotechnology. Such innovations are spawning unprecedented capabilities, fostering dynamic shifts in interpersonal communications, societal relationships, business operations, and service delivery. They are also stimulating product usage diversity and heightening production flexibility.
The momentum and breadth of this revolutionary transformation are intrinsically tied to the maturation rates of these underlying technologies, which are evolving at a swift pace. Historical precedents show us that each industrial revolution has deeply influenced management sciences and practices, including those in quality management. There is no reason to assume that the Fourth Industrial Revolution will be any different—it is poised to reshape our world in ways we are only beginning to comprehend.
Established quality management systems, such as total quality management (TQM), originated in the early period of the previous century. These systems were designed in response to the needs of the Second Industrial Revolution, which was characterized by mass production. Their primary goal was to mitigate defects and nonconformities within manufacturing processes. It is undeniable that these quality systems and methodologies have proven robust over time, consistently enhancing quality. They have undergone regular upgrades to stay relevant, and they continue to be fundamental components of our contemporary operational landscape.
However, it has had no innovative changes for nearly 60 years, and many methods and tools are outdated, so they are less and less used in the field. Figure 1.1 illustrates Google search trends for quality management and Six Sigma, and they are clearly trending downward since 2004. Obviously, pursuing good quality is human nature, and it will never go out of style. It is the current quality management theories and practices that become stagnant [1–3]. It is time for a breakthrough in quality management.
Quality is an ancient, complex, and eternal concept. Throughout history, as science and technology develop, at some point, industrial revolution will happen, as a result, new production systems may emerge, and the quality paradigm usually evolves with it to keep pace with these changes. Since the central theme of this book is “Quality in the Era of Industry 4.0,” it is beneficial to see how quality has been evolving through all previous industrial revolutions. It will help us to predict what is next for quality in Industry 4.0. This is the key topic of this chapter.
Figure 1.1 Google Search Trends for Quality Management and Six Sigma
Throughout the course of history, quality has consistently manifested as an elegant, profound, yet elusive concept. The term “quality” has its roots in ancient Rome, derived from “qualis,” signifying a “degree of excellence” in things or human values [4]. This concept echoes the philosophical musings of Greek thinkers such as Plato and Socrates on the subject of epistemology. Further exploration by philosophers, including John Locke, introduced the duality of quality: the objective aspect, which is inherent in the items themselves, and the subjective aspect, reflecting individual perceptions of these items [4].
With the burgeoning consumer markets that began in the 1600s, products started reaching a mass audience. This meant a diverse array of consumer expectations and needs, compounded by a wide spectrum of economic capabilities. Consequently, the interpretation of quality began to transition from an “absolute excellence” toward the concepts of “value for money” and “cost‐effectiveness.”
In the business realm, “value” has emerged as a critical measure of quality. In this context, value is defined as the ratio of a product’s benefits to its cost. Prior to the era of industrialization, goods were predominantly crafted through craftsman production systems. In such setups, a craftsman or a team would oversee the entire design and production process, from inception to completion.
Craftsmen typically possessed high levels of skill, dedication, and self‐discipline. They approached their work with an artisan’s mindset, finely attuned to the client's subjective evaluation of quality or their esthetic sensibilities. These highly esteemed craftsmen were, in essence, the arbiters of quality for their products. As highlighted by Juran [5], they meticulously monitored every detail of design and production.
During this era, quality assurance relied heavily on the integrity of the craftsmen who valued their reputation highly. Consumers, too, played a role by carefully inspecting the products. In many parts of Europe, guilds implemented self‐inspection and certification protocols, significantly enhancing the branding and quality assurance of artisans and their workshops. An official standard verification model was also prevalent, with government‐led verification and oversight mechanisms being particularly advanced in China since the Qin and Han dynasties [5].
Throughout this period, the craftsman held primary responsibility for the product. This role was sometimes played by an individual entrepreneur, at other times by a team, albeit typically a small one. The craftsman managed all aspects of the product's design, production, and delivery. When required, craftsmen liaised closely with clients to ensure a high standard of work. Given their comprehensive oversight of the entire process, they could ensure full production control, preclude coordination issues, and maintain a tight rein on product defects.
This approach remained in vogue until the onset of the Second Industrial Revolution.
The First Industrial Revolution unfolded in Great Britain between the 18th and 19th centuries. The driving forces behind this revolutionary period included the use of fossil fuels for power generation and the advent of steam engines for trains and ships. These developments were bolstered by significant breakthroughs in the iron, steel, and chemical industries and the rise of machine‐driven textile industry.
The First Industrial Revolution incited transformative changes across economies, societies, and lifestyles. While it did not fundamentally alter the craftsman model, it did raise the complexity of many products well beyond that of traditional handicraft items. Consequently, the teams of craftsmen expanded, giving rise to a workshop‐based supplier system—commonly known as the “cottage industry.” This new system provided parts, materials, and subsystems for complex products. To coordinate this growing supplier system, the inception of early versions of industrial standards was necessary [6].
The craftsman model is characterized by its practitioners being “jacks‐of‐all‐trades”—highly skilled individuals who manage all aspects of production. However, a notable drawback of this model is the extensive training period required for craftsmen.
In the craftsman production model, the manufacturing of complex products lacks streamlining. Lengthy process transitions and preparation times lead to lower productivity, extended production duration, and consequently, higher costs. It is also important to note that the First Industrial Revolution did not significantly impact quality control methods.
The Second Industrial Revolution predominantly originated in the United States, spanning from the late 19th century to the early 20th century. This revolution was marked by the extensive adoption of electrical energy and the implementation of a mass production system. This system was grounded in moving assembly lines and the use of standard, interchangeable parts.
The influence of the Second Industrial Revolution on production methods and working practices was incredibly profound. It has shaped modern quality management significantly and has even influenced the definition of quality itself, a testament to its enduring impact that persists to this day.
Since the First Industrial Revolution, scientific and technological advancements have accelerated rapidly, significantly boosting the production capacity of industrial materials. This led to the emergence of various complex new products, such as automobiles. Crafting such products via traditional craftsmen and workshops was prohibitively time‐consuming and expensive, making them accessible only to a select few.
The breakthroughs in electrification infrastructure presented opportunities to expedite the production process and reduce costs. Among those seizing this opportunity was Frederick Taylor, who proposed the “Scientific Management Method,” also known as Taylorism. This method proposed several core principles [7]:
Break down large, complex tasks, such as automobile assembly, into numerous smaller steps—potentially in the thousands.
Divide labor among workers, assigning each to a specific process step, with each worker performing the same task repetitively.
Utilize stopwatches and motion analysis to determine the most efficient work practices and use these as a standard for training workers.
Abandon the craftsman production system completely by segregating management, design, and production sectors, and by instituting an extensive division of labor and professions, marking a clear distinction between “blue collars” and “white collars.”
Another pivotal figure in the Second Industrial Revolution was Henry Ford, founder of the Ford Motor Company and the originator of the moving assembly line operation and the mass production system [8]. His core contributions include:
Full utilization of standardized, interchangeable parts across the entire industry—preceding Ford, the American firearms industry had already implemented the practice of standardized parts in the late 19th century.
The first electrically powered moving assembly line: the assembly line comprised numerous workstations, each executing a single operation performed by one worker.
Simplified, standardized, and interchangeable worker skills.
At its inception, the moving assembly line exhibited the following characteristics:
The product (automobile) had a single design.
The assembly line was a rigid yet precise process, capable of producing only one specified product through a sequence of pre‐determined process steps.
The work of product design was also highly specialized and separated from production.
Compared to the craftsman production model, this mass production model drastically reduced, if not entirely eliminated, setup and change‐over times between consecutive process steps. Consequently, it markedly improved productivity and production capacity, substantially reduced the cost and sales price of products, and significantly broadened the consumer base. However, the product variety in the mass production system became quite limited, with monotonous styles and typically unimpressive esthetics.
Figure 1.2 Comparison Between Cars Produced from Craftsman and Mass Production Systems. (a) Mass Production: Ford Model T, 1922 Price: U$319, 1.3 million produced per year.
Source: Shawshots / Alamy Stock Photo.
(b) Craftsmen Production: Rolls‐Royce Twenty £1600, 2940 cars produced from 1922 to 1929.
Source: Barker / Wikimedia Commons / CC BY‐SA 2.5
Figure 1.2 contrasts two types of cars—one produced by a mass production system and the other by a craftsman production system. The vast differences in cost, productivity, and style between these two cars are clearly discernible.
The mass production model, or the Taylor model, had a disruptive and profound impact on the economic activities, business models, lifestyles, and corporate organizations of that era, effects that largely persist today, particularly in Western countries.
Some key effects of the mass production system include the following features:
The decline of the craftsman/artisan class
: Craftsmen, highly skilled and versatile individuals, once constituted the backbone of key economic activities in ancient and pre‐modern times. With the proliferation of the mass production model, this class has become nearly extinct, now appearing predominantly in the luxury goods and handicraft industries.
Realignment of Society
: A highly specific division of labor and professions has become a common practice, extending beyond manufacturing to government, service, medical, and scientific research sectors.
The widespread adoption of well‐designed processes and standardization in all disciplines.
The mass production system also exerted a paradigm‐shifting, disruptive influence on quality control. In ancient times, quality represented “the degree of excellence,” and with the maturity of the mass consumer market, the concept of quality evolved to encompass “benefits versus cost” and “customer value.” Compared to handcrafted luxury products, when the mass production system substantially reduced product prices, consumers, driven by cost considerations, were willing to forgo all luxurious features and decorations in favor of basic functions. For instance, the Ford Model T, available only in black color, was inexpensive, functional, and reasonably reliable, making it the most popular car model of its time.
However, the early Taylorism mass production system was highly fragmented. Workers operated separately at different stations along the assembly line without communicating with each other, contrasting starkly with the craftsman model, where one craftsman would complete the entire order from start to finish.
Amid the surge in complexity attributed to the large number of components and steps involved in the mass production system, the susceptibility of products to errors and defects intensified. Consequently, the critical aspect of quality assurance shifted toward managing defects by guaranteeing conformance from every supplier and each stage of the production process. This need served as a catalyst for the evolution of modern quality management systems pioneered by distinguished figures such as Shewhart, Deming, and Juran [9].
Several fundamental quality systems and methodologies, such as statistical quality control (statistical process control [SPC]) [10] and total quality management (TQM) [11], took root during the pinnacle of the Second Industrial Revolution. Notably, following the widespread adoption of Taylorism’s mass production model, the working definition of quality underwent a transformation. It evolved from notions of “excellence” and “customer value,” to the more tangible metrics of “defect free,” “low scrap rate,” and “low failure rate.” This interpretation of quality has persisted to this day, with scrap and failure rates remaining the sole quantifiable quality indicators being assessed.
In conclusion, the emergence of modern quality management and quality engineering can be attributed to the pressing issue of handling, controlling, and eradicating defects and failures within mass production systems.
The transition in defining quality, from a “degree of excellence” to “free from defects and adherence to standards,” is a drastic shift. To make sense of this “definition of quality,” it is essential to explore the numerous studies that have debated this issue. This brief summary provides an overview. In their article, Reeves and Bednar [12] argue that perceptions of quality vary based on individual perspectives. For instance, from a consumer’s perspective, quality may signify “excellence,” “customer value,” or “surpassing expectations.” In contrast, producers perceive quality as “conformance to specifications” and low rates of scrap, failure, and defects. In the initial phases of the mass production model, a seller’s market prevailed due to product scarcity, which likely influenced this differentiation.
Garvin’s 1987 article [13] outlines eight dimensions of quality: performance, features, reliability, conformity, durability, service, esthetics, and perceived quality. This comprehensive overview contemplates multiple perspectives, encompassing both “objective quality” and “subjective quality.” However, from then until the present, the quality community has prioritized aspects such as reliability, conformity, and durability.
Since the advent of the mass production system, the initial method of quality control involved inspecting products prior to shipping, repairing or discarding defective products, and retaining the quality ones. Large repair shops were commonplace in early automobile factories. Sampling inspection [14], involving random or full inspection to prevent defective supplier parts from reaching factories, was another common practice. If a batch failed this inspection, it would be returned to the supplier. Inspection was also used to weed out inferior products within factories before shipping them to customers.
However, this inspection approach had clear disadvantages. First, the inspection was labor intensive and time consuming, with the manual inspection proving less than foolproof. Should an issue be detected during inspection, the root cause could potentially lie anywhere within the thousands of process steps, due to worker errors, defective incoming parts, or machine failure. Identifying and eliminating these problems via end‐of‐production inspection proved challenging, hence their recurring nature. The production process, as shaped by the Taylor system, was fragmented and compartmentalized, complicating the swift identification and resolution of quality issues. For instance, in early 20th‐century automobile plants, the repair workshop, managed by the quality department, aimed to eliminate defects. This objective clearly conflicted with the assembly plant management’s goal of swiftly dispatching manufactured cars.
Since quality issues typically arise during the production process, and inspection is the last step, some argued that “quality control should move upstream.” Consequently, it was deemed necessary to control quality at critical upstream process steps. In 1931, American statistician Walter Shewhart’s book “Economic Control of Manufactured Product” formally defined quality control as a statistical issue and introduced the concept of the statistical process control (SPC) [15]. This method of managing key processes marked a significant shift, with numerous subsequent quality methods and indicators (such as process capability indices) deriving from it and remaining in use today.
Before World War II, the usage of control charts was not widespread among manufacturers; they were primarily utilized in Bell Labs factories. However, Edward Deming and Joseph Juran [9], who were involved with these early initiatives, later emerged as key pioneers of modern quality management.
It is noteworthy to observe the gradual evolution of the mass production system between 1920 and 1950. In the 1910s, the Ford production system epitomized the first version of the mass production system. The assembly line was virtually inflexible, producing solely the Ford Model T. Initially, due to its low price and reasonable reliability, Model T dominated the market. However, post‐1920s, General Motors (GM) employed several strategic measures to outmaneuver Ford:
They deviated from a single product line setup, introducing several brands catering to different price points to meet varying consumer needs. Each brand operated akin to a small company, all under the GM umbrella.
They initiated the practice of introducing new models annually. While there may not have been significant changes each year, the presence of new features was ensured. This marked the beginning of the mass production system’s gradual transition toward a flexible production system
[16]
.
In the 1950s, Toyota’s lean production system emerged, enabling greater flexibility in production and an increased variety of products without sacrificing efficiency [17]. The advent of the Third Industrial Revolution in the 1960s, powered by computer and information technology, introduced further flexibility and control over the production process [18]. It also significantly improved product design capabilities. These advancements underscored the emphasis on another critical aspect of quality: a robust focus on customer satisfaction. This series of events precipitated the comprehensive evolution of quality management systems and quality engineering.
The initial wave of modern quality management unfolded in post‐World War II Japan. Although Japan had industrialized prior to the war, its reputation for consumer goods’ quality was unimpressive. The label “Made in Japan” was equated with cheap and low‐quality products during this period. The Korean War transformed Japan into a critical strategic hub for the US military, necessitating the rectification of their problematic, defect‐ridden, and subpar telephone communications within a stipulated timeframe. As a measure to “urge Japan to improve quality,” American quality experts Edward Deming and Joseph Juran were invited to deliver open classes on quality management to Japanese corporate executives. The US military mandated the attendance of top Japanese company executives at these sessions [19].
Intriguingly, Deming and Juran had previously conducted similar lectures in the United States, but their insights garnered far less attention from American executives. However, their presentations in Japan led to significant impacts and spurred a series of follow‐up activities:
As per historical documents, the Japanese executives attending these sessions expressed significant interest in the Plan‐Do‐Study‐Act (PDCA) or Plan‐Do‐Check‐Act (PDSA) work process. This process entailed “identifying and defining a critical problem and persisting with it until it was resolved”
[20]
.
At that time, Japan was grappling with severe economic hardships in the post‐war period. The nation needed to export consumer goods to alleviate these economic difficulties and sought to enhance the quality of its products to alter the reputation of Japanese goods. Consequently, many companies demonstrated considerable interest in this approach.
Following the absorption of American methodologies, several Japanese experts began propagating these principles via broadcast lectures tailored to the needs of Japanese business individuals. This marked the beginning of many Japanese experts and business professionals enhancing these quality methodologies originating from America and developing their own unique approaches.
Local expert Ishikawa Kaoru, a professor at the University of Tokyo, played a significant role. He translated and expanded on Deming and Juran’s teachings. In 1950, in collaboration with front‐line industry workers, he developed the cause‐and‐effect diagram (also known as the fishbone diagram). Post‐1960, he posited that not only middle and upper‐level enterprise managers should learn quality management, but first‐line supervisors and on‐site workers should as well. This belief led him to pioneer the “total quality control (TQC)” activity model and training method, which subsequently became the first practical version of the TQM system, underpinned by Ishikawa's 11 points [21].
Another local expert, Genichi Taguchi, joined Nippon Telegraph and Telephone Company (NTT) in 1950 with the aim of enhancing the quality and reliability of Japan's telecommunications industry. He later developed the Taguchi method to improve durability, stability, and reliability during product design—a method which was adopted seriously by Toyota and promoted worldwide post‐1980 [22].
Quality function deployment (QFD) is a unique design planning methodology originating from Japan, which is focused on incorporating the customer’s product expectations into the design process. The inventor of QFD, Yoji Akao, saw his concept first applied in Japan in 1966. QFD gained initial popularity in Japan before being adopted globally [23].
Professor Noriaki Kano from the Tokyo University of Science and Technology introduced the renowned Kano Customer Satisfaction Model (Kano Model) in the 1970s and 1980s. This model categorizes customer needs into three types (or five, according to some studies): must‐have, one‐dimensional, and attractive quality. Different categories of customer needs are addressed in various ways. Kano also developed a method to identify these three types of customer needs through customer surveys [24].
Japanese anthropologist Jiro Kawakita proposed the affinity diagram methodology in the 1960s [25]. This method helps analyze data from large volumes of scattered, rambling words collected from customer surveys or brainstorming sessions. The technique can discern patterns and underlying structures within these words. It is now widely used to analyze and synthesize customer needs, serving as a fundamental technique for brainstorming and one of the seven tools of TQM.
Originating in Japan, Kansei Engineering is a product design methodology that stems from the observation that consumers possess conscious or subconscious preferences or dislikes for specific product features based on their personal feelings. These feelings can be evoked by a comprehensive psychological state induced by the customers’ five senses—sight, hearing, taste, smell, and touch. As such, designers must comprehend the correlation between these consumer perceptions and the features of the design, aiming to enhance the design’s positive impression. Kansei Engineering has evolved several robust methods to capture, quantify, and analyze sensations. For instance, it uses techniques such as analyzing the gaze, attention, facial expressions, and smiles captured in video recordings of visitors during a product exhibition. The outcomes of this human perceptual analysis are translated into product design elements, and products are manufactured to align with people’s preferences. Pioneering the introduction of perceptual analysis into the realm of engineering research were researchers from the Faculty of Engineering at Hiroshima University, Japan. Starting in 1970, with a focus on residential design that takes into account occupants’ emotions and desires, they investigated how to embody occupants’ sensibilities into engineering techniques used in residential design. Renowned expert Nagamachi Mitsuo has contributed significantly to this field, authoring works such as “Kansei Engineering of Automobiles” and “Kansei Engineering and New Product Development” [26].
Shiego Shingo, a technical pioneer of the Toyota production system, devised key processes and techniques for quick die change and rapid process changeover to achieve zero inventory and flexible production [27]. As the Toyota production system underscores “one piece flow,” conditions such as “zero defects” and “zero failures” are indispensable. This led Shingo to extensively investigate quality control, arriving at several critical observations:
Shingo recognized the human tendency to make mistakes, acknowledging that perfection is unattainable. However, he believed that an effective system to automatically correct errors could prevent human mistakes from turning into significant defects or flawed products.
He categorized the prevalent post‐production inspection to separate good from bad products, often done manually, as “judgmental inspection.” Shingo pointed out several drawbacks to this method, such as the inevitability of human error and its general inability to identify the root cause of a quality issue.
Shingo found that SPC, which originated in America and was popular in Japan, had limitations. Statistical process control relies on sampling data from production for quality‐related judgment, requiring production to halt and root cause identification when the SPC charts indicate an “out of control” situation. Shingo argued that due to its reliance on sampling inspection and time lag in problem identification, SPC might miss defects and prove ineffective for retrospective investigation of root causes.
He advocated for three inspection methods to effectively guarantee zero defects: (i) step‐by‐step inspection, wherein the downstream process checks the upstream’s semi‐finished products, investigating any problem’s source immediately; (ii) self‐inspection, which involves checking the semi‐finished product before it leaves the process, and tracing the source of any issue promptly; (iii) source inspection, which finds and blocks the source of a problem to prevent its occurrence.
After an exhaustive study of existing quality inspection and control methods from the United States, Shigeo Shingo proposed his Poka‐Yoke (fool‐proof) quality assurance method. He outlined the following guiding principles:
To achieve zero defects, 100% inspection is necessary.
Judgmental testing, if needed, should be objective, and not manual.
Inspection must be low cost and automated.
On detecting faults and defects, the root cause should be identified immediately.
All hidden root causes should be identified and eliminated one by one.
Upon finding the source of the problem, it needs to be blocked at the source to prevent its occurrence using an automatic detection device, termed a Poka‐Yoke device.
Shingo emphasized the characteristics of the Poka‐Yoke device, which should be inexpensive, capable of 100% inspection, produce real‐time results, and be implemented either through technical means or process design.
Interestingly, Shingo also shared his thoughts on statistical methods and SPC. Initially, he viewed SPC and statistics as the ultimate solution to quality issues. However, he later saw SPC as a means of estimating and maintaining the current process, not necessarily improving it. He regretted that his early reverence for statistics and SPC might have delayed the perfection of his Poka‐Yoke system. Despite Shingo’s somewhat controversial views, both non‐statistical and statistical methods have their places in quality management, varying case by case and over time.
In practical terms, Shingo Shigeo has undeniably triumphed. The modern digital Poka‐Yoke system, governed by automatic sensors, facilitates automatic real‐time 100% inspection and online real‐time control, becoming the mainstay of online quality control [28].
Starting from the 1970s, the remarkable success of Japan in multiple domains posed significant competitive pressure and challenges to industries in Europe and America. This led to introspection across various sectors in the United States. In 1980, a conversation was sparked by an NBC TV show titled “If Japan can, why can’t we?” which led to heated discussions about “Japan’s success and the United States’ response.”
Edward Deming and Joseph Juran, who were acknowledged for introducing modern quality management to Japan, were then solicited by many large Western corporations, such as Ford and General Motors, to guide their operations. In the process, Deming and Juran learned about many Japanese practices like the total quality team, which they had greatly influenced during their involvement in Japan’s quality initiatives. From the 1970s onward, they published numerous works on quality management. Some examples are Deming’s “Out of Crisis,” “The New Economics for Industry, Government, Education” [29], and Juran’s “Quality Planning and Analysis,” “Upper Management and Quality” [30].
They asserted that in a mass production system, the assurance of quality can only be achieved through comprehensive control and continuous improvement in all key process elements throughout the entire product lifecycle. Since all actions are performed by people, everyone needs to receive proper quality training and be empowered to take responsibility. They saw quality as a meticulously planned and executed systematic endeavor by the whole organization, a concept they referred to as TQM.
This concept of TQM received varying degrees of response and support from the governments and industries of Europe and America:
Beginning in 1984, the United States, initially with the military and with the Department of Defense, and then with the US federal government, embraced the concept of TQM.
From 1987 onward, the United States has annually evaluated and bestowed the Malcolm Baldrige National Quality Award
[31]
upon select outstanding companies and entities.
Since the 1990s, many European countries have standardized TQM, leading to the creation of ISO 9000 certification, which is now issued to companies worldwide.
The Third Industrial Revolution [18] marked a transition from mechanical and analog electronic technology to digital electronics. This shift, which began in the latter half of the 20th century, was characterized by the adoption and proliferation of digital computers and related information technology. These advancements had profound impacts on the manufacturing industry as sensors, computers, and information technology were continuously integrated into the production process. Consequently, important parameters in industrial processes, such as workpiece dimensional measurements, temperatures in chemical reactors, and real‐time pressure measurements in containers, could be collected in real time.
The technological developments of the Third Industrial Revolution had several significant impacts on quality management and quality engineering, particularly within the manufacturing industry:
Widespread Application of Automatic Sensors, Detectors, and Digital Poka‐Yoke Devices/Systems in the Production Process
: Starting from the 1980s, numerous manufacturing companies invested billions of dollars in installing these devices and systems in their factories. Their implementation significantly reduced production failures and assembly quality issues, leading to a marked improvement in production quality.
Computer‐Aided Design (CAD) and Computer‐Aided Analysis (CAE)
: A multitude of quality issues are attributed to poor design, affecting not only performance, esthetics, and features but also resulting in malfunctions, hidden safety concerns, and low reliability. Addressing these problems typically requires a significant investment of manpower, materials, and time. However, from the 1980s onward, the extensive use of CAD and CAE for simulation testing, product modeling, and virtual reality exercises on computer platforms allowed engineering teams to detect and rectify design issues quickly and inexpensively. Consequently, CAD and CAE significantly improved both the quality and speed of design.
Advances in computer capabilities facilitated the widespread application of numerous rigorous scientific methods. These included scientific modeling, statistical analysis, ultrafast computing, and the handling of vast digital data storage. These advances simplified the use of powerful quality tools such as SPC, design of experiments (DOE), and statistical modeling and analysis. Some advanced industrial companies were even capable of integrating sophisticated statistical methods with engineering disciplines such as mechanics and materials science, along with other specific sciences, to implement in‐depth and accurate process modeling. This process facilitated the monitoring, control, and optimization of manufacturing processes. These advancements in computer capabilities also served as a key enabling factor for the Lean Six Sigma initiatives in the quality community.
Six Sigma [32], which was first pioneered by Motorola, is a business operating system initially aimed at eliminating manufacturing defects. The term “Six Sigma” originates from the statistical field of process control, signifying the capacity of manufacturing processes to generate a substantial proportion of output within specification. In short‐term operations, processes running with “six sigma quality” are projected to yield long‐term defect rates below 3.4 defects per million opportunities (DPMO).
The concept and movement of Six Sigma were initiated as early as 1986 at Motorola and later spread to various manufacturing companies in the 1990s. In 1995, General Electric (GE) officially launched the Six Sigma movement. Due to the immediate benefits it brought about such as quality improvement, cost reduction, and profit enhancement, Six Sigma rapidly expanded across the entire Western manufacturing industry. The primary components of Six Sigma include:
The concept of continuous improvement through reducing scrap rate, costs, and waste.
Establishing an Organization and Team
: The company’s top management and administrative team must shoulder the responsibility of leadership and implementation, while also selecting and recruiting technical teams. An improvement project team, the administrative team, and the technical team collaboratively select projects for the project team.
Each project must establish specific goals and timelines, such as “reducing the paint shop scrap rate by 50% within three months.” Each project must also undergo the Six Sigma (DMAIC) process (a digitized PDCA). Upon completion, the project’s cost and return are reviewed by the finance department.
Both the administrative and technical teams should receive bespoke training. The executive team learns the concepts of TQM and Lean, while the technical team should grasp more comprehensive quality methods and lean techniques.
It is noteworthy to mention that Six Sigma’s initial goal in the early 1990s was to minimize the “cost of poor quality” with an extremely low scrap rate, signifying 3.4 defects per million (3.4 ppm). From the 1990s to the 2000s, several influential books on the Toyota production system and lean manufacturing were published in the West, including “The Machine that Changed the World,” “Lean Thinking,” and “Toyota Way” [33–35]. The Toyota system gained popularity due to its clear concept, limited mathematical components, wide applicability, and immediate impact on efficiency improvement and waste reduction. Given that Lean and Six Sigma address different problems—efficiency and quality, respectively—they complement each other. Incorporating Lean into an already established Six Sigma team enhances its effectiveness. Thus, Lean Six Sigma has become a standard business improvement operating system. After manufacturing, Lean Six Sigma quickly spread to other industries, including services, medical, financial, and government sectors.