Basic Statistical Tools for Improving Quality - Chang Wok Kang - E-Book

Basic Statistical Tools for Improving Quality E-Book

Chang Wok Kang

0,0
62,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

This book is an introductory book on improving the quality of a process or a system, primarily through the technique of statistical process control (SPC). There are numerous technical manuals available for SPC, but this book differs in two ways: (1) the basic tools of SPC are introduced in a no-nonsense, simple, non-math manner, and (2) the methods can be learned and practiced in an uncomplicated fashion using free software (eZ SPC 2.0), which is available to all readers online as a downloadable product. The book explains QC7 Tools, control charts, and statistical analysis including basic design of experiments. Theoretical explanations of the analytical methods are avoided; instead, results are interpreted through the use of the software.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 374

Veröffentlichungsjahr: 2012

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Contents

Cover

Half Title page

Title page

Copyright page

Dedication

Preface

Chapter 1: The Importance of Quality Improvement

1.1 Introduction

1.2 What Is Statistical Process Control?

1.3 The Birth of Quality Control

1.4 What Is a Process?

1.5 Examples of Processes from Daily Life

1.6 Implementing the Tools and Techniques

1.7 Continuous Process Improvement

1.8 The Goal of Statistical Process Control

1.9 The Eight Dimensions of Quality for Manufacturing & Service

1.10 The Cost of (Poor) Quality

1.11 What Did We Learn?

1.12 Test Your Knowledge

Chapter 2: Graphical Display of Data

2.1 Introduction to eZ SPC

2.2 Qualitative and Quantitative Data

2.3 Bar Chart

2.4 Pie Chart

2.5 Pareto Chart

2.6 Radar Chart

2.7 Histogram

2.8 Box Plot

2.9 Scatter Plot

2.10 Cause and Effect Diagram

2.11 What Did We Learn?

2.12 Test Your Knowledge

Exercises

Chapter 3: Summarizing Data

3.1 Central Tendency

3.2 Variability

3.3 Statistical Distributions

3.4 Distributions in eZ SPC

3.5 What Did We Learn?

3.6 Test Your Knowledge

Exercises

Chapter 4: Analyzing Data

4.1 Confidence Intervals

4.2 Test of Hypothesis

4.3 The p-value

4.4 Probability Plots

4.5 What Did We Learn?

4.6 Test Your Knowledge

Exercises

Chapter 5: Shewhart Control Charts

5.1 The Concept of a Control Chart

5.2 Managing the Process with Control Charts

5.3 Variable Control Charts

5.4 Attribute Control Charts

5.5 Deciding Which Chart to Use

5.6 What Did We Learn?

5.7 Test Your Knowledge

Exercises

Chapter 6: Advanced Control Charts

6.1 CUSUM Control Chart

6.2 EWMA Control Chart

6.3 CV Control Chart

6.4 Nonparametric Control Charts

6.5 Process Capability

6.6 Gage R & R

6.7 What Did We Learn?

6.8 Test Your Knowledge

Exercises

Chapter 7: Process Improvement

7.1 Correlation Analysis

7.2 Regression Analysis

7.3 Experimental Design

7.4 Overview of Experimental Design

7.5 Principles of Experimentation

7.6 One-Way Analysis of Variance

7.7 Two Way Analysis of Variance

7.8 Two-Level Factorial Design Analysis

7.9 What Did We Learn?

7.10 Test Your Knowledge

Exercises

Chapter 8: End Material

8.1 Final Exam

8.2 Final Exam Solutions

8.3 Test Your Knowledge: Answers

References

Glossary

Index

Basic Statistical Tools for Improving Quality

Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey.

Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging-in-Publication Data:

Kang, Chang W. (Chang Wok), 1957-, author. Basic Tools and Techniques for Improving Quality / Chang W. Kang, Paul H. Kvam. p. cm Includes index. ISBN 978-0-470-88949-7 (pbk.) 1. Process control—Statistical methods. 2. Quality control—Statistical methods. 3. Acceptance sampling. I. Kvam, Paul H., 1962–, author. II. Title. TS156.8.K353 2011 658.5’62—dc22 2010042346

To Jinok, Philip, JohnC.W.K.

To LoriP.H.K.

PREFACE

Basic Statistical Tools for Improving Quality is intended for business people and non-business people alike who do not possess a great amount of mathematical training or technical knowledge about quality control, statistics, six sigma, or process control theory. Without relying on mathematical theorems, you will learn helpful quality management techniques through common sense explanation, graphical illustration and various examples of processes from different industries. Unlike most other books written about quality management, you will be learning by doing. Every time we introduce a new tool for improving quality, we will practice it on a real example. Programming skills are not necessary because we will implement a free menu-based and user-friendly software program that was designed with the basic tools and techniques for improving quality.

If you bought this book, you are probably familiar with some kind of production process. By process, we can mean a traditional manufacturing process such as goes on in a Hyundai Motors automobile assembly plant in Korea, but it can also refer to a supply chain that links a Maple tree seedling from an Oregon plant nursery to a Baltimore garden store. In a service industry, a process can be the stages necessary to provide the customer at McDonald’s with an Egg McMuffin.

All organizations have a critical process and strive hard for improving quality of products and/or services through that process. At the end of each process, there always exists some bit of variation in output. For example, in the chocolate chip cookie manufacturing process, the number of chocolate chips in each cookie varies. This is not an exciting prospect for the customer who opens the cookie box with clear idea of what they want and expect in a chocolate chip cookie: not too many chips, and certainly not too few. But some variation between cookies is inevitable, and these variations are difficult to see during the baking and packaging process.

In order to improve quality, it makes sense to reduce this item-to-item variability when we can. But to know about the variability in the process, we have to measure things, and then figure out how to assess this variability by analyzing the resulting data. This can get complicated, even if we are baking chocolate chip cookies, where it is impractical to count the chips in every baked cookie. This is where statistical process control takes over.

Statistical process control refers to the application of statistical tools to measure things in the process in order to detect whether a change has taken place. In the examples we have discussed, the process aims for consistency, so any change in the process output is usually a bad thing. This book describes and illustrates statistical tools to detect changes in a, process, and it also shows how to implement the idea of continuous improvement into process control.

The tools you can implement are both graphical and analytical. Graphical procedures (charts, diagrams) serve as effective tools to communicate intricate details about a complex or dynamic process without having to go into statistical or mathematical detail. Graphical tools are helpful, but not always convincing. Analytical tools, on the other hand, can be more powerful but are more challenging to learn. While we spare the reader of any unnecessary detail about the mathematical machinery of the analytical tools, we certainly do not promote an overly simple “push that button” mentality in this book. That won’t work for the unique problems encountered when applying these tools and techniques to your workplace. The analytical tools need a better level of understanding.

To be an effective process manager, you don’t have to be an expert in statistics. You do need to be knowledgeable about the process you are working on, and you need to be determined enough to learn how these simple tools can be used to understand and control this process. This is the kind of expert that we hope you will become after you have finished reading this book. To avoid unnecessary statistical formulas, we will focus on concepts and show you how to use the free eZ SPC software to understand how data is analyzed. We will learn through examples.

Here are some key skills you should pick up after reading Basic Statistical Tools for Improving Quality:

1. You will perceive and acknowledge that there is always some kind of problem in the process.
2. You will know what kind of problem solving tools or techniques are required to overcome your obstacles.
3. You will know how to collect appropriate data that will make the problem solving as easy as possible.
4. You will know which tools to use and how to use menu commands from the eZ SPC software to complete your data analysis.
5. You will know how to interpret the eZ SPC results to explain how the problem is solved.

Outline of Chapters

Chapter 1: In the first chapter, you will learn how to frame a typical business task or industrial procedure as the kind of process that is appropriate for quality improvement. Examples will be used to illustrate processes we encounter in daily life and we outline how tools and techniques for quality improvement can be learned and exploited.

Chapter 2: Here, we introduce you to eZ SPC, the free and easy-to-use software that will help you analyze and solve quality management problems in this book and at your job. If you are already comfortable programming with other software packages that are capable of running all the necessary statistical process control procedures listed in this book, you can take advantage of your programming expertise without using the eZ SPC program. However, we devote parts of this book to helping the reader complete quality management tasks using the simple menu-based commands of eZ SPC. Chapter 2 shows you how charts and graphs can be created to help summarize process performance and describe how efficiently a process is working. We include bar graphs, box plots, cause-and-effect diagrams, histograms, Pareto charts, pie graphs, radar charts and scatter plots.

Chapter 3: In this chapter, we will go over the key features in a set of data. To know which values appear in the data set and how frequently they are found at different values tells us about the data’s

distribution.

You will learn about statistics that describe the spread of the data as well as where the middle or end-points of the data set are. Using helpful tools in eZ SPC, we will learn how to summarize data and find out about how it is distributed without having to use graphs.

Chapter 4: As a companion chapter to Chapter 3, here we will present you with an overview of the analytical tools you will rely on in the analysis of process data. Two key concepts in data analysis are confidence intervals and statistical tests of hypotheses. These procedures not only provide a summary and prediction of the process from the features of the data, but they also provide a measure of our uncertainty regarding those predictions. The ability to communicate the uncertainty in statistical results will make our results more objective, informative and scientific.

Chapter 5: Here we introduce you to a primary method for evaluating and monitoring a process: the control chart. Control charts are simple and varied graphical tools that can be easily generated using the eZ SPC software, and the generated chart helps us to determine if the process is running correctly. The various kinds of charts represent the various goals of the process manager as well as the different kinds of measurements from that process that we can scrape together for data analysis.

Chapter 6: The control charts in Chapter 5 are constructed to detect the first moment when a process breaks down and goes out of control. In the following chapter, we move from these basic control charts to more advanced charts that are constructed to detect the incremental changes made by a process that is slowly degrading or getting out of control. Also in this chapter, you will learn how charts and statistics can be applied to assess process capability once the control charts have shown that the process is running as planned.

Chapter 7: Here we will show how you can create process improvement by investigating potential factors (in the environment or in the process itself) that affect the process and/or the process output. Some of these factors are readily identifiable, but there might be other causes and factors that are not obvious. Moreover, these new factors we consider not only affect the process output, but they may affect each other. That can really challenge our ability to understand the process the way we want. To overcome this challenge, statistical methods such as correlation analysis, regression, analysis of variance and factorial design can be implemented using eZ SPC.

At the end of the book, we include a glossary of terms that are used in the book, and we also include some terms not found elsewhere in the book. These terms generally fall outside the scope of statistical process control, but you might come across them in your work on process improvement.

Examples, Exercises and Test Questions

Examples are featured throughout the book and easy to find. Just look for a gray box followed by the example text. Here is an example from the first chapter:

Example 1–8: General Hospital Emergency Center

Hospitals must always be at the forefront of process improvement. Patient processing at an emergency reception has evolved continuously over…. The example always ends with three dots:

• • •

Exercises are made to challenge the reader to think about a problem, and they might require the reader to use eZ SPC for solving it. Because this is a self-teaching textbook, we try to guide you through the steps of the exercises, and discuss the outcomes you should be able to achieve.

At the end of each chapter, we include a section titled Test Your Knowledge that contains 10 quiz questions. These will allow you to quickly evaluate your learning as you go along, and generally don’t take as long to finish as the exercises. Solutions to the quiz questions are found in the last section, title End Matter. In that section, we have also constructed a Final Exam consisting of 100 multiple choice questions that cover all the chapters of the book.

By the time you finish the book and the exercises, you can be confident that you have mastered the basics of process control and improvement. Then, you will be ready to take what you have learned and practice it on all of the processes that inspired you to get this book in the first place.

Why We Wrote This Book

Before this book was imagined, one of the authors (Dr. Kang) developed and improved software tools for use in quality management training workshops. Participants who attended the workshops included mid-level managers, assembly line foremen, and a few people in upper management. The audience was attending the seminar for one reason: to obtain the minimum hours of training they needed to help them achieve ISO 9000 accreditation and certification (ISO stands for International Organization for Standardization). As a result, participants were usually satisfied to passively sit through workshop seminars, which (let’s face it) can tend to be dull. The initial results were not positive.

To improve learning in later workshops, class participation was mandated through use new software tools made available to the class. Along with lessons in quality management, participants made graphs and analyzed process data with a click of the button. As a result, the audience became more involved in learning how statistical process control tools could be used in quality management without having to learn about complicated statistical formulas or detailed computer algorithms. It was all available to them with a push of the button, and participants were excited to use these newly learned techniques on quality management problems.

The software was further refined and improved. We listened to what process managers told us about which tools and techniques were most helpful to them in their day to day work. It was easy to gather the most important ones to put in this book, but it was not so easy to decide what to leave out. For example, multivariate control charts are an important foundation in advanced process control monitoring, but seemed unnecessary for learning the basics of process control. So we grudgingly left it out, along with some other highly advanced topics in statistical process control so we could focus on the important stuff.

Free Software

With the software in mind, we will show the reader how to make statistical graphs and analytical tables for quality management problems. With every new lesson, there will be a computer tool to help you solve problems. Exercises and quiz questions will also include a dose of pain-free data analysis using the provided software. The software program eZ SPC is all you need, and you can download it for free at

http://www.hanyang.ac.kr/english/ezspc.html

Along with the program, you should download the Applications Folder, which contains all of the data sets introduced throughout the book. These data sets are used both in examples in the textbook and quiz questions at the end of every chapter. Of course, you can also use eZ SPC for your own statistical analysis and process control once you are finished reading this book. The spreadsheet structure of eZ SPC allows an easy transfer of data from text files and Microsoft Excel© files.

Acknowledgments

Along with our families, we are grateful to our students, colleagues and the editors. Prom Wiley, we thank Jacqueline Palmieri, Melissa Yanuzzi, Amy Hendrickson and Steve Quigley.

We thank Dr. Bae-Jin Lee and Dr. Sung-Bo Sim for their contribution to the development of eZ SPC software, Dr. Jae-Won Baik, Jong-Min Oh, and Eui-Pyo Hong for their contributions to the development of eZ SPC 2.0. We would also like to thank Dr. Hae-Woon Kang for his special contribution to develop and improve eZ SPC 2.0 English version for this book. We also thank current and former graduate students of the Statistical Engineering for Advanced Quality Laboratory, Hanyang University who played supporting roles as we improve eZ SPC software and as we prepare this book. We would also like to express our appreciation to the many students and Dr. Kyung Jong Lee who have used eZ SPC 2.0 Korean version and who have made useful suggestions for improvement of the software.

We have benefited from Hanyang University colleagues who encouraged us working together at Georgia Tech. Thanks go to colleagues at Georgia Tech who made this cooperation possible, including Chip White, Lorraine Shaw and Michael Thelen.

CHANG W. KANGDepartment of Industrial & Management Engineering Hanyang University South Korea

 

PAUL H. KVAMH. Milton Stewart School of Industrial and System Engineering Georgia Institute of Technology United States

CHAPTER 1

THE IMPORTANCE OF QUALITY IMPROVEMENT

1.1 INTRODUCTION

Think about how rapidly things change in today’s world. As the years have gone by, many of our daily chores and activities have increased dramatically in speed. With on-line bill paying, internet commerce and social networking available on mobile devices like smartphones, in one hour we can finish a set of tasks that once took a full day to accomplish.

The business world changes rapidly, too. Once a company has brought a new product to market, they probably have only a short time to celebrate this accomplishment before competitors introduce an improved version or a new and unexpected alternative that customers may soon prefer. Because as technology changes, customers change, too. It’s easier for consumers to collect information about new products so they can make educated choices. Today’s consumers have higher expectations and less tolerance of products that don’t keep up.

In this way, the business world emulates our daily life. Instead of using all of our saved time to enjoy the additional hours of leisure that new technologies have afforded us, we immediately reinvest our time in new tasks that help us produce even more so we can keep up with our colleagues, adversaries and neighbors. And it’s even tougher for businesses these days, because globalization of markets has brought together competitors from every part of the world, not just from across town.

As the world melds into one enormous trading village, increased competition demands faster increases in product output and quality. More than ever, weaker companies will be driven out of the market without protection of domestic loyalties or the fleeting fortunes derived from the economic benefits of assumed scarcity. Consider the impact of Japanese car manufacturing on Ford and General Motors in the 1970s and 1980s. When automobile purchasers in the United States were more limited in their choice of new automobiles, the car producers took advantage of the scarcity of customer resources and did not aggressively pursue quality improvements until they lost the scarcity advantage to Japanese companies such as Toyota, Honda and Mitsubishi.

Starting in the 1970s, the confidence and reputation enjoyed by General Motors and other domestic auto makers slowly eroded up through the 1990s, when customers in the United States tacitly assumed a new automobile from Toyota, Nissan or Honda would deliver higher quality at a better price than the domestic alternatives. In the coming years, auto makers with a reputation for high quality products will have to work harder to retain their customers’ confidence because competitors will catch up more quickly than it took American automobile manufacturers to react to the flourishing Japanese automobile industry.

In 1990, Japan became the world’s top producer of automobiles. In 2008, Toyota surpassed General Motors as the largest auto maker in the world. Late in 2009, however, Toyota started to face millions of potential car recalls due to problems with acceleration. All of a sudden, Toyota’s reputation for long-term quality is being severely tested. Customers are more knowledgeable, so manufacturers and service providers that fail to adapt to meet the needs, expectations, and requirements of customers will lose market share quickly. By early 2010, for example, Toyota lost a third of its U.S. market share in less than a month after the car recalls were announced. Other Japanese auto makers have experienced reduced market share as well, in part due to recent competition from Korea.

With their increased access to knowledge, consumers have demanded higher quality products and more responsive services. With the ruthlessness of global competition, companies are frantically adopting various strategies for improving quality and reducing cost. These strategies include statistical quality control, total quality management, Six Sigma and Lean Six Sigma. Statistical quality control is the chief way many manufacturers realize quality improvement, and this includes sampling plans, experimental design, variation reduction, process capability analysis, process improvement plans, reliability analysis and statistical process control.

Of all these various components, statistical process control (SPC) is the focal point of quality improvement. The techniques we learn in this book will make up the toolbox you need to analyze and explain problems with process management. It requires some statistical thinking where we frame our problems in terms of processes, consider how these processes are interconnected, and how reducing the variation in these processes will be a key to our success as a quality manager. Once these problems are addressed, we can focus on process improvement. In the next section, we explain just what process control means in business and manufacturing, and how it differs from other quality control management tools.

1.2 WHAT IS STATISTICAL PROCESS CONTROL?

In general terms, statistical process control can be defined as a method which monitors, controls, and ideally improves a process through statistical analysis. SPC includes measuring the process, reducing variability in the process to make it produce a consistent output, monitoring the process, and improving the process in terms of a target value of the output. By monitoring the process, diagnostic techniques are applied to identify different causes of variation in the process, eliminate the assignable causes of variation, and continuously improve the process in order to reduce variability or production cost. An assignable cause might be some unnatural or unwanted pattern observed in the process. This is in contrast to a common cause, which more likely is treated as random, unavoidable noise or natural variability in the process. This chance-cause variation can be as innocuous as a minor measurement error.

Statistical process control uses statistics to analyze the variation in processes. This represents the more technical side of quality improvement, and requires the most effort to understand. In this book, we guide the reader around (and away from) most of the technical details required in the statistical analysis of process data, but we emphasize the importance of the statistical issues and how statistical procedures are used to solve quality management problems. Using statistics, SPC tries to keep the process average at a target value and to reduce the unpredictability of the process. These techniques can be applied to a wide variety of industries, to maintain the consistency of a manufacturing process, a service process, or any of the unique processes in today’s business world.

1.3 THE BIRTH OF QUALITY CONTROL

The birth of quality control started around Chicago back in 1924, at the same time when prohibition helped make the city notorious for bootlegging and gangsters. The most notorious gangster of this time was Al Capone. By 1920, Capone was just making a name for himself among the Chicago crime syndicate, but in a few years, he was well on his way to becoming America’s most famous gangster of all time. At the height of his career, Capone’s crime group ran an impressive bootlegging operation, controlled speakeasies, gambling houses, brothels, race tracks, and nightclubs that brought in income of nearly 100 million dollars per year. When the operation moved to the Chicago suburb of Cicero in early 1924, the city of Cicero became synonymous with mobster life and crooked politics.

Figure 1.1 Industry innovators of Chicago in the 1920s: (a) Al Capone (1899–1947) and (b) Walter Shewhart (1891–1967).

But Capone didn’t have the biggest operation in town. The Western Electric Company, also based in Cicero, was producing 90% of the nation’s telephone equipment, and brought in over 300 million dollars per year. The company’s Hawthorne Works plant had become a world famous research center, developing the high-vacuum tube, the condenser microphone, and radio systems for airplanes. One 1923 study that originally set out to show how better lighting increased worker productivity later became famous for its discovery of a short-term improvement caused by observing worker performance, which psychologists refer to as the “Hawthorne effect”.

Less famously at the time, Western Electric pioneered the development of inspection policies to assure specification and quality standards in manufactured products. When an engineer and statistician named Walter Shewhart joined the Inspection Engineering Department at Hawthorne in 1918, industrial quality was limited to merely inspecting products and removing defective ones. Shewhart changed all that on May 16, 1924, when he presented his boss, George Edwards, a memo that would soon change the way we perceived quality control in manufacturing. Edwards remarked

“Dr. Shewhart prepared a little memorandum only about a page in length. About a third of that page was given over to a simple diagram which we would all recognize today as a schematic control chart. That diagram, and the short text which preceded and followed it, set forth all of the essential principles and considerations which are involved in what we know today as process quality control.”

Shewhart’s ensuing research built on this genesis of statistical process control, with many of his early results published in his 1931 book, Economic Control of Quality of Manufactured Product.

It would be exciting to think that Walter Shewhart and Al Capone were somehow intertwined in 1924 Cicero, but this is highly unlikely. It seems more probable that Chicago’s bootlegging operation lacked statistical process control of any kind. The closest tie we can make is from what we know of Shewhart’s protègè, Joseph Juran, who occasionally visited a couple of Capone’s casinos across the street from the Hawthorne Works. After spending some time in Capone’s establishment after hours, Juran noticed that one roulette wheel operator worked “like a robot”, making the operation of his wheel amenable to statistical analysis and prediction. His expertise enabled him to win one hundred dollars, which at the time was several weeks’ pay.

Shewhart deduced that while variation exists in every type of process, some variation could be blamed on an assignable cause (so it is not completely random) but other sources of variation could not. The key was to focus on the factors that are associated with assignable cause variation in order to remove as much of that variation as possible. Inherent or natural variation, on the other hand, has only chance cause (or common cause) and can only be tolerated.

Along with Juran and his other famous protégé, Edward Deming, Shewhart revolutionized the science of manufacturing processes. Their research was eventually disseminated throughout the world, but ironically did not catch on quickly at Western Electric. In 1925, Western Electric joined with AT&T, the co-owner of Bell Labs, and much of its research was consolidated and moved to Bell Labs research centers in New Jersey. The Hawthorne Plant was greatly downsized by the Great Depression, and Juran noted by the time he left in 1941, “you could walk through this plant, the seed bed of the quality revolution, without seeing any control charts”.

Deming is famous for teaching American manufacturers about quality and management, but his work was first adopted by the Japanese Union of Scientists and Engineers, whose members had already studied Shewhart’s techniques. Starting in the 1950s, Deming trained a generation of Japanese scientists, engineers and managers in quality improvement. In the 1980s, many believed this training was one factor that helped the Japanese automobile industry leap-frog over its overseas competitors and allowed Japanese electronics manufacturers gain a large market niche in world trade. Later, American manufacturers such as Ford Motor Company also recognized the value of Deming’s techniques, although Ford waited until 1981 to hire him for quality consultation.

Figure 1.2 A timeline of the history of statistical process control in the 20th Century

1.4 WHAT IS A PROCESS?

A process is defined as a series of operations performed in the making of a product. More simply, a process is the transformation of input variables to a final result that might be a product or service. We will often refer to the process result as a product or an output - see Figure 1.3. In a manufacturing process, the input variables can be anything from raw materials, machines or equipment, operators and operator actions, environmental conditions, solicited information and working methods. Some factors are clearly controllable, such as an operator’s action in the process. Uncontrollable factors include inputs that clearly affect the output but are not under our control, such as the effect of weather on a house construction process.

Figure 1.3 Simple diagram of a process affected by controllable factors and and uncontrollable factors

The variation in the inputs will propagate through the process and result in variability in the output. In general, process managers will want a predictable process that produces identical items consistently, so any variability in the output is bad news. We know that perfect consistency in the output is an unobtainable ideal, so we have to deal with the output variability by understanding it, characterizing it, and ultimately reducing it. The output variability is created by assignable causes or chance causes or both. The variability created by assignable causes is usually unpredictable, but it is explainable after it has been observed. Some examples of assignable causes are machine malfunction, low quality batch of raw materials, new operator, and so on. Chance causes produce random variation in the behavior of output measurements. This random variation created by chance causes is consistent and predictable, but it is unexplainable. Some examples of chance causes are vibration in the processes, poor design, noise, computer response time, and so on. Assignable causes contribute significantly to the total output variability.

Due to assignable and chance causes, output items that are distributed like this bell-shaped curve are characterized statistically using a Normal distribution (see Section 4 of Chapter 2: Statistical Distributions). In Figure 1.4, the target value is a well known ideal value and LSL (lower specification limit) and USL (upper specification limit) represent what outputs can be tolerated in the market. The target value, LSL, and USL are set by product standards or by the engineers. The LSL and USL are used to determine the products are acceptable. For example, the producer of bottled water can set the target value for the content of a bottle to be 500 ml and he can also specify that any quantity between 495 ml and 505 ml is acceptable. Here, the target value is 500 ml, LSL is 495 ml, and USL is 505 ml. Even though the output variability is understood to exist, process engineers must try to produce each product to be between LSL and USL.

Figure 1.4 Distribution of output measurements for a process

1.5 EXAMPLES OF PROCESSES FROM DAILY LIFE

In this book, we consider a large variety of process examples that are subject to statistical process control. This will provide readers with a wide array of applications and allow each reader to identify with some real-life applications when learning how to improve process quality. For example, one reader might closely relate their own work experiences with examples in semiconductor manufacturing, while another reader associates more with the process of shipping manufactured goods from China to a series of distribution centers in the United States.

Example 1–1: Frying an Egg

Suppose that we are at home, frying an egg for breakfast. In this process, the input variables are the cook, a pan, a gas or electric range, an egg, cooking oil, salt, an egg turner, and the recipe that provides you with instruction on how to fry an egg. There are many factors involved in the process such as cooking temperature, cooking time, amount of cooking oil on the pan, amount of salt, and so on. Among them, we can control the cooking time, the amount of cooking oil, and the amount of salt. These are controllable factors. If we want to make a consistent output in our fried egg process, we need to be consistent with the levels of the controllable input factors. However, the cooking temperature and uniformity of the input egg can’t be controlled exactly. In this process, they are uncontrollable factors. Due to the uncontrollable factors, we experience slightly different results every time we cook a fried egg. This represents the variation in the egg-frying process.

• • •

Example 1–2: Personnel Hiring

Suppose a computer software firm wants to hire an electronics engineer for software and hardware design. There are numerous steps in the hiring process, and uncontrollable factors that affect the process goal include the quality and quantity of the candidate pool as well as the demand from competing companies. Like many service processes, in this example we keep track of a person throughout the process, and give less thought to changeable inputs of machine parameters and raw materials. The process might be condensed into eight steps:

1. Advertise for the position, possibly including proactive calls to potential candidates who are working for other companies. The resources used in this step represent a controllable factor, given some firms spend much more money for advertising than other firms.
2. Form hiring group to judge resumes and select appropriate candidates.
3. After discussion and debate, a candidate is selected and then called by the department head and the human resources (HR) group. If turned down right away, the process goes back to step 2.
4. Negotiate with candidate to select date for interview, and set up travel and interview sessions.
5. Interview candidate. The hiring company has several controllable factors in the interview process, from the scheduling, the quality of transportation, lodging and restaurants used during the interview.
6. Hiring group discusses candidate and decides whether or not to tender an offer (if not, it’s back to step 2). The offer will be formalized by the department head and the HR group.
7. The department head and HR group will negotiate with the candidate, if necessary, to entice the candidate to accept the offer (or it’s back to 2 again). If the company is flexible, controllable factors can be used to optimize the contract offer. The prejudices, needs and preferences of the candidate are uncontrollable factors, and they play a crucial role in whether the offer is accepted.
8. The candidate signs an employment contract with the company. This is a less mechanized process, and it depends on subjectivity, group consensus and possibly unknowable hiring resources. Although there are numerous controllable factors involved in such a hiring process, factor levels may be ambiguous, especially to someone not on the hiring committee.

• • •

Example 1–3: Renting a Car

Suppose that we are picking up a rental car at the airport. The process includes the following nine steps:

1. Fill out the contract document at the agency desk.
2. Select a car from available car list.
3. Decide whether to purchase the insurance plan.
4. Select option items such as all-wheel drive, a GPS navigator or a child safety seat.
5. Register a credit card with the rental agent.
6. Receive the car keys and the signed rental agreement.
7. Go to the car pick-up area (by walking or using shuttle bus).
8. Inspect the car for scratches and dents.
9. Start engine and drive the car out of the airport vicinity.

In this case the inputs associated with service labor are somewhat clear. What might be hidden, however, are the numerous sub-processes that must work in order to guarantee the right car is available to the customer at the right time. Along with the quality of the rental car, the process output might be characterized by service quality, such as the time when the customer enters the queue at the rental agency desk to the time the customer starts the rental car. The controllable factors include rental car selection, the number of rental agents working at the service counter and the training, competence and courteousness exuded by the agents. Uncontrollable factors can include unforseen customer demand affected by airline delays, delays caused by difficult customers, and the effects of nearby car rental agency competitors on customer demand.

• • •

Example 1–4: Filling a Drug Prescription

Suppose that a pharmacist takes a drug prescription from a customer and fills the order for the customer. We consider the following nine steps in the process:

1. Pharmacist receives doctor’s prescription for medicine at front counter.
2. Pharmacist verifies customer prescription.
3. Pharmacist verifies insurance information
4. Pharmacist logs prescription into queue for retrieval.
5. Pills are obtained from shelves and counted to match customer order.
6. Pills are inserted into pill bottle, which is labelled according to the prescription.
7. Filled prescription is taken to check out to be picked up by customer.
8. Upon arrival at check out, customer is offered advisement on instructions or side effects with regard to the prescription.
9. Customer purchases drugs.

If the time needed to fill the prescription is the main process output, then the customer’s actions might represent the most critical uncontrollable factor, since they might ask for the prescription right away. Controllable factors that hinder the pharmacists speed can be an outdated cash register, an inefficient referral system, or inadequate inventory for popular medicines.

• • •

Example 1–5: Juice Manufacturing

Some farms, groves and vineyards have resources to harvest, process and package food products without the help of middlemen or large agri-businesses. For a berry grower, there are harvesting, pressing and bottling machines needed to make the process efficient, but given such machines are available, we might consider machine efficiency and reliability as uncontrollable factors. The process of turning berries into packaged juice products can be summarized into six basic steps:

1. The harvest and sorting of berries. There are numerous uncontrollable factors relating to weather, climate and soil quality, and controllable factors might include allotted growing time and number of field workers hired to help harvest the fruit.
2. Wash berries at juicing facility. Uncontrollable factors: amount of dirt clods, weeds and stones to remove when sorting berries. Controllable factor: amount of water and human labor used in washing berries.
3. Juice berries to puree via cold pressing through screening that removes seeds and stems. The screen filtering is a controllable factor.
4. Samples are examined to measure nutrients and detect contaminants such as pesticides. Factors are controlled through setting measurement tolerances, but potential measurement error might serve as an uncontrollable factor.
5. Blend juices and combine needed additives (factors same as step 4).
6. Bottling juices can include capping, labeling and boxing.

• • •

Example 1–6: Abused or Neglected Children

Not all industry processes are about making things or selling things customers. The Division of Family and Children Services (DFCS) in the state of Georgia is the part of the Department of Human Resources (DHR) that investigates child abuse. DFCS finds foster homes for abused and neglected children and, at the same time, help their impoverished parents get back on their feet with counselling and job training. Like many other service providers, the inputs (children, families) provide a lot of uncontrollable variation to the process.

1. DFCS is alerted (perhaps by a nurse or school teacher) about a case of child abuse or neglect and writes up a report. The reporting network is an important controllable factor that can vary greatly between different communities.
2. The report is screened for potential investigation. If the case meets the criteria for abuse or neglect, DFCS sends investigators to the child’s home to check on the health of the child. They inspect living conditions, talk with the parents and interview people involved with this case.
3. A decision is made about the case of abuse or neglect, and a wide range of outcomes are possible, from a formal warning for the parents to obtaining a court order to remove the child from the home. In some cases, the police will be involved, representing another potential source of uncontrollable process variation.
4. In cases where abuse charges were substantiated but the child is not in eminent danger and remains in the home, case managers visit the family regularly and provide services that might include counselling, drug abuse treatment, referrals for employment or child care.
5. If the child has to be removed from the home and not into the care of a family relative, a court order might be used to terminate parental rights, at which time the child is sent to foster care and officially becomes a ward of the state.

This is an intricate process with numerous uncontrollable factors related to the abuse, the physical and mental state of the parents and the family’s resources. In both steps 3 and 4, the case manager decisions are based on a finite but complex set of controllable factors requiring the expert decision making of a trained social worker.

• • •

Example 1–7: Amazon.com

Amazon.com is a premier on-line sales site for merchandise such as books, dvds and music cds. One crucial part to Amazon’s success is maintaining a fast, simple and easy process for the customer to select and purchase merchandise. To ensure the process runs smoothly, Amazon uses six basic steps to get the customer to complete a purchase on line:

1. After the customer clicks on a product to learn of product details, a bright yellow button that reads [Add to Cart] is displayed prominently next to a picture of the product.
2. Once the [Add to Cart] button is pressed, Amazon leaves several opportunities to shop some more (and includes suggested items based on what is already in the shopping cart) but the bright yellow [Proceed to Checkout] button is the most conspicuous one on the page.
3. If the customer has not yet signed in to their personal account (or signed up for one), they are required to do so before proceeding to the next stage of the process. For a computer user that allows Amazon’s cookies, this can be as simple as typing in a password and clicking [Continue].
4. The customer selects a mailing address or types in an address if this is the first order. By storing addresses along with other customer information for use in expediting the order, Amazon can take advantage of the customer’s potential impulse buying habits.
5. One click will determine the shipping method from a short list of options, and buttons that encourage the customer to continue with the order are always featured with bright yellow buttons.
6.