Вы находитесь на странице: 1из 52

UNIT II

PRINCIPLES AND PHILOSOPHIES OF QUALITY MANAGEMENT

INTRODUCTION

Different schools of thought on management dominate the minds


of the industrialists and practitioners. The western school of thought
accelerates the decision making process but when it comes to the stages of
implementation there is a slow down comparatively. On the other side,
the Japanese Management spend a lot more time in arriving at the
particular decision and because of which the implementation is faster.
Many quality management principles and philosophies have emerged
from Japanese soil. The notable contributions and the profile of the
contributors are presented in this part. This unit deals with Overview of
the contributions of Walter Shewhart, Overview of the contributions of
Deming, Overview of the contributions of Juran, Overview of the
contributions of Crosby, Overview of the contributions of Masaaki Imai,
Overview of the contributions of Feigenbaum, Overview of the
contributions of Ishikawa, Overview of the contributions of Taguchi,
Overview of the contributions of Shingeo, Concepts of quality circle,
Japanese 5S Principles, 8D Methodology.

LEARNING OBJECTIVES

Upon completion of this unit, you will be able to:

Have an understanding of western and Japanese thinking on


quality
Appreciate the evolution process of various dominant techniques in
quality
Understand the process of evolution of various techniques
The contributors profile and their contributions in particular

2.1 OVERVIEW OF THE CONTRIBUTIONS OF WALTER SHEWHART

Walter A. Shewhart
Walter A. Shewhart

Walter Andrew Shewhart (pronounced like "Shoe-heart", March 18, 1891 -


March 11, 1967) was an American physicist, engineer and statistician, sometimes
known as the father of statistical quality control.

W. Edwards Deming said of him:

As a statistician, he was, like so many of the rest of us, self-taught, on a good


background of physics and mathematics.

Born New Canton, Illinois to Anton and Esta Barney Shewhart, he


attended the University of Illinois before being awarded his doctorate in physics
from the University of California, Berkeley in 1917.

Work on industrial quality

Bell Telephones engineers had been working to improve the reliability of


their transmission systems. Because amplifiers and other equipment had to be
buried underground, there was a business need to reduce the frequency of
failures and repairs. When Dr. Shewhart joined the Western Electric Company
Inspection Engineering Department at the Hawthorne Works in 1918, industrial
quality was limited to inspecting finished products and removing defective
items. That all changed on May 16, 1924. Dr. Shewhart's boss, George D
Edwards, recalled: "Dr. Shewhart prepared a little memorandum only about a
page in length. About a third of that page was given over to a simple diagram
which we would all recognize today as a schematic control chart. That diagram,
and the short text which preceded and followed it, set forth all of the essential
principles and considerations which are involved in what we know today as
process quality control." Shewhart's work pointed out the importance of
reducing variation in a manufacturing process and the understanding that
continual process-adjustment in reaction to non-conformance actually increased
variation and degraded quality.

Shewhart framed the problem in terms of assignable-cause and chance-


cause variation and introduced the control chart as a tool for distinguishing
between the two. Shewhart stressed that bringing a production process into a
state of statistical control, where there is only chance-cause variation, and
keeping it in control, is necessary to predict future output and to manage a
process economically. Dr. Shewhart created the basis for the control chart and the
concept of a state of statistical control by carefully designed experiments. While
Dr. Shewhart drew from pure mathematical statistical theories, he understood
data from physical processes never produce a "normal distribution curve" (a
Gaussian distribution, also commonly referred to as a "bell curve"). He
discovered that observed variation in manufacturing data did not always behave
the same way as data in nature (Brownian motion of particles). Dr. Shewhart
concluded that while every process displays variation, some processes display
controlled variation that is natural to the process, while others display
uncontrolled variation that is not present in the process causal system at all
times.

Shewhart worked to advance the thinking at Bell Telephone Laboratories


from their foundation in 1925 until his retirement in 1956, publishing a series of
papers in the Bell System Technical Journal.

His work was summarised in his book Economic Control of Quality of


Manufactured Product (1931).Shewharts charts were adopted by the American
Society for Testing and Materials (ASTM) in 1933 and advocated to improve
production during World War II in American War Standards Z1.1-1941, Z1.2-
1941 and Z1.3-1942.

Later work

From the late 1930s onwards, Shewhart's interests expanded out from
industrial quality to wider concerns in science and statistical inference. The title
of his second book Statistical Method from the Viewpoint of Quality Control (1939)
asks the audacious question: What can statistical practice, and science in general,
learn from the experience of industrial quality control?

Shewhart's approach to statistics was radically different from that of many


of his contemporaries. He possessed a strong operationalist outlook, largely
absorbed from the writings of pragmatist philosopher C. I. Lewis, and this
influenced his statistical practice. In particular, he had read Lewis's Mind and the
World Order many times. Though he lectured in England in 1932 under the
sponsorship of Karl Pearson (another committed operationalist) his ideas
attracted little enthusiasm within the English statistical tradition. The British
Standards nominally based on his work, in fact, diverge on serious philosophical
and methodological issues from his practice.

His more conventional work led him to formulate the statistical idea of
tolerance intervals and to propose his data presentation rules, which are listed
below:

1. Data has no meaning apart from its context.


2. Data contains both signal and noise. To be able to extract information, one
must separate the signal from the noise within the data.

Walter Shewhart visited India in 1947-48 under the sponsorship of P. C.


Mahalanobis of the Indian Statistical Institute. Shewhart toured the country, held
conferences and stimulated interest in statistical quality control among Indian
industrialists.He died at Troy Hills, New Jersey in 1967.

Walter Shewharts invention of statistical control charts and pioneering of


industrial quality control methods.

Pioneer of Modern Quality Control

Recognized the need to separate variation into assignable and


Unassignable causes (defined in control.)
Founder of the control chart (e.g. X-bar and R chart).
originator of the plan-do-check-act cycle.
perhaps the first to successfully integrate statistics,
Engineering and economics.
defined quality in terms of objective and subjective quality
objective quality: quality of a thing independent of
people.
subjective quality: quality is relative to how people
perceive it.(Value)

Influence

In 1938 his work came to the attention of physicists W. Edwards Deming


and Raymond T. Birge. The two had been deeply intrigued by the issue of
measurement error in science and had published a landmark paper in Reviews of
Modern Physics in 1934. On reading of Shewhart's insights, they wrote to the
journal to wholly recast their approach in the terms that Shewhart advocated.

The encounter began a long collaboration between Shewhart and Deming


that involved work on productivity during World War II and Deming's
championing of Shewhart's ideas in Japan from 1950 onwards. Deming
developed some of Shewhart's methodological proposals around scientific
inference and named his synthesis the Shewhart cycle.

Achievements and honours

In his obituary for the American Statistical Association, Deming wrote of


Shewhart:

As a man, he was gentle, genteel, never ruffled, never off his dignity. He knew
disappointment and frustration, through failure of many writers in mathematical
statistics to understand his point of view.

He was founding editor of the Wiley Series in Mathematical Statistics, a role


that he maintained for twenty years, always championing freedom of speech and
confident to publish views at variance with his own.

His honours included:

Founding member, fellow and president of the Institute of Mathematical


Statistics;
Founding member, first honorary member and first Shewhart Medalist of
the American Society for Quality Control;
Fellow and president of the American Statistical Association;
Fellow of the International Statistical Institute;
Honorary fellow of the Royal Statistical Society;
Holley medal of the American Society of Mechanical Engineers;
Honorary Doctor of Science, Indian Statistical Institute, Calcutta.
1. All chance systems of causes are not alike in the sense that they enable us to
predict the future in terms of the past.
2. Constant systems of chance causes do exist in nature.
3. Assignable causes of variation may be found and eliminated.

Based upon evidence such as already presented, it appears feasible to set


up criteria by which to determine when assignable causes of variation in quality
have been eliminated so that the product may then be considered to be
controlled within limits. This state of control appears to be, in general, a kind of
limit to which we may expect to go economically in finding and removing causes
of variability without changing a major portion of the manufacturing process as,
for example, would be involved in the substitution of new materials or designs.

The definition of random in terms of a physical operation is notoriously


without effect on the mathematical operations of statistical theory because so far
as these mathematical operations are concerned random is purely and simply an
undefined term. The formal and abstract mathematical theory has an
independent and sometimes lonely existence of its own. But when an undefined
mathematical term such as random is given a definite operational meaning in
physical terms, it takes on empirical and practical significance. Every
mathematical theorem involving this mathematically undefined concept can then
be given the following predictive form: If you do so and so, then such and such
will happen.

Every sentence in order to have definite scientific meaning must be


practically or at least theoretically verifiable as either true or false upon the basis
of experimental measurements either practically or theoretically obtainable by
carrying out a definite and previously specified operation in the future. The
meaning of such a sentence is the method of its verification.

In other words, the fact that the criterion we happen to use has a fine
ancestry of highbrow statistical theorems does not justify its use. Such
justification must come from empirical evidence that it works.

Presentation of Data depends on the intended actions

Rule 1. Original data should be presented in a way that will preserve the
evidence in the original data for all the predictions assumed to be useful.
Rule 2. Any summary of a distribution of numbers in terms of symmetric
functions should not give an objective degree of belief in any one of the
inferences or predictions to be made there from that would cause human action
significantly different from what this action would be if the original distributions
had been taken as evidence.

The original notions of Total Quality Management and continuous


improvement trace back to a former Bell Telephone employee named Walter
Shewhart. One of W. Edwards Deming's teachers, he preached the importance of
adapting management processes to create profitable situations for both
businesses and consumers, promoting the utilization of his own creation -- the
SPC control chart.

Dr. Shewhart believed that lack of information greatly hampered the


efforts of control and management processes in a production environment. In
order to aid a manager in making scientific, efficient, economical decisions, he
developed Statistical Process Control methods. Many of the modern ideas
regarding quality owe their inspiration to Dr. Shewhart.

He also developed the Shewhart Cycle Learning and Improvement cycle,


combining both creative management thinking with statistical analysis. This
cycle contains four continuous steps: Plan, Do, Study and Act. These steps
(commonly referred to as the PDSA cycle), Shewhart believed; ultimately lead to
total quality improvement. The cycle draws its structure from the notion that
constant evaluations of management practices -- as well as the willingness of
management to adopt and disregard unsupported ideas --are keys to the
evolution of a successful enterprise.

2.2 OVERVIEW OF THE CONTRIBUTIONS OF DEMING

Understanding the Deming Management Philosophy

FIGURE 2.1

W. Edwards Deming called it the Shewhart cycle, giving credit to its


inventor, Walter A. Shewhart. The Japanese have always called it the Deming
cycle in honor of the contributions Deming made to Japan's quality improvement
efforts over many years. Some people simply call it the PDCA--plan, do, check,
and act--cycle. Regardless of its name, the idea is well-known to process
improvement engineers, quality professionals, quality improvement teams and
others involved in continuous improvement efforts.

The model can be used for the ongoing improvement of almost anything
and it contains the following four continuous steps: Plan, Do, Study and
Act. Students, facilitated by their teacher should be able to complete the
following steps using information from a classroom data center. Building staffs,
facilitated by their principal or district quality facilitators should be able to
complete the steps of PDSA using information from a building data center.
Similarly, district level strategic planning committees can use the same process.

In the first step (PLAN), based on data, identify a problem worthy of


study to effect improvement. Define the specific changes you want. Look at the
data [numerical information] related to the current status. List a numerical
measure for the future target.

In the second step (DO), the plan is carried out, preferably on a small
scale. Identify the process owners who are on the team. Within the action plan
and on the storyboard, explain why new steps are necessary. Collect and chart
the baseline data. Form a hypothesis of possible causes that are related to the
current performance results. A quality tool such as a fish-bone diagram or an
affinity diagram would be useful at this stage. Implement the strategy to bring
about the change.

In the third step (STUDY), the effects of the plan are observed. Monitor
the data. On the baseline data chart, continue with the next data points. Explain
when and how data analysis with appropriate people takes place. Explain what
is being learned though the improvement process. Identify trends, if any can be
discerned. Conduct a gap analysis. Use comparisons, and benchmark [best
practice] data. If negative data result, undergo another cycle of PDSA.

In the last step (ACT), the results are studied to determine what was
learned and what can be predicted. If positive data result, standardize the
process/strategy. Standardize and keep the new process going. Repeat the cycle
starting with PLAN to define a new change you want. See the diagram of the
process below.

William Edwards Deming was an American statistician, college


professor, author, lecturer, and consultant. Deming is widely credited with
improving production in the United States during World War II, although he is
perhaps best known for his work in Japan. There, from 1950 onward he taught
top management how to improve design (and thus service), product quality,
testing and sales (the latter through global markets). Deming made a significant
contribution to Japan becoming renowned for producing innovative high-quality
products. Deming is regarded as having had more impact upon Japanese
manufacturing and business than any other individual not of Japanese heritage.

After World War II (1947), Deming was involved in early planning for the
1951 Japanese Census. He was asked by the Department of the Army to assist in
this census. While he was there, his expertise in quality control techniques,
combined with his involvement in Japanese society, led to his receiving an
invitation by the Japanese Union of Scientists and Engineers (JUSE).

JUSE members had studied Shewhart's techniques, and as part of Japan's


reconstruction efforts they sought an expert to teach statistical control. During
June-August 1950, Deming trained hundreds of engineers, managers, and
scholars in statistical process control (SPC) and concepts of quality. He also
conducted at least one session for top management. Deming's message to Japan's
chief executives: improving quality will reduce expenses while increasing
productivity and market share. Perhaps the best known of these management
lectures was delivered at the Mt. Hakone Conference Center in August of 1950.

A number of Japanese manufacturers applied his techniques widely, and


experienced theretofore unheard of levels of quality and productivity. The
improved quality combined with the lowered cost created new international
demand for Japanese products.

Deming declined to receive royalties from the transcripts of his 1950


lectures, so JUSE's board of directors established the Deming Prize (December
1950) to repay him for his friendship and kindness. The Deming Prize, especially
the Deming Application Prize that is given to companies, has exerted an
immeasurable influence directly or indirectly on the development of quality
control and quality management in Japan.

In 1960, the Prime Minister of Japan (Nobusuke Kishi), acting on behalf of


Emperor Hirohito, awarded Dr. Deming Japans Order of the Sacred Treasures,
Second Class. The citation on the medal recognizes Deming's contributions to
Japans industrial rebirth and its worldwide success.

The first section of the meritorious service record describes his work in
Japan:

1947 Rice Statistics Mission member


1950 assistant to the Supreme Commander of the Allied Powers
Instructor in sample survey methods in government statistics

The second half of the record lists his service to private enterprise through
the introduction of epochal ideas, such as quality control and market survey
techniques

Contributions of Deming

The recovery of Japan after World War II has many explanations: Japan
was forbidden to be involved in military industries, the Japanese concentrated on
consumer products; powerful conglomerates of industry and banks (zaibatsus)
poured money into selected companies; the Japanese people consented to great
sacrifices in order to support the recovery.

The Japanese themselves point to American W. Edwards Deming as one


factor in their success. During the War, Deming was one of many who helped
apply statistical quality control methods developed by Walter Shewhart at Bell
Labs to help with the industrial mobilization. After the war, Deming was
disappointed by American industrys rejection of these methods. Deming visited
Japan after the war as representative of the US government, to help the Japanese
set up a census. He met with Japanese engineers interested in applying
Shewharts methods. In 1950, the Japanese Union of Engineers invited Deming to
give a series of lectures on quality control, which were attended by top Japanese
industrialists.Within months, they found amazing increase in productivity and
statistical quality control took off in Japan. The top people came to Deming with
a desire to learn that bordered on obssession. The Japanese integrated the
statistical methods into their companies, involving all the workers in the
movement to improve quality.

American industry flourished in the postwar boom in the US, but found
itself getting hints and finally clear indications of Japanese competition in the
1970s. Hal Sperlich, a Ford executive, visited a Japanese auto factory in the early
seventies and was amazed to find that the factories had no area dedicated to
repairing shoddy work; in fact the plant had no inspectors. Sperlich left that
factory somewhat shaken: In America, he thought, we have repair bins the size of
football fields. William Ouchi wrote that when he began to study Japanese
practices in 1973, there was little interest in the US in his findings. When his
book, Theory X, was published in 1981, interest had grown tremendously and
the book was a best seller. However, even in 1981, a top officer in Motorola
warned American manufacturers of computer chips that they were complacent
and not paying enough attention to Japanese quality. In 1981, Ford engineers
compared automatic transmissions, some built by Mazda for the Ford Escorts,
and some built by Ford. The ones made in Japan were well liked by our
customers; many of those from Ohio were not. Ours were more erratic; many
shifted poorly through the gears, and customers said they didnt like the way
they performed. The difference was due to the tighter tolerances in the Japanese
made transmissions.

NBC aired a documentary If Japan Can, Why Cant We. The documentary
explained what Japan was doing and especially stressed the contributions of
Deming. Donald Peterson, then president of Ford, was one of many CEOs
motivated to call Deming. Deming said his phone rang off the hook.

Deming began with statistical quality control, but he recognized that


success depended on involving everyone. His 14 points are a manifesto for
worker involvement and worker pride. Peterson sent teams from Ford to visit
Japanese companies: Before those visits, many of the people at Ford believed
that the Japanese were succeeding because they used highly sophisticated
machinery. Other thought their industry was orchestrated by Japans
government. The value of our visits, however, lay in Ford peoples discovery that
the real secret was how the people worked together -- how the Japanese
companies organized their people into teams, trained their workers with the
skills they needed, and gave them the power to do their jobs properly. Somehow
or other, they had managed to hold on to a fundamental simplicity of human
enterprise, while we built layers of bureaucracy.

In a return to using the brain, not just the brawn, of the worker, the
Japanese methods, building on Deming, actually deTaylorize work. The
classical Taylor model of scientific management, which favored the separation of
mental from physical labor and the retention of all decision making in the hands
of management, is abaondoned in favor of a cooperative team approach
designed to harness the full mental capabilities and work experience of everyone
involved in the process ....

While Demings principles as filtered through the Japanese methods argue


for reskilling work and reject of Taylors belief that workers should just do what
they are told, Taylorism lives on, only now it is called McDonaldization.
Taiichi Ohno, Toyota Production System developed between 1945 and 1970.

While the US had much to learn from Japanese methods, careful observers
realized that the differences between Japanese and American societies were so
great that not all ideas could be imported (some of the cooperation among
Japanese companies would violate US antitrust laws), that the Japanese methods
were not always what they seemed (for example, life time employment was
limited to a minority), and American companies, unheralded, were already using
many of the new Japanese methods. Ouchi in Theory Z, examined Japanese
practices in their treatment of workers, distilled them to the central ideas which
he called Theory Z, and discovered that the best examples of Theory Z
management were American companies.

Understanding the Deming Management Philosophy

FIGURE 2.2

W. Edwards Deming called it


the Shewhart cycle, giving credit to its
inventor, Walter A. Shewhart. The
Japanese have always called it the
Deming cycle in honor of the
contributions Deming made to Japan's
quality improvement efforts over
many years. Some people simply call it the PDCA--plan, do, check, and act--
cycle. Regardless of its name, the idea is well-known to process improvement
engineers, quality professionals, quality improvement teams and others involved
in continuous improvement efforts.

The model can be used for the ongoing improvement of almost anything
and it contains the following four continuous steps: Plan, Do, Study and
Act. Students, facilitated by their teacher should be able to complete the
following steps using information from a classroom data center. Building staffs,
facilitated by their principal or district quality facilitators should be able to
complete the steps of PDSA using information from a building data center.
Similarly, district level strategic planning committees can use the same process.

In the first step (PLAN), based on data, identify a problem worthy of


study to effect improvement. Define the specific changes you want. Look at the
data [numerical information] related to the current status. List a numerical
measure for the future target.

In the second step (DO), the plan is carried out, preferably on a small
scale. Identify the process owners who are on the team. Within the action plan
and on the storyboard, explain why new steps are necessary. Collect and chart
the baseline data. Form a hypothesis of possible causes that are related to the
current performance results. A quality tool such as a fish-bone diagram or an
affinity diagram would be useful at this stage. Implement the strategy to bring
about the change.

In the third step (STUDY), the effects of the plan are observed. Monitor
the data. On the baseline data chart, continue with the next data points. Explain
when and how data analysis with appropriate people takes place. Explain what
is being learned though the improvement process. Identify trends, if any can be
discerned. Conduct a gap analysis. Use comparisons, and benchmark [best
practice] data. If negative data result, undergo another cycle of PDSA.

In the last step (ACT), the results are studied to determine what was
learned and what can be predicted. If positive data result, standardize the
process/strategy. Standardize and keep the new process going. Repeat the cycle
starting with PLAN to define a new change you want. See the diagram of the
process below.

2.3 OVERVIEW OF THE CONTRIBUTIONS OF JURAN

Joseph Juran

Juran expressed his approach to quality in the form of the Quality trilogy.
Managing for quality involved three basic processes:

Quality Planning: This involves identifying the customer (both internal and
external), determining their needs, design goods and services to meet these
needs at the established quality and cost goals. Then design the process and
transfer this to the operators.

Quality Control: Establish standards or critical elements of performance,


identify measures and methods of measurements, compare actual to standard
and take action if necessary.

Quality Improvement: Identify appropriate improvement projects, organize


the team, discover the causes and provide remedies and finally develop
mechanisms to control the new process and hold the gains.

The relationship among the three processes is shown in the Quality Trilogy
figure below:

FIGURE 2.3
The errors made during the initial planning result in a higher cost which
Juran labeled Chronic Waste: At the beginning, the process stays within control
limits. A quality improvement project is initiated and succeeds in reducing
chronic waste.

Juran also created the concept of Cost of Quality. There are four elements
comprising the cost of quality.

Prevention Costs: Initial design quality, actions during product creation (e.g.
marketing research, establishing product specifications, determining
consumer needs, training workers, vendor evaluation, quality audit,
preventive maintenance, etc.)

Appraisal Costs: Inspection testing of raw materials, work-in-progress,


finished goods, procedures for testing, training manuals, laboratories.

External Costs: Returned merchandise, making warranty repairs or refunds,


credibility loss, lawsuits.

Internal Costs: Scrap, rework, redesign, downtime, broken equipment,


reduced yield, selling products at a discount etc.

The graph in the following exhibit shows that costs of conformance (appraisal
and prevention) increase as defect rate declines. However, the costs of
nonconformance (internal and external failures) decrease. The trade-off leads
to an optimal conformance level.
FIGURE 2.4

The Quality Trilogy

Quality Planning: Determine quality goals; implementation planning:


resource planning; express goals in quality terms; create the quality plan.
Quality Control: Monitor performance; compare objectives with
achievements: act to reduce the gap.
Quality Improvement: Reduce waste; enhance logistics; improve employee
morale; improve profitability; satisfy customers.

Philosophy

Management is largely responsible for quality


Quality can only be improved through planning
Plans and objectives must be specific and measurable
Training is essential and starts at the top
Three step process of planning, control and action

The Quality Planning Roadmap

Step 1 : Identify who are the customers

Step 2 : Determine the needs of those customers


Step 3 : Translate those needs into our language (the language of the
organization)

Step 4 : Develop a product that can respond to those needs

Step 5 : Optimize the product features so as to meet our (the


companys needs as well as customers needs

Step 6 : Develop a process, which is able to produce the product


Step 7 : Optimize the process

Step 8 : Prove that the process can produce the product under
operating conditions

Step 9 : Transfer the process to operations

Ten Steps to Continuous Quality Improvement

Step 1 : Create awareness of the need and opportunity for quality


improvement

Step 2 : Set goals for continuous improvement

Step 3 : Build an organization to achieve goals by establishing a


quality council, identifying problems, selecting a project,
appointing teams and choosing facilitators

Step 4 : Give everyone training

Step 5 : Carry out projects to solve problems

Step 6 : Report progress

Step 7 : Show recognition

Step 8 : Communicate results

Step 9 : Keep a record of successes.

Step 10 : Incorporate annual improvements into the companys


regular systems and processes and thereby maintain
momentum.
2.4 OVERVIEW OF THE CONTRIBUTIONS OF CROSBY

Philip Crosby

Crosby described quality as free and argued that zero defects was a
desirable and achievable goal. He articulated his view of quality as the four
absolutes of quality management:

1. Quality means conformance to requirements. Requirements needed to be


clearly specified so that everyone knew what was expected of them.
2. Quality comes prevention. And prevention was a result of training,
discipline, example, leadership, and more.
3. Quality performance standard is zero defect. Errors should not be
tolerated.
4. Quality measurement is the price of non-conformance.

In addition to the above, Crosby developed a quality management maturity


grid in which he listed five stages of managements maturity with quality issues.
These five are Uncertainty, Awakening, Enlightenment, Wisdom and Certainty.
In the first stage, management fails to see quality as a tool; problems are handled
by firefighting and are rarely resolved; there are no organized quality
improvement activities. By the last stage, the company is convinced that quality
is essential to the companys success; problems are generally prevented; and
quality improvement activities are regular and continuing.

Quality is defined as conformance to requirements, not as goodness nor


elegance.
There is no such thing as a quality problem.
It is always cheaper to do it right the first time.
The only performance measurement is the cost of quality.
The only performance standard is zero defects.

Five absolutes of quality management: Philip B. Crosby

Crosbys Quality Vaccine

Integrity
Dedication to communication and customer satisfaction
Company-side policies and operations which support the quality thrust

FIGURE 2.5
Fourteen Step Quality Programme: Philip B. Crosby

Step 1 Establish management commitment it is seen as vital that the


whole management team participates in the programme, a half
hearted effort will fail.

Step 2 Form quality improvement teams the emphasis here is on


multidisciplinary team effort. An initiative from the quality
departments will not be successful. It is considered essential to
build team working across arbitrary, and often artificial,
organizational boundaries.

Step 3 Establish quality measurements these must apply to every


activity throughout the company. A way must be found to capture
every aspect, design, manufacturing, delivery and so on. These
measurements provide a platform for the next step.

Step 4 Evaluate the cost of quality this evaluation must highlight, using
the measures established in the previous step, where quality
improvement will be profitable.

Step 5 Raise quality awareness this is normally undertaken through the


training of managers and supervisors, through communications
such as videos and books, and by displays of posters etc.

Step 6 Take action to correct problems this involves encouraging staff to


identify and rectify defects, or pass them on to higher supervisory
levels where they can be addressed.

Step 7 Zero defects planning establish a committee, or working group to


develop ways to initiate and implement a zero defects programme.
Step 8 Train supervisors and managers this step is focused on achieving
understanding by all managers and supervisors of the steps in the
quality improvement programme in order that they can explain it
in turn.

Step 9 Hold a Zero Defects day to establish the attitude and expectation
within the company. Crosby sees this as being achieved in a
celebratory atmosphere accompanied by badges, buttons and
balloons.

Step 10 Encourage the setting of goals for improvement goals are of


course of no value unless they are related to appropriate time-
scales for their achievement.

Step 11 Obstacle reporting This is encouragement to employees to advise


management of the factors which prevent them from achieving
error free work. This might cover defective or inadequate
equipment, poor quality components, etc.

Step 12 Recognition for contributors Crosby considers that those who


contribute to the programme should be rewarded through a formal,
although non-monetary, reward scheme. Readers may be aware of
the Gold Banana award given by Foxboro for scientific
achievement (Peters and Waterman, 1982).

Step 13 Establish quality councils these are essentially forums composed


of quality professional and team leaders allowing them to
communicate and determine action plans for further quality
improvement.

Step 14 Do it all over again the message here is very simple


achievement of quality is an ongoing process. However far you
have got, there is always further to go!

2.5 OVERVIEW OF THE CONTRIBUTIONS OF MASAAKI IMAI

Masaaki Imai, a quality management consultant, was born in Tokyo in 1930. In


1955, he received his bachelor's degree from the University of Tokyo, where he
also did graduate work in international relations. In the 1950's he worked for five
years in Washington, D.C. at the Japanese Productivity Center where his
principle duty was escorting groups of Japanese business people through major
U.S. Plants. In 1962, he founded Cambridge Corp., an international management
and executive recruiting firm based in Tokyo. As a consultant, he assisted more
than 200 foreign and joint-venture companies in Japan in fields including
recruiting, executive development, personnel management and organizational
studies. From 1976 to 1986, Imai served as president of the Japan Federation of
Recruiting and Employment Agency Associations

In 1986, Imai established the Kaizen Institute , to help Western companies


introduce kaizen concepts, systems and tools. That same year, he published his
book on Japanese management, Kaizen: The Key to Japan's Competitive Success.
This best-selling book has since been translated into 14 languages. Other book by
Imai include, 16 Ways To Avoid Saying No, Never Take Yes for an Answer and
Gemba Kaizen published in 1997. To date the Kaizen institute is operating in over
22 countries and continues to act as an enabler to companies to accomplish their
manufacturing, process, and service goals.

The True Total Quality according to Masaaki Imai is important to recognise the
importance of the commonsense approach of gemba (shop floor) kaizen to
quality improvement, as against the technology-only approach to quality
practised in the west.

The production system (batch production) employed by over 90% of all the
companies in the world is one of the biggest obstacles to quality improvement. A
conversion from a batch to a JIT (just-in-time)/lean production system should be
the most urgent task for
any manufacturing company today in order to survive in the next millennium.

2.6 OVERVIEW OF THE CONTRIBUTIONS OF FEIGENBAUM

Mitchell Jay Feigenbaum (born December 19, 1944; Philadelphia, USA) is


a mathematical physicist whose pioneering studies in chaos theory led to the
discovery of the Feigenbaum constant.

The son of a Polish and a Ukrainian Jewish immigrants, Feigenbaum's


education was not a happy one. Despite excelling in examinations, his early
schooling at Tilden High School, Brooklyn, New York, and the City College of
New York seemed unable to stimulate his appetite to learn. However, in 1964 he
began his graduate studies at the Massachusetts Institute of Technology (MIT).
Enrolling for graduate study in electrical engineering, he changed his area to
physics. He completed his doctorate in 1970 for a thesis on dispersion relations,
under the supervision of Professor Francis Low.

After short positions at Cornell University and the Virginia Polytechnic


Institute, he was offered a longer-term post at the Los Alamos National
Laboratory in New Mexico to study turbulence in fluids. Although that group of
researchers was ultimately unable to unravel the currently intractable theory of
turbulent fluids, his research led him to study chaotic mappings.

Some mathematical mappings involving a single linear parameter exhibit


the apparently random behavior, known as chaos, when the parameter lies
within certain ranges. As the parameter is increased towards this region, the
mapping undergoes bifurcations at precise values of the parameter. At first there
is one stable point, then bifurcating to an oscillation between two values, then
bifurcating again to oscillate between four values and so on. In 1975, Dr.
Feigenbaum, using the small HP-65 computer he had been issued, discovered
that the ratio of the difference between the values at which such successive
period-doubling bifurcations occur tends to a constant of around 4.6692... He was
then able to provide a mathematical proof of that fact, and he then showed that
the same behavior, with the same mathematical constant, would occur within a
wide class of mathematical functions, prior to the onset of chaos. For the first
time, this universal result enabled mathematicians to take their first steps to
unravelling the apparently intractable "random" behavior of chaotic systems.
This "ratio of convergence" is now known as the Feigenbaum constant.

The Logistic map is a prominent example of the mappings that


Feigenbaum studied in his noted 1978 article: Quantitative Universality for a Class
of Nonlinear Transformations.

During Dr Feigenbaum's duty at the Los Alamos Lab, he acquired a unique


position which led to many scientists' being sorry to see him leave. When anyone
in any of the many fields of work going on at the Los Alamos Lab was stuck on a
problem, it eventually became a common practice to seek out Feigenbaum, and
then go for a walk to discuss the problem. Dr. Feigenbaum frequently helped
others to understand the problem they were dealing with better, and he often
turned out to have read a paper that would help them; he was usually able to tell
them the title, authors, and publication date to make things easier, and he did so
straight off the top of his head most of the time. The amount of reading he was
doing must have been formidable, and that would have left many without time
to do any of their assigned research. Yet, his appetite for work was such that he
continued to make a significant contribution to the work that he was assigned to
do. It should be noted that the people who found him helpful in this manner
were working in a very wide range of different kinds of scientific work. Few men
would have stood a chance of being able to understand these all in enough depth
to help out. "Not my field" would have been the response of most of them if they
had discussed these matters with one another instead.

Feigenbaum's other contributions include important new fractal methods


in cartography, starting when he was hired by Hammond to develop techniques
to allow computers to assist in drawing maps. The introduction to the Hammond
Atlas (1992) states:

"Using fractal geometry to describe natural forms such as coastlines,


mathematical physicist Mitchell Feigenbaum developed software capable
reconfiguring coastlines, borders, and mountain ranges to fit a multitude of map
scales and projections. Dr. Feigenbaum also created a new computerized type
placement program which places thousands of map labels in minutes, a task
which previously required days of tedious labor."

In 1983 he was awarded a MacArthur Fellowship, and in 1986, he was


awarded the Wolf Prize in Physics. He has been Toyota Professor at Rockefeller
University since 1986.

2.7 OVERVIEW OF THE CONTRIBUTIONS OF ISHIKAWA

ISHIKAWA DIAGRAM

An Ishikawa diagram, also known as a Fishbone diagram or cause and


effect diagram, is a diagram that shows the causes of a certain event. It was first
used by Kaoru Ishikawa in the 1960s, and is considered one of the seven basic
tools of quality management, including the histogram, Pareto chart, check sheet,
control chart, cause and effect diagram, flowchart, and scatter diagram.. Because
of its shape, an Ishikawa diagram can be known as a Fishbone Diagram. It is also
known as a cause and effect diagram.

A common use of the Ishikawa diagram is in product design, to identify


desirable factors leading to an overall effect. Mazda Motors famously used a
Ishikawa diagram in the development of the Miata sports car, where the required
result was "Jinba Ittai" or "Horse and Rider as One". The main causes included
such aspects as "touch" and "braking" with the lesser causes including highly
granular factors such as "50/50 weight distribution" and "able to rest elbow on
top of driver's door". Every factor identified in the diagram was included in the
final design.

FIGURE 2.6
A generic Ishikawa diagram showing general and more refined causes for
an event.

People sometimes call Ishikawa diagrams "fishbone diagrams" because of


their fish-like appearance. Most Ishikawa diagrams have a box at the right hand
side in which is written the effect that is to be examined. The main body of the
diagram is a horizontal line from which stem the general causes, represented as
"bones". These are drawn towards the left hand corners of the paper, and they
are each labeled with the causes to be investigated. Off each of the large bones
there may be smaller bones highlighting more specific aspects of a certain cause.
When the most probable causes have been identified, they are written in the box
along with the original effect.

Definition: A graphic tool used to explore and display opinion about sources of
variation in a process. (Also called a Cause-and-Effect or Fishbone Diagram.)

Purpose: To arrive at a few key sources that contributes most significantly to the
problem being examined. These sources are then targeted for improvement. The
diagram also illustrates the relationships among the wide variety of possible
contributors to the effect.

The figure below shows a simple Ishikawa diagram. Note that this tool is
referred to by several different names: Ishikawa diagram, Cause-and-Effect
diagram, Fishbone diagram, and Root Cause Analysis. The first name is after the
inventor of the tool, Kaoru Ishikawa (1969) who first used the technique in the
1960s.

The basic concept in the Cause-and-Effect diagram is that the name of a


basic problem of interest is entered at the right of the diagram at the end of the
main bone". The main possible causes of the problem (the effect) are drawn as
bones off of the main backbone. The "Four-M" categories are typically used as a
starting point: "Materials", "Machines", "Manpower", and "Methods". Different
names can be chosen to suit the problem at hand, or these general categories can
be revised. The key is to have three to six main categories that encompass all
possible influences. Brainstorming is typically done to add possible causes to the
main "bones" and more specific causes to the "bones" on the main "bones". This
subdivision into ever increasing specificity continues as long as the problem
areas can be further subdivided. The practical maximum depth of this tree is
usually about four or five levels. When the fishbone is complete, one has a rather
complete picture of all the possibilities about what could be the root cause for the
designated problem.

FIGURE 2.7

The Cause-and-Effect diagram can be used by individuals or teams;


probably most effectively by a group. A typical utilization is the drawing of a
diagram on a blackboard by a team leader who first presents the main problem
and asks for assistance from the group to determine the main causes which are
subsequently drawn on the board as the main bones of the diagram. The team
assists by making suggestions and, eventually, the entire cause and effect
diagram is filled out. Once the entire fishbone is complete, team discussion takes
place to decide what are all the most likely root causes of the problem. These
causes are circled to indicate items that should be acted upon, and the use of the
tool is complete.
The Ishikawa diagram, like most quality tools, is a visualization and
knowledge organization tool. Simply collecting the ideas of a group in a
systematic way facilitates the understanding and ultimate diagnosis of the
problem. Several computer tools have been created for assisting in creating
Ishikawa diagrams. A tool created by the Japanese Union of Scientists and
Engineers (JUSE) provides a rather rigid tool with a limited number of bones.
Other similar tools can be created using various commercial tools.

Only one tool has been created that adds computer analysis to the
fishbone. Bourne et al. (1991) reported using Dempster-Shafer theory (Shafer and
Logan, 1987) to systematically organize the beliefs about the various causes that
contribute to the main problem. Based on the idea that the main problem has a
total belief of one, each remaining bone has a belief assigned to it based on
several factors; these include the history of problems of a given bone, events and
their causal relationship to the bone, and the belief of the user of the tool about
the likelihood that any particular bone is the cause of the problem.

How to Construct:

1. Place the main problem under investigation in a box on the right.


2. Have the team generate and clarify all the potential sources of
variation.
3. Use an affinity diagram to sort the process variables into naturally
related groups. The labels of these groups are the names for the
major bones on the Ishikawa diagram.
4. Place the process variables on the appropriate bones of the
Ishikawa diagram.
5. Combine each bone in turn, insuring that the process variables are
specific, measurable, and controllable. If they are not, branch or
explode the process variables until the ends of the branches are
specific, measurable, and controllable.

Tip:

Take care to identify causes rather than symptoms.


Post diagrams to stimulate thinking and get input from other staff.
Self-adhesive notes can be used to construct Ishikawa diagrams. Sources
of variation can be rearranged to reflect appropriate categories with
minimal rework.
Insure that the ideas placed on the Ishikawa diagram are process
variables, not special caused, other problems, tampering, etc.
Review the quick fixes and rephrase them, if possible, so that they are
process variables.
2.8 OVERVIEW OF THE CONTRIBUTIONS OF TAGUCHI

Taguchi Methods: Introduction

Dr. Genichi Taguchi has played an important role in popularising Design Of


Experiments (DOE). However, it would be wrong to think that the Taguchi
Methods are just another way of performing DOE. He has developed a complete
philosophy and the associated methods for Quality Engineering. His most
important ideas are:

A quality product is a product that causes a minimal loss (expressed in


money!) to society during it's entire life. The relation between this loss and
the technical characteristics is expressed by the loss function
Quality must be built into products and processes. There has to be much
more attention to Off Line Quality Control in order to prevent problems
from occurring in production.
Different types of noise (variation within tolerance, external conditions,
dissipation from neighbouring systems, ) have an influence on our system
and lead to deviations from the optimal condition. To avoid the influence of
these noises we need to develop robust products and processes. The
robustness of a system is it's ability to function optimally even under
changing noise conditions.

FIGURE 2.8

FIGURE 2.9
Taguchi methods

Taguchi methods are statistical methods developed by Genichi Taguchi to


improve the quality of manufactured goods and, more recently, to
biotechnology, marketing and advertising. Taguchi methods are controversial
among many conventional Western statisticians unfamiliar with the Taguchi
methodology.

Taguchi's principle contributions to statistics are:

1. Taguchi loss-function;
2. The philosophy of off-line quality control; and
3. Innovations in the design of experiments.

Loss functions

Taguchi's reaction to the classical design of experiments methodology of


R. A. Fisher was that it was perfectly adapted in seeking to improve the mean
outcome of a process. As Fisher's work had been largely motivated by
programmes to increase agricultural production, this was hardly surprising.
However, Taguchi realised that in much industrial production, there is a need to
produce an outcome on target, for example, to machine a hole to a specified
diameter or to manufacture a cell to produce a given voltage. He also realised, as
had Walter A. Shewhart and others before him, that excessive variation lay at the
root of poor manufactured quality and that reacting to individual items inside
and outside specification was counter-productive.
He therefore, argued that quality engineering should start with an
understanding of the cost of poor quality in various situations. In much
conventional industrial engineering the cost of poor quality is simply
represented by the number of items outside specification multiplied by the cost
of rework or scrap. However, Taguchi insisted that manufacturers broaden their
horizons to consider cost to society. Though the short-term costs may simply be
those of non-conformance, any item manufactured away from nominal would
result in some loss to the customer or the wider community through early wear-
out; difficulties in interfacing with other parts, themselves probably wide of
nominal; or the need to build-in safety margins. These losses are externalities and
are usually ignored by manufacturers. In the wider economy the Coase Theorem
predicts that they prevent markets from operating efficiently. Taguchi argued
that such losses would inevitably find their way back to the originating
corporation (in an effect similar to the tragedy of the commons) and that by
working to minimise them, manufacturers would enhance brand reputation, win
markets and generate profits.

Such losses are, of course, very small when an item is near to nominal.
Donald J. Wheeler characterised the region within specification limits as where
we deny that losses exist. As we diverge from nominal, losses grow until the point
where losses are too great to deny and the specification limit is drawn. All these
losses are, as W. Edwards Deming would describe them, ...unknown and
unknowable but Taguchi wanted to find a useful way of representing them within
statistics.

Taguchi specified three situations:

1. Larger the better (for example, agricultural yield);


2. Smaller the better (for example, carbon dioxide emissions); and
3. On-target, minimum-variation (for example, a mating part in an
assembly).

The first two cases are represented by simple monotonic loss functions. In the
third case, Taguchi adopted a squared-error loss function on the grounds:

It is the first symmetric term in the Taylor series expansion of any


reasonable, real-life loss function, and so is a "first-order" approximation;
Total loss is measured by the variance. As variance is additive it is an
attractive model of cost; and
There was an established body of statistical theory around the use of the
least squares principle.
The squared-error loss function had been used by John von Neumann and
Oskar Morgenstern in the 1930s.

Though much of this thinking is endorsed by statisticians and economists in


general, Taguchi extended the argument to insist that industrial experiments
seek to maximise an appropriate signal to noise ratio representing the magnitude
of the mean of a process, compared to its variation. Most statisticians believe
Taguchi's signal to noise ratios to be effective over too narrow a range of
applications and they are generally deprecated.

Off-line quality control

Taguchi realised that the best opportunity to eliminate variation is during


design of a product and its manufacturing process (Taguchi's rule for
manufacturing). Consequently, he developed a strategy for quality engineering
that can be used in both contexts. The process has three stages:

1. System design;
2. Parameter design; and
3. Tolerance design.

System design

This is design at the conceptual level involving creativity and innovation.

Parameter design

Once the concept is established, the nominal values of the various


dimensions and design parameters need to be set, the detailed design phase of
conventional engineering. In 1802, philosopher William Paley had observed that
the inverse-square law of gravitation was the only law that resulted in stable
orbits if the planets were perturbed in their motion. Paley's understanding that
engineering should aim at designs robust against variation led him to use the
phenomenon of gravitation as an argument for the existence of God. William
Sealey Gosset in his work at the Guinness brewery suggested as early as the
beginning of the 20th century that the company might breed strains of barley
that not only yielded and malted well but whose characteristics were robust
against variation in the different soils and climates in which they were grown.
Taguchi's radical insight was that the exact choice of values required is under-
specified by the performance requirements of the system. In many
circumstances, this allows the parameters to be chosen so as to minimise the
effects on performance arising from variation in manufacture, environment and
cumulative damage. This approach is often known as robust design or
Robustification.

Tolerance design

With a successfully completed parameter design, and an understanding of


the effect that the various parameters have on performance, resources can be
focused on reducing and controlling variation in the critical few dimensions.

Design of experiments

Taguchi developed much of his thinking in isolation from the school of R. A.


Fisher, only coming into direct contact in 1954. His framework for design of
experiments is idiosyncratic and often flawed but contains much that is of
enormous value. He made a number of innovations.

Outer arrays

In his later work, R. A. Fisher started to consider the prospect of using design of
experiments to understand variation in a wider inductive basis. Taguchi sought to
understand the influence that parameters had on variation, not just on the mean.
He contended, as had W. Edwards Deming in his discussion of analytic studies,
that conventional sampling is inadequate here as there is no way of obtaining a
random sample of future conditions. In conventional design of experiments,
variation between experimental replications is a nuisance that the experimenter
would like to eliminate whereas, in Taguchi's thinking, it is a central object of
investigation.

Taguchi's innovation was to replicate each experiment by means of an outer


array, itself an orthogonal array that seeks deliberately to emulate the sources of
variation that a product would encounter in reality. This is an example of
judgement sampling. Though statisticians following in the Shewhart-Deming
tradition have embraced outer arrays, many academics are still skeptical. An
alternative approach proposed by Ellis R. Ott is to use a chunk variable.

Management of interactions

Many of the orthogonal arrays that Taguchi has advocated are saturated
allowing no scope for estimation of interactions between control factors, or inner
array factors. This is a continuing topic of controversy. However, by combining
orthogonal arrays with an outer array consisting of noise factors, Taguchi's
method provides complete information on interactions between control factors
and noise factors. The strategy is that these are the interactions of most interest in
creating a system that is least sensitive to noise factor variation.

Followers of Taguchi argue that the designs offer rapid results and that
control factor interactions can be eliminated by proper choice of quality
characteristic (ideal function) and by transforming the data. That
notwithstanding, a confirmation experiment offers protection against any
residual interactions. In his later teachings, Taguchi emphasizes the need
to use an ideal function that is related to the energy transformation in the
system. This is an effective way to minimize control factor interactions.

Western statisticians argue that interactions are part of the real world and
that Taguchi's arrays have complicated alias structures that leave
interactions difficult to disentangle. George Box, and others, have argued
that a more effective and efficient approach is to use sequential assembly.

Analysis of experiments

Taguchi introduced many methods for analysing experimental results including


novel applications of the analysis of variance and minute analysis. Little of this work
has been validated by Western statisticians.

Assessment

Genichi Taguchi has made seminal and valuable methodological innovations in


statistics and engineering, within the Shewhart-Deming tradition. His emphasis
on loss to society; techniques for investigating variation in experiments and his
overall strategy of system, parameter and tolerance design have been massively
influential in improving manufactured quality worldwide.

Cost of Quality

I Assessing the cost of Quality

The quality of a product is one of the most important factors that


determine a companys sales and profit. Quality is measured in relation with the
characteristics of the products that customers expect to find on it, so the quality
level of the products is ultimately determined by the customers. The customers
expectations about a products performance, reliability and attributes are
translated into Critical-To-Quality (CTQ) characteristics and integrated in the
products design by the design engineers. While designing the products, they
must also take into account the resources capabilities (machines, people,
materials), i.e. their ability to produce products that meet the customers
expectations. They specify with exactitude the quality targets for every aspect of
the products.

But quality comes with a cost. The definition of the Cost Of Quality is
contentious. Some authors define it as the cost of non-conformance, i.e. how
much producing nonconforming products would cost a company. This is a one
sided approach since it does not consider the cost incurred to prevent non
conformance and above all in a competitive market, the cost of improving the
quality targets.

For instance, in the case of an LCD (Liquid Crystal Display) manufacturer,


if the market standard for a 15 LCD with a resolution of 1024x768 is 786,432
pixels and a higher resolution requires more pixels, improving the quality of the
15 LCDs, pushing the companys specifications beyond the market standards
would require the engineering of LCDs with more pixels which would require
extra cost.

The cost of quality is traditionally measured in terms of the costs


conformance and the cost of nonconformance to which we will add the cost of
innovation. The cost of conformance includes the appraisal and preventive costs
while the cost of non-conformance includes the costs of internal and external
defects.

Cost of conformance

Preventive Costs

The costs incurred by the company to prevent non-conformance. It


includes the costs of:
o Process capability assessment and improvement
o The planning of new quality initiatives (process changes, quality
improvement projects.)
o Employee training

Appraisal Cost.

The cost incurred while assessing, auditing, inspecting products and


procedures to conform products and services to specifications. It is intended
to detect quality related failures. It includes:
o Cost of process audits
o Inspection of products received from suppliers
o Final inspection audit
o Design review
o Pre-release testing

Cost of non-conformance

The cost of non-conformance is in fact the cost of having to rework


products and the loss of customers that results from selling poor quality
products.

Internal Failure
o Cost of reworking products that failed audit
o Cost of bad marketing
o Scrap

External Failure
o Cost of customer support
o Cost of shipping returned products
o Cost of reworking products returned from customers
o Cost of refunds
o Loss of customer Goodwill
o Cost of discounts to recapture customers

In the short term, there is a positive correlation between quality


improvement and the cost of conformance and a negative correlation between
quality improvement and the cost nonconformance. In other words, an
improvement in the quality of the products will lead to an increase in the cost of
conformance that generated it. This is because an improvement in the quality
level of a product might require extra investment in R&D, more spending in
appraisal cost, more investment in failure prevention and so on.

But a quality improvement will lead to a decrease in the cost of


nonconformance because fewer products will be returned from the customers,
therefore less operating cost of customer support and there will be less internal
rework.

For instance, one of the CTQs (Critical-To-Quality) for an LCD (Liquid


Crystal Display) is the number of pixels it contains. The brightness of each pixel
is controlled by individual transistors that switch the backlights on and off. The
manufacturing of LCDs is very complex and very expensive and it is very hard
to determine the number of dead pixels on an LCD before the end of the
manufacturing process. So in order to reduce the number of scrapped units, if the
number of dead pixels is infinitesimal or the dead pixels are almost invisible, the
manufacturer would consider the LCDs as good enough to be sold. Otherwise,
the cost of scrap or internal rework would be so prohibitive that it would
jeopardize the cost of production. Improving the quality level of the LCDs to
zero dead pixels would therefore increase the cost of conformance.

On the other hand, not improving the quality level of the LCDs will lead
to an increase in the probability of having returned products from customers and
internal rework, therefore increasing the cost of nonconformance.

The following graph plots the relationship between quality improvement


and the cost of conformance on one hand and the cost of non-conformance on the
other hand.

FIGURE 2.10

If the manufacturer determines the quality level at Q2, the cost of


conformance would be low (C1), but the cost of nonconformance would be high
(C2) because the probability for customer dissatisfaction will be high and more
products will be returned for rework therefore increasing the cost of rework, the
cost of customers services and shipping and handling.

The Total cost of Quality would be the sum the cost of conformance and the
cost of nonconformance, that cost would be C3 for a quality level of Q2.
C3 = C1 + C2.

FIGURE 2.11
Should the manufacturer decide that the quality level would be at Q1, the
cost of conformance (C2) would be higher than the cost of nonconformance (C1)
and the Total cost of Quality would be at C3.

The Total Cost Of Quality is minimized only when the cost of


conformance and the cost of nonconformance are equal.

It is worth to note that currently, the frequently used graph to represent


the throughput yield in manufacturing is the Normal curve. For a given target
and specified limits, the normal curve helps estimate the volume of defects that
should be expected. So while the Normal curve estimates the volume of defects,
the U curve estimates the cost incurred as a result of producing parts that do not
match the target.

The following graph represents both the volume of expected conforming


and nonconforming parts and the costs associated to them at every level.

FIGURE 2.12

II Taguchi's Loss Function


In the now traditional quality management acceptance, the engineers
integrate all the CTQs in the design of their new products and they clearly
specify the target for their production processes as they define the characteristics
of the products to be sent to the customers, but because of unavoidable common
causes of variation (variations that are inherent to the production process and
that are hard to eliminate) and the high costs of conformance, they are obliged to
allow some variation or tolerance around the target. Any product that falls
within the specified tolerance is considered as meeting the customers
expectations, and any product outside the specified limits would be considered
as non-conforming.

But according to Taguchi, the products that do not match the target, even
if they are within the specified limits do not operate as intended and any
deviation from the target, be it within the specified limits or not will generate
financial loss to the customers, the company and to society and the loss is
proportional to the deviation from the target.

Suppose that a design engineer specifies the length and diameter of a


certain bolt that needs to fit a given part of a machine. Even if the customers do
not notice it, any deviation from the specified target will cause the machine to
wear out faster causing the company financial loss under the form of repair of
the products under warranty or a loss of customers if the warranty has expired.

Taguchi Constructed a Loss Function equation to determine how much


society loses every time the parts produced do not match the specified target.
The Loss Function determines the financial loss that occurs every time a CTQ of a
product deviates from its target. The loss function is the square of the deviation
multiplied by a constant k, with k being the ratio of the cost of defective product
and the square of the tolerance.

The loss Function quantifies the deviation from the target and assigns a
financial value to the deviation.

= cost of a defective product


And m = LSL T or m = T - USL

According to Taguchi, the cost of quality in relation with the deviation


from the target is not linear because the customers frustration increases (at a
faster rate) as more defects are found on a product. Thats why the Loss function
is quadratic.

FIGURE 2.13

The graph that depicts the financial Loss to society that results from a
deviation from the target resembles the Total Cost of quality U graph that we
built earlier but the premises that helped build them are not the same. While the
Total Cost curve was built based on the costs of conformance and
nonconformance, Taguchis Loss Function is primarily based on the deviation
from the target and measures the loss from the customers expectation
perspective.

Example:

Suppose a machine manufacturer specifies the target for the diameter of a


given rivet to be 6 inches and the upper and lower limits of 5.98 and 6.02 inches
respectively. A bolt measuring 5.99 inches is inserted in its intended hole of a
machine. Five months after the machine was sold, it breaks down as a result of
loose parts. The cost of repair is estimated at $95, find the loss to society incurred
as a result of the part not matching its target.

Solution:

We must first determine the value of the constant k


T=6
USL = 6.02

m = (USL - T) = 6.02 - 6 = 0.02

= 95
K = (95 / 0.004) = 237500

Therefore

Not producing a bolt that match the target would have resulted in a financial
loss to society that amounted to $23.75.

Taguchi Method :Variability Reduction

Since the deviation from the target is the source of financial loss to society,
what needs to be done in order to prevent any deviation from the set target?

The first thought might be to reduce the specification range and improve
the online quality control, to bring the specified limits closer to the target and
inspect more samples during the production process in order to find the
defective products before they reach the customers. But this would not be a good
option since it would only address the symptoms and not the root causes of the
problem. It would be an expensive alternative because it would require more
inspection which would at best help detect nonconforming parts early enough to
prevent them from reaching the customers.

The root of the problem is in fact the variation within the production
process, i.e. the value of sigma, the standard deviation from the mean.

Lets illustrate this assertion with an example. Lets suppose that the
length of a screw is a Critical-To-Quality (CTQ) characteristic and the target is
determined to be 15 with a LCL of 14. 96 and a UCL of 15.04. The following
sample was taken for testing:

15.02
14.99
14.96
15.03
14.98
14.99
15.03
15.01
14.99

All the observed items in this sample fall within the control limits even
though all of them do not match the target. The mean is 15 and the standard
deviation is 0.023979. Should the manufacturer decide to improve the quality of
the output by reducing the range of the control limits to 14.98 and 15.02, three of
the items in the sample would have failed audit and would have to be reworked
or discarded.

Lets suppose that the manufacturer decides instead to reduce the


variability (the standard deviation) around the target and leave the control limits
untouched. After process improvement, the following sample is taken:

15.01
15
14.99
15.01
14.99
14.99
15
15.01
15

The mean is still 15 but the standard deviation has been reduced to
0.00866 and all the observed items are closer to the target. Reducing the
variability around the target has resulted in improving quality in the production
process at a lower cost.

This is not to suggest that the tolerance around the target should never be
reduced; addressing the tolerance limits should be done under specific
conditions and only after the variability around the target has been reduced.
Since variability is a source of financial loss to producers, customers and society
at large, it necessary to determine what the sources of variation are so that
actions can be taken to reduce them. According to Taguchi, these sources of
variation that he calls Noise factors can be reduced to three:

The Inner Noise


Inner noises are deteriorations due to time. Product wear, metal rust or
fading colors, material shrinkage and product waning are among the
Inner Noise factors.

The Outer Noises which are environmental effects on the products.


They are factors such as heat, humidity, operating conditions or pressure.
These factors have negative effects on products or processes. In the case of
my notebook, at first the LCD would not display until it heats up so
humidity was the noise factor that was preventing it from operating
properly. The manufacturer has no control over these factors.

The Product Noise or manufacturing imperfections


Product noises are due to production malfunctions, they can come from
bad materials, inexperienced operator or bad machine settings.

But if the online quality control is not the appropriate way to reduce
production variations, what needs to be done to prevent deviations from the
target?

According to Taguchi, a pre-emptive approach must be taken to thwart


the variations in the production processes. That pre-emptive approach that he
calls Off-line Quality control consists in creating a robust design, in other words
designing products that are insensitive to the noise factors.

Concept Design

The production of a product starts with the concept design, which


consists in choosing the product or service to be produced and defining its
structural design and the production process that will be used to generate it.
These factors are contingent upon among other factors the cost of production, the
companys strategy, the current technology and the market demand. So the
concept design will consist in:
Determining the intended use of the product and its basic functions
Determining the materials needed to produce the selected product
Determining the production process needed to produce it

Parameter Design

The next step in the production process is the parameter design. After the
design architecture has been selected; the producer will need to set the parameter
design. The parameter design consists in selecting the best combination of control
factors that would optimize the quality level of the product by reducing the
products sensitivity to noise factors. Control factors are parameters over which
the designer has control. When an engineer designs a computer, he has control
on factors such as the CPU, System board, LCD, memory, LCD cables. etc. He
determines what CPU best fits a motherboard, what memory stick and what
wireless network card to use and how to design the system board that would
make it easier for the parts to fit in. The way he combines those factors will
impact the quality level of the computer.

The producer wants to design products at the lowest possible cost and
at the same time have the best quality result under current technology. To do so,
the combination of the control factors must be optimal while the effect of the noise
factors must be so minimal that they will not have any negative impact on the
functionality of the products. So the experiment that leads to the optimal results
will require the identification of the noise factors because they are part of the
process and their effects need to be controlled.

Signal to Noise Ratio

One of the first steps the designer will take is to determine what the
optimal quality level is. He will need to determine what the functional
requirements are, assess the Critical-To-Quality characteristics of the product
and specify their targets. The determination of the CTQs and their targets
depends among other criteria on the customer requirements, the cost of
production and current technology. The engineer is seeking to produce the
optimal design, a product that is insensitive to noise factors!. The quality level of
the CTQ characteristics of the product under optimal conditions depends on
whether the response experiment is static or dynamic.

The response experiment (or output of the experiment) is said to be dynamic


when the product has a signal factor that steers the output. For instance when I
switch on the power button on my computer, I am sending a signal to the
computer to load my Operating System. It should power up and display within 5
seconds and it should do so exactly the same way every time I switch it on. If, as
in the case of my computer, it fails to display because of the humidity, I conclude
that the computer is sensitive to humidity and that humidity is a noise factor that
negatively impacts the performance of my computer.

FIGURE 2.14
The response experiment is said to be static when the quality level of the
CTQ characteristic is fixed. In that case, the optimization process will seek to
determine the optimal combination of factors that enables to reach the targeted
value. This happens in the absence of a signal factor, the only input factors are
the control factors and the noise factors. When we build a table, we determine all
the CTQ target and we want to produce a balanced table with all the parts
matching the targets.

The optimal quality level of a product depends on the nature of the


product itself. In some cases, the more a CTQ characteristic is found on a
product, the happier the customers are, in other cases the less the CTQ is present,
the better it is. Some products require the CTQs to match their specified targets.
According to Taguchi, to optimize the quality level of his products, the producer
must seek to minimize the noise factors and maximize the Signal-To-Noise (S/N)
ratio. Taguchi uses log functions to determine the Signal-To-Noise ratios that
optimize the desired output.

The Bigger-The-Better

If the number of minutes per dollar customers get from their cellular phone
service provider is critical to quality, the customers will want to get the
maximum number of minutes they can for every dollar they spend on their
phone bills.

If the lifetime of a battery is critical to quality, the customers will want


their batteries to last forever. The longer the battery lasts, the better it is.
The Signal-To-Noise ratio for the bigger-the-better is:

S/N = -10*log (mean square of the inverse of the response)


The Smaller-The-Better

Impurity in drinking water is critical to quality. The less impurities


customers find in their in their drinking water, the better it is.

Vibrations are critical to quality for a car, the less vibration the customers
feel while driving their cars the better, the more attractive the cars are.
The Signal-To-Noise ratio for the Smaller-The-Better is:

S/N = -10 *log (mean square of the response)

The Nominal-The-Best.

When a manufacturer is building mating parts, he would want every part


to match the predetermined target. For instance when he is creating pistons that
need to be anchored on a given part of a machine, failure to have the length of
the piston to match a predetermined size will result in it being either too small or
too long resulting in lowering the quality of the machine. In that case, the
manufacturer wants all the parts to match their target.

When a customer buys ceramic tiles to decorate his bathroom, the size of
the tiles is critical to quality, having tiles that do not match the predetermined
target will result in them not being correctly lined up against the bathroom walls.

The S/N equation for the Nominal-The-Best is:

S/N = 10 * log (the square of the mean divided by the variance)

Tolerance Design.

Parameter design may not completely eliminate variations from the target.
Thats why tolerance design must be used for all parts of a product to limit the
possibility of producing defective products. The tolerance around the target is
usually set by the design engineers; it is defined as the range within which
variation may take place. The tolerance limits are set after testing and
experimentation. The setting of the tolerance must be determined by criteria such
as the set target, the safety factors, the functional limits, the expected quality
level and the financial cost of any deviation from the target.

The safety limits measure the loss incurred when products that are
outside the specified limits are produced.

With being the loss incurred when the functional limits are exceeded and A
being the loss when the tolerance limits are exceeded.
tolerance specifications for the response factor will be:

With being the functional limit.

Example:

The functional limits of a conveyor motor are +/- 0.05 of the response
RPM. The adjustments made at the audit station before a motor left the company
cost $2.5 and the cost associated to defective motors once it has been sold is on
average $180.

Find the tolerance specification for a 2500 RPM motor.

Solution:

We need first of all to find the economical factor which is determined by


the loss incurred when the functional limits or/and the tolerance limits are
exceeded.

Now we can determine the tolerance specification. The tolerance specification


will be the value of the response factor plus or minus the allowed variation from
the target.

Tolerance specification for the response factor:


The variation from the target:
2500 * 0.0059 = 14.73

The tolerance specification will be 2500 +/- 14.73.

2.9 OVERVIEW OF THE CONTRIBUTIONS OF SHINGEO

Shingeo

Mistake Proofing or Poka-Yoke was pioneered By Shingeo Shingeo and


detailed in his 'Zero Defects Model'. Poke Yoke is defined as a simple,
inexpensive device that is non operator dependent, build into the production
process at the source of the operation for the purpose of preventing safety
hazards and quality defects 100% of the time.

It has many applications from a production environment to a lean office or


paper trail process and is used as a method for introducing a mistake proofing
idea into a process to eliminate defects in that process.

Examples in everyday include:

Bathroom sinks have a mistake-proofing device. It is the little hole near


the top of the sink that helps prevent overflows
An iron turns off automatically when it is left unattended or when it is
returned to its holder
The window in the envelope is not only a labour saving device. It prevents
the contents of an envelope intended for one person being inserted in an
envelope address to another

Shigeo Shingo's life-long work has contributed to the well being of everyone in
the world. Shigeo Shingo along with Taiichi Ohno, Kaoru Ishikawa and others
has helped to revolutionise the way we manufacture goods. His improvement
principles vastly reduce the cost of manufacturing - which means more products
to more people. They make the manufacturing process more responsive while
opening the way to new and innovative products with less defects and better
quality.

He was the first to create some of the strategies on continuous and total
involvement of all employees. Shingo's never-ending spirit of inquiry challenges
the status quo at every the first to create some of the strategies on total
involvement of all employees. Shingo's never-ending spirit of inquiry challenges
the status quo at every level- he proposed that everything could be improved.
Shingo believed that inventory is not just a necessary evil, but all inventory is
absolute evil. He is one of the pioneers of change management. He brought about
many new concepts such as ZD (Zero Defects), shifting use of statistics for
acceptance or rejection from SQC (Statistical Quality Control) to SPC (Statistical
Process Control), SMED (Single Minute Exchange of Dies), POKA-YOKE
(mistake proofing), Defining Processes & Operations in two-dimensions of VA
(Value Addition) and non-VA, etc.

2.10 CONCEPTS OF QUALITY CIRCLE

Quality circle

Quality is conformance to the claims made. A quality circle is a volunteer group


composed of workers who meet together to discuss workplace improvement,
and make presentations to management with their ideas. Typical topics are
improving safety, improving product design, and improvement in
manufacturing process. Quality circles have the advantage of continuity, the
circle remains intact from project to project.

Quality Circles were started in Japan in 1962 ( Kaoru Ishikawa has been credited
for creating Quality Circles) as another method of improving quality. The
movement in Japan was coordinated by the Japanese Union of Scientists and
Engineers (JUSE). Prof. Ishikawa, who believed in tapping the creative potential
of workers, innovated the Quality Circle movement to give Japanese industry
that extra creative edge. A Quality Circle is a small group of employees from the
same work area who voluntarily meet at regular intervals to identify, analyse,
and resolve work related problems. This can not only improve the performance
of any organisation, but also motivate and enrich the work life of employees.

The use of Quality Circles in many highly innovative companies in the


Scandinavian countries has been proven. The practice of it is recommended by
many economist/business scholars.
Dictionary meaning of Quality circle is: A group of employees who
perform similar duties and meet at periodic intervals, often with management, to
discuss work-related issues and to offer suggestions and ideas for improvements,
as in production methods or quality control.

Business Dictionary defines Quality Circles as: Small groups of employees


meeting on a regular basis within an organization for the purpose of discussing
and developing management issues and procedures. Quality circles are
established with management approval and can be important in implementing
new procedures. While results can be mixed, on the whole, management has
accepted quality circles as an important organizational methodology.

As per the Small Business Encyclopedia,


Quality Circle is identified as: A quality circle is a participatory management
technique that enlists the help of employees in solving problems related to their
own jobs. In their volume Japanese Quality Circles and Productivity, Joel E. Ross
and William C. Ross define a quality circle as "a small group of employees doing
similar or related work who meet regularly to identify, analyze, and solve
product-quality and production problems and to improve general operations.
The circle is a relatively autonomous unit (ideally about ten workers), usually led
by a supervisor or a senior worker and organized as a work unit." Employees
who participate in quality circles usually receive training in formal problem-
solving methodssuch as brainstorming, pareto analysis, and cause-and-effect
diagramsand then are encouraged to apply these methods to either specific or
general company problems. After completing an analysis, they often present
their findings to management and then handle implementation of approved
solutions.

Although most commonly found in manufacturing environments, quality


circles are applicable to a wide variety of business situations and problems. They
are based on two ideas: that employees can often make better suggestions for
improving work processes than management; and that employees are motivated
by their participation in making such improvements. Thus, implemented
correctly, quality circles can help a small business reduce costs, increase
productivity, and improve employee morale. Other potential benefits that may
be realized by a small business include greater operational efficiency, reduced
absenteeism, improved employee health and safety, and an overall better
working climate. In their book Production and Operations Management, Howard J.
Weiss and Mark E. Gershon called quality circles "the best means today for
meeting the goal of designing quality into a product."
The interest of U.S. manufacturers in quality circles was sparked by
dramatic improvements in the quality and economic competitiveness of Japanese
goods in the post-World War II years. The emphasis of Japanese quality circles
was on preventing defects from occurring rather than inspecting products for
defects following a manufacturing process. Japanese quality circles also
attempted to minimize the scrap and downtime that resulted from part and
product defects. In the United States, the quality circle movement evolved to
encompass the broader goals of cost reduction, productivity improvement,
employee involvement, and problem-solving activities.

Background

Quality circles were originally associated with Japanese management and


manufacturing techniques. The introduction of quality circles in Japan in the
postwar years was inspired by the lectures of W. Edwards Deming (1900-1993), a
statistician for the U.S. government. Deming based his proposals on the
experience of U.S. firms operating under wartime industrial standards. Noting
that American management had typically given line managers and engineers
about 85 percent of the responsibility for quality control and line workers only
about 15 percent, Deming argued that these shares should be reversed. He
suggested redesigning production processes to more fully account for quality
control, and continuously educating all employees in a firmfrom the top down
in quality control techniques and statistical control technologies. Quality
circles were the means by which this continuous education was to take place for
production workers.

Deming predicted that if Japanese firms adopted the system of quality


controls he advocated, nations around the world would be imposing import
quotas on Japanese products within five years. His prediction was vindicated.
Deming's ideas became very influential in Japan, and he received several
prestigious awards for his contributions to the Japanese economy.

The principles of Deming's quality circles simply moved quality control to


an earlier position in the production process. Rather than relying upon post-
production inspections to catch errors and defects, quality circles attempted to
prevent defects from occurring in the first place. As an added bonus, machine
downtime and scrap materials that formerly occurred due to product defects
were minimized. Deming's idea that improving quality could increase
productivity led to the development in Japan of the Total Quality Control (TQC)
concept, in which quality and productivity are viewed as two sides of a coin.
TQC also required that a manufacturer's suppliers make use of quality circles.
Quality circles in Japan were part of a system of relatively cooperative
labor-management relations, involving company unions and lifetime
employment guarantees for many full-time permanent employees. Consistent
with this decentralized, enterprise-oriented system, quality circles provided a
means by which production workers were encouraged to participate in company
matters and by which management could benefit from production workers'
intimate knowledge of the production process. In 1980 alone, changes resulting
from employee suggestions resulted in savings of $10 billion for Japanese firms
and bonuses of $4 billion for Japanese employees.

Active American interest in Japanese quality control began in the early


1970s, when the U.S. aerospace manufacturer Lockheed organized a tour of
Japanese industrial plants. This trip marked a turning point in the previously
established pattern, in which Japanese managers had made educational tours of
industrial plants in the United States. Lockheed's visit resulted in the gradual
establishment of quality circles in its factories beginning in 1974. Within two
years, Lockheed estimated that its fifteen quality circles had saved nearly $3
million, with a ratio of savings to cost of six to one. As Lockheed's successes
became known, other firms in the aerospace industry began adopting quality
circles. Thereafter quality circles spread rapidly throughout the U.S. economy; by
1980, over one-half of firms in the Fortune 500 had implemented or were
planning on implementing quality circles.

In the early 1990s, the U.S. National Labor Relations Board (NLRB) made
several important rulings regarding the legality of certain forms of quality
circles. These rulings were based on the 1935 Wagner Act, which prohibited
company unions and management-dominated labor organizations. One NLRB
ruling found quality programs unlawful that were established by the firm, that
featured agendas dominated by the firm, and addressed the conditions of
employment within the firm. Another ruling held that a company's labor-
management committees were in effect labor organizations used to bypass
negotiations with a labor union. As a result of these rulings, a number of
employer representatives expressed their concern that quality circles, as well as
other kinds of labor-management cooperation programs, would be hindered.
However, the NLRB stated that these rulings were not general indictments
against quality circles and labor-management cooperation programs, but were
aimed specifically at the practices of the companies in question.

Requirements for Successful Quality Circles

In his book Productivity Improvement: A Guide for Small Business, Ira B.


Gregerman outlined a number of requirements for a small business
contemplating the use of quality circles. First, the small business owner should
be comfortable with a participative management approach. It is also important
that the small business have good, cooperative labor-management relations, as
well as the support of middle managers for the quality circle program. The small
business owner must be willing and able to commit the time and resources
needed to train the employees who will participate in the program, particularly
the quality circle leaders and facilitators. It may even be necessary to hire outside
facilitators if the time and expertise does not exist in-house. Some small
businesses may find it helpful to establish a steering committee to provide
direction and guidance for quality circle activities. Even if all these requirements
are met, the small business will only benefit from quality circles if employee
participation is voluntary, and if employees are allowed some input into the
selection of problems to be addressed. Finally, the small business owner must
allow time for the quality circles to begin achieving desired results; in some
cases, it can take more than a year for expectations to be met.

But successful quality circles offer a wide variety of benefits for small
businesses. For example, they serve to increase management's awareness of
employee ideas, as well as employee awareness of the need for innovation within
the company. Quality circles also serve to facilitate communication and increase
commitment among both labor and management. In enhancing employee
satisfaction through participation in decision-making, such initiatives may also
improve a small business's ability to recruit and retain qualified employees. In
addition, many companies find that quality circles further teamwork and reduce
employee resistance to change. Finally, quality circles can improve a small
business's overall competitiveness by reducing costs, improving quality, and
promoting innovation.

2.11 JAPANESE 5S PRINCIPLES

5S
The 5Ses referred to in LEAN are:

Sort
Straighten
Shine
Standardize
Sustain

In fact, these 5S principles are actually loose translations of five Japanese


words:

Seiri - Put things in order (remove what is not needed and keep
what is needed)
Seiton - Proper Arrangement (Place things in such a way that they
can be easily reached whenever they are needed)
Seiso - Clean(Keep things clean and polished; no trash or dirt in the
workplace)
Seiketsu - Purity (Maintain cleanliness after cleaning - perpetual
cleaning)
Shitsuke - Commitment (a typical teaching and attitude towards
any undertaking to inspire pride and adherence to standards
established for the four components)

Another way to summarize 5S is:

A place for everything (first 3Ses) and Everything in it's Place (last two Ss)

2.12 8D METHODOLOGY

8 Disciplines

The "8D (8 Disciplines)" process is another problem solving method that is


often required specifically in the automotive industry. One of the distinguishing
characteristics of the 8D methodology is its emphasis on "teams."

The steps to 8D analysis are:

1. Use Team Approach


2. Describe the Problem
3. Implement and Verify Interim Actions (Containment)
4. Identify Potential Causes
5. Choose/Verify Corrective Actions
6. Implement Permanent Corrective Actions
7. Prevent Recurrence
8. Congratulate Your Team

SUMMARY

The principles and philosophies behind the evolution of various quality


management techniques are elaborated in this unit. Walter A. Shewhart focused
his work on ensuring control in industrial quality process. He laid the foundation
for evolutionary thinking on quality and its management. William Edwards
Deming, a contemporary to Shewhart has developed PDSA cycle and named it
as Shewhart cycle. Later, it was changed as PDCA cycle. Deming has made an
attempt to balance the standardized changes and continuous improvement of
things in the organization. Joseph Jurans contribution is Quality Trilogy. While
emphasizing quality planning during design of a product, he laid more stress on
quality control during operations. The famous Cost of quality curve to identify
the optimum conformance level was developed by Juran . The road map for
quality planning and the steps to continuous quality improvement are
amongst the contributions of Juran. Philip B.Crosby has identified five absolutes
of quality management and he has prescribed a quality vaccine. To achieve zero
defects in organization he has spelt out fourteen Step Quality Programme and
firmly believes that the zero defect is an achievable goal. Masaaki Imai
introduced Kaizen to the world. He has established Kaizen Institute which is
propagating his ideas throughout the world. Mitchell Jay Feigenbaum pioneered
studies in Chaos theory. Through his publications, he was able to disseminate
the logistic maps developed by him. Kaoru Ishikawa suggested the diagram to
identify the root cause of a problem. The fish bone diagram also known as cause
and effect diagram is predominantly used in fixing quality related problems. The
construction methodology is also presented. Dr.Genich Taguchi has popularized
the concept design of experiments. He has also developed a complete philosophy
of off-line quality control and innovations in the design on experiments are
deliberated in detail. Shingeo introduced Zero Defects Model to the
production community. Poke-Yoke is advocated by him. The concepts of quality
circle was introduced by Deming. The requirements for successful quality circles
and its evolution are presented. Japanese 5S principles, namely Seiri, Seiton,
Seiso, Seiketsu, Shitsuke and the 8 disciplines to be focused in the new era are
deliberated.

REVIEW QUESTIONS

1. Explain the influence of Walter A.Shewhart on ensuring quality in


organization.
2. Explain Shewhart Cycle and elaborate the contributions of Deming on
that.
3. Illustrate Juran Trilogy and demonstrate how quality is ensured
throughout that.
4. Enumerate the 14-step quality programme advocated by Crosby.
5. Explain how the logistics map was developed by Feigenbaum.
6. Illustrate Ishikawa diagram and demonstrate its usefulness in problem
solving.
7. What is Taguchi loss function? Explain the principles of operation.
8. Highlight the concept of quality circle and explain the requirements for
successfully carrying it out.
9. Explain the Japanese 5S principle.
10. Discuss the 8D methodology with examples.

Вам также может понравиться