Вы находитесь на странице: 1из 25

Lean Six Sigma Project

Presentation Template
[PROJECT TITLE]
[Project Period from mmm-yy to mmm-yy]

[Company Name
Division / Deptt Team]
Introduction
• About the company
• Products, Location
• Wh
DEFINE
DEFINE
A. Project Charter should contain
a. Project Title:
1. It should include clearly the name of the process to be
improved
b. Business Case
1. What is the “Pain” issue? (Problem symptoms)
2. How long it has been there? (Historical trends)
3. Current financial and non-financial impacts of the issue
4. Future impacts if improvement is not made now
5. How is the issue related to strategy?
DEFINE
A. Project Charter should contain (contd..)
c. Goals and objectives
1. SMART Objective for improvement
2. Stage targets (if long-term stage by stage
improvement is required)
d. Expected Benefits from the project
1. Involve Finance in quantifying the expected
benefits (quantify reduction in Cost of Poor
Quality)
2. For stage-by-stage improvement, what are
short, medium, long term objectives
e. Team members/Leader, Project
Sponsor,
f. Project Timeframe (initial)
1. Project plan in Gantt chart /network
diagram form
DEFINE
B. “SIPOC” or “COPIS” Diagram
a. Correctly identify SIPOC/COPIS (supplier, input, process, output,
customer)
b. Make a high level block diagram of the process relating to the issue

C. “VOC”
a. Understand “Voice of Customer” based on available data (customer
SLA, specification, feedback) or
b. Generate a VOC data collection plan including MSA.
D. CTQs and related Big Ys
a. Identify the “Critical to Quality” parameters of deliverables to customer .
Define Big Y(s) of the process
b. Classify “Y”s as per Kano Model – Hygiene, Higher the better, Delight
factors
c. Make CTQ Tree to cascade Big Ys into smaller deliverables (Y1, Y2…)
d. Prepare CTQ Specification Table to clearly define all Ys.
e. Defect Definition – Define what is not acceptable
DEFINE
E. Stratification & Prioritization of Ys
a. Collect historical / current data regarding the Ys
b. Stratify data for smaller Ys
c. Analyse historical / collected data e.g. by applying Regression Analysis to
identify significant smaller Ys (e.g. Y = f(Y1, Y2….,Yn),
F. Project Scope / Boundary
a. Make Pareto Chart and identify major Ys to focus upon
b. Define Project Scope/Boundary using SIPOC
c. Define the boundary of the project in terms of which Ys will be included
in the project and what will be done with the remaining Ys. (e.g. plan
another project, consider it in the next “6σ projects wave”)

G.Set Baseline
a. Define baseline performance for all Ys in the project scope based on
historical / collected data
b. Work out process capability (Sigma Level) for selected Ys based on
historical /collected data.
c. Summarize the “as is” process performance with comments
DEFINE
G. Set Targets
a. Obtain benchmark data for selected Y based on VOC / SLA and/or
competitive / best-in-class performance benchmarks
b. Define SMART targets for selected Ys and target process capabilities
(Sigma Levels) for them
c. Test the targets statistically to verify if they are significantly different from
the current performance level (Hypothesis test)
d. Review and re-define “defects” and their project targets
e. Review and re-define the project timeframe and milestones with due
dates

H. Approval of the Charter


a. Document the Project Charter
b. Champion and Sponsor review and approve the Charter
c. Charter is registered with Program Management Office for follow
up.
DEFINE – Key
Points
• Define phase is all about Ys, don’t bring in
Xs (causal factors) at this stage
• Don’t propose solutions in “Define” phase.
• The purpose of data analysis is only to
clearly define the project scope and set
SMART targets for Ys based on VOC (and
other considerations).
MEASURE
MEASURE
A. Data Collection Plan
a. It is preferable to collect current data from the
process however in some cases historical data may
be utilized (if the reliability of data is well established)
b. Use Prioritisation matrix or FMEA (if existing) to
identify critical data to be collected.

c. Prepare Data Collection Plan specifying:


i. Parameter to be measured
ii. Whether discrete or continuous
iii. Which location
iv. Measurement method (procedure/code/standard)
v. Sample Size and Sampling method : How was determined?
vi. How will the data be summarized
vii. Confirmation of calibration of measuring devices and
calibration plan if necessary.
MEASURE
B. Measurement System Analysis (MSA)
a. Verify calibration status and accuracy of measuring
devices used in the process. Calibration error should be
zero.
b. Demonstrate MSA (if applicable):
1. Gauge R&R for continuous variables measurement
2. Statistical Test for attribute Ys
MEASURE
C. Actual Measurement
a. Show sample of raw data in tabulated form
b. Summarise data
c. Give comments on the process performance
based on data
MEASURE – Assess Process
Capability

D. Process Capability Assessment


a. Carry out Process Capability Analysis
and determine
1. Cp, Cpk &
2. Pp, Ppk
3. Current defect ppm and process Sigma Levels
of the process
E. Estimate Process Sigma Level
Note: Assessment of Process Capability and Measurement of
Process Sigma level may be carried out in define phase, in case
reliable historical data is available.
ANALYZE
ANALYZE – Process
1. Detailed Process Map in appropriate ways
a. Flowchart or
b. Spaghetti Diagram
c. Function diagram of the process system (TRIZ)
d. Any other
2. Identify (for lean)
a. Value Adding activities,
b. Non-value Adding but necessary activities
c. Wasteful activities
3. Identify constraints / bottlenecks in the flow
4. Make Value Stream Maps
a. As is VSM
b. “Should be” or target VSM
ANALYZE - Data
1. Use PFMEA or Cause and Effect diagram to identify
all “Potential” causes
2. Prepare a Cause Validation Table, and apply the
appropriate validation methods:
3. Classify each potential cause into “Strong”, “Weak” and
“Insignificant/Irrelevant” Categories
4. List the “possible” causes separately for further
validation.
5. Prepare validation plan for the possible causes.
ANALYZE - Data
5. Validate “possible” cause by using
appropriate tests/tools as needed:
6. Apply deep analysis to further drill down
validated causes to reach “root causes”
7. Conclude Analysis by clearly identifying
validated possible causes and their Root
causes.
8. If DOE / Regression is carried out, predict
optimum settings and validate them.
IMPROVE
IMPROVE-
A. Based on the identified and validate root
causes, generate possible solutions.
B. Use creative / innovative thinking to
explore solutions
C. Try to incorporate mistake proofing
measures.
D. Prioritize solutions using appropriate
methods/criteria
IMPROVE
E. Plan for implementation
a. Trial runs / pilot implementation
i. Trial / pilot implementation plan
ii. Risk analysis of the pilot / trial plan
iii. Plan for MSA and data collection
iv. Method to evidence improvement
b. Finalize full scale implementation plan, with phases as
appropriate
c. Carry out thorough risk analysis of the agreed solution for full
scale implementation
d. Define stage gates including handing over the process to the
process owner.
CONTROL
CONTROL
A. Document / amend Procedures,
standards, work instructions incorporating
the improvements in the process
B. Define control plan for Xs
C. Install appropriate control charts (SPC)
for Ys and critical Xs.
D. Document “Out of Control Guidelines”
for the operators
CONTROL
E. Evaluate Process Sigma Level and compare with pre-
improvement Sigma level
F. Update Process FMEA and recalculate RPNs, which should
go down.
G. Carry out cost-benefit analysis, include non-financial
benefits
H. Identify opportunities for further improvement
a. In the same process and
b. Horizontal deployment opportunities (other similar processes)
I. Give details of Reward and Recognition within the company.
J. Share knowledge by publishing / presenting the case study
Lessons Learnt
• Identify what lessons were learnt
regarding problem solving method, project
management, teamwork etc

Вам также может понравиться