Вы находитесь на странице: 1из 14

Software Metrics

Donna Dulo
Topics
Definitions
Data Collection
Principles
Objectives
Impacts
Issues
Measures
Development
Acquisition
Good Measures
Important Considerations
Measures & Metrics (Indicators)
Measures:
Quantifiable dimension, attribute, or amount of any
aspect of a program, product or process
The raw data which are associated with various
elements of the process and product
Metrics or Indicators:
Computed from measures
Quantifiable indices used to compare products,
processes, or projects or to predict their outcomes
Used to monitor requirements, predict resources,
track progress, and understand costs
Principles of Data Collection
Data is collected in accordance with specific
objectives and a plan
Choice of data to be collected is based on a
model or hypothesis about the process being
examined
Data collection process must consider the impact
of data gathering on the entire organization
Data collection plan must have management
support
Objectives of Data Collection
To gain an understanding about some item or
process as part of a research or development
study
To evaluate some product or activity to see if
it meets acceptance criteria
To control an activity
To develop rate or trend indicators for use in
making a prediction
Impact of Data Collection
Two fundamental issues that data gathering plans must
consider:
Effects of measurement on the people
Effects of the people on the measurements
Unless it is continually emphasized that process measures
will not be used to evaluate their performance, people will
often go to great lengths to make the numbers look good
Collecting data is tedious and must generally be done by
people who are busy with other tasks
Essential to motivate people to collect the needed data in a
timely manner
Automate as much data collection as is practical
Issues with Data Collection
Can be expensive and time consuming
Affects the busiest people on the project
May be viewed as personally threatening
Can be confusing as to what data needs to be
collected and how that data is going to be
used
Data Collection Plan
Contents answer the following questions:
What data is needed, by whom, and for what purpose?
What are the data specifications?
Who will collect the data?
Who will support data collection?
How will the data be collected?
How will the data be validated?
How will the data be managed?
Development data should be part of an offerors
proposal and negotiated
What about acquisition data?
Development Measures
Size
SLOC or function point (MIS)/feature point (RT, Emb)
EXACTLY what are you measuring? Definition?
What incentives are there for high/low counts?
Quality
Modules higher in defect density before (found bugs) tend
to be higher risk for remaining (latent) bugs
Staffing
Very insightful because
Even rats jump a sinking ship
People will eventually leave if you abuse them
Defects go up with inexperience
Acquisition Measures
Completion of documents (plans, specs, ICDs,
etc.)
Contracting milestones (ECP/CCP milestones)
Post-baseline requirements volatility
Efficacy of corrective actions
Action Item closure
Deviations from procedure/process/plan
Risks, risk assessment
Experience
Training
Goal-Driven Measurement Process
Measurement goals are derived from business
goals
Evolving mental models provide context and
focus The driving forces in goal-driven measurement are the
Goal-Question-Metric
business goals of your organization and the information
(GQM)
that you would like to have about translates
your products, processes, goals
into executable measurement
and resources to help you meet thesestructures
goals

CMU/SEI-96-HB-002, Goal-Driven Software MeasurementA Guidebook (1996)


Characteristics of Good Measures
Robust
Repeatable, precise, insensitive to technique
Suggests a norm
Obvious whether more is better
Relate to specific process or product property
Suggest an improvement strategy
Are a natural result of the process
Should never create work
Simple
Predictable and trackable
Important Considerations
Avoid lies, damn lies, and statistics
Listen to what the measures are telling you
Measure once, analyze twice
Validate, then interpret
Never evaluate people with process data
Start small
Few metrics; much analysis
Automate where practical
Feed analysis results from bottom up
Let workers know what data shows FIRST
Its never too late to start!! But
Dont expect < 6 months of data to be too meaningful
End of Lesson

Вам также может понравиться