Вы находитесь на странице: 1из 31

Software Process and Project Metrics

❚ Outline:
❙ In the Software Metrics Domain:
❘ product metrics
❘ project metrics
❘ process metrics
❙ Software Measurement
❘ size-oriented metrics
❘ function-oriented metrics
❙ Metrics for Software Quality

March 2004 Chapter 4 – R. S. Pressman SRIMCA 1


Measure, Metrics, and Indicator
❚ Measure -- Provides a quantitative indication of the extent,
amount, dimensions, capacity, or size of some product or
process attribute.
❚ Metrics -- A quantitative measure of the degree to which a
system, component, or process possesses a given attribute.
❚ Software Metrics -- refers to a broad range of measurements
for computer software.
❚ Indicator -- a metric or combination of metrics that provide
insight into the software process, a software project, or the
product itself.

March 2004 Chapter 4 – R. S. Pressman SRIMCA 2


In the Process and Project Domains
Process Indicator

❙ enable insight into the efficacy of an existing process


❙ to assess the current work status
❙ Goal -- to lead to long-term software process improvement
❚ Project Indicator
❙ assess the status of an ongoing project
❙ track potential risks
❙ uncover problem areas before they go “critical”
❙ evaluate the project team’s ability to control the product quality

March 2004 Chapter 4 – R. S. Pressman SRIMCA 3


Process Metrics and Software
Process Improvement
Project

Customer Business
characteristics conditions

Process

People Technology
Development
environment
March 2004 Chapter 4 – R. S. Pressman SRIMCA 4
Measurement
❚ What to measure?
❙ errors uncovered before release
❙ defects delivered to and reported by end users
❙ work products delivered
❙ human effort expended
❙ calendar time expended
❙ schedule conformance
❚ At what level of aggregation?
By team?
❙ Individual?
❙ Project?

March 2004 Chapter 4 – R. S. Pressman SRIMCA 5


Privacy Issues
❚ Should they be used for personnel evaluation?
❚ Some issues?
❙ Privacy?
❙ Is total assignment being measured?
❙ Are the items being measured the same as for other
individuals being measured?
❙ Are the conditions of measurement the same across
individuals?
❚ However, they can be useful for individual improvement.

March 2004 Chapter 4 – R. S. Pressman SRIMCA 6


Use of Software Metrics
❚ Use common sense and organizational sensitivity.
❚ Provide regular feedback to individuals and teams.
❚ Don’t use metrics to appraise individuals.
❚ Set clear goal and metrics.
❚ Never use metrics to threaten individuals or teams
❚ Problems != negative. These data are merely an indicator for
process improvement.
❚ Don’t obsess on a single metric to the exclusion of other important
metrics.
❚ Do not rely on metrics to solve your problems.
❚ Beware of people performing to metrics rather than product quality or safety.

March 2004 Chapter 4 – R. S. Pressman SRIMCA 7


Statistical Software Process Improvement (SSPI)

❚ All errors and defects are categorized by origin.


❚ The cost to correct each error and defect is recorded.
❚ The number of errors and defects in each category is counted and ranked in descending order.

❚ The overall cost of errors and defects in each category is computed.

❚ Resultant data are analyzed to uncover the categories that result in


highest cost to the organization.
Plans are developed to modify the process with the intent of eliminating (or reducing) the class of errors and defects that is most costly.

March 2004 Chapter 4 – R. S. Pressman SRIMCA 8


Typical Causes of Product Defects

March 2004 Chapter 4 – R. S. Pressman SRIMCA 9


Example of Defect Analysis
missing ambiguous

specification
defects
wrong customer queried

customer gave
wrong infor.
inadequate inquiries

used outdated
information incorrect changes

March 2004 Chapter 4 – R. S. Pressman SRIMCA 10


Project Metrics
❚ Software Project Measures Are Tactical
❙ used by a project manager and a software team
❙ to adapt project work flow and technical activities
❚ The Intent of Project Metrics Is Twofold
❙ to minimize the development schedule to avoid delays and
mitigate potential problems and risks
❙ to assess project quality on an ongoing basis and modify
the technical approach to improvement quality
❚ Production Rates
❙ pages of documentation
❙ review hours
❙ function points
❙ delivered source lines
❙ errors uncovered during SW engineering tasks
March 2004 Chapter 4 – R. S. Pressman SRIMCA 11
Software Metrics
❚ Direct measures
❙ Cost and effort applied (in SEing process)
❙ Lines of code(LOC) produced
❙ Execution speed
❙ CPU utilization
❙ Memory size
❙ Defects reported over certain period of time

❚ Indirect Measures
❙ Functionality, quality, complexity, efficiency, reliability,
maintainability.
March 2004 Chapter 4 – R. S. Pressman SRIMCA 12
Software Measurement
❚ Size-Oriented Metrics
❙ are derived by normalizing quality and/or productivity
measures by considering the “size” of the software that has
been produced.
❙ lines of code often as normalization value.

project LOC effort $(000) pp.doc errors defects people


alpha 12,100 24 168 365 134 29 3
beta 27,200 62 440 1224 321 86 5
gamma 20,200 43 314 1050 256 64 6
.... ... ... ... ...
March 2004 Chapter 4 – R. S. Pressman SRIMCA 13
Typical Size-Oriented Metrics
❚ Errors per KLOC
❚ Defects per KLOC
❚ Dollars per KLOC
❚ Pages of documentation per KLOC
❚ Errors per person month
❚ LOC per person month
❚ Dollars per page of documentation

March 2004 Chapter 4 – R. S. Pressman SRIMCA 14


Software Measurement
❚ Function-Oriented Metrics
❙ use “functionality” to measure
❙ derived from “function point”
❙ using an empirical relationship
❙ based on countable (direct) measure of SW information
domain and assessments of software complexity
❚ Use of Function-Oriented Metrics
❙ Measuring scale of a project
❙ Normalizing other metrics, e.g., $/FP, errors/FP

March 2004 Chapter 4 – R. S. Pressman SRIMCA 15


Function Point Calculation

Weighting Factor
measurement parameter count simple average complex
number of user inputs * 3 4 6 =
number of user outputs * 4 5 7 =
# of user inquiries * 3 4 6 =
number of files * 7 10 15 =
# of external interfaces * 5 7 10 =
count_total

March 2004 Chapter 4 – R. S. Pressman SRIMCA 16


Function Point Calculation

Computing function points


Rate each factor on a scale of 0 to 5
1 2 3 4 5 6
no influence incidental moderate average significant essential

1. does the system require reliable backup and recovery?


2. are data communications required?
3. are there distributed processing functions?
4. is performance critical?
........
14. is the application designed to facilitate change and ease of use by the user?
March 2004 Chapter 4 – R. S. Pressman SRIMCA 17
Function-Oriented Metrics

FP = count_total * [0.65 + 0.01 * sum of Fi]


Outcome:
errors per FP
defects per FP
$ per FP
page of documentation per FP
FP per person_month

March 2004 Chapter 4 – R. S. Pressman SRIMCA 18


Function Point Extensions
❚ Function Points emphasizes “data dimension”
❚ Transformations added to capture “functional dimension”
❚ Transitions added to capture “control dimension”

March 2004 Chapter 4 – R. S. Pressman SRIMCA 19


3-D Function Point Calculation

March 2004 Chapter 4 – R. S. Pressman SRIMCA 20


Reconciling Different Metrics

C++
March 2004 Chapter 4 – R. S. Pressman 64
SRIMCA 21
Visualbasic 32
Metrics for Software Productivity
❚ LOC and FP Measures Are Often Used to Derive Productivity
Metrics
❚ 5 Important Factors That Influence SW Productivity
❙ people factors
❙ problem factors
❙ process factors
❙ product factors
❙ resource factors

March 2004 Chapter 4 – R. S. Pressman SRIMCA 22


Measures of Software Quality
❚ Correctness
❙ is the degree to which the software performs its required function.
the most common measure for correctness is defects per KLOC
(per year)
❚ Maintainability
❙ the ease that a program can be corrected
❙ adapted if the environment changes
❙ enhanced if the customer desires changes in requirements
❙ based on the time-oriented measure mean time to change (MTTC).
❙ Spoilage – a cost oriented metric for maintainability

March 2004 Chapter 4 – R. S. Pressman SRIMCA 23


Measures of Software Quality (Cont’d)

❚ Integrity
❙ to measure a system’s ability to withstand attacks (both accidental
and intentional) on its security threat and security are defined

❙ integrity = sum [ 1 - threat * (1- security)]


❚ Usability - an attempt to quantify “user friendliness”
❙ physical/intellectual requirement to learn
❙ time required to become moderately efficient in the use
❙ the net increase in productivity
❙ user attitudes toward system

March 2004 Chapter 4 – R. S. Pressman SRIMCA 24


Defect Removal Efficiency
❚ A Quality Metric That Provides Benefit at Both the Project and
Process Level
❚ DRE = E / ( E + D )
E = # of errors found before delivery of the software to the end
user
D = # of defects found after delivery
❚ More generally,
DREi = Ei / ( Ei + Ei+1 )
Ei = # of errors found during SE activity i

March 2004 Chapter 4 – R. S. Pressman SRIMCA 25


Integrating Metrics within the Processes
❚ Arguments for S/w Metrics
❙ Measurement is used to establish process baseline from which
improvements can be assessed.
❙ Developers are anxious to find after design:
❘ Which user reqs. are most likely to change?
❘ Which components in this system are most error prone?
❘ How much testing should be planned for each component?
❘ How many errors can I expect when testing commences?
❙ Answers to these can be found if metrics are collected and
used as technical guide.

March 2004 Chapter 4 – R. S. Pressman SRIMCA 26


Integrating Metrics within the Processes
❚ Establishing a baseline
❙ Benefits can be obtained at process, project & product levels
❙ Consists of data collected from past s/w development
❙ Baseline data should have following attributes:
❘ Data must be reasonably accurate
❘ Collect data for as many projects as possible
❘ Measures must be consistent
❘ Applications should be similar to work
❚ Metrics collection, computation and Evaluation

March 2004 Chapter 4 – R. S. Pressman SRIMCA 27


Summary View

March 2004 Chapter 4 – R. S. Pressman SRIMCA 28


Summary
❚ Metrics are a tool which can be used to improve the productivity and quality of the software system

❚ Process metrics takes a strategic view to the effectiveness of a


software process
❚ Project metrics are tactical that focus on project work flow
and technical approach
❚ Size-oriented metrics use the line of code as a normalizing
factor
❚ Function-oriented metrics use function points
❚ Four quality metrics------correctness, integrity, maintainability, and usability
were discussed

March 2004 Chapter 4 – R. S. Pressman SRIMCA 29


METRICS
❚ CLCS Metrics Philosophy
Phase 1: Provide a mandatory, nearly automated, metrics foundation to track lines of
code and errors.
Phase 2: Provide additional high-return metrics with recognized value.
❘ Schedule metrics (milestones)
❘ Additional S/W Problem metrics (actuals, trends, prediction)
❘ Defect correction metrics
❘ Run-time analysis metrics (McCabe tools, automated, COTS)
Phase 3: Be driven to additional metrics only by absolute need.

March 2004 Chapter 4 – R. S. Pressman SRIMCA 30


METRICS
System Software
Milestones Redston Thor CIT Atlas
e CIT Complet CIT
Complet e Complet
e e
Month/Year Sep-97 Oct-97 Nov-97 Dec-97 Jan-98 Feb-98 Mar-98 Apr-98 May-98 Jun-98 Jul-98 Aug-98
Software Size (KSLOC) 377.2 377.2 383.3 388.1 450.2 554.3
Actual Size of Executable Code 214.1 214.1 218.2 221.3 250.6 319.3 0.0 0.0 0.0 0.0 0.0 0.0
Code Delivered Comments 163.1 163.1 165.1 166.8 199.6 242.1 0.0 0.0 0.0 0.0 0.0 0.0
Razor Issue Closure TOTAL
Issues Opened Urgent 9 3 2 0 4 5 0 0 0 0 0 0 23
(this month) Critical 54 19 8 1 16 53 0 0 0 0 0 0 151
Major 60 16 20 5 28 26 0 0 0 0 0 0 155
Minor 57 17 6 0 13 37 0 0 0 0 0 0 130
Total: 180 55 36 6 61 121 0 0 0 0 0 0 459
Issues Closed Urgent 6 6 1 1 3 2 0 0 0 0 0 0 19
(this month) Critical 36 26 11 0 13 12 0 0 0 0 0 0 98
Major 39 24 12 4 14 10 0 0 0 0 0 0 103
Minor 27 17 19 5 2 16 0 0 0 0 0 0 86
Total: 108 73 43 10 32 40 0 0 0 0 0 0 306
Current Issues Open: Urgent 3 0 1 0 1 4 0 0 0 0 0 0
Critical 18 11 8 9 12 53 0 0 0 0 0 0
Major 21 13 21 22 36 52 0 0 0 0 0 0
Minor 30 30 17 12 23 44 0 0 0 0 0 0
Total: 72 54 47 43 72 153 0 0 0 0 0 0
Error Density: Issues / KSLOC 0.84 1.10 1.24 1.25 1.35 1.44

March 2004 Chapter 4 – R. S. Pressman SRIMCA 31

Вам также может понравиться