Академический Документы
Профессиональный Документы
Культура Документы
Software Metrics
One hour presentation to inform you of new techniques and practices in software development.
Express in Numbers
Measurement provides a mechanism for objective evaluation
2
Software Crisis
According to American Programmer, 31.1% of computer software projects get canceled before they are completed, 52.7% will overrun their initial cost estimates by 189%. 94% of project start-ups are restarts of previously failed projects.
Solution? systematic approach to software development and measurement
3
Software Metrics
It refers to a broad range of quantitative measurements for computer software that enable to
improve the software process continuously assist in quality control and productivity assess the quality of technical products assist in tactical decision-making
4
Metrics.
relates the individual measures in some way.
Indicator.
a combination of metrics that provide insight into the software process or project or product itself.
5
No. of errors and defects in each category is counted and ranked in descending order
D t H ndling 11%
Logic 20%
10
12
13
Unadjusted Function Points: count-total Assuming multiplier complexity all inputs with the same weight, all output with the same weight, function points Complete Formula for the Unadjusted Function Points:
Inputs
Taking Complexity into Account Fact rs are rate a scale f 0 ( t 5 (very i p rta t): ata c icati s istri te f cti s eavily se c fi rati tra sacti rate -li e ata e try ser efficie cy e t i p rta t)
-li e p ate c plex pr cessi i stallati ease perati al ease ltiple sites facilitate c a e
Formula:
CM ! ComplexityMultiplier FComplexityMultiplier
16
17
LOC vs. FP
Relationship between lines of code and function points depends upon the programming language that is used to implement the software and the quality of the design Empirical studies show an approximate relationship between LOC and FP
18
LOC/FP (average)
Assembly language C COBOL, FORTRAN C++ Visual Basic Smalltalk SQL Graphical languages (icons) 320 128 106 64 32 22 12 4
19
Implementation Metrics
Size, Complexity, Efficiency, etc.
20
Used to Derive
maintenance effort of software testing time required for software
23
Flow Graph
if (a) { X(); } else { Y(); }
Predicate Nodes
McCabes Metric
Smaller the V(G) the simpler the module. Modu es arger than V(G) are a itt e unmanageab e. A high cyc omatic comp exity indicates that the code may be of ow qua ity and difficu t to test and maintain
25
WM
! ci
i !1
Complexity may be the McCabe complexity of the method Smaller values are better Perhaps the average complexity per method is a better metric?
The number of methods and complexity of methods involved is a direct predictor of how much time and effort is required to develop and 27 maintain the class.
How would you interpret this number? A moderate va ue indicates scope for reuse and high va ues may indicate an inappropriate abstraction in the design
29
Excessive coup ing indicates wea ness of c ass encapsu ation and may inhibit reuse High coup ing a so indicates that more fau ts 30 may be introduced due to inter-c ass activities
Mc
i !1
If a large number of methods can be invoked in response to a message, the testing and debugging of the class 31 becomes more complicated
Testing Metrics
Metrics that predict the likely number of tests required during various testing phases Metrics that focus on test coverage for a given component
33
Views on SE Measurement
34
Views on SE Measurement
35
Views on SE Measurement
36
37
39
Selecting Metrics
Goal: Ensure all known defects are corrected before shipment
42
Developer
User
44
Other Measurements:
Preparation Rate = oc / prep_hrs nspection Rate = oc / in_hrs efect etection Rate = defects / (prep_hrs + in_hrs)
45
onth 2 W3
W4
W5
W6
Month 3 W7
W8
47
48
Fixed 23 27 18 12
Resolved 13 3 24 11 26 15 18 27
100 80 60 40 20 0 Jan Mar May July
120
80
40
120
80
40
0 0 20 40 60 80 100 120
1 2 3 4 5 6 7 8 9 10 11 12
49
Owner Management
Dont underestimate the intelligence of your engineers. For any one metric you can come up with, they will find at least two ways to beat it. [unknown]
54
Dont
Measure individuals Use metrics as a stick
Cost
Schedule
Do
Select metrics based on goals
Goal 1 Goal 2 Question 1 Question 2 Question 3 Question 4 [Basili-88] Metrics 1 Metric 2 Metric 3 Metric 4 Metric 5
Provide feedback
Data
Metrics
Obtain buy-in
References
Chidamber, S. R. & Kemerer, C. F., A Metrics Suite for Object Oriented Design, IEEE Transactions on Software Engineering, Vol. 20, #6, June 1994. Hitz, M. and Montazeri, B. Chidamber and Kemerers Metrics Suite: A Measurement Theory Perspective, IEE Transaction on Software Engineering, Vol. 22, No. 4, April 1996. Lacovara , R.C., and Stark G. E., A Short Guide to Complexity Analysis, Interpretation and Application, May 17, 1994. http://members.aol.com/GEShome/complexity/Comp.html Tang, M., Kao, M., and Chen, M., An Empirical Study on Object-Oriented Metrics, IEEE Transactions on Software Engineering, 0-76950403-5, 1999. Tegarden, D., Sheetz, S., Monarchi, D., Effectiveness of Traditional Software Metrics for Object-Oriented Systems, Proceedings: 25th Hawaii International Confernce on System Sciences, January, 1992, pp. 359-368. Principal Components of Orthogonal Object-Oriented Metrics http://satc.gsfc.nasa.gov/support/OSMASAS_SEP01/Principal_Components_of_Orthogonal_Object_Oriented_Metrics.htm Software Engineering Fundamentals by Behforhooz & Hudson, Oxford Press, 1996 Chapter 18: Software Quality and Quality Assurrance Software Engineering: A Practioner's Approach by Roger Pressman, McGraw-Hill, 1997 IEEE Standard on Software Quality Metrics Validation Methdology (1061) Object-Oriented Metrics by Brian Henderson-Sellers, Prentice-Hall, 1996
57