Вы находитесь на странице: 1из 64

Information Systems Project Management—David Olson

7-1

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-2

Chapter 7: Estimation

project planning - what to do


project control make sure it’s
done right
estimation of detailed system
design
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-3

Planning Process
•determine requirements from objectives
•specify work activities
•plan project organization
•develop schedule
•develop resource plan and budget
•establish control mechanisms
each project unique
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-4

Determine Requirements
Specify work activities
STATEMENT OF WORK:
– product descriptions
– constraints
– schedule requirements
– budget limits
– roles & responsibilities

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-5

WORK DEFINITION
• once objectives set,
TRANSLATE INTO WORK ELEMENTS
• what needs to be done
– easy to overlook some, or duplicate
• WORK BREAKDOWN STRUCTURE
– divide project into major work categories
• subdivide

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-6

Work Breakdown Structure

w r ite p a p e r
P R O JE C T

re s e a rc h w r itin g p r in tin g
C A TE G O R Y C A TE G O R Y C A TE G O R Y

lib r a r y in te r n e t w r ite r u n o ff
TA S K TA S K P A C K A G E P A C K A G E

w h a t's b e e n d o n e n e e d s d o in g s e a rc h
P A C K A G E P A C K A G E P A C K A G E

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-7

Work Breakdown Structure


• level 1 - overall project
• level 2 - category
– major project subelement
• level 3 - task
– subelement of category
• level x - subtask
• level bottom - WORK PACKAGE
– specific activity

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-8

Detailed Task List


• WBS can be focused on
– PRODUCT
– FUNCTION
– etc.
• when work packages identified,
– estimate requirements by resource
– WHAT IS NEEDED
– WHEN
– WHAT MUST PRECEDE

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-9

Work Breakdown Structure


• needs to be checked, approved
• provides
– good definition of work
– how long it will take
– resources required
– estimated costs
• planning & control
– assignments, budget, basis for control

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-10

Work Packages
• chunk of required work
• relatively small cost and short duration
• includes
– summary of work
– inputs required (predecessors)
– manager responsible
– product specifications
– resources required (including budget, dates)
– deliverables

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-11

list of work packages


(activities)
system design
activity predecessor time
A2 identify req’d infoA1 10
days
A31 basic software A2 3
days
A32 data access req’dA2 1 day
A33 vendor software A2 1 day
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-12

Work Packages
• need to identify start and finish events
for each work package
• related tasks without definable end
results (overhead & management;
inspection; maintenance) should be
included as task-oriented work
packages for COST purposes

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-13

Project Organization
• identify resources required by work
package
• RESPONSIBILITY MATRIX
– which functions do what work packages
– cost account structure
• start & finish date
• budget
• responsibilities

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-14

Project Management System

• lists activities on one axis


• lists people on other axis
• shows who is
– primarily responsible
– also involved
– has approval authority
– must be notified

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-15

Scheduling
• BASIS for
– RESOURCE ALLOCATION
– ESTIMATED COST
– plan for monitoring & control
• EVENTS or MILESTONES
– when activity completed (or started)
– INTERFACE EVENT
• when responsibility passes

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-16

Kinds of Schedules
• project schedules
– project master schedule - top management
– overview rather than detail
• task schedules
– specific activities required
– more detail

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-17

Resource Plans & Budgets

• Activities often compete for the same


resources
– hire more
– reschedule
• Resource plans show critical resource
schedules
– bottlenecks around which schedule is built

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-18

Charts
visual aids
• Gantt Charts
– plan - activities by time (work in outline)
– implement - fill in as work done
– doesn’t show relationships well
– very good at seeing where things are
(IF ACCURATE)
• Expense Charts - cumulatively graph $ spent

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-19

Recap
• Planning - key to accurate bidding
• need to know what it will cost in order
to know how to price
• need to know resources required,
complex projects take a long time
• MIS projects
– activities, predecessor relations, resource use

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-20

Software Estimation

The Mythical Man-Month:


Essays on Software Engineering
Frederick P. Brooks, Jr. (U N. Carolina)
Addison-Wesley: 1975

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-21

Programming Products
• Program
– usable by author
• Programming System
– usable by anyone
• Programming Product
– tested, documented, maintained
– 3 times the effort of a program
• Programming System Product

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-22

causes of project failure


• LACK OF CALENDAR TIME the most
common
– estimating techniques are poor
– assume that effort = progress
• you can’t just throw people at a problem
– poor monitoring of progress
• SCHEDULING
– tendency to assume all will go well

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-23

impact of adding people


• partitionable project
– marginal contribution declines
• non-partitionable project
– no benefit at all from adding people
• complex interactions
must separately coordinate each task with
all others
– first few have declining marginal contribution
– after some number, adding people slows down project
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-24

software project activities


• testing is the activity most difficult to predict
– planning 1/3 of project
– coding 1/6 of project
– component testing 1/4 of project
– system testing 1/4 of project
• most projects are on schedule
UNTIL TESTING
• Brooks’s Law: Adding manpower to a late
project makes it later
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-25

programmer productivity
• there is wide variation in
productivity between good and fair
programmers
• Brute force failures
– costly OS/360 TSS
– slow Exec 8 SAGE
– inefficient Scope 6600
– nonintegrated systems Multics
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-26

impact of adding programmers

• if a 200 person project has its best


25 people as managers
– fire the 175
– make the 25 managers programmers
• shouldn’t have more than 10
people on a team
• OS/360 had 1000 working on it,
5000 man-years
• small teams infeasible;
© McGraw-Hill/Irwin 2004 use
Information Systems Project Management—David Olson
7-27

surgical team
• surgeon chief programmer
• copilot share thinking, evaluation
• administrator boring details
• editor references, documentation
• secretaries (2)
• program clerk technical records
• toolsmith editing, debugging
• tester develop test cases
• language lawyer expert on language

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-28

conceptual integrity
• better to reflect one set of design
ideas than to add independent and
uncoordinated features &
improvements
• purpose of programming system is
to make computer easy to use
• simplicity & straightforwardness
come from conceptual integrity
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-29

conceptual integrity example


• OS/360
– architect manager: said his 10 person team
could write specifications in 10 months (3
months late)
– control program manager: his 150 people
could get it done in 7 months, & if his
people didn’t do this, they would have
nothing to do
– architect manager: control program people
would take 10 months, do a poor job
– Brooks gave to control program group
– took 10 months, plus added year to
debugging© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-30

estimating programming
time
• duration is exponential (bigger jobs take
more than proportionally longer)
– analogous to sprint 100 yards, running 1 mile
effort = K x (number of instructions)1.5
• one manager noted programming
taking twice as long as estimate
– only getting 20 hours of work/week
– machine down, divert to emergencies, meetings,
paperwork, sick

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-31

programming estimation
• the more interactions, the less
productivity
– interaction = coordination with others
• high level languages increase
productivity
– now tools should almost eliminate the
programming component, but there are other
activities (the more unpredictable ones)

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-32

(prototyping)
• in most projects, the first system
built is barely usable
• PLAN THE SYSTEM FOR CHANGE
– modularization
– subroutining
– interfaces
– documentation of interfaces
– high-level languages

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-33

software estimation

Charles R. Symons
Software Sizing and Estimating:
Mk II FPA
Wiley [1991]

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-34

software production cycle


• DESIGN
• DEVELOPMENT
• production not hard
• MAINTENANCE

recognized as a difficult task

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-35

software production cycle


• SYSTEM SIZE a variable in
– DESIGN
– DEVELOPMENT
– MAINTENANCE
• components of system size
– amount of information processed
– technical requirements
– performance drivers (objectives)

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-36

objectives
• COST minimization
• TIME minimization
• QUALITY assure that product
performs to specifications

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-37

size measures
• source lines of code
+ concrete measure
- what lines?
- logical or physical?
- housekeeping?
- different across languages
• most commonly used
• some economy of scale

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-38

Albrecht’s Function Point


Analysis Method
AIMS
– consistent measure
– meaningful to end user
• function points should be easier to understand than
lines of code
– rules easy to apply
– Can estimate from requirements specification
– independent of technology used

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-39

Albrecht’s system
• count
– external user inputs
– enquiries
– outputs
– master files delivered (internal & external)
• get points for every useful activity
function points=
information processing size
x technical complexity adjustment

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-40

Albrecht’s System
complexity tables
– data elements referenced
• 1-4
• 5-15
• 16 or more
– file types referenced
• 0 or 1
• 2
• 3 or more

table of SIMPLE, AVERAGE, COMPLEX

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-41

Albrecht complexity
1-4 data 5-15 data 16+ data

0 or 1 file SIMPLE SIMPLE AVERAGE


types
2 file SIMPLE AVERAGE COMPLEX
types
3+ file AVERAGE COMPLEX COMPLEX
types

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-42

Albrecht functional
multipliers
SIMPLE AVERAGE
COMPLEX
external input x3 x4 x
6
external output x4 x5 x
7
logical internal file x7 x 10 x
15
ext interface file x5 x7 x
10
external inquiry x3 x4 x
6
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-43

Albrecht Technical
Complexity Adjustment
14 general application characteristics
data communications on-line update distributed
functions complex processing performance
re-useability
heavily used configuration installation ease
transaction rate operational ease
on-line data entry multiple sites
end user efficiency facilitate change
not present =0 average influence =3
insignificant influence = 1 significant influence =4
moderate influence = 2 strong influence =5
TCA = 0.65+(0.01x(total degree of influence))
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-44

Symons’ complaints
• Albrecht method
– alternative counting practices
– weights used are questionable
– large systems under-weighted
• (XEROX 1985: rapid drop in productivity with
increasing system size)
– range of points too narrow
• but MUCH BETTER THAN SLOC

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-45

Mk II Function Point
Analysis
• modification of Albrecht
• use same Technical Complexity
Adjustment
• extend general application
characteristics to 19 or more
• weights adjusted
• Information Processing Size changed
the most

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-46

logical transactions
logical transaction =
– unique input/process/output combination
triggered by unique event of interest to the user
– or need to retrieve information
• create a customer
• update an account
• enquiry
• produce monthly summary report

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-47

unadjusted function points


UFPs = WI x # input data element types
+ WE x # entity types referenced
+ WO x # output data element types
weights determined by calibration
determine UFPs for system by adding
UFPs for all system logical transactions
assumes work directly proportional to # of data
elements;
size of process proportional to # data entries;
weights meaningful, obtainable
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-48

complexity adjustment
Albrecht’s method with 2 modifications:
– extend general application list to 19
• interfaces to other applications
• special security features
• direct access requirement for third parties
• special user training facilities
• documentation requirements
TCA = 0.65 + C x (total degree of influence)
where C is obtained by calibration

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-49

calibration
by CALIBRATION Symons means fit the
company’s data
regress
industry averages:
WI 0.58
WE 1.66
WO 0.26
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-50

Mk II FPA summary
• obtain general understanding of
the system
• construct a model of primary
entities
• identify logical transactions
• score degree of influence of all 19
general application characteristics
(plus client specific)
• obtain total project2004
© McGraw-Hill/Irwin work-hours,
Information Systems Project Management—David Olson
7-51

comparison
SLOC Albrecht Mk II
FPA
accepted standard no yes
yes
clarity potentially some
subjective objective
structured? no no yes
easy to use? yes no
no
automatable? yes no
yes
use for estimating? sometimes
© McGraw-Hill/Irwin 2004 yes
Information Systems Project Management—David Olson
7-52

Estimation Example

SLOC
Function Point

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-53

Source Lines of Code

• NEED DATABASE of past experience


AVERAGES effort 33 months
cost $361 (thousand)
documentation 1194 pages
errors 201
defects 52
people 4
KLOC 20.543

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-54

Implementing LOC

• Estimate structured lines of code 10,000


• averages proportional to LOC 10/20.543 = 0.487
effort 33 months × 0.487 = 16
cost $361 thou × 0.487 = $177,000
documentation 1194 pages × 0.487 = 581 pp.
errors 201 × 0.487 = 98
defects 52 × 0.487 = 25
people 4 × 0.487 = 2 people

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-55

Function Point Calculation

1 - get count-total
number of features times complexity
2 - get Fi
rate 14 factors (0-5), total
3 - FP = count-total × [0.65 + 0.01 × Σ Fi ]
4 - multiply historical averages (623) per FP
by this FP
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-56

1 - get count total

Complexity Weighting
simple average complex product
# user inputs __ × 3 + __ × 4 + __ × 6 = ___
# user outputs __ × 4 + __ × 5 + __ × 7 = ___
# user inquiries __ × 3 + __ × 4 + __ × 6 = ___
# files __ × 7 + __ × 10+__ × 15 = ___
# external interfaces __ × 5 + __ × 7 + __ × 10 = ___

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-57

1 - get count-total

Bank accounts record system involving


36 user inputs simple complexity
5 user outputs average complexity
20 user inquiries simple complexity
40 files accessed simple complexity
3 external interfacesaverage complexity

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-58

1 - get count-total

Complexity Weighting
simple average complex product
36 user inputs 36 × 3 + __ × 4 + __ × 6 = 108
5 user outputs __ × 4 + 5 × 5 + __ × 7 = 25
20 user inquiries 20 × 3 + __ × 4 + __ × 6 = 60
40 files 40 × 7 + __ × 10+__ × 15 = 280
3 external interfaces __ × 5 + 3 × 7 + __ × 10 = 21
TOTAL 494

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-59

2 - get Fi
F1 require reliable backup & recovery? Significant 4
F2 data communications required? Moderate 2
F3 distributed processing functions? Significant 4
F4 performance critical? Average 3
F5 run on existing, heavily utilized environment? Essential 5
F6 require on-line data entry? Essential 5
F7 on-line data entry from multiple operations? Incidental 1
F8 master files updated on-line? No influence 0
F9 inputs, outputs, files, or inquiries complex? Incidental 1
F10 internal processing complex? Incidental 1
F11 code designed to be reusable? Average 3
F12 conversion and installation included in the design? Average 3
F13 system designed for multiple installations in different orgs? No influence 0
F14 application designed to facilitate change and ease of use? No influence 0 Σ = 32

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-60

3- Calculate FP

FP = count-total × [0.65 + 0.01 × Σ Fi ]


= 494 × [0.65 + 0.01 × 32 ] = 479.18

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-61

4- Multiply by Historical

• Estimated FP 479.18
• averages proportional to avg 479.18/623 = 0.77
effort 33 months × 0.77 = 25.4
cost $361 thou × 0.77 = $278,000
documentation 1194 pages × 0.77 = 918 pp.
errors 201 × 0.77 = 155
defects 52 × 0.77 = 40
people 4 × 0.77 = 3 people

© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-62

Scheduling
• “Coding is 90% finished half of the
coding time”
• “Debugging is 99% complete most of
the time”
• MILESTONES: concrete events
studies of government projects
estimates carefully updated every 2
weeks before activity starts rarely
change
during activity, overestimates drop
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-63

Control
• When delay is first noticed,
the tendency is to not report it
• STATUS INFORMATION
what is going on
• ACTION INFORMATION
learning this causes something to be
done
• KEY: know when which case applies
© McGraw-Hill/Irwin 2004
Information Systems Project Management—David Olson
7-64

Summary
• Estimation of duration & cost key to sound project
decision making
• Estimating software development very difficult
– Can improve by
• Keeping records
• Using productivity-enhancing methods
• Use more off-the-shelf software
– Estimation methods can become accurate if
systematically applied

© McGraw-Hill/Irwin 2004

Вам также может понравиться