Академический Документы
Профессиональный Документы
Культура Документы
RATING INDEX
By Chung-Suk Cho1 and G. Edward Gibson Jr.,2 Members, ASCE
ABSTRACT: Poor scope definition is recognized by industry practitioners as one of the leading causes of project
failure, adversely affecting projects in the areas of cost, schedule, and operational characteristics. Unfortunately,
many owner and contractor organizations do a poor job of adequately defining a project’s scope leading to a
poor design basis. A research team constituted by the Construction Industry Institute (CII) has developed the
Project Definition Rating Index (PDRI) to address scope definition in the building sector. The PDRI for buildings
is a comprehensive, weighted checklist of 64 scope definition elements presented in a score sheet format. It
provides a tool for an individual or project team to objectively evaluate the status of a building project during
preproject planning. This paper will discuss the PDRI development process, including input from over 100
industry professionals. Key project scope definition elements will be identified. The PDRI validation procedure,
involving over 50 projects, will be discussed. A description of the potential uses of the PDRI and a summary
of its benefits to the building construction industry will be outlined.
eight other professionals directly involved in planning building completed, the workshop participants were instructed to apply
projects (Cho et al. 1999; Cho 2000). what they felt to be an appropriate cost contingency to each
Each participant completed a series of documents at the element, given two circumstances—the element was unde-
workshops. In addition to personal history, they were initially fined (level of definition 5), or it was completely defined (level
asked to list and consider a typical project that they had re- of definition 1). The weighting was based on their opinions as
cently worked on for the organization they represented. Each to the relative impact that each element has on the overall
was then asked to assume that he/she was estimating this par- accuracy of the project’s total installed cost (TIC) estimate. All
ticular project and evaluating its probability of success based 64 elements were reviewed in this manner.
on the level of definition of the 64 elements. The workshop The workshop concluded with critiques of the scoring meth-
proceeded in order through the 64 elements with each element odology and the tool itself. These comments were subse-
reviewed and its description read. quently evaluated and several minor corrections were made to
Assuming that scope development for the project had been the score sheet, instructions for use, and element descriptions.
JOURNAL OF ARCHITECTURAL ENGINEERING / DECEMBER 2001 / 117
FIG. 3. Example Element Description, G1. Equipment List
The raw weights obtained from these workshops were used level 5. Depending on how well the element is defined in the
to develop the final version of the PDRI score sheet. Each scope definition package, the PDRI score sheet user can check
participant’s responses at the workshop were individually eval- the appropriate definition level for a particular element, rang-
uated and normalized to a maximum of 1,000 points. This was ing from being completely defined to incomplete or poor def-
accomplished by adding all values in the ‘‘incomplete or inition.
poorly defined’’ column and converting those scores to values Adding up the individual element evaluations and their cor-
relative to one another that added to 1,000 points. A similar responding weights yields a single PDRI score for the project,
method was used to evaluate the elements when they had which can range from 70 to 1,000. The lower the total PDRI
‘‘complete definition’’ by normalizing to 70 points (which was score, the better the project scope definition. Higher weights
chosen to be consistent with the PDRI for industrial projects). signify that certain elements within the scope package lack
Definition levels 2, 3, and 4 were interpolated between the adequate definition and should be reexamined prior to con-
extremes. struction documents development.
A single, collective weight was developed for each of five
levels of definition of each element based using the mean of ANALYSES
the 69 responses. Several statistical tests were then performed
Analyzing Weighted PDRI
to evaluate the responses including simple descriptive statis-
tics, skewness, kurtosis, and variance analyses. In some cases, The three sections and 11 categories of the PDRI were
respondents were removed from the sample because their re- sorted in order of importance as shown in Table 1. The weight
sponses were far different from the overall sample. In the end, column corresponds to a summation of all definition level 5
59 of the 69 respondents were used to develop the final values for that category or section. In other words, if all ele-
weights, and the weighted PDRI score sheet is provided in ments in that section or category were incomplete or unde-
Appendix I (Cho 2000). An unweighted PDRI score sheet and fined, these would be the scores.
38 pages of element descriptions can be found in a separate Section II, Basis of Design, and Section I, Basis of Project
document entitled Project Definition Rating Index (PDRI), Decision, in combination, comprise 841 points, or approxi-
Building Projects (CII 1999). mately 84% of a potential of 1,000 points. This indicates the
The PDRI score sheet is used to evaluate the level of com- significance of having a sound basis of design and project
pleteness of the project scope definition at a point in time. decision prepared in the project scope definition package dur-
Each of the 64 elements is subjectively evaluated by key proj- ing the preproject planning phase, as identified by the work-
ect stakeholders during preproject planning based on its level shop panelists. It also signifies the importance of owner input
of definition versus its corresponding description. Six levels and active participation of critical owner stakeholders during
of definition are listed across the top of the PDRI score sheet, the planning stage of a project. The category weights sorted
creating a matrix with the 64 elements. These six definition in hierarchical order of importance indicate that Categories
levels, including level 0 for not applicable, range from com- A and E were deemed as the most important of the 11 cate-
plete definition for level 1 to incomplete or poor definition for gories, receiving 376 of the 1,000 total points. A list of 10
118 / JOURNAL OF ARCHITECTURAL ENGINEERING / DECEMBER 2001
TABLE 1. PDRI Section and Category Weights ement status at the beginning of construction document (CD)
Section Weights Category Weights development. These data were used to build profiles of the
sample and to assess the PDRI with regard to project success.
Section Weight Category Weight The PDRI for Building Projects was tested on a total of 33
II Basis of Design 428 A Business Strategy 214 completed projects varying in size from a final cost of $0.9
I Basis of Project 413 E Building Programming 162 million to $200 million, as shown in Table 3. The sample was
Decision a nonrandom sample from 10 organizations, with the PDRI
III Execution Approach 159 C Project Requirements 131
Total 1,000 F Building/Project Design 122 scored ‘‘after-the-fact.’’ These projects represented approxi-
Parameters mately $899.5 million in total constructed cost with a $26.8
D Site Information 108 million average.
B Owner Philosophies 68 Using an unweighted PDRI score sheet, the validation ques-
K Project Control 63 tionnaire respondents were asked to rate how well developed
L Project Execution Plan 60 each of the 64 elements were at the time the project was ready
G Equipment 36
H Procurement Strategy 25 to begin development of construction documents. This use of
J Deliverables 11 an unweighted scoresheet minimized the tendency of element
Total 1,000 weights to influence the evaluation process. Respondents in-
dicated their choice for each element by placing a check mark
in the box corresponding to the appropriate level of definition
TABLE 2. Ten Highest Weighted PDRI Elements on a scale ranging from 0 to 5. When the questionnaire was
Element returned, the writers converted this series of checks into a final
designator Element Weight project score.
A1 Building Use 44 The PDRI scores for 33 sample projects ranged from 74 to
A5 Facility Requirements 31 648 (from a possible range of 70 to 1,000) with a mean value
A7 Site Selection Considerations 28 of 203 and a median of 202. Among 33 sample projects, 16
A2 Business Justification 27 projects scored below 200 and remaining 17 scored above 200.
C6 Project Cost Estimate 27 The survey questionnaire captured detailed project information
A3 Business Plan 26 such as schedule, cost, changes, financial and investment in-
C2 Project Design Criteria 24
C3 Evaluation of Existing Facilities 24 formation, operating information, and customer satisfaction
A6 Future Expansion/Alteration Considerations 22 (Cho 2000).
F2 Architectural Design 22 The writers realize that project planning data used in the
sample were collected by relying on the respondent’s subjec-
tive recollections and, therefore, could be subject to biases.
highest weighted elements in descending order is shown in However, given the level of industry input in the tool devel-
Table 2. opment phase and the sample size, the results are adequate to
These 10 elements total 275 points, or approximately 28%,
of the 1,000 total points. (Each element has a corresponding TABLE 3. PDRI Validation Projects
detailed description which is not given here.) The 10 highest
weighted elements can be regarded as the most important el- Project Estimated cost PDRI
ements in the project scope definition package and, if poorly number Type of project (million $) score
or incompletely defined during early project planning, will 1 Office $10.0 256
have the greatest negative impact on project performance. If a 2 Recreational/athletic facility $32.6 96
3 Office $34.8 164
project team lacks the time for preproject planning prior to the
4 Warehouse $45.9 203
development of construction document and construction, these 5 Recreational/athletic facility $122.5 285
elements are the critical few that should be considered. 6 Stores/shopping center $200.0 460
Oftentimes, there is a tendency in the construction industry 7 Office $10.2 141
to skip several steps in the scope definition process in an at- 8 Office $8.7 130
tempt to reduce overall project cycle time. This may be due 9 Research/laboratory facility $0.9 208
10 Research/laboratory facility $0.9 202
to several reasons, such as lack of necessary expertise within
11 Research/laboratory facility $43.4 204
the organization, demand for the end product, or an unwill- 12 Industrial control building $25 126
ingness to commit the funds required for complete scope def- 13 Office $8.7 240
inition. If this happens, at least those critical few elements 14 Office $14.1 223
defined in Table 2 should be considered during preproject 15 Government border station $4.2 172
planning in order to meet the project objectives and reduce 16 Government border station $1.7 95
risk. 17 Courthouse $132.9 238
18 Store/shopping center $1.8 233
19 Fire station $1.6 218
PDRI VALIDATION 20 Retail/car dealership $1.6 158
21 School $23.1 102
Although the weights obtained for PDRI elements were 22 School $23.0 139
based upon the expertise of experienced project managers, ar- 23 Research/laboratory facility $3.3 149
chitects, and engineers, the tool needed to be tested on actual 24 Office $13.4 648
projects to verify its capabilities and value. In order to estab- 25 Research/laboratory facility $9.7 202
26 Seismic protection $16.1 188
lish an unbiased, reliable validation data sample from an an- 27 Warehouse $25.7 151
alytical and statistical standpoint, a number of both successful 28 Office $6.4 74
and unsuccessful projects were used for the validation. The 29 School $13.2 160
primary goal of the validation process was to correlate PDRI 30 Institutional building $18.1 205
scores with projects measured in terms of cost performance, 31 Recreational/athletic facility $24.2 238
schedule performance, change orders, and customer satisfac- 32 Public assembly/performance $18.2 165
33 Office $3.6 216
tion. A mail survey was used to collect quantitative and his- Totals $899.5
torical project data as well as ‘‘level of definition’’ PDRI el-
JOURNAL OF ARCHITECTURAL ENGINEERING / DECEMBER 2001 / 119
provide an initial tool validation, pending further study in the 200 and the projects scoring below 200 prior to development
future. of construction documents, as shown in Table 4.
Performance is the mean percentage change in actual cost
Project Performance Analyses Using Target (contingency not included) and schedule performance as com-
PDRI Score pared with that estimated prior to development of construction
documents (CDs). The reported change order value represents
In order to determine a PDRI score that distinguishes suc- the cost increase/decrease during design and construction due
cessful and unsuccessful projects, several different PDRI sam- to change orders as an absolute value.
ple segregation points (e.g., 150, 200, and 210) were used to The validation projects scoring below 200 outperformed
test the mean performance differences. Using these segregation those scoring above 200 in three important design/construction
points, mean values of project performance variables were outcome areas: cost performance, schedule performance, and
compared at a 95% confidence level. The writers found statis- the relative value of change orders as compared with the au-
tically significant mean differences on several performance thorized cost. In addition to cost and schedule differences, the
variables when the segregation point of 200 was used. projects scoring less than 200 performed better financially, had
The authors consistently observed a statistically significant fewer numbers of change orders, had less turbulence related
difference in performance between the projects scoring above to design size changes during CD development and construc-
tion, and were generally rated more successful on average than
TABLE 4. Summary of Cost, Schedule, and Change Order projects scoring higher than 200. Additional performance data
Performance for PDRI Validation Projects Using 200 Point Cutoff are summarized in Table 5.
PDRI Score
Performance <200 >200 Difference
PDRI Validation Using In-Progress Projects
Cost 1% above 6% above 5% While the validation process as discussed was performed on
budget budget complete projects, the PDRI was also used by the writers and
Schedule 2% behind 12% behind 10%
schedule schedule research team members on current, ongoing projects in a group
Change orders 7% of budget 10% of budget 3% setting to observe its effectiveness in helping teams complete
(N = 16) (N = 17) preproject planning activities. It was used on a total of 20
projects at different stages of planning, as outlined in Table 6.
In each case, the PDRI gave project team members a viable
TABLE 5. Summary of Other Performance Data for PDRI Validation platform to discuss project-specific issues and helped identify
Projects Using 200 Point Cutoff critical planning problems on every project. Examples of prob-
PDRI Score lems identified included site-specific issues such as flood plain
encroachment, fire water pressure shortfalls, traffic flow prob-
Performance <200 >200
lems, permitting surprises, and setback problems. Example
Average PDRI score 138 264 building problems identified included poor equipment lists, in-
Average number of change orders 58 95 adequate space planning, undersized utilities, code violations,
Financial performance (scale of 1–5) 3.4 3.2
Average percent design size and design 100.1 99.1 and so on. These problems were identified at a point in the
size changes project when they could be addressed with minimal disruption
During CD development or constructiona 3 7 and cost.
Project success (scale of 1–5) 4.9 4.2 Specific observations include the following:
(N = 16) (N = 17)
a
Denotes number of projects with design size changes out of subsam-
ple.
• The PDRI can be used effectively more than once during
project planning.