Вы находитесь на странице: 1из 21

Decision-Support System for Assessment of Alternative Signal Controllers using Expert 1

Knowledge Acquisition 2
3
4
5
6
Milos N. Mladenovic
Graduate Research Assistant
Charles Via Department of Civil and
Environmental Engineering
301-D, Patton Hall
Virginia Polytechnic Institute and State
University
Blacksburg, VA, 24061
Phone: (540) 553-5949
E-mail: milosm@vt.edu

Sudarshan Gopinath
Graduate Research Assistant
Grado Department of Industrial and Systems
Engineering
250 Durham Hall
Virginia Polytechnic Institute and State
University
Blacksburg, VA 24061
Phone: (540) 808-9909
E-mail: sudgopi@vt.edu
Montasir M. Abbas, Ph.D., P.E. (Corresponding
author)
Associate Professor
Charles Via Department of Civil and
Environmental Engineering
301-A, Patton Hall
Virginia Polytechnic Institute and State
University
Blacksburg, VA, 24061
Phone: (540) 231-9002
E-mail: abbas@vt.edu

Guillaume E. Faddoul
Student
Master of Business Administration
1044 Pamplin Hall
Virginia Polytechnic Institute and State
University
Blacksburg, VA 24061
Phone : (540) 557-7353
E-mail: faddoul@vt.edu
1
2
3
4
Submitted for presentation and publication in the proceedings of the 5
92
nd
Transportation Research Board Meeting, January, 2013 6
Word Count: 7
Text: 4250 8
Table and FIGUREs: 250*13=3250 9
Total: 7500 10
11
12
Key words: Advanced Transportation Controller, Multi-Attribute Decision Making, Analytic 13
Hierarchy Process, Performance Index 14
15
16
17
18
TRB 2013 Annual Meeting Paper revised from original submittal.
2

ABSTRACT 19
The standardization of controller hardware under the Advanced Transportation Controller 20
standard provided the flexibility for customized development of controller software. However, a 21
multitude of controller software features on the market presents a challenge for the assessment 22
when purchasing future system. This paper presents a decision-support application based on 23
Analytic Hierarchy Process technique for the assessment of traffic signal controllers. The method 24
to evaluate the controllers is based on the set of criteria derived from market controller features. 25
The application consists of knowledge, model, data, and dialog management components. It was 26
developed in MS Excel, along with supporting MS Access database. The application is highly 27
dependent on the expert knowledge acquisition. It provides weighted Performance Index along 28
with the decision-support visualization aids. The application was developed using previous 29
research originally developed for a signal system under the purview of Virginia Department of 30
Transportation (VDOT). However, the framework and application are transferable among 31
agencies, and can be potentially used for evaluation of other signal infrastructure. 32
33
34
35
36
TRB 2013 Annual Meeting Paper revised from original submittal.
3

INTRODUCTION 37
Traffic signals are one of the vital elements of traffic control systems under the purview of any 38
Department of Transportation (DOT). In the current practice, Traffic Signal Engineers and 39
Traffic Signal Field Technicians in DOTs are often facing pressure for extracting additional 40
benefits from existing signal control equipment, influenced by evident increase in demand and 41
changing traffic patterns. In addition, these engineers and technicians have to evaluate and 42
upgrade traffic signal controllers, once they reach the limit of their operational capacity. Their 43
evaluation concerns are increased by the fact that DOTs usually assumes the responsibility of 44
having to operate with the selected traffic signal controller in the next 15 to 20 years. 45
46
The recent efforts under the Advanced Transportation Controller standard are introducing 47
additional complexity for controller evaluation. Advanced Transportation Controller (ATC) 48
standard, is a combination and improvement of all the previous North American operational and 49
hardware standards (NEMA TS1, NEMA TS2, 170, 2070) [1, 2]. This standard introduced a high 50
level of hardware standardization, defining signal cabinet elements, controller elements, 51
operating system, etc. However, this standard does not have a significant focus on controller 52
software. This gave the opportunity for third-party vendors to develop signal control software 53
according to customized agencys needs or simply driven by the competitive-market 54
development. Nowadays, North American market consists of over ten companies developing 55
signal controller software, which frequently has over 200 control parameters [3, 4]. 56
57
The multitude and complexity of software options introduced decision-making issues in selecting 58
an optimal solution for a future agencys signal controller. Considering previous research and 59
practice [5-8] developed for selecting future signal controller, it is noticeable that their focus was 60
not primarily on controller software. The focus was primarily on cabinet-controller compatibility 61
and establishing some base-level requirements. These requirements are primarily related to 62
device compatibility, equipment life, ability to generate reports, and some operational functions. 63
However, there is no complete procedure or tool that would help DOTs to evaluate signal 64
controller software. Literature [2] suggests that there has been previous research in evaluation of 65
signal infrastructure. However, the alternatives were evaluated from a macro perspective and the 66
evaluation procedure did not consider the actual features of the controllers. 67
DSS FOR EQUIPMENT SELECTION 68
The complex situation described above has made apparent the essential need for improved and 69
analytically-based decision-making in selecting the optimal controller for procurement. In the 70
case this was a single criterion problem, decision making would be intuitive [9]. However, 71
considering that there are several alternative controllers, multiple criteria (multiple controller 72
features), preference dependence, etc., there is a need for more sophisticated evaluation methods. 73
74
TRB 2013 Annual Meeting Paper revised from original submittal.
4

In addition, DOTs across US have many Signal Operation Engineers and Traffic Signal Field 75
Technicians with different training levels. These engineers and technicians have extensive expert 76
knowledge and trial-and-error experience in specific parts of controller fine-tuning. This 77
dispersed knowledge base has a potential to be effectively integrated and organized into a 78
decision-support system (DSS) [2] for selecting the future signal controller. In addition, 79
considering that the controller selection is a semi-structured decision problem, the problem 80
cannot be solved by existing classic mathematical models, but the decision primarily relies on 81
human intuition [10]. 82
83
According to its definition, DSS is an interactive computer-based information system that uses 84
data and models to help solve semi-structured or unstructured problems, and support managerial 85
judgment [11]. Important characteristics of a DSS are its flexibility, ability to incorporate both 86
data and models, and capability to provide a range of alternative solutions. In overall, DSS is 87
intended for improving the quality of information on which decision is based, and extending the 88
range of decision processes. As such, DSS is ideal to support, rather than replace, traffic signal 89
experts role in choosing future signal controller. 90
91
In transportation engineering, DSS were primarily used for scheduling and managing trains, 92
fleet, or crew [12, 13], without any research on equipment selection. However, there has been 93
previous research in other fields, using DSS for equipment replacement decision [14] and 94
equipment selection [15-17]. With this in mind, the research presented is considered being an 95
initial research, focusing on developing a flexible decision-support application for selecting 96
future traffic signal controllers. This DSS, named Decision-Support System for Advanced 97
Transportation Controller selection (DSS-ATCS), is partially developed from research intended 98
to help the Northern Region Operations (NRO) of Virginia Department of Transportation 99
(VDOT) in the selection of their future controllers [18]. The complete evaluation procedure 100
developed in that research is presented in a separate paper. The focus of this paper is on the DSS 101
application and its generalized and flexible procedure for evaluating future traffic signal 102
controllers. 103
DSS-ATCS CONCEPTUAL MODEL 104
Considering that DSS-ATCS is intended to help decision-making in selecting future traffic signal 105
controllers, it needs to be able to rank and compare traffic signal controllers. This comparison 106
bases on controllers programmable features and their correspondence to agencys functional 107
requirements. Considering that expert knowledge of DOTs traffic engineers is very valuable in 108
this decision-making process, DSS-ATCS is designed to perform expert knowledge acquisition 109
[19]. That knowledge database is used further in the decision-making process. DSS-ATCS is 110
designed to provide the flexibility in collecting and analyzing the expert knowledge, in order to 111
achieve the optimal solution for all the stakeholders. Using DSS-ATCS, DOTs decision-makers 112
should have the opportunity to do a "what-if" alternative analysis, include any preferences not 113
TRB 2013 Annual Meeting Paper revised from original submittal.
5

initially expressed into the decision process, and have supporting graphical representation for 114
improved decision-making. DSS-ATCS consists of knowledge, model, data, and dialog 115
management. In addition, DSS-ATCS has relations with external data, and system user. A 116
conceptual model for DSS-ATCS is presented on the following FIGURE 1. The description of 117
each of the components will be presented in the following sections. 118
119
120
FIGURE 1: Conceptual model of DSS-ATCS 121
122
MODEL MANAGEMENT 123
Considering that this DSS is primarily concerned with alternative choice selection, a typical 124
model choice for this type of DSS issue are Multi-Criteria Decision Making (MCDM) techniques 125
[20]. In addition, MCDM is selected because: 126
It is transparent and cross-referenced to other sources of information, 127
Allows analytical input to be a responsibility of group of experts, and 128
Evaluation of the results can be delivered in a multifaceted form, helping decision- 129
makers to increase their understanding about the choices made. 130
131
As defined, MCDM is a decision theory approach and set of techniques that helps in coherent 132
ordering of conflicting options [9]. DSS-ATCS classifies as Multi-Attribute Decision Making 133
(MADM), since decision space is discrete and each candidate alternative can be evaluated using 134
a combination of analytical tools. Each MADM evaluation model is defined by [21]: 135
a set of alternatives 136
a set of criteria or attributes for evaluation 137
a decision matrix 138
139
MADM has been used in decision-making problems in transportation, for planning purposes [22] 140
or for planning in highway asset management [23, 24]. 141
TRB 2013 Annual Meeting Paper revised from original submittal.
6

Analytic Hierarchic Process 142
There are many MADM techniques implemented in practice (e.g., Simple Additive Weighting, 143
ELECTRE, TOPSIS, PROMETHEE). Essentially, all MCDM techniques are similar, since they 144
base on the criteria weights and scores. However, the evaluation procedure and the emphasis for 145
using each technique are different. For example, Simple Additive Weighting is the simplest 146
technique, using the addition of weighted criteria scores. This is the reason this technique is the 147
most general one, but is not always the best one. On the opposite, technique such as ELECTRE 148
bases on the relative comparison and elimination of alternatives. This technique is applicable to 149
choosing, ranking, and sorting problems [21]. 150
151
For the proposed DSS, the research team chose to use Analytical Hierarchy Process (AHP) 152
technique [25]. AHP was developed to model subjective decision-making processes based on 153
multiple attributes in a hierarchical system, and structures process as a hierarchical 154
decomposition, reducing complex problems into sub-problems [26]. AHP procedure has five 155
steps, which will be described in the context of DSS-ATCS: 156
1) Establish decision context, and decompose a problem into an interrelated hierarchy of 157
goal, criteria, sub-criteria, and alternatives. 158
2) Collect data from the experts or decision-makers, as a pairwise qualitative comparison of 159
criteria to create reciprocal matrix. 160
3) Organize pairwise criteria comparison into a square quantitative matrix. 161
4) Evaluate matrix consistency. 162
5) Multiply the weights of the criteria with the score for each attribute and then aggregate to 163
obtain local ratings with respect to each criterion, that are finally aggregated to obtain 164
global ratings. 165
Decision context 166
Establishing a decision context by decompose a problem into an interrelated hierarchy of goal, 167
criteria, and alternatives, we have the following: 168
Goal: selecting an optimal traffic signal controller 169
Criteria: 170
o Controller hardware and software 171
o General traffic operation 172
o Coordination and plan selection 173
o Signal Preemption (PE) and Transit Signal Priority (TSP) 174
o Pedestrian and bike operation 175
o Reports, data archiving needs, and communications requirements 176
o Advanced controller features 177
Alternatives: different market ATC 178
179
TRB 2013 Annual Meeting Paper revised from original submittal.
7

A well-defined set of criteria is important because it allows each of the criteria to be quantifiable 180
and easily evaluated [27]. The decision criteria for DSS-ATCS were developed in the previous 181
research by the authors [18]. The decision criteria are related directly to the features of traffic 182
signal controller hardware and software. Decision variables are presented as sub-criteria in the 183
lowest hierarchical level on the Figure 2. These criteria were developed by gathering interview 184
and survey input from signal control experts across Northern America. In addition, the authors 185
used highly detailed information about the actual controller features available on the market. 186
During this research, over 50 specific controller features were identified. These features were 187
clustered together, into groups that are mapped to distinguished groups [28], which became the 188
decision criteria. This clustering was intended to ease the calculation of weights in MADM 189
application and to facilitate the importance of high-level views on issues with related trade-offs. 190
The derivation of criteria and relative weights is performed according to a hierarchical system 191
[9]. Decomposing a problem of selecting an optimal signal controller into a hierarchy of 192
interrelated elements is presented on the following FIGURE 2: Hierarchical structure of criteria 193
and sub-criteria. This figure presents sub-criteria for the seven decision criteria, that were 194
developed in the previous research [18] by the authors. 195
196
197
FIGURE 2: Hierarchical structure of criteria and sub-criteria 198
199
Pairwise criteria comparison 200
As a second step in AHP, we need to collect data from the traffic signal experts, as a pairwise 201
qualitative comparison of criteria to create reciprocal matrix. The weighting process is usually 202
based on the explicit conversion of expert knowledge obtained through interviews of signal 203
control experts. Linguistic variables are used for pairwise comparison of criteria in the second 204
AHP step [29]. A transformation of qualitative expert estimates into the quantitative ones, and 205
the approach uses the standard 9-point Saatys scale [29] (FIGURE 3) based on linguistic 206
variables: 207
TRB 2013 Annual Meeting Paper revised from original submittal.
8

208
209
FIGURE 3: Saatys comparison scale 210
211
By changing the weights of criteria, according to their priority to future control system, analyst 212
can affect the final scores of mutual assessment of controllers. For example, a certain corridor 213
might require both Transit and pedestrian facilities but the Transit features may be more 214
important than the pedestrian requirements. This relative importance can be well represented in 215
the reciprocal matrix. As each linguistic variable has a corresponding numerical value, we first 216
calculate Relative Value Vector. This is performed by multiplying the entries in each row of the 217
weight matrix, then calculating the n
th
root of that product, and finally summing the nth roots 218
results in the normalized eigenvector elements that add to one. Having organized pairwise 219
criteria comparison into a square quantitative matrix, this completes the third AHP step. 220
Evaluating matrix consistency 221
The weights assigned by the expert are consistent if they are transitive. However, since human 222
judgment is usually not consistent, AHP process requires the comparison of expert judgments to 223
purely random judgments. There are several methods for assessing consistency of weights, 224
including the use of Eigenvector, Weighted Least Square, Entropy method, etc. In this particular 225
case, the weights are assessed using Eigenvector method [17, 30]. The relative weights from the 226
previous step are given as the right eigenvector (w), corresponding to the largest eigenvalue 227
(max), as: 228
229


230
If the pairwise comparisons are completely consistent, the matrix A has rank value of one and 231

max
is equal to n. The consistency is defined by the transitiveness between the entries in the 232
matrix A, as a
ij
a
jk
= a
ik
. In order to determine matrix consistency, we need to calculate 233
Consistency Ratio (CR). However, before CR, we need to determine the Consistency Index (CI) 234
as 235




236
Finally, CR is calculated as a ratio of CI and Random Index (RI). RI is the value of CR for 237
purely random weight matrix. In the case a matrix has seven elements (as the number of criteria 238
in DSS-ATCS) this value is 1.32. The value of 0.1 is the acceptable upper limit for CR. In the 239
case final CR exceeds this value, the input of weights comparison need to be repeated. This 240
completes fourth AHP step, evaluating matrix consistency, and having importance weights 241
TRB 2013 Annual Meeting Paper revised from original submittal.
9

assigned to each criteria. An example of eigenvector and CR calculation for hypothetical weights 242
of four criteria is presented in the Table 1. The RI obtained from Saatys list of random values 243
was 0.9, since there are four criteria (n=4). 244
Table 1: Example of CR calculation 245
246
A B C D
nth root of
product of
criteria weights
Eigenvector
(normalized
value)
Sum of products
of criteria weight
and eigenvector
value
Division of
sum of
products and
eigenvector
CI
A
1.00 0.33 0.11 0.20 0.29 0.06 0.25 4.05 0.06
B
3.00 1.00 1.00 1.00 1.32 0.27 1.12 4.12
CR
C
7.00 1.00 1.00 3.00 2.14 0.44 1.81 4.10
D
7.00 1.00 0.20 1.00 1.09 0.22 1.01 4.49
0.07

SUM
4.84
1.00
MEAN (max) 4.19
247
Calculation of Performance Index 248
As the last AHP step, analyst needs to calculate Performance Index (PI). This value is used to 249
compare decision alternatives. Weighted PI is obtained by multiplying the attribute value by the 250
importance weight and then summing over all attributes. Assuming a set of importance weights 251
as w = {w
1
, w
2
, , w
n
}, the most preferred alternative A* is selected as 252
253

254
255
where x
ij
is the score of the i
th
alternative about the j
th
attribute. Usually the weights are 256
normalized as

, in order to obtain mutually comparable scores. Therefore, in the 257


maximization case, the best alternative is the one that corresponds to the largest PI. By using the 258
above formula, the PI values can be calculated for each controller, thus creating a global 259
comparison numerical scale. That PI value is a measurement of the controller performance and 260
represents the score/benefit of alternate systems. 261
KNOWLEDGE MANAGEMENT 262
Knowledge management in DSS-ATCS is introduced to include the expert knowledge of Signal 263
Technicians and Traffic Signal Engineers. This component is intended primarily for tacit 264
knowledge acquisition [19] considering criteria weights and evaluation scores. In addition, the 265
knowledge acquisition is enhanced by extensive use of graphical presentations. These 266
presentations will be presented in the dialog management component of DSS-ATCS. A 267
conceptual presentation of the expert knowledge acquisition process is presented in FIGURE 4. 268
The main sub-process is expert weight input, which takes top-down approach, starting with DOT 269
operational goals. This input can be gathered from senior Traffic Engineers. Usually these goals 270
are safety, mobility, and environmental protection, but they directly influence the importance of 271
TRB 2013 Annual Meeting Paper revised from original submittal.
10

specific controller features. Consequently, this relates to relative weight importance of specific 272
criteria for future signal controller (e.g. Coordination and plan selection or Pedestrian and bike 273
controller features). The sub-process of expert weight input is performed by Signal Technicians 274
and Traffic Signal Engineers, and is directly related to its consistency check sub-process. The 275
analytical procedure for weight-matrix consistency check was described in the previous paper 276
section, and this is the first level of assessment of expert input. These two elements of the expert 277
knowledge acquisition process are further directly related weight visualization. Weight 278
visualization is the second level of expert input assessment. Finally, expert weight input is also 279
related to evaluation visualization, which is a third level of expert input assessment. 280
281
In addition to criteria weights, scores for all the alternative controller features are an external 282
input to expert knowledge acquisition process. The procedure and scoring can be obtained using 283
controller manual, user and vendor survey, modeling and testing of controller features (for 284
further details see [3, 4, 31-34]). Each controller feature can receive a score according to the 285
level it satisfies technical requirements. The scoring of controller capabilities is done in integer 286
values. The scoring of alternative controller features can be performed by DOT or by external 287
contractor. Depending if or up to what level the controller feature is satisfying technical 288
requirements, that feature receives a score. All the scores are normalized to be in the range of 289
zero to one. In DSS-ATCS, these values are stored in MS Access database, which is a data 290
management component. 291
292
293
FIGURE 4: Conceptual model of expert knowledge acquisition 294
295
DATA MANAGEMENT 296
DSS-ATCS data management component is a collection of interrelated data structure organized 297
to share the information. This component is represented by a MS Access database, which enables 298
storage of expert knowledge from different experts and at different points in time. The database 299
itself contains four tables: Criteria, Char, User, and Weight (FIGURE 5). Table Criteria contains 300
TRB 2013 Annual Meeting Paper revised from original submittal.
11

field Criteria ID, which is identification number for decision criteria, and field Criteria, which 301
contains all the decision criteria (seven in this DSS). This table allows the change of decision 302
criteria for further customizable development of DSS-ATCS. Table Char is related to table 303
Criteria. This table has fields Characteristic ID, Criteria ID, CharacteristicName, 304
ControllerScore, and Scoring System. Field Characteristic ID and CharacteristicName are related 305
to the list of all the controller features that belong to each of the seven decision criteria in DSS- 306
ATCS (e.g., 16-phase operation, phase reservice during Free and Coordinated operation, number 307
of offsets per plan, 4 seconds early Walk, etc.). ControllerScore and Scoring System fields 308
contain the information on the score each controller feature has, along with the description of the 309
scoring procedure. Criteria and Char table have one-to-one relation using CriteriaID, and are 310
primarily populated through external data. User table contains several fields, which are intended 311
to store information on all the experts that were a part of the knowledge acquisition. The fields in 312
this table store name, title, organization, and date related information. Weight table is related to 313
User table in one-to-many relationship, and contains all the expert input for weight relationships. 314
This enables storage of all the expert inputs, which can later be retrieved and compared. This 315
way, the system is able to compare multiple users comparison matrices at once. The MS Access 316
database developed is related to MS Excel application, where DSS-ATCS is developed. 317
318
319
FIGURE 5: MS Access database relationships 320
321
DIALOG MANAGEMENT 322
Dialog management component of the DSS-ATCS application is designed to allow step-by-step 323
user input while allowing backward progression through application menu. Some DSS experts 324
feel that the user interface is the most important component because many of the distinguishing 325
characteristics of DSS (such as power, flexibility, and ease-of-use) are derived from this 326
component [13, 31]. According to the presented framework, the research team developed an 327
application based on MS Excel, using Visual Basic for Applications (VBA) [35]. Each of the 328
DSS application menus is described below. 329
User Selection 330
User Selection (FIGURE 6) is the first menu displayed in this application to the potential user. 331
This menu is used for selecting a user that will perform the decision-making procedure itself. 332
TRB 2013 Annual Meeting Paper revised from original submittal.
12

The decision-maker can be selected from the drop-down combo box. Clicking on the button 333
Proceed, the potential user will be guided to the next menu for knowledge acquisition. In 334
addition, the buttons below the drop-down combo box provide the capabilities to add, change the 335
info, or delete an existing user. The button Exit Application is used for closing the DSS 336
application. 337
338
339
FIGURE 6: User Selection menu 340
Pairwise Comparison of Criteria Weight 341
The second menu (FIGURE 7) in the decision-making process is primarily directed towards 342
collecting expert knowledge from agencys traffic engineers. In this menu, the expert traffic 343
engineer is supposed to decide on the relative importance between decision-making criteria. 344
These criteria are provided in the central matrix, encircled in the red ellipse on the FIGURE 7. 345
The user can see additional information on each decision criteria by holding a cursor above 346
specific criteria. The expert is expected to perform relative comparison as relation of criteria 347
from list 1 (rows) to criteria from list 2 (columns). The comparison of criteria bases on the 348
Saatys comparison scale, which is provided on the left side of the menu. The relative 349
comparison is performed using linguistic parameters on a scale from Extremely Unimportant up 350
to Extremely Important. Each linguistic parameter is having corresponding numerical intensity 351
from 1/9 to 9. By selecting a linguistic parameter from combo box in the comparison matrix the 352
expert decides on the relative importance or non-importance of criteria in list 1 with respect to 353
criteria in list 2. The Help button provides additional instructions and examples for relative 354
comparison of criteria. In addition, the user has the option to retrieve previous weights selection 355
using the combo box in the left lower corner of this menu interface. 356
357
TRB 2013 Annual Meeting Paper revised from original submittal.
13

358
FIGURE 7: Pairwise comparison of criteria weight 359
While the user is assigning relative importance between criteria, the pie chart below the matrix is 360
showing criteria weight distribution on a hundred-point scale. This dynamic graphical feature 361
enables expert traffic engineer to perceive the absolute importance each criteria will have in the 362
decision-making process afterwards. In addition to this weight distribution pie chart (FIGURE 363
8), this menu shows consistency ratio in percent. 364
365
366
FIGURE 8: Example of criteria weights distribution 367
368
In the case the user input is inconsistent and error message is displayed in the box below the 369
consistency ratio, the user can click on the button Detailed Info of Inconsistencies. This will 370
generate a message box showing the specific relations that have potential logical inconsistencies. 371
The button Refresh Matrix resets the relative weights among criteria. Finally, the button 372
GENERATE REPORT AND DECISION MATRIX in the bottom right corner leads the decision- 373
maker into the final decision-making menu. 374
TRB 2013 Annual Meeting Paper revised from original submittal.
14

Generating Report 375
The last menu in the decision-making procedure is used for generating the report. The FIGURE 9 376
and FIGURE 10 presents a default view that shows two radar graphs. The radar graph on the top 377
of the report presents controller capabilities per criteria for the selected relative criteria weights 378
in the previous step. Below this radar graph are the three color-coded cells (red, yellow, green) 379
that represent the absolute score for each controller. The green color is assigned to a cell with the 380
highest score, while respectively the cell with the lowest score has assigned red color. The radar 381
graph at the bottom of the default report menu gives an opportunity to the decision-maker to 382
select alternative set or absolute criteria weights. The selection of alternative absolute criteria 383
weights can be performed using combo boxes on the lower right, encircled in green on the 384
FIGURE 10. The user can also re-enter relative weights in the comparison matrix by clicking Re- 385
Enter Comparison Matrix button, or exit the application by clicking the EXIT button. 386
387
The button View Decision Matrix (marked in the red ellipse on the FIGURE 9) generates a 388
decision matrix for the defined controller weights. This enables user to see the details of the 389
decision-making scores and procedures, along with respective details on controller capabilities. 390
The user has the option to choose between expanded and collapsed view in the option buttons 391
below View Decision Matrix button. An example of the collapsed view is presented on the 392
following FIGURE 11, where criteria name and weight are in yellow cells, while light blue cells 393
show respective scores per criteria. 394
395
396
FIGURE 9: Upper part of the Decision Matrix and Radar Graph menu 397
398
TRB 2013 Annual Meeting Paper revised from original submittal.
15

399
FIGURE 10: Lower part of the Decision Matrix and Radar Graph menu 400
401
FIGURE 11: Collapsed view of decision-matrix and PIs 402
TRB 2013 Annual Meeting Paper revised from original submittal.
16

Interpretation of the results 403
As the final output, DSS-ATCS calculates the weighted PI for each controller, resulting in pure 404
ordinal scale with relative ranking. This enable decision-maker to conclude which controller 405
provides the highest satisfaction of established criteria for the future control system. In addition, 406
the decision-maker can observe controllers that have advanced or missing capabilities in some of 407
the groups of required features. The radar graphs in previous Figures FFIGURE 9 and FIGURE 408
10 enable a comparison of capabilities per criteria for the best three controllers. 409
410
FINDINGS FROM THE TOOL DEVELOPMENT AND TESTING 411
Considering that DSS-ATCS is the first application for evaluating signal controllers, the findings 412
from tool development and testing are also important. The tool testing has been performed using 413
data from the project Evaluation of Merits and Requirements of Next-Generation Traffic- 414
Control Systems for VDOT's Northern Region Existing Infrastructure [18]. One of the focuses 415
of this project was evaluation of market controllers for future VDOT signal control system. The 416
data on controller scores created during that project has been used for operational testing of DSS- 417
ATCS. Although this application development has not focused on determining scores for 418
controller features, the authors emphasize that special attention needs to be dedicated to this. 419
Analytical scores for valid decision-making need to be determined during a meticulous testing 420
and analysis, as described in the previous research [3, 4, 31-34]. 421
422
As initially envisioned in the design, DSS-ATCS has been proved as a flexible decision-support 423
system. DSS-ATCS is able to accommodate any changes in the weights or criteria, based on the 424
specific DOT requirements. Originally, the application was envisioned to be used by signal 425
control experts that should provide their input related to importance of criteria. During the 426
development the user interface of the application has been simplified to be used by any decision- 427
making stakeholder, not necessarily having signal control expertise. However, the evaluation 428
process must include signal control exerts, considering their knowledge base is important in 429
finalizing the decision. However, any input from other stakeholders is welcome in the evaluation 430
process, since it can broaden the perspectives. Furthermore, the application interaction with the 431
user is able to provide feedback and help a user during the input. The application can be used 432
even if the user does not have engineering background, or does not have in-depth understanding 433
of the hidden analytical engine. In addition to Help buttons, pop-up windows, and comments, the 434
application has graphical tools that perform input assessment, and provide instant feedback. 435
Matrix consistency check, pie chart for criteria weights visualization, and radar graphs for 436
evaluation visualization are additional tools that support input assessment. The combination of 437
these tools helps the decision-maker to determine the weights and generate different reports, but 438
is limited not to impose additional load to the potential user. 439
440
TRB 2013 Annual Meeting Paper revised from original submittal.
17

The applications step-by-step procedure is intended to allow iterative comparison and 441
generation of several reports in brief amount of time. DSS-ATCS gives the opportunity for 442
decision-makers to do a "what-if" alternative analysis, comparing different criteria weights and 443
controller capabilities. This allows inclusion of any preferences not initially expressed into the 444
decision process. The application allows selection of the previous sets of weight too. In addition, 445
graphical representation of the output should increase communication transparency for reaching 446
a consensus among decision-makers. Ideally, each decision-maker should perform the input of 447
desired criteria weights individually, without consultation with others. This should enable 448
unconstrained perspectives and alternative solutions. After these initial evaluations, all the 449
reports can be gathered and discussed during a joint meeting. This would ensure that all different 450
perspectives are included, but that a mutually-agreeable solution is reached. 451
452
In addition, there are several more findings related to implementation of MCDM in this DSS. As 453
stated above, MCDM is chosen for this DSS because it is transparent, cross-referenced, allows 454
expert input, and delivers output in a multifaceted form. MCDM does not just help make a 455
decision, but also increases the understanding on the choice made. The selection of a specific 456
MCDM technique is especially important. AHP has been chosen in this case because it is based 457
on hierarchical decomposition and allows calculation of PI, for relative comparison of 458
alternatives. In addition, the assignment of criteria weights is especially important because it has 459
the biggest influence on the final PI values. This is the reason the weight input is the central 460
theme of the application and is supported by several supporting tools that help in input 461
assessment. Either in analysis performed individually or in the group meeting, decision-makers 462
need to be especially attentive to weight assignment. For example, as you can see on FIGURE 463
10, the controller with the highest PI value is controller 2. However, the radar graph shape on 464
that figure is significantly different than the one on FIGURE 12. FIGURE 12presents the 465
evaluation output, that has only one criteria weight changed weight for criteria Pedestrian and 466
bike controller features has been changed from equally important to slightly important. 467
Although this can be considered as a small change, the PI values for controllers are different, 468
with controller 3 having the highest value. This example shows the importance of weight 469
assignment has, since even small changes can have significant impact on the final decision. 470
471
TRB 2013 Annual Meeting Paper revised from original submittal.
18

472
FIGURE 12: Example of changed evaluation because of increased importance of criteria Ped 473
CONCLUSION AND RECOMMENDATIONS 474
This paper presents a DSS application for evaluating traffic signal controllers based on the AHP 475
technique. DSS-ATCS is designed to provide the flexibility in collecting and analyzing the 476
expert knowledge, in order to achieve the optimal solution for all the stakeholders. Using DSS- 477
ATCS, DOTs decision-makers should have the opportunity to do a "what-if" alternative 478
analysis, include any preferences not initially expressed into the decision process, and have 479
supporting graphical representation for improved decision-making. DSS-ATCS consists of 480
knowledge, model, data, and dialog management. In addition, DSS-ATCS has relations with 481
external data, and system user. 482
483
The application is highly dependent on the knowledge acquisition from signal control experts. 484
The expert input is primarily used in determining the criteria weights. The weights assignment 485
process uses Eigenvector for calculation, along with calculating Consistency Ratio for 486
determining the logical consistency of assigned weights. The equation for the calculation of a 487
weighted Performance Index of each controller respects determines global ranking among 488
alternative controllers. 489
490
The developed method is applied in the developed MS Excel application and supporting MS 491
Access database. This application allows automating the decision-making process in any DOT. 492
Flexible framework design can accommodate the iterative change of weights according to local 493
conditions and expert input. Each DOT has the flexibility to emphasize or deemphasize certain 494
controller features. In addition to selecting the optimal future controller, the application allows 495
identification of the potential for improvement of controller features. 496
497
TRB 2013 Annual Meeting Paper revised from original submittal.
19

The presented evaluation method can be enhanced even further by developing a method that 498
assigns scores based on the Measure of Effectiveness expected from each controller features. In 499
addition, the application has yet to incorporate the capability of automatic integration of multiple 500
users importance pairwise comparisons matrix into one. The developed method can also be 501
applied to the evaluation of other signal infrastructure components or to further MADM 502
evaluation of controllers including all of their advanced features. The framework is considered 503
transferable to evaluation of other signal infrastructure. 504
ACKNOWLEDGEMENT 505
The test case for this work used information from the Evaluation of Merits and Requirements of 506
Next-Generation Traffic-Control Systems for VDOT's Northern Region Existing Infrastructure 507
project, funded by the Virginia Center for Transportation Innovation and Research (VCTIR). The 508
authors are also thankful to Dr Christopher Zobel from the Business Information Technology 509
Department at Virginia Polytechnic Institute and State University for his insightful advices. 510
REFERENCES 511
512
1. Miller, D., Advanced Transportation Controller Standards and Interoperability. ITE 513
Journal, 2010(46 - 49). 514
2. Gordon, R.L. and W. Tighe, Traffic Control Systems Handbook, 2005, Federal Highway 515
Administration. 516
3. Mladenovic, M. and M. Abbas, Multi-Scale Integration of Petri Net Modeling and 517
Software-in-the-loop Simulation for Novel Approach to Assessment of Operational 518
Capabilities in the Advanced Transportation Controllers, in 14th International IEEE 519
Conference on Intelligent Transportation Systems (ITSC)2011. 520
4. Mladenovic, M., M. Abbas, and M. Vodogaz, Implementation of the Software-in-the-loop 521
simulation for assessment of operational capabilities in the North American Advanced 522
Transportation Controllers. Journal of Road and Traffic Engineering, 2011(2011/2). 523
5. Erin Bard Ehlinger and P. Farradyne, Successful Traffic Signal System Procurement 524
Techniques 2002, Federal Highway Administration. 525
6. Traffic Signal Requirements 2005 City of Lakeland. 526
7. Traffic Signal System Improvement Program - 2007 update summary report, 2007, 527
Denver Regional Council of Governments. 528
8. Rauch, R., James Cheeks Jr., M., The New York City, NY, USA, Traffic Signal System - A 529
Standards-Based Approach ITE Journal, 2005. 75(3): p. 36-40. 530
9. Tzeng, G.H. and J.J. Huang, Multiple attribute decision making: Methods and 531
applications2011: CRC Press. 532
10. Turban, E., J. Aronson, and T.P. Liang, Decision Support Systems and Intelligent Systems 533
7Edition2005: Pearson Prentice Hall. 534
11. Ragsdale, C., Spreadsheet Modeling and Decision Analysis (with CD-ROM and 535
Microsoft Project 2003 120 day version)2006: South-Western College Publishing. 536
12. Eom, S., S.M. Lee, E. Kim, and C. Somarajan, A survey of decision support system 537
applications (1988-1994). Journal of the Operational Research Society, 1998: p. 109-120. 538
TRB 2013 Annual Meeting Paper revised from original submittal.
20

13. Power, D.J. and R. Sharda, Decision support systems. Springer handbook of automation, 539
2009: p. 1539-1548. 540
14. Oeltjenbruns, H., W.J. Kolarik, and R. Schnadt-Kirschner, Strategic planning in 541
manufacturing systemsAHP application to an equipment replacement decision. 542
International journal of production economics, 1995. 38(2): p. 189-197. 543
15. Chan, F., R. Ip, and H. Lau, Integration of expert system with analytic hierarchy process 544
for the design of material handling equipment selection system. Journal of Materials 545
Processing Technology, 2001. 116(2): p. 137-145. 546
16. Kulak, O., A decision support system for fuzzy multi-attribute selection of material 547
handling equipments. Expert systems with applications, 2005. 29(2): p. 310-319. 548
17. Dadeviren, M., Decision making in equipment selection: an integrated approach with 549
AHP and PROMETHEE. Journal of Intelligent Manufacturing, 2008. 19(4): p. 397-406. 550
18. Abbas, M., M. Mladenovi, S. Ganta, Y. Kasareneni, Y. Li, A. Gharat, L. Chong, and A. 551
Medina, Evaluation of Merits and Requirements of Next-Generation Traffic-Control 552
Systems for VDOT's Northern Region Existing Infrastructure - Draft Final Report, 2011, 553
Virginia Center for Transportation Innovation & Research. 554
19. Milton, N.R., Knowledge acquisition in practice: a step-by-step guide 2007: Springer 555
Verlag. 556
20. Niu, L., J. Lu, and G. Zhang, Cognition-Driven Decision Support for Business 557
Intelligence: Models, Techniques, Systems and Applications. Vol. 238. 2009: Springer 558
Verlag. 559
21. Denis Bouyssou, Thierry Marchant, Marc Pirlot, Alexis Tsoukias, and P. Vincke, 560
Evaluation and Decision Models with Multiple Criteria - Stepping stones for the 561
analyst2006: Springer Science. 562
22. Massam, B.H., Multi-Criteria Decision Making (MCDM) Techniques in Planning, ed. E. 563
Diamond and J.B. McLoughlin. Vol. 30. 1988: Pergamon Press. 564
23. Zongzhi Li and K.C. Sinha, Methodology for Multicriteria Decision Making in Highway 565
Asset Management. Transportation Research Record, 2004. No. 1885. 566
24. Stephen G. Mattingly, R. Jayakrishnan, and M.G. McNally, Determining the Overall 567
Value of Implemented New Technology in Transportation - Integrated Multiple 568
Objective-Attribute Methodology. Transportation Research Record, 2000. 1739(No. 00- 569
3241). 570
25. Saaty, T.L., Analytic hierarchy process 1980: Wiley Online Library. 571
26. Bhushan, N. and K. Rai, Strategic decision making : applying the analytic hierarchy 572
process2004, London; New York: Springer. 573
27. Roy, B., Decision-aid and decision-making. European Journal of Operational Research, 574
1990. 45. 575
28. Dodgson, J., M. Spackman, A. Pearman, and L. Phillips, Multi-criteria analysis: a 576
manual, 2009, Department for Communities and Local Government: London, UK. 577
29. Saaty, T.L., How to make a decision: the analytic hierarchy process. European Journal of 578
Operational Research, 1990. 48(1): p. 9-26. 579
30. Saaty, T. and L. Vargas, Models, Methods, Concepts & Applications of the Analytic 580
Hierarchy Process. Second Edition ed2012, New York: Springer Science and Business 581
Media 582
31. Stevanovic, A., A. Abdel-Rahim, M. Zlatkovic, and E. Amin, Microscopic Modeling of 583
Traffic Signal Operations: Comparative Evaluation of Hardware-in-the-Loop and 584
TRB 2013 Annual Meeting Paper revised from original submittal.
21

Software-in-the-Loop Simulations. Transportation Research Record: Journal of the 585
Transportation Research Board, 2009(2128). 586
32. Nelson, E. and D. Bullock, Impact of Emergency Vehicle Preemption on Signalized 587
Corridor Operation: An Evaluation. Transportation Research Record: Journal of the 588
Transportation Research Board, 2000. 1727(-1): p. 1-11. 589
33. Bullock, D., B. Johnson, R.B. Wells, M. Kyte, and Z. Li, Hardware-in-the-loop 590
simulation. Transportation Research Part C: Emerging Technologies, 2004. 12(1): p. 73- 591
89. 592
34. Yohe, J. and T. Urbanik, Advance Preempt with Gate-Down Confirmation: Solution for 593
Preempt Trap. Transportation Research Record: Journal of the Transportation Research 594
Board, 2007(Transportation Research Board of the National Academies No. 2035): p. 40- 595
49. 596
35. Walkenbach, J., Excel 2010 power programming with VBA 2010: Wiley Online Library. 597
598
599
TRB 2013 Annual Meeting Paper revised from original submittal.

Вам также может понравиться