This paper examines the operational evaluation of the venue and the systems required for the conduct of international-level swimming competitions such as the Olympic Games and various world championships. Major international sporting events are complex undertakings involving the use of venues not necessarily purpose-designed for the activity and thus requiring significant temporary infrastructure and facilities to be brought in. Management of the event is complicated by the large number of people involved in its planning and execution. These range from professional staff and contractors through to volunteer officials and the competitors. An additional layer of complexity is added by the extensive media involvement, particularly live television broadcast. Operational evaluation of the venue, its facilities and, importantly, the management structure and procedures is essential because national image and reputation are at stake when running major international events Through a rigorous review of the swimming test event activity prior to the 2000 Sydney Olympic Games, in the light of modern test and evaluation theory and practice, a framework has been developed to assist in the design of test plans for future OT&E activities in support of major, competitive swimming events.
Keywords: operational, test and evaluation, swimming, competition, venue
1. INTRODUCTION
Major international sporting events, such as the Olympic Games and the assorted world championships, are an extremely complex undertaking involving a wide range of people, from professional staff and contractors to volunteers. Conduct of these events involves complicated logistics problems. Setting up the venue to meet the requirements of the controlling bodies; such as the International Olympic Committee (IOC), FINA (the international controlling body for the aquatic sports of swimming, diving, water polo, synchronised swimming and open-water swimming), and national authorities like the Sydney Olympic Games Organising Committee (SOCOG); is complex and demanding. These events also require additional external equipment and supplies to serve a large number of people, ranging from competitors and officials to spectators. Significant, and increasing, levels of media involvement, particularly television, add an additional layer of complexity both in terms of the venue set-up and the need to align the competition program with rigid television-broadcast schedules.
In the case of competitive swimming, facilities used for major international competitions are usually designed and constructed to satisfy a wide variety of requirements, ranging from major competitive events through to public recreation and community fitness activities. Because of the multi-purpose nature of the facility, its design will, of necessity, involve a number of trade-offs. Thus, its ability to meet the requirements of any specific application will
1 This paper forms part of the research work conducted as a post-graduate student of the University of South Australia. 2 potentially be compromised. Further, most venues need some form of temporary supplementation to provide sufficient spectator seating, specialist facilities to support the media, facilities for VIPs and office space for drug-and-doping control and general event management. Compromises in the venue design and layout, and the event-specific venue supplementation, are further sources of potential problems in conduct of competitive events, particularly in regard to the flow of people, equipment and information during the event.
Because of the complexity introduced by the need to involve a wide range of people and temporary equipment, the potential for unforseen problems is high. The impact of the problems can be significant and damaging to national reputation, therefore the need for operational test and evaluation of the venue in its competitive-event configuration and the management procedures is self-evident. In recognition of the potential for operational problems to emerge, it has become common practice for test events to be conducted prior to the use of facilities for major competitive events. These test events are in effect operational test and evaluation (OT&E), however they are not necessarily planned and conducted in accordance with accepted test and evaluation principles and practices.
While test events were conducted by the various bodies responsible for the 1996 Atlanta Olympics, the 2000 Sydney Olympics and the 2004 Athens Olympics, no formal record of the test plans, or the test reports, appears to have been maintained for ready access in planning future test events. It appears that most of the information is destroyed or discarded after the competition or, at best, is held in the personal records of those involved in the process.
A comprehensive literature search failed to yield any documentation such as test plans or test reports in the official records for the past three Olympic Games; however, there were a significant number of media and other reports that mentioned the conduct and scheduling of test events without providing any details. Discussion with people responsible for the swimming test events prior to the Sydney 2000 Olympic Games, and a review of the test plan developed, revealed that while the plans for the conduct of the test events were very comprehensive, they were largely procedural manuals devoted to defining how the test meet would be run rather than specifically focussing on test activities per se. Therefore it was recognised that test plans were not developed in accordance with the accepted test and evaluation (T&E) practices, rather they were derived largely from the swimming competition management experience of the individuals involved.
Given the high potential for major operational problems to emerge, particularly when using a new or poorly-designed venue for an international swimming competition, the development of a well-designed, operational test plan and the conduct of an operational evaluation is central to the ultimate success of the major event. The purpose of this paper is to critique, not criticise, the practices employed in the conduct of test events prior to the 2000 Sydney Olympic Games, with a view to drawing on the experiences to develop a framework, based on currently accepted T&E practice, for the creation of test plans for the operational evaluation of swimming facilities. Such a framework will provide a common baseline, and a level of traceability and rigour, not currently evident in the conduct of such swimming test events.
2. OPERATIONAL TEST AND EVALUATION FRAMEWORK
Although OT&E practice and principles are largely derived from the defence sector (Reynolds 1996; Stevens 1986) they may be applied with equal validity and success to other areas. This paper examines how OT&E may be, and has been, applied in the evaluation of aquatic facilities for the conduct of major, international, competitive swimming events, such as the Olympic Games.
2.1 Operational T&E
3 Reynolds (1996) defined test and evaluation (T&E) simply as the measurement of the performance of a system, and the assessment of the results for one or more purposes, admitting that, in many respects, T&E is more an art than a science. The Australian Department of Defence (Department of Defence 2005) defined T&E more specifically, as a scientific, systematic process to obtain information in order to support the evaluation of the quality of a system (or product) with known confidence. Thus, T&E parallels the scientific method (viz. define the hypothesis, conduct research to disprove the hypothesis, analyse and report results) and provides a structured, objective and planned approach to address three key issues (Harris 2004):
What is the system required to do? That is, what is the desired operational outcome or output of the system? Another slant may be to determine the users needs. How will we know when the system fulfils these needs? Measures, particularly measures of effectiveness, will be needed in order to know when this occurs. Who carries responsibility for the T&E? Master planning 2 of the T&E program is needed to clearly identify the expected effort, scheduling, resources, and funding aspects of the T&E activities.
The Defence Materiel Organisation (2004) defined operational test and evaluation (OT&E) as T&E conducted under realistic operational conditions. OT&E is conducted with representative users of the system, in the expected operational context, for the purpose of determining a systems operational effectiveness and suitability to carry out the role and fulfil the requirement that it was intended to satisfy.
This definition aligns with perspectives offered by other sources that describe OT&E variously as:
testing conducted by representative users in real world, operational scenarios that duplicate actual use of the system or product, including realistic field conditions, and a full range of climatic conditions ... (Technical Support Working Group 2003) or,
testing to demonstrate that a new system is operationally effective and operationally suitable for use (Federal Aviation Administration 1999), or
OT&E attempts to determine the performance of a system under the most current operational conditions. (Stevens 1986)
All these definitions are a perfect fit with the overall concept of test events employed with respect to major swimming competitions, particularly since OT&E includes the evaluation of socio-technical systems and whether they suit the people and their in-service needs and, importantly, produce the desired operational effect. This is a crucial aspect concerning competitive swimming events, because they involve large numbers of people that interact in a complex social web.
Major competitive sporting events are relatively short-lived compared to the operational life of the facility, with transitory use of additional temporary infrastructure that naturally changes the operational environment. Therefore, the OT&E of aquatic facilities specifically for high-level competition does not need to address operational suitability aspects related to long-term issues such as maintainability, availability or through-life support; but should focus on the systems operational effectiveness and the short-term operational suitability aspects such as the human factors, safety and reliability. The primary focus on operational effectiveness will be to determine whether the system, as configured, can be used for its intended use.
2 Master planning is the process to determine the overall structure, major elements, and objectives of a T&E program. Planning, however, is the lower-level process to determine the detailed preparations and actions needed for an individual T&E activity. 4 In developing the methodology for conduct of test events, the classic OT&E approach presented by Harris (2004) provides a useful framework. This model, depicted in figure 1, will be used as a basis for analysing the plan for the test event for swimming that was conducted prior to the 2000 Sydney Olympics.
Figure 1: Operational Test & Evaluation Approach (after Harris, 2004)
Fundamental to successful OT&E is the identification of the critical operational issues (COIs) and the development of the associated measures of effectiveness (MOEs) where the COIs are potential show stoppers. To identify the COIs and determine appropriate MOEs for a major competitive swimming event, the test event plan for the Sydney Olympics was critically analysed. Because the venue layout and infrastructure are key factors in the ability to deliver an effective competitive event it is necessary to review the fundamental specification of swimming venues, as it was based upon the users needs definition.
3. VENUE SPECIFICATION
3.1 As A Users Need Statement
Most venues suitable for the conduct of major international swimming competitions are built either directly by government bodies, or increasingly by private bodies under some form of private-public partnership (PPP) arrangement, to satisfy a number of perceived community needs ranging from high-level competitive swimming events to fitness and recreational activities. Thus, compromises in the design of a venue must be made if it is to satisfy all potential uses.
Output-requirements currently used by governments do not generally provide detailed specifications of the facility, but focus on defining the nature of the services to be provided, allowing the PPP developer (or designer) maximum flexibility in the facilitys design and construction. Thus, a series of broad statements such as to meet the swimming and aquatic needs and experiences of the regional community as they relate to four categories of activities: recreation/leisure; sports; fitness; and events (Office for Recreation and Sport 2003) will be found rather than detailed requirements defining the physical layout or other technical aspects of the facility.
This approach offers considerable scope for the basic requirements to be satisfied while leading to a facility with significant functional and operational shortfalls for specific applications. For example, an early draft of the output specification for a new, state aquatic facility for South Australia (Office for Recreation and Sport 2003) defined functional requirements simply in terms of being compliant with the FINA Facilities Rules (FINA 2005). These rules provide clear guidance in regard to the dimensional aspects of a pool and its
5 ancillary equipment, but are silent in regard to the functional layout of the facility. Table 1 illustrates the minimalist nature the draft specifications for the proposed new South Australian state aquatic facility.
Table 1: Pool Specification (Office for Recreation and Sport 2003)
Permanent Facilities:
50-m Indoor Pool A 50-m indoor pool complying with the Olympic and world championship FINA requirements.
Second 50-m Warm-Up/Cool-Down Pool A 50-m indoor pool complying with the requirements for event warm up and cool-downs.
Diving Pool A diving pool, and associated dive tower and equipment, to Olympic and world championship standards.
Polo Pool A polo pool and associated equipment to Olympic and world championship standard.
Issues critical to the operational effectiveness of a swimming competition; such as the physical relationship between the various functional areas, and specialist facilities that impact the flow of people, materials and information around the venue during a competition; are largely ignored in the specification with the overall layout left to the architect to determine, often in the absence of specialist users input. Further, the layout for competitive purposes is often compromised by the available funding, the perceived needs of other uses of the facility such as recreational use, or architectural aesthetics.
Therefore, given the multi-purpose nature of the facility, and the lack of specific layout and other key functional requirements in the specification, the need for OT&E of the facility for particular applications (such as international competitions) is essential if the event is not to be compromised by shortfalls in the venue layout or facilities.
4. FINAS FACILITIES RULES
FINA, as the international controlling body of swimming, diving, water polo, synchronised swimming and open-water swimming; sets the rules for the conduct of competitions and plays a lead role in the management of international events for these aquatic sports, especially the Olympic Games and related world championships. All the rules pertaining to the conduct of these aquatic sports, at national and international level, are contained in the FINA Handbook 2002-2005. In addition, FINA also provides rules that define the requirements for the field of play, which are known as the facilities rules (FR). A review of the facilities rules reveals that they largely provide quantitative requirements for the defining parameters of the facilities. For example, FR2 and FR3 are quantitative in nature and provide all the physical dimensions and their tolerances of swimming pools and ancillary equipment used in international competition. FR4 provides the detailed functional requirements of the automatic timing equipment, the electronic score board and starting equipment. FR13 provides functional requirements of the audio equipment, but only in qualitative terms no quantitative values for sound levels or distortion are given. Finally, FR14 provides detailed quantitative requirements in respect to pool sanitation and general temperatures in the pool area.
6 Most of the requirements defined in the facilities rules are quantitative values of fixed physical parameters that can be readily measured and remain fixed once the pool is constructed. These are measured during the acceptance test and evaluation for handover from the developer to the owner of the facility, and are not considered as part of the OT&E.
For planning OT&E, it is important to note what the FINA facilities rules do not address. It is these factors, such as the physical layout and operational procedures, which are largely subject to local variation and thus impact upon the operation of the facility during the conduct of competitions. The layout of the aquatic facility, and the physical relationship between the various functional areas, will have a significant impact on the operational effectiveness of the event. The critical operational issues (COIs) will be primarily related to the effects caused by these operating procedures, flow of people, equipment and information and the interfaces between functional areas, all of which are dependent on the particular aquatic venue layout.
5. TEST EVENTS
As part of the Olympic bid commitments, SOCOG was required to conduct a test event, i.e. a meet or competition during which the venues procedures would be assessed, in every discipline of each Olympic sport (SOCOG 2001). By late 1997, a dedicated test event program team had been established and an extensive test event schedule finalised. Test events allowed SOCOG to meet three major objectives:
Test the field of play - and all elements involved with the competition and various aspects of the venue Test all technology systems - including specific scoring, timing and results systems and communications Train staff, contractors and volunteers - in an event environment and develop teams for Olympic Games-time.
Test event tasks, content and schedules varied from sport to sport, but overall, fitted into two broad categories:
Existing events: events already on the sporting calendar, such as the Australian championships. Created events: events specifically created by SOCOG and the national controlling bodies to test Olympic operations.
A detailed literature search failed to yield any information on the test plans for the test events in any sport, thus recourse was made to Swimming Australia Ltd. (SAL), the national controlling body for swimming. Enquiries revealed that nothing relating to the test events, either test plans or the test report, had been retained in the formal published records. Fortunately, the Operations Manager for the Sydney International Aquatic Centre at the time of the test event, was able to provide a personal copy of the test plans. These test plans were in fact, only the procedures for the conduct of nominated test events. The prime test event was the 2000 Telstra Australian Open Championships Selection Trials, an existing event, run by SAL with SOCOG involved as a key stakeholder. To ensure that the test event was representative of the Olympic event, the competition program and operating procedures employed were identical to the Olympic program, and included significant media and television participation. This approach enabled assessment of the interaction between the media, particularly the television crews and their equipment, including interfaces to the electronic timing, and scoreboard systems, and the conduct of the competition.
The test-event plan is reviewed in the following section, in the light of accepted practices in the preparation of OT&E plans to identify COIs and MOEs.
7 5.1 Sydneys Olympic Parks Test Event Plan
The Sydney International Aquatic Centres test-event plan was a very detailed and well- structured document, comprising in excess of 100 pages with some 90 or more pages of appendices. Before examining the detail of the test event plan, it is crucial to identify the overall users objective, as it is the fundamental yardstick used in assessing the operational effectiveness and operational suitability of the systems and facilities. From the test event plan (SOCOG 2000), the overall mission was to host a high standard [sic] swimming tournament at the Sydney International Aquatic Centre site for the swimming competition of the 2000 Olympic Games.
The test event plan stated that the objective of the test event was for SOCOG to test certain key elements critical to the success of the swimming discipline during the Olympic Games. Six key elements were identified:
The venue (Sydney International Aquatic Centre SIAC) - management Field of play (FOP) - competition pool to meet FINA requirements. Technology - including data networks, electronic timing and scoring, meet management systems, results compilation and doping control. Venue expansion - which focussed on crowd control and flows in the temporary, expanded spectator-seating facilities. Officials Paid staff and volunteers
SOCOGs safety policy stated that, in the design and operation of venues, the priority for allocation of resources was:
Number 1 life safety Number 2 sport competition Number 3 television broadcast
Television broadcast. From this priority hierarchy, television broadcast was clearly considered important due to the resources needed and the effect on the mass audience not present at the venue. However the real issue was not the technology aspect (which was relatively straightforward and remains something that the TV network staff do all the time), the real concern was the interaction of the TV crew, and their equipment, with the officials and other pool deck activities and, importantly, synchronising the running of the meet with the TV timetable / schedule. At a normal swim meet, the referee runs the meet and manages a schedule that is not linked to, or dictated by, international TV broadcasts and their tight timing. Thus, television broadcast was a management/coordination issue, not just a technology issue, that needed to be tested; and should therefore be explicitly identified as the seventh key element.
Assessment approach. However, the test event plan was primarily a detailed set of operating procedures that addressed every aspect of the test event management which, while being very comprehensive and necessary for such a complex undertaking, were in many respects not directly germane to an OT&E activity. Each functional area was provided with a standard reporting sheet titled event objectives and deliverables. It was required that each area report its outcomes on these sheets along with any recommendations for improvement after each competition session (Abernethy, 2004). These sheets formed the basis of test data collection and offer the best insight into the test structure and test objectives. A post-test event debrief was conducted and a basic test event report was prepared however attempts to locate copies were fruitless. Daily briefings were conducted during the test event ensuring that issues were resolved on the spot and adjustments made in time for the next session. From an analysis of the event objectives and deliverables sheets the relationship between functional areas and activities was developed and is shown in table 2. 8
Table 2: Functional Areas and Associated Activities
Functional areas Venue & venue management Site management House management Operations management Competition mgt. Signage Brand protection Accreditation / passes IOC relations / protocol Olympic Coordination Authority liaison Catering Cleaning / waste management Key agencies liaison Risk management Medical support Communications operations Press operations - Merchandising Doping control Sydney Olympics Broadcasting Authority (SOBO) - Spectator services Environment Sponsor liaison - Ticketing Language services Venue operator liaison - Transport services Logistics Security - - Medal ceremony
A C T I V I T I E S Staffing - - Technology
Weakness in the assessment approach. Since test event plan was primarily the detailed operational procedures, the actual test activities were often obscured. The lack of any explicitly defined COIs or MOEs makes it difficult to determine exactly what was being tested and how it was to be evaluated. Indeed, the assessment process was to run the event in accordance with the draft procedures, and amend the procedures at the end of each session as agreed in a meeting of functional managers, notably without the use of pre-determined measures to evaluate the outcomes. Thus, the test event was a highly subjective and qualitative assessment, done in an evolutionary manner, with the outcome (the final version of the procedures) unique to the specific venue (the Sydney International Aquatic Centre.)
6. OPERATIONAL TEST AND EVALUATION PLAN DEVELOPMENT
Drawing on the six key elements originally identified in the test event plan, with the authors addition of television broadcast as the seventh element (as discussed above), and in combination with the functional area activities shown in table 2, the following COIs and related MOEs have been developed. These are proposed by the authors as the basis of an operational evaluation of a venue and the associated management processes, for the conduct of an international swimming competition.
COI-1 Is this venue successfully managed? MOE 1 : Number of tardy corrective actions to venue and event management procedures.
COI-2 Is the field of play suitable for a major international swimming competition? MOE-2-1: Number of corrective actions needed to meet FINAs facilities rules. MOE-2-2: Mean rating of the competition pool by elite competitors.
COI-3 Is the technology capable of supporting this competition? MOE-3: Percentage of results correctly promulgated.
COI-4 Is this venues surge-expansion effective? MOE-4: Mean queuing time to enter the facility. 9
COI-5 Will the competition be officiated successfully? MOE-5: Percentage of competitors lodging a protest.
COI-6 Are the paid-staff and volunteers able to perform effectively? MOE-6: Mean lag to formally release events results.
COI-7 Will the media be able to cover the competition successfully? MOE-7: Ordinal place of the television rating for televised events.
COI-8 Will the safety of lives be maintained? MOE-8-1: Number of deaths at the venue. MOE-8-2: Percentage of attendees suffering a reportable injury.
6.1 Measures Of Performance
Measures of performance (MOPs) are generally measures that are developed for noting the internal efficiency of a particular solution system, thus are highly dependant on the characteristics of the system. Which, and how many, MOPs are used is therefore likely to vary from one solution system (swimming competition) to another, thus a set list of MOPs cannot be provided. As an example however, consider the case of the table-2 activity, spectator services. Useful MOPs might include:
Number of seats. Seating set-up time. Number of ushers available. Training time (ushers). Number of female toilets. Number of male toilets. Number of canteens customer service points. People flow rate entering the stand. People flow rate exiting the stand to the toilets. People flow rate entering female toilets. People flow rate entering male toilets. Canteen customer servicing rate.
7. CONCLUSION
It has been shown that the conduct of international swimming competitions is a highly complex undertaking that involves a number of separate groups, with significant logistical issues. Involvement of the media adds an additional layer of complexity. Further, the need to conduct the competition in venues that are not specifically, or at least not solely, designed for the activity increases the potential for operational difficulties.
Given the high public profile, and the potential damage to the national image and reputation if the event does not proceed according to plan, some form of operational evaluation of the venue, the facilities, and event management procedures and logistical support arrangements is essential. As in all operational test and evaluation, planning is the key to success. A prime factor in the development of test plans is the identification of COIs and their associated MOEs. While the test event plan used prior to the Sydney 2000 Olympics was a very comprehensive document, it was more of an operating procedure manual than a test plan, in that it did not specifically identify COIs or MOEs and did not define actual test procedures.
In this paper, the test event plans were reviewed in the light of current operational test and evaluation practice, to tease out the key elements for achieving the overall mission statement of an international swimming competition. These elements, within the context of the 10 functional areas identified for the conduct of the competition, were used to formulate a set of COIs and MOEs. The COIs and MOEs which were presented, and the framework within which they are presented, provide a sound basis for the development of future operational test plans. In this respect, this offers a more rigorous approach to venue test event planning than has been obvious in the past.
8. ACKNOWLEDGEMENTS
The authors would like to thank Mr Glen Tasker, Chief Executive Officer, Swimming Australia Limited, for his support in identifying sources of documentation related to the Sydney 2000 test events. Thanks are also due to Mr Robert Abernethy, for providing his personal copy of the 2000 Telstra Australian Open Championships Selection Trials, Sydney Olympic Park Test Event Plans 2000.
9. REFERENCES
Defence Materiel Organisation 2004, DMO Verification and Validation Manual, Final Draft edn, ed. M Polya, Department of Defence, Canberra, ACT.
Department of Defence 2005, Defence Test & Evaluation Procedures, ed. Director of Trials, Capability Development Group, Canberra, A.C.T.
Federal Aviation Administration, US 1999, T&E during the acquisition management phases, http://fast.faa.gov/archive/v0501/test_evaluation/pg5.html>.
FINA 2005, FINA Facilities Rules, vol. Part IX, Federation Internationale De Natation (FINA) Handbook, Constitution and Rules 2002-2005, Lausanne, Switzerland.
Harris, MB 2004, Course notes for EEET 5046 Operational Test and Evaluation, v. 5, Course Code EEET 5046, University of South Australia, Adelaide, S.A., Australia.
Office for Recreation and Sport 2003, State Swimming Centre Output Specification, Reference F028.01, Revision 3, S.A. Government, Adelaide, S.A.
Reynolds, MT 1996, Test and Evaluation of Complex Systems, John Wiley & Sons Ltd., Chichester, England.
SOCOG 2000, Sydney Olympic Park Test Event Plans, unpublished plan, Sydney, NSW, Australia.
SOCOG 2001, Venues and Sport: Competing at the Games, Volume 1, State Library of NSW.
Stevens, RT 1986, Operational Test and Evaluation: A Systems Engineering Process, Robert E. Krieger Publishing Company Inc., Malabar, Florida.
Technical Support Working Group 2003, Test and evaluation planning guide for combating terrorism and public safety systems and products, US Government. 11 AUTHORS
Steve Pendry, Information Sciences Laboratory, Defence Science and Technology Organisation, Edinburgh, S.A., 5111. Phone: +61 8 8259 5168 e-mail: steve.pendry@dsto.defence.gov.au
Steve completed an engineering cadetship in private industry before moving to the SA Institute of Technology. After 18 years in academia, he joined the South Australian Centre for Manufacturing, before moving to the Department of Defence to manage the electro-optics area in the defence industry development program. He later took up a position in the South Australian government as senior investment manager for defence and advanced engineering. Steve moved to DSTO in 1998, and is currently the Executive Officer to the Director, Information Sciences Laboratory. His other interests include refereeing competitive swimming meets and pistol shooting.
Michael Harris, Systems Engineering & Evaluation Centre, University of South Australia, Mawson Lakes Campus, Mawson Lakes, S.A., 5095. Phone: +61 8 8302 5274 e-mail: michael.harris@unisa.edu.au
Michael is a senior research fellow with the Systems Engineering and Evaluation Centre at the University of South Australia. His qualifications include a BSc (Hons) in physics and an MSc in military vehicle technology. Prior to joining the University in 1998, Michael was a project manager with CEA Technologies Pty Ltd, (an Australian electronics research and development firm), following 14-years service in the Australian Army. Michaels other interests include umpiring Australian football, distance running and volunteering at the Animal Welfare League.