Вы находитесь на странице: 1из 8

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

Contents lists available at ScienceDirect

Process Safety and Environmental Protection


journal homepage: www.elsevier.com/locate/psep

The integration of HAZOP expert system and piping and instrumentation diagrams
Lin Cui a,1 , Jinsong Zhao a, , Ruiqi Zhang b,2
a b

Department of Chemical Engineering, Tsinghua University, Beijing 100084, China Sinopec Engineering Institute, Chaoyang District, Beijing 100101, China

a b s t r a c t
The main purpose of hazard and operability (HAZOP) analysis is to identify the potential hazards in the process design which nowadays is generally developed through a computer aided design (CAD) package. Due to the time and effort consuming nature of HAZOP, it is not done in every engineering rm for every design project. To make HAZOP an integral part of process design, an integration framework is proposed in this paper to seamlessly integrate the commercial process design package Smart Plant P&ID (SPPID, Intergraph) with one of the HAZOP expert systems (named as LDGHAZOP) developed by authors. This integration makes it possible to perform HAZOP analysis easily at anytime of the whole lifecycle of a chemical plant as long as the process design is available, which might help the improvement of design quality. One industrial case study is used to illustrate the ability of the integrated system. 2010 The Institution of Chemical Engineers. Published by Elsevier B.V. All rights reserved. Keywords: HAZOP; P&ID; Process design; Expert system

1.

Introduction

Hazard and operability (HAZOP) study is a widely used process hazard analysis (PHA) approach (Swann and Preston, 1995). It is usually applied when detailed process piping and instrumentation diagrams (P&ID) are ready but not yet frozen, so that the potential process hazards can be identied, evaluated and mitigated in the design stage (Venkatasubramanian et al., 2000). But HAZOP study is still an expensive process due to its highly time consumption and effort consumption for the HAZOP human team (Khan and Abbasi, 1997a). Therefore, many engineering rms prefer limiting the number of HAZOP studies which are usually performed at the end of the design project. However, this approach carries a possible risk of identifying a safety risk late in the design stage, which might result in costly changes, for example, re-ordering of equipments. Therefore, the sooner the HAZOP is done, the less time and resources will be spent on the changes resulted from HAZOP ndings. On the other hand, some of the chemical plant managers do not accept HAZOP yet because HAZOP

studies require the key personnel away from their daily jobs to attend the HAZOP meetings which might last for weeks or months. Sometime it is almost a mission-impossible for chemical plants running lean. Many HAZOP automatic reasoning approaches (McCoy et al., 1999; Kanga et al., 2003; Bartolozzi et al., 2000) and expert systems (Zhao et al., 2005; Viswanathan et al., 1999; Khan and Abbasi, 1997b) have been developed in the past two decades to improve the HAZOP teams efciency. Although the automatic analysis process done by the expert systems is quick, the process of data input to the expert systems is long because HAZOP analysis is also a highly data consuming process. The process topological information, process chemistry, equipment design parameters and chemical material properties are needed to run the expert systems. Computer aided design (CAD) had been introduced to chemical process design activity for a long time all around the world, therefore most of the process specic information is available in the CAD system used by the process designers. Unfortunately, directly utilizing the CADs database is rarely reported in the chemical process eld. A

Corresponding author. Tel.: +86 10 62783109. E-mail addresses: mr.cuilin@mail.tsinghua.edu.cn (L. Cui), jinsongzhao@mail.tsinghua.edu.cn (J. Zhao), zhangrq@sei.com.cn (R. Zhang). Received 1 January 2010; Received in revised form 19 April 2010; Accepted 27 April 2010 1 Tel.: +86 10 62783109. 2 Tel.: +86 10 84876913. 0957-5820/$ see front matter 2010 The Institution of Chemical Engineers. Published by Elsevier B.V. All rights reserved. doi:10.1016/j.psep.2010.04.002

328

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

Fig. 1 Chemical process life cycle. Fig. 2 The framework of LDGHAZOP.

great deal of extra effort on manual transformation and reorganization of the needed data is still needed in order to use the automatic HAZOP systems. This dilemma makes it rather difcult to agilely perform HAZOP study during the process design stage where frequent changes might be made to the design. In addition, manual input of the process specic data into a HAZOP expert system is prone to human error. Therefore, there exists a considerable motivation to automatically generate HAZOP reports from the process design data. Most of the nowadays widely used commercial process design software packages provide external software interfaces in various forms, which make it feasible for other programs to communicate with them, extract the process design information, or manipulate the data within them. An integration framework was proposed centering on design rationale and integrated tools by Zhao et al. (2004) for safer batch chemical processes. A HAZOP expert system empowered with such information extracting capability can perform automatic analysis repeatedly and frequently in the whole lifecycle of a chemical plant shown in Fig. 1, especially in the design stage, the operation stage and the modication stage, which might provide more reliable life long guarantee of process safety. Although combining HAZOP expert system with process CAD packages is necessary, relevant researches or development attempts are rarely reported. An et al. (2009) proposed methods for indentifying the plant items boundaries and the instruments from P&ID created through Intergraphs Smart Plant P&ID (SPPID) process design package. As a result, two computer tools were developed based the methods. One was to help with the task of identifying hazards related to maintenance work and the other was to carry out cause and effect analysis automatically. To the best knowledge of authors, HAZID Technologies Ltd. partners with Intergraph Corporation, a worldwide leader in process plant design software for the integration of the HAZIDs HAZOP expert system and the SPPID (Joop, 2007). However, their detailed integration methodology has not been readily available yet. The purpose of this paper is to present an approach of integrating a process design package with the layered directed graph (LDG) model based HAZOP expert system (LDGHAZOP) developed by authors for continuous petroleum, petrochemical and chemical processes.

2. Brief introduction of SPPID and LDGHAZOP


2.1. SPPID

As a key part of the Intergraph SmartPlant Enterprise, the chemical process design package SmartPlant P&ID (SPPID) is an asset-centric, rule-driven CAD system that can help to efciently create and improve plant congurations. Running in a clientserver environment, the software focuses on the plant data instead of on drafting, with all P&ID information stored in the data model, a central database, which brings many conveniences for accessing its internal data for other applications such as safety analysis, hydraulic analysis optimization and process optimization. HAZID Technologies Ltd. has partnered with Intergraph Corporation for the integration of the HAZIDs HAZOP expert system and SPPID. However, their detailed integration methodology has not been readily available yet. In addition, the prototype of the HAZIDs HAZOP expert system was initiated by researchers from Loughborough Universitys chemical engineering and computer science departments. In that system, the plant was described as a network of interconnected units and each unit is described in terms of a model based on signed directed graphs (SDGs) (McCoy et al., 1999).

2.2.

LDGHAZOP

To overcome the shortcomings of signed directed graph (SDG) model in knowledge representation, LDG model was proposed by the authors (Cui et al., 2008). Basically LDG model qualitatively captures cause-effect relationships between process deviations generated by using all types of HAZOP guidewords. Recently a HAZOP expert system named LDGHAZOP has been developed based on LDG models. LDGHAZOP is a web based multi-client expert system for HAZOP study developed with Java programming language mainly. It consists of document (DOC) module, layered directed graph (LDG) module, database abstract module and some accessory modules as shown in Fig. 2. The DOC module is an intelligent word processing module, which can assist the experts editing HAZOP results, making the printable HAZOP reports and so on. All the text generated during the HAZOP process is saved and reorganized by DOC module. Based on those data and utilizing a certain

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

329

search algorithmic, the DOC module also provides hints during HAZOP analysis to the human experts. Aided by the hints about different HAZOP components such as causes, consequences and suggestions, the efciency and completeness of the HAZOP analysis can be improved signicantly. The LDG module consists of several sub-modules. LDG model library contains a serial of LDG models of various types of equipment which are organized in a tree-like structure. The process description (PD) sub-module describing the chemical processes to be studied is used to generate process specic data packages also named as reasoning conguration (RC) packages. Based on the data dened by the PD module, a model matching algorithmic is developed to match each piece of process equipment to a proper LDG model in the LDG model library. The models are then linked based on the equipment interconnection information specied in the RC packages. After the LDG models are linked, the reasoning machine submodule can be used to perform automatic HAZOP analysis. However, the analysis results generated by LDGHAZOP are still far from completion before the human experts validation and modication. These reasoning results are stored in a separate reference base (REF base) and utilized by the DOC module. There are two usages of the REF base. Firstly, DOC module can generate the HAZOP report by straightly importing the REF base whenever it is required by the human experts. Secondly, the data in the REF base is used to make intelligent online hints during the discussion of the human experts. An external database system is used to store all the data including HAZOP analysis results, LDG library, REF base, etc. The database operations are performed via the database abstract (DBA) module based on SQL language and Java Database Connector (JDBC). The DBA module envelops the detail operations on the database system, supplying high level interfaces for performing data manipulations. Besides the core modules described above, some accessory modules such as user interface (UI) module, user management module are all shown in the top block in Fig. 2. Although those modules are necessary for the system, they are not detailed here because their functions have nothing to do with the topic of this paper.

3.2.

Data acquisition module

The main function of the data acquisition module is actually data extraction of the corresponding object instances inside P&IDs for obtaining the process specic information required by LDGHAZOP. In SPPID, all the drawings are stored in a generic database system provided by other database manufacturers. Currently there are two database supported by SPPID: Oracle 9i and Microsoft SQL Server . As the database adopted by SPPID is a generic database management system (DBMS), accessing the database directly bypassing the SPPID might be the most rapid data acquisition method. Unfortunately, this method is not recommended for at least two reasons. The rst reason is that the detailed inner data structure is not readily available, which makes this method hardly practical. The other reason is that this method is not ofcially supported by INTERGRAPH , which might lead to compatibility problems.

3.2.1.

The SPPIDs Llama interface

3. The framework and implementations of the integration


3.1. The framework

SPPID provides developers with a set of Application Programming Interfaces (APIs) named as Llama that are ofcially supported by Intergraph Corp. for data communication between SPPID and other systems. The Llama APIs are presented under the middleware technology framework named as Microsoft COM. Their functionalities are dened implicitly through the methods associated with them. They provide standard mechanisms of interactions between SPPID and other systems or programs regardless of the program languages, the type of machine in a computer network or the operating system used. Under the mechanisms, the inner data of SPPID is encapsulated into a set of classes according to Object-Oriented programming principles. The Llama classes have a hierarchy that organizes the classes in an class-tree representing the objects used in the P&IDs. In the Llama object model, the top level abstract classs name is Model Item (MI). Class MI has a set of subclasses organized as a class tree corresponding to each kind of the drawing object. One of the subclasses of class MI, named as Plant Item (PI), represents all the concrete objects such as equipment, instruments, piping components and so on. PI has a set of subclasses representing the above objects respectively. For example a water tank is an instance of the class Vessel, a subclass of Equipment which is a subclass of PI, once again a subclass of MI.

3.2.2.

Data acquisition method

To integrate LDGHAZOP and SPPID, a heterogeneous systems integration framework is presented as shown in Fig. 3. There are ve modules for the data transfer and data processing between the LDGHAZOP and SPPID. The SPPID Interface (SPI) module accesses process specic information in the SPPIDs database. Then the Data Acquisition (DAQ) module classies and reorganizes the data into ve categories including equipment, valve, chemical, instrument, interlink. The normalization (NORM) module performs data regularization operations which translate the items by using ontologies (Zhao et al., 2008) and unify units. The Interface module (IM) acts as a bridge which allows some other application access the extracted data via Intranet or Internet. Those four modules should be all deployed on the host computer in which SPPID resides. Finally LDGHAZOPs host gets the SPPIDs data through the RC package generator (RCPG) module.

The DAQ classies the data sets extracted from SPPIDs P&IDs into ve categories as shown in Fig. 3, of which equipment infomation, interlinks among equipments are indispensable for automatic HAZOP reasoning. The equipment type information is used to match LDG models, while the interlinks are primarily utilized to link the corresponding LDG models. Although the other data categories including equipment detail description, instrument and chemical are optional, they can aid the reasoning machine to improve the quality of the HAZOP analysis results. Equipment information is primarily used in the LDG model matching (LDGMM) process preformed by the RCPG module. The LDG library stores a set of organized LDG models corresponding to a certain type of equipment or operation. The LDGMMs objective is to select the most suitable LDG model for the equipment found in P&IDs. To select the model accurately, the extracted equipments type information is rstly applied to roughly determine the category of the model, and

330

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

Fig. 3 The framework of the integrated system. then the name and the properties of the equipment are utilized to distinguish the exact LDG model from the library. The properties of the equipments, including the contained chemicals, the materials, design parameters, working conditions and so on, are needed by the reasoning machine and the DOC module, so that they should be extracted and stored, too. The equipments are most important for HAZOP study as mentioned above. The information about the equipment type, construction materials, operating condition parameters, etc. is all needed. Llama denes a class named Equipment derived from the class PI, having a set of properties corresponding to the information listed above. Furthermore, the class Equipment has a set of subclasses corresponding to various types of equipments such as Vessel, Heat Exchanger, etc., while each subclass contains more specic properties. For the convenience of programming, all equipment are regarded as Equipment at rst and the generic properties of Equipment are extracted, and then, based on the extracted data the equipments type can be determined and the specic properties corresponding to the specic equipments types are extracted. This step by step data extraction method is usually used in the whole extracting process. Another important item group is the interconnections among equipments, instruments and valves. The interconnection data is rstly used to link the selected LDG models, for the sake of the cross-model reasoning performed by the LDG reasoning machine. Based on the linked models, the reasoning machine can search the HAZOP deviation propagation routes and deduce the aftermaths of each deviation in the equipment network. Secondly, the interconnection data helps dene the service or action targets of the instruments and valves. Although it is easy for human beings to visually judge whether two items are connected together by directly watching the P&ID, its sometimes rather difcult for computer to recognize the interconnections as the equipments are often connected indirectly, via a chain consisting of instruments or piping components such as control valves and anges. We use a browse strategy simulating human cognitive behavior to deal with it. The extraction of interconnection is an equipmentcentered approach. By browsing all of P&IDs, all equipment can be listed by the DAQ by using the method described above. To nd all items and connections attached to each equipment, the DAQ searches the items along all the pipelines or signal lines departing from the equipment. When an item is founded along a certain route, the DAQ makes a decision depending on the items type. If it is equipment, the searching along this route is done, and searching along another route is started unit all routes from the equipment are searched. Otherwise the searching is continued until another equipment is found or an end point is reached. Finally, the DAQ creates object instances for the interconnections, instruments and valves found in the above routes. In many cases, the piping and signal lines interconnections constitute a complicated network. The strategy described above will sometimes encounter innite loops. However, as HAZOP study does not need high resolution of the interconnections map, only the networks major prole needs to be extracted. Recording all searched routes and skipping them when encountering them again can resolve this problem. Although represented by different classes in Llama, in the DAQ module, the process accessories such as sensors, actuators, controllers and the process safeguards such as alarms, check valves and pressure relief valves are all included in the category of instrument for the reason that they all might be able to change the probability of abnormal situation emergence. Although they are simple items, they play a key role in hazard evaluation performed by the LDG reasoning machine. The relationships between instruments and their effects are described by rules, similar to the reasoning rules introduced by An, Chung, et al. for Cause & Effect (C&E) Table generation (An et al., 2009). For example, a check valve in a pipeline can reduce the likelihoods of consequences caused by the deviation reverse ow in the pipeline. Another example is that if three instrument elements including a level sensor, a control valve and a controller are found linked by signal lines, then a level control loop certainly exists and the consequence likelihood of the levels deviations should be decreased. Therefore identication and utilization of the instruments information during automatic LDG model reasoning help improve the results quality.

3.3.

Data normalization module

The SPPID has private taxonomy and standards of data representation while LDGHAZOP has its own taxonomy and standards. The problems of interoperability between interacting computer systems have been well researched. A comprehensive classication of the different kinds of interoperability problems can be found in literature (Sheth, 1998)

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

331

where the system, syntactic, structural and semantic levels of heterogeneity were depicted. The semantic level refers to the meaning of terms in the interchange. Although some standards were presented for system integration and data unication, such as the CAPE-OPEN for process simulation module interoperation (Pons, 2003), Chemical Industry Data Exchange Association (CIDX) eStandards for chemical data exchange in e-Business (Lampathaki, 2009), there is no widely used one for chemical process data exchange between heterogeneous computer systems. To resolve the semantic heterogeneity problem between LDGHAZOP and SPPID, a Data Map based an ontology library is created to specify different terms from the two systems and dene the translations between the terminologies. The ontology library being used is a chemical process safety ontology developed by our team (Zhao et al., 2008), dening a comprehensive set of concepts in chemical engineering including chemical process, unit operation, equipment, chemical materials and so on. Based on the data map, NORM module unies the textual data from DAQ and the units of numerical data in SPPID are also standardized into international units which are adopted by LDGHAZOP.

Fig. 4 Extracted data structure. Fig. 4. Every LDG model has a set of properties describing the corresponding equipments. No matter how complex the interconnection between two equipments is, there is only a single connector linking the matched LDG model. The other accessory items such as instruments and piping components are treated as the attachments of the equipments. They are classied into two categories. The rst category includes attachments with an explicit host and the second one covers attachments without an explicit host. The attachments with an explicit host are the items linked directly to the equipments such as level sensors. The attachments without an explicit host means they are linked to the connector between the equipments such as a ow meter attached to a pipeline, through which two equipments may be connected. For those items connected to several equipments simultaneously instead of being attached to connectors, their hosts are indistinguishable either. They will be assigned to the second category. In the RC package, all of the attachments are recorded for the risk evaluation, so that they are all treated as preliminary safeguards of the equipments. Although much information of a process is eliminated through the above extraction simplication method, the essential data is still kept to fulll the requirements of HAZOP study.

3.4.

Interface

For the reason that the SPPID system normally resides in a standalone server, while LDGHAZOP is usually deployed on another host, the integration is a distributed heterogeneous system interlink. There are several data transfer frameworks coping with that situation, in which the Service Oriented Architecture (SOA) is a widely adopted one (Bell, 2002). Under SOA, the data operations are presented in form of standardized services, while any application can request the services via computer network to access the data despite of the applications platforms and program languages. In our integration, the Representational State Transfer (REST), one of the SOA implementations (Fielding and Taylor, 2002), is applied. REST affords the services in the form of Unied Resource Identier (URI), allowing any request by using the Hypertext Transfer Protocol (HTTP), which is widely used on World Wide Web (WWW). REST also uses XML as its data representation language, being capable of containing various kinds of data. Under such a mechanism, LDGHAZOP can access the data through IM without any external efforts on data transferring details.

4.

Case study

3.5.

The RC package generator

Data transferred through IM nally reaches the RCPG module and a RC package is generated to get ready for automatic HAZOP reasoning. The RCPGs main work is to perform LDGMM process, matching LDG models, as mentioned above. During this process, RCPG compares the extracted equipments name, type and properties with the contents in the LDG model library. The textual comparison algorithm which calculates the Levenshtein Distance (Levenshtein, 1965) is adopted here to determine the textual similarities. The chemicals and the equipments function, if provided, are also considered in the matching process. However, the automatic model matching is failed or partly failed in some situation due to the lack of data or LDG models, manually matching the models maybe necessary. After data extraction, the whole P&IDs are simplied into a data set with an LDG model-centered structure shown in

The P&ID shown in Fig. 5 is a process segment of a catalyst preparation process. It is a batch operation. Firstly, the powder of slightly overdosed chemical B is manually dumped into tank T1 via a manhole on the top of T1. After the manhole is sealed, solvent C is pumped into T1 to dissolve B. Then a little amount of desalinized water and chemical A are pumped into T1 to react with chemical B to make catalyst D. Once the reaction is completed, the mixture is transferred to tank T2 for precipitation separation. The waste residue in T2 is transferred into tank T3 where it is hydrolyzed by high pressure water and thereafter drained. The remaining solution from T2 containing catalyst D is transferred to downstream equipments. During maintenance, tanks T1, T2, and T3 should be thoroughly cleaned up. As the original digital P&ID is condential, the P&ID shown in Fig. 5 is drafted by authors. There are three main equipments, two motors, two agitators and several instruments, including indicator, valves and so on, in this process segment. To retrieve the data within, the LDGHAZOP provides users with the interface shown in Fig. 6. In Fig. 6, the left region is the SPPIDs drawing list which is read from SPPID. Once the P&ID is selected, its equipment list is retrieved and shown in the right region as is shown in the gure. It can be seen that all equipments including the ve unnamed equipments whose names are all null as shown are all listed. Then the inter-

332

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

Fig. 5 P&ID of the catalyst preparation.

Fig. 6 The extracted equipment list from P&ID. links among the equipments can be read and displayed using the user interface shown in Fig. 7. The cells with L mark means the corresponding two equipments on the vertical and horizontal coordinates are connected directly or indirectly via piping, instrument, signal line and so on. During the interlink acquisition process the equipment attachments including instruments, safeguards and so on are also extracted by background processes. The whole data extraction process including scanning the P&ID and extracting the meaningful elements lasts for 6 min 25 s to complete. During the scanning process 331 graphic elements are scanned, most of them are pipe line segments and signal line segments, among them three major equipments, ve accessorial equipment parts and 24 instruments are found and imported. The system test was executed on a common personal computer having a dual core 2.4 GHz CPU and 2G RAM, on which a SPPID system and the LDGHAZOP system were installed together. The average data extraction speed is about two important equipment or instruments per minute. Compared with the manual data preparation done by authors, about 50% of the data input time can be saved. After the data retrieved by the processes described above is fed into the LDGHAZOP via NORM and IM modules shown in Fig. 6, the corresponding RC package is produced by the RCPG Module. As the nal data acquisition result, the RC package contains a set of LDG modules corresponding to the equipments in the P&ID, respectively, the attachments of the equipment and the connections among the equipments are also recorded by the RC package. The RC package can be represented by a graphic interface shown in Fig. 8. In the gure, the center box represents the currently focused equipment; the connected equipments of it are represented by the smaller boxes placed on the left or

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

333

Fig. 7 The equipment interconnection.

Fig. 8 The RC package graphic representation. Table 1 the comparison of the results: LDG model vs. human experts. Number of results
LDG results Expert results

Causes
14 18

Common causes
14

Adverse consequences
18 21

Common consequences
18

Table 2 The results did not deduced by reasoning machine but addressed by human experts. Equipment
T1

Parameter
Maintenance

Deviation
Maintenance

Causes
The agitator in T1 needs maintenance.

Consequences
The manhole on the top of the drum, 3 m above the bottom, may result in falling injury.

The operation procedure: after the preparation process complete, T1 should be manually cleaned up. The drum must be purged by N2 rstly, then the staffs should use copper scoop to clean the residue through the manhole. Safety

Asphyxia risk due to N2 purge

Safety

Before the preparation process begins, the T1 must be washed using solvent oil C.

The solvent oil C used by washing process must be drained to the gusset. Explode or re hazard due to the Cs volatilization.

T2

Safety

Safety

Byproduct E may be generated.

right side. The equipments name and type are all visible in every box. The focused equipments properties including the attachments, the design parameters, the materials within and the deviations should be considered are shown in the four corresponding tabs. Fig. 8 shows the equipment T2s attachments retrieved form P&ID.

This chemical process, relative LDG models and the automatic reasoning method were discussed in the authors paper introducing the LDG methodology (Cui et al., 2008). The results based on the RC packages are compared with the human experts HAZOP report. There are totally 100 causes and 86 adverse consequences deduced by the LDG reason-

334

Process Safety and Environmental Protection 8 8 ( 2 0 1 0 ) 327334

ing machine. Although the automatic reasoning results are much more than human experts results, most of them are redundant or meaningless. After data rearrangement and examination, the result comparison is shown in Table 1. The lost results, which were concerned by human expert but did not be deduced by reasoning machine, are shown in Table 2. It can be concluded than LDGHAZOP identies 77% of the results from the human experts. However, the quality of the automatic reasoning results depends on the LDG models knowledge representation ability and the algorithmic in the reasoning machine, which is little concerned with the system integration.

5.

Conclusions and future works

In this paper, the idea of integrating HAZOP expert systems with commercial process design software packages SPPID is introduced. The framework of the integration is presented including the function modules of the integrated system, the essential data needed by the integration, the integration strategy and so on. It can be seen that the automatic feature extraction application can easily and rapidly translate the P&IDs into data with proper structure that a HAZOP expert system can understand and utilize. The integration will save much effort and time in specifying the process, which signicantly eases HAZOP study in the whole lifecycle of a chemical plant. Although SPPID and LDGHAZOP are used as the examples for system integration in this paper, the strategies and methods proposed in the paper are generic, such as the ontology and the platform independent SOA, so that the system has potential of integration with some other CAD systems without major modications. The future work may include extending the ontology library, developing new data acquisition modules for other CAD systems. Besides that, since the current LDG model librarys scale is small, expanding it is also an important future task that should be done.

Acknowledgement
The work is supported by the National Nature Science Foundation of China, 20776010.

References
An, H., Chung, P., McDonald, J. and Madden, J., 2009, Computer-aided identication of isolation boundary for safe maintenance and cause and effect analysis for assessing safeguards. International Journal of Process Systems Engineering, 1(1): 2945. Bartolozzi, V., Castiglione, L., Picciotto, A. and Galluzzo, M., 2000, Qualitative models of equipment units and their use in automatic HAZOP analysis. Reliability Engineering & System Safety, 70(1): 4957. Bell, M., (2002). Introduction to service-oriented modeling. In Service-Oriented Modeling: Service Analysis, Design and Architecture. (Wiley & Sons), p. 3, ISBN 978-0-470-14111-3

Cui, L., Zhao, J., Qiu, T. and Chen, B., 2008, Layered digraph model for HAZOP analysis of chemical processes. Process Safety Progress, 27(4): 293305. Fielding, R.T. and Taylor, R.N., (2002). Principled Design of the Modern Web Architecture, ACM Transactions on Internet Technology (TOIT), vol. 2. (Association for Computing Machinery, New York), pp. 115150 Joop, F., 2007, Design safety into your plant, In Proceedings of Mary Kay OConnor Process Safety Center: Beyond Regulatory Compliance: Making Safety Second Nature TX, USA, October 2324, pp. 231243 Kanga, B., Shinb, D. and Yoon, E.S., 2003, Automation of the safety analysis of batch processes based on multi-modeling approach. Control Engineering Practice, 11(8): 871880. Khan, F.I. and Abbasi, S.A., 1997, Mathematical model for HAZOP study time estimation. Journal of Loss Prevention in the Process Industries, 10(4): 249257. Khan, F.I. and Abbasi, S.A., 1997, TOPHAZOP: a knowledge-based software tool for conducting HAZOP in a rapid, efcient yet inexpensive manner. Journal of Loss Prevention in the Process Industries, 10(5/6): 333343. Lampathaki, F., Mouzakitis, S., Gionis, G., Charalabidis, Y. and Askounis, D., 2009, Business to business interoperability: a current review of XML data integration standards. Computer Standards & Interfaces, 31(6): 10451055. Levenshtein, V.I., 1965, Binary codes capable of correcting deletions, insertions, and reversals. Soviet Physics Doklady, 10: 707710. McCoy, S.A., Wakeman, S.J., Larkin, F.D., Jefferson, M.L., Chung, P.W.H., Rushton, A.G., Lees, F.P. and Heino, P.M., 1999, HAZID, a computer aid for hazard identication 2. Unit Model System. Process Safety and Environmental Protection, 77(6): 328334. Pons, M., 2003, Industrial implementations of the CAPE-OPEN standard, In AIDIC Conference Series, vol. 6 , pp. 253262 Sheth, A.P., 1998, Changing focus on interoperability in information systems: from system, syntax, structure to semantics, in Interoperating Geographic Information Systems, Goodchild, M.F., Egenhofer, M.J., Fegeas, R., & Kottman, C.A. (eds) (Kluwer Academic Publishers), pp. 530. (Kluwer Academic Publishers). Swann, C.D. and Preston, M.L., 1995, Twenty-ve years of HAZOPs. Journal of Loss Prevention in the Process Industries, 8(6): 349353. Venkatasubramanian, V., Zhao, J. and Viswanathan, S., 2000, Intelligent systems for HAZOP analysis of complex process plants. Computers and Chemical Engineering, 24(9): 22912302. Viswanathan, S., Zhao, J. and Venkatasubramanian, V., 1999, Integrating operating procedure synthesis and hazards analysis automation tools for batch processes, In Computers and Chemical Engineering, 23 (Proceedings of ESCAPE-9) , pp. 747750 Zhao, C., Bhushan, M. and Venkatasubramanian, V., 2005, PHASUITE: an automated HAZOP analysis tool for chemical processes. Part I: Knowledge engineering framework. Process Safety and Environmental Protection, 83(B6): 509532. Zhao, C., Shimada, Y. and Venkatasubramanian, V., 2004, An integrated system to support design of safer batch processes, In Proceedings of ESCAPE-14 Zhao, J., Cui, L., Zhao, L., Qiu, T. and Chen, B., 2008, Learning HAZOP expert system by case based reasoning and ontology. Computers & Chemical Engineering, 33(1): 371378.

Вам также может понравиться