Вы находитесь на странице: 1из 52

1

ISA‑62443-1-2 – 2018

Security for Industrial Automation and Control Systems:

Master Glossary of Terms and Abbreviations

Draft 2, Edit 1
February 2018
ISA99, WG03, TG01 –2– ISA‑62443-1-2, D2E1, February 2018

ISA

Security for Industrial Automation and Control Systems :

Master Glossary of Terms and Abbreviations

ISBN: -to-be-assigned-

Copyright © 2018 by ISA. All rights reserved. Not for resale. Printed in the United States of
America.

ISA
67 T.W. Alexander Drive
P. O. Box 12277
Research Triangle Park, NC 27709 USA
ISA‑62443-1-2, D2E1, February 2018 –3– ISA99, WG03, TG01

6 PREFACE
7 This preface, as well as all footnotes and annexes, is included for information purposes and is not
8 part of ISA‑62443-1-2.

9 This document has been prepared as part of the service of ISA, the International Society of
10 Automation, toward a goal of uniformity in the field of instrumentation. To be of real value, this
11 document should not be static but should be subject to periodic review. Toward this end, the
12 Society welcomes all comments and criticisms and asks that they be addressed to the Secret ary,
13 Standards and Practices Board; ISA; 67 T.W. Alexander Drive; P. O. Box 12277; Research
14 Triangle Park, NC 27709; Telephone (919) 549-8411; Fax (919) 549-8288; E-mail:
15 standards@isa.org.

16 The ISA Standards and Practices Department is aware of the growing need for attention to the
17 metric system of units in general and the International System of Units (SI), in the preparation of
18 instrumentation standards. The Department is further aware of the benefits to USA users of ISA
19 standards of incorporating suitable references to the SI (and the metric system) in their business
20 and professional dealings with other countries. Toward this end, this Department will endeavor to
21 introduce SI-acceptable metric units in all new and revised standards, recommended practices
22 and technical reports to the greatest extent possible. Standard for Use of the International
23 System of Units (SI): The Modern Metric System, published by the American Society for Testing
24 and Materials as IEEE/ASTM SI 10-97, and future revisions, will be the reference guide for
25 definitions, symbols, abbreviations, and conversion factors.

26 It is the policy of ISA to encourage and welcome the participation of all concerned individuals and
27 interests in the development of ISA standards, recommended practices and technical reports.
28 Participation in the ISA standards-making process by an individual in no way constitutes
29 endorsement by the employer of that individual, of ISA or of any of the standards, recommended
30 practices and technical reports that ISA develops.

31 CAUTION – ISA adheres to the policy of the American National Standards Institute with
32 regard to patents. If ISA is informed of an existing patent that is required for use of the
33 standard, it will require the owner of the patent to either grant a royalty -free license for use
34 of the patent by users complying with the standard or a license on reasonable terms and
35 conditions that are free from unfair discrimination.

36 Even if ISA is unaware of any patent covering this Standard, the user is cautioned that
37 implementation of the standard may require use of techniques, processes or materials
38 covered by patent rights. ISA takes no position on the existence or validity of any patent
39 rights that may be involved in implementing the standard. ISA is not responsible for
40 identifying all patents that may require a license before implementation of the standard or
41 for investigating the validity or scope of any patents brought to its attention. The user
42 should carefully investigate relevant patents before using the standa rd for the user’s
43 intended application.

44 However, ISA asks that anyone reviewing this standard who is aware of any patents that
45 may impact implementation of the standard notify the ISA Standards and Practices
46 Department of the patent and its owner.

47 Additionally, the use of this standard may involve hazardous materials, operations or
48 equipment. The standard cannot anticipate all possible applications or address all possible
49 safety issues associated with use in hazardous conditions. The user of this standard m ust
50 exercise sound professional judgment concerning its use and applicability under the
51 user’s particular circumstances. The user must also consider the applicability of any
52 governmental regulatory limitations and established safety and health practices be fore
53 implementing this standard.
ISA99, WG03, TG01 –4– ISA‑62443-1-2, D2E1, February 2018

54 ISA (www.isa.org) is a nonprofit professional association that sets the standard for those who
55 apply engineering and technology to improve the management, safety, and cybersecurity of
56 modern automation and control systems used across industry and critical infrastructure. Founded
57 in 1945, ISA develops widely used global standards; certifies industry professionals; provides
58 education and training; publishes books and technical articles; hos ts conferences and exhibits;
59 and provides networking and career development programs for its 40,000 members and 400,000
60 customers around the world.

61 ISA owns Automation.com, a leading online publisher of automation-related content, and is the
62 founding sponsor of The Automation Federation ( www.automationfederation.org), an association
63 of non-profit organizations serving as "The Voice of Automation." Throu gh a wholly owned
64 subsidiary, ISA bridges the gap between standards and their implementation with the ISA
65 Security Compliance Institute (www.isasecure.org) and the ISA Wireless Compliance Institute
66 (www.isa100wci.org).

67

68
ISA‑62443-1-2, D2E1, February 2018 –5– ISA99, WG03, TG01

69 The following people served as active members of ISA99, Working Group 3, Task Group 1 for the
70 preparation of this document:

Name Company Contributor Reviewer


Bruce Billdeaux Maverick Technologies X
Eric Cosman OIT Concepts LLC X
Jeff Potter Emerson Automation X

71

72
ISA99, WG03, TG01 –6– ISA‑62443-1-2, D2E1, February 2018

73 CONTENTS
74

75 PREFACE ............................................................................................................................... 3
76 FOREWORD ........................................................................................................................... 7
77 INTRODUCTION ..................................................................................................................... 8
78 1 Scope ............................................................................................................................... 9
79 2 Normative references ....................................................................................................... 9
80 3 Terms, definitions, abbreviated terms, acronyms, and conventions ................................. 11
81 3.1 Terms and definitions ............................................................................................ 11
82 3.2 Abbreviated terms and acronyms ........................................................................... 44
83 BIBLIOGRAPHY ................................................................................................................... 51
84
85

86
ISA‑62443-1-2, D2E1, February 2018 –7– ISA99, WG03, TG01

87 FOREWORD
88 This is part of a multipart standard that addresses the issue of security for industrial automation
89 and control systems. It has been developed by Working Group 03, Task Group 01 of the ISA99
90 committee.

91 This document provides definitions for terms used throughout the ISA‑62443 series of standards.
92
93
ISA99, WG03, TG01 –8– ISA‑62443-1-2, D2E1, February 2018

94 INTRODUCTION
95 Cyber security is an increasingly important topic in modern organizations. Many organizati ons
96 involved in information technology (IT) and business have been concerned with cyber security for
97 many years and have well established information security management systems (ISMS) in place
98 as defined by the International Organization for Standardizati on (ISO) and the International
99 Electrotechnical Commission (IEC) (see ISO/IEC 27001 and ISO/IEC 27002). These management
100 systems provide an organization with a well-established method for protecting its assets from
101 cyber attacks.

102 Today, Industrial automation and control system (IACS) organizations frequently use commercial -
103 off-the-shelf (COTS) technology developed for business systems in their everyday processes,
104 which has provided an increased opportunity for cyber -attack against the IACS equipment. For
105 many reasons, these COTS-based systems are not usually as robust at dealing with cyber -attack,
106 in the IACS environment, as are systems designed specifically for that environment. This
107 weakness may lead to health, safety and environmental (HSE) consequences.

108 Organizations may try to use pre-existing IT and business cyber security solutions to address
109 security for IACS without understanding the consequences. While many of these solutions can be
110 applied to IACS, they need to be applied in the correct way and wit h cognizance of their side-
111 effects to eliminate or reduce inadvertent consequences.

112

113
ISA‑62443-1-2, D2E1, February 2018 –9– ISA99, WG03, TG01

114 1 Scope
115 This report defines the terms and abbreviations commonly used throughout the ISA‑62443 series.
116 Wherever possible definitions have been taken from established available sources (see
117 Bibliography). In some cases, the definitions have been altered, usually slightly, to adapt them to
118 the context of industrial automation and control systems.

119 The following guidelines have been used in defining terms:

120 1) When a term has been identified as needing definition, the IEC and ISO glossaries are
121 checked to see if the term has already been defined.
122 NOTE ISO definitions include those developed by ISO/IEC JTC 1 SC27, which is responsible for IT cyber security
123 standards.
124 2) If there is a potentially relevant IEC or ISO definition, it is determined whether it is suitable
125 and appropriate for the intended usage; if so, then this definition is used.
126 3) If the existing definition(s) are not appropriate or consistent with intended usage, either
127 consider an alternate term or suggest an additional definition to be included in the master
128 glossary. References should be cited for any alternate definitions offered.
129 4) Terms and new or alternate definitions identified for the master glossary are shared with
130 IEC for possible inclusion in their glossary.
131 5) When necessary, the definition may be followed by a NOTE describing qualification or
132 clarification of the definition used.

133 2 Normative references


134 The following referenced documents are indispensable for the application of this document. For
135 dated references, only the edition cited applies. For undated references, the latest edition of the
136 referenced document (including any amendments) appli es.

137 ISA‑62443-1-1, Security for industrial automation and control systems – Part 1-1: Terminology,
138 concepts and models

139 ISA‑62443-1-3, Security for industrial automation and control systems – Part 1-3: System
140 security compliance metrics

141 ISA‑TR62443-1-4, Security for industrial automation and control systems – Part 1-4: IACS
142 security lifecycle and use-case

143 ISA‑62443-2-1, Security for industrial automation and control systems – Part 2-1: IACS security
144 management system – Requirements

145 ISA‑TR62443-2-2, Security for industrial automation and control sy stems – Part 2-2: IACS
146 security management system – Implementation guidance

147 ISA‑TR62443-2-3, Security for industrial automation and control systems – Part 2-4: Patch
148 management in the IACS environment

149 IEC 62443-2-4, Security for industrial automation and control systems – Part 2-4: Installation and
150 maintenance requirements for IACS suppliers

151 ISA‑TR62443-3-1, Security for industrial automation and control systems – Part 3-1: Security
152 technologies for IACS

153 ISA‑62443-3-2, Security for industrial automation and control systems – Part 3-2: Security levels
154 for zones and conduits
ISA99, WG03, TG01 – 10 – ISA‑62443-1-2, D2E1, February 2018

155 ISA‑62443-3-3, Security for industrial automation and control systems – Part 4-1: System
156 security requirements and security levels

157 ISA‑62443-4-1, Security for industrial automation and control systems – Part 4-2: Product
158 development requirements

159 ISA‑62443-4-2, Security for industrial automation and control systems – Part 4-3: Technical
160 security requirements for IACS components

161
ISA‑62443-1-2, D2E1, February 2018 – 11 – ISA99, WG03, TG01

162 3 Terms, definitions, abbreviated terms, acronyms, and conventions


163 3.1 Terms and definitions
164 The following terms and corresponding definitions shall apply consistently across all standards in
165 the ISA‑62443 series.

166 3.1.1
167 abstract process
168 interaction between one process and another process, or between one process and a participant

169 3.1.2
170 abuse case
171 test case used to perform negative operations of a use case
172 NOTE 1: Abuse case tests are simulated attacks often based on the threat model. An abuse case is a type of complete
173 interaction between a system and one or more actors, where the results of the interaction are intentionally intended to
174 be harmful to the system, one of the actors, or one of the stakeholders in the system .
175 3.1.3
176 access
177 ability and means to communicate with or otherwise interact with a system in order to use system
178 resources
179 NOTE 1: Access may involve physical access (authorization to be allowed physically in an area, possession of a
180 physical key lock, PIN code, or access card or biometric attributes that allow access) or logical access (authorization to
181 log in to a system and application, through a combination of logical and physical means) .
182 3.1.4
183 access authority
184 entity responsible for monitoring and granting access privileges to IACSs and their associated
185 industrial network for other authorized entities

186 3.1.5
187 access control
188 protection of system resources against unauthorized access; a process by which use of system
189 resources is regulated according to a security policy and is permitted by only authorized entities
190 (users, programs, processes, or other systems) according to that policy

191 3.1.6
192 accountability
193 property of a system (including all of its system resources) that ensures that the actions of a
194 system entity may be traced uniquely to that entity, which can be held responsible for its actions

195 3.1.7
196 achieved security level
197 measure of the security level achieved in the deployed security architecture, elsewhere,
198 sometimes referred to as the “as-built” security level
199 NOTE 1: Actual security level will vary over time based on natural degradations, induce d events and maintenance of
200 security mechanisms.
201 3.1.8
202 activity
203 work performed using a well-defined process
204 NOTE 1: An activity can be atomic or non-atomic (compound).
205 NOTE 2: Process, sub-process and task are types of activity.
206 3.1.9
207 actuator
208 actuating element connected to process equipment and to the control system
ISA99, WG03, TG01 – 12 – ISA‑62443-1-2, D2E1, February 2018

209 3.1.10
210 administrator
211 user role whose responsibilities include implementing security policies for a system

212 3.1.11
213 aggregation
214 strong form of association
215 NOTE 1: Aggregation often implies, but does not require, that the aggregated objects have mutual dependencies.
216 3.1.12
217 application
218 software programs executing on the infrastructure that are used to interface with the process or
219 the control system itself
220 NOTE 1: Attributes include executable, typically execute on personal co mputers (PCs) or embedded controllers.

221 3.1.13
222 application layer protocols
223 protocols specific to executing network applications such as email and file transfer.
224 NOTE 1: Layer 7 of the OSI reference model in standard ISO 7498, Information Technology - Open Systems
225 Interconnection - Basic Reference Model (www.iso.ch). It is important to note here that many modern industrial control
226 systems include fieldbus networks, which do not normally include all seven layers but do have an application layer.
227 3.1.14
228 area
229 subset of a site’s physical, geographic, or logical group of assets
230 NOTE 1: An area may contain manufacturing lines, process cells, and production units. Areas may be connected to
231 each other by a site local area network and may contain systems related to the operations performed in that area.
232 3.1.15
233 artifact
234 graphical object that provides supporting information about process or elements with the process
235 NOTE 1: An artifact does not directly affect the flow of the process.
236 3.1.16
237 asset
238 physical, logical and informational object that has value to the organization and is associated with
239 the IACS
240 NOTE 1: In the case of industrial automation and control systems the physical assets that have the largest directly
241 measurable value may be the equipment under control.
242 NOTE 2: IACS assets are generally part of the systems used to manufacture, inspect, manage and ship a product.
243 NOTE 3: In this specific case, an asset is any item that should be protected as part of the IACS security management
244 system.
245 3.1.17
246 asset operator
247 individual or organization responsible for the operation of the IACS

248 3.1.18
249 asset owner
250 individual or organization responsible for one or more IACSs
251 NOTE 1: Used in place of the generic word end user to provide differentiation.
252 NOTE 2: This definition includes the components that are part of the IACS.
253 NOTE 3: In the context of this standard, asset owner also includes the operator of the IACS.
ISA‑62443-1-2, D2E1, February 2018 – 13 – ISA99, WG03, TG01

254 3.1.19
255 asymmetric key algorithm
256 see "public key cryptographic algorithm"
257 NOTE 1: By asymmetric, the key for encoding the digital data to be transmitted is entirely different than the code for
258 decrypting the data at the receiving end. This is in contrast of symmetric key encryption whereby the same key is used
259 to encrypt and decrypt the data. Asymmetric is logistically more secure because it avoids transfer o f the key between
260 transmitter and receiver, whereby it could be intercepted. It is important to note that cryptographic methods to protect
261 the confidential data are more critical for IT networks than for control networks. For IACSs, confidentiality is most
262 critical for the authenticating and authorization stages during access control into a given IACS. Usually cryptography
263 adds undesired latency to the IACS network –which is very undesirable for open and closed loop systems that must
264 receive, manipulate and send control data at a rate commensurate of asset’s process dynamics. Consequently,
265 availability and integrity are usually higher IACS cyber security objectives than confidentiality .
266 3.1.20
267 attack
268 assault on a system that derives from an intelligent threat
269 NOTE 1: For example, an intelligent act that is a deliberate attempt (especially in the sense of a method or technique)
270 to evade security services and violate the security policy of a system
271 NOTE 2: There are different commonly recognized classes of attack: An “active attack” attempts to alter system
272 resources or affect their operation. A “passive attack” attempts to learn or make use of information from the system but
273 does not affect system resources; An “inside attack” is an attack initiated by an entity insi de the security perimeter (an
274 “insider”) – i.e., an entity that is authorized to access system resources but uses them in a way not approved by those
275 who granted the authorization. An “outside attack” is initiated from outside the perimeter, by an unauthor ized or
276 illegitimate user of the system (including an insider attacking from outside the security perimeter). Potential outside
277 attackers range from amateur pranksters to organized criminals, international terrorists, and hostile governments.
278 3.1.21
279 attack surface
280 physical and functional interfaces of a system that can be accessed and, therefore, potentially
281 exploited
282 NOTE 1: The size of the attack surface for a software interface is proportional to the number of methods and
283 parameters defined for the interface. Simple interfaces, therefore, have smaller attack surfaces than complex
284 interfaces.
285 NOTE 2: The size of the attack surface and the number of vulnerabilities are not necessarily related to each other.
286 3.1.22
287 attribute
288 an observable discrete, inherent characteristic, trait, quantity, or quality property of an entity

289 3.1.23
290 audit
291 independent review and examination of records and activities to assess the adequacy of system
292 controls, to ensure compliance with established policies and operational procedures, and to
293 recommend necessary changes in controls, policies, or procedures (see “security audit”)
294 NOTE 1: There are three forms of audit. (1) External audits are conducted by parties who are not employees or
295 contractors of the organization. (2) Internal audit s are conducted by a separate organizational unit dedicated to internal
296 auditing. (3) Controls self-assessments are conducted by peer members of the process automation function.
297 3.1.24
298 audit log
299 event log that requires a higher level of integrity protection than pro vided by typical event logs
300 NOTE 1: Audit logs are used to protect against claims that repudiate responsibility for an action.
301 3.1.25
302 authentication
303 provision of assurance that a claimed characteristic of an identity is correct
304 NOTE 1: Authentication is usually a prerequisite to allowing access to resources in a control system.
305 NOTE 2: Not all credentials used to authenticate an identity are created equally. The trustworthiness of the credential is
306 determined by the configured authentication mechanism. Hardware or software-based mechanisms can force users to
ISA99, WG03, TG01 – 14 – ISA‑62443-1-2, D2E1, February 2018

307 prove their identity before accessing data on a device. A typical example is proving the identity of a user usually
308 through an identity provider.
309 3.1.26
310 authenticator
311 means used to confirm the identity of a user (human, software process or device)
312 NOTE 1: For example, a password or token may be used as an authenticator.
313 3.1.27
314 authenticity
315 property that an entity is what it claims to be
316 NOTE 1: It may also be defined as confidence in the validity of a transmission, a me ssage, or message originator.
317 NOTE 2: Authenticity is typically used in the context of confidence in the identity of an entity, or the validity of a
318 transmission, a message or message originator.
319 3.1.28
320 authorization
321 right or a permission that is granted to a system entity to access a system resource

322 3.1.29
323 automatic
324 process or equipment that, under specified conditions, functions without human intervention

325 3.1.30
326 Automation Solution
327 control system and any complementary hardware and software components that have been
328 installed and configured to operate in an IACS
329 NOTE 1: Automation Solution is used as a proper noun in this part of [SOURCE: ISA-62443.
330 NOTE 2: The difference between the control system and the Automation Solution is that the control system is
331 incorporated into the Automation Solution design (e.g., a specific number of workstations, controllers, and devices in a
332 specific configuration), which is then implemented. The resulting configuration is referred to as the Automation Solution.
333 NOTE 3: The Automation Solution may be comprised of components from multiple suppliers, including the product
334 supplier of the control system.
335 3.1.31
336 availability
337 property of ensuring timely and reliable access to and use of control system information and
338 functionality

339 3.1.32
340 bandwidth
341 capacity of a communication channel to pass data through the channel in a given amount of time.
342 It is usually expressed in bits per second.
343 NOTE 1: As a side note, control and SCADA data are usually of smaller, yet consistent, bit sizes than IT networks,
344 which traditionally carry higher levels. Nonetheless the move to fieldbus systems are requiring higher band widths due
345 to their inherent nature of requiring less wiring and performing control algorithms without the use of a master station or
346 PLC.
347 3.1.33
348 banned function
349 software method that should not be allowed in software because more secure versions exist with
350 less propensity for misuse
351 NOTE 1: Banned functions are sometimes called banned methods or banned application programming interfaces
352 (APIs).
353 3.1.34
354 basic process control system
355 system that responds to input signals from the process, its associated equipment, other
356 programmable systems and/or an operator and generates output signals causing the process and
ISA‑62443-1-2, D2E1, February 2018 – 15 – ISA99, WG03, TG01

357 its associated equipment to operate in the desired manner but does n ot perform any safety
358 integrated functions (SIF)
359 NOTE 1: Safety instrumented functions are specified in the IEC 61508 series.
360 NOTE 2: The term “process” in this definition may apply to a variety of industrial processes, including continuous
361 processes and manufacturing processes.
362 3.1.35
363 best practices (security)
364 guidelines for securely designing, developing, testing, maintaining or retiring products that are
365 commonly recommended by both the security and industrial automation communities
366 NOTE 1: Examples of secure design best practices include least privilege, economy of mechanism, and least common
367 mechanism.
368 3.1.36
369 border
370 edge or boundary of a physical or logical security zone

371 3.1.37
372 botnet
373 collection of software robots, or bots, which run autonomously
374 NOTE 1: A botnet's originator can control the group remotely, possibly for nefariou s purposes.

375 3.1.38
376 boundary
377 software, hardware, or other physical barrier that limits access to a system or part of a system

378 3.1.39
379 boundary device
380 communication security asset, within a zone or conduit , that provides an interface between a
381 zone and a conduit

382 3.1.40
383 bug
384 flaw in the original development of software (such as a security vulnerability), which causes it to
385 perform or behave in an unintended manner (such as cause reliability or operability issues)

386 3.1.41
387 business process modeling notation
388 graphical representation for specifying business processes in a business process model
389 NOTE 1: BPMN is maintained by the Object Management Group (OMG). The current version of BPMN is 2.0. With
390 BPMN 2.0 the language does not only contain notational information, but execution semantics.
391 3.1.42
392 business requirement
393 specification of characteristics of the business process that must be satisfied by the Solution

394 3.1.43
395 business system
396 collection of information technology elements (i.e., hardware, software and services) installed
397 with the intent to facilitate an organization’s business process or processes (administrative or
398 project)

399 3.1.44
400 cell
401 lower-level element of a manufacturing process that performs manufacturing, field device control,
402 or vehicle functions
ISA99, WG03, TG01 – 16 – ISA‑62443-1-2, D2E1, February 2018

403 NOTE 1: Entities at this level may be connected together by an area control network and may contain information
404 systems related to the operations performed in that entity.
405 3.1.45
406 certificate
407 see Public Key Certificate

408 3.1.46
409 certification authority
410 entity in a Public Key Infrastructure (PKI) that is responsible for issuing certificates, and exac ting
411 compliance to a PKI policy

412 3.1.47
413 channel
414 specific logical or physical communication link between assets located in two separate zones
415 NOTE 1: A channel facilitates the establishment of a connection.
416 3.1.48
417 ciphertext
418 data that have been transformed by encryption so that its semantic information content (i.e., its
419 meaning) is no longer intelligible or directly available

420 3.1.49
421 clear text
422 data in which the semantic information content (i.e., the meaning) is intelligible or is directly
423 available

424 3.1.50
425 client
426 device or application receiving or requesting services or information from a server application

427 3.1.51
428 communication channel
429 specific logical or physical communication link between assets
430 NOTE 1: A channel facilitates the establishment of a connection.
431 3.1.52
432 communication path
433 logical connection between a source and one or more destinations, which could be devices,
434 physical processes, data items, commands, or programmatic interfa ces
435 NOTE 1: The communication path is not limited to wired or wireless networks, but includes other means of
436 communication such as memory, procedure calls, state of physical plant, portable media, and human interactions.
437 3.1.53
438 communication system
439 arrangement of hardware, software, and propagation media to allow the transfer of messages
440 (ISO/IEC 7498 application layer service data units) from one application to another

441 3.1.54
442 compensating countermeasure
443 countermeasure employed in lieu of or in addition to inherent se curity capabilities to satisfy one
444 or more security requirements
445 NOTE 1: Examples include:
446 a) (component-level): locked cabinet around a controller that does not have sufficient cyber access control
447 countermeasures.
448 b) (control system/zone-level): physical access control (guards, gates and guns) to protect a control room to
449 restrict access to a group of known personnel to compensate for the technical requirement for personnel to be
450 uniquely identified by the IACS.
ISA‑62443-1-2, D2E1, February 2018 – 17 – ISA99, WG03, TG01

451 c) (component-level): a vendor’s programmable logic controller (PLC) cannot meet the access control capabilities
452 from an end-user, so the vendor puts a firewall in front of the PLC and sells it as a system.
453 3.1.55
454 compliance authority
455 entity with jurisdiction to determine the adequacy of a security assessment, implementation or
456 effectiveness as specified in a governing document
457 NOTE 1: Examples of compliance authorities include government agencies, regulators, and internal auditors.
458 3.1.56
459 compound activity
460 flow that proceeds from one flow object to another, via a s equence flow link, but is subject to
461 either conditions or dependencies from another flow
462 NOTE 1: Typically, this is a sequence flow between two activities with a conditional indicator or sequence flow.

463 3.1.57
464 compromise
465 violation of the security of a system such that an unauthorized disclosure or modification on
466 sensitive information may have occurred, or unauthorized behavior of the controlled physical
467 process may have occurred

468 3.1.58
469 conduit
470 logical grouping of communication channels, between connecting two or mo re zones, that share
471 common security requirements
472 NOTE 1: This is analogous to the way that a physical conduit protects cables from physical damage.
473 NOTE 2: A conduit can traverse a zone if the security of the channels contained within the conduit is not impacted by
474 the zone.
475 3.1.59
476 confidentiality
477 assurance that information is not disclosed to unauthorized individuals, processes, or devices
478 NOTE 1: When used in the context of an IACS, refers to protecting IACS data and information from unauthorized
479 access.
480 3.1.60
481 configuration management
482 discipline of identifying the components of an evolving system for the purposes of controlling
483 changes to those components and maintaining continuity and traceability throughout the lifecycle

484 3.1.61
485 conformance
486 fulfillment of a specified requirement; adherence of an entity to the requirements of one or more
487 specific specifications or standards

488 3.1.62
489 conformity
490 fulfilment of specified requirements by a product, process or service to determine if a product
491 remains as intended
492 NOTE 1: The term “conformance” is synonymous but deprecated.
493 3.1.63
494 conformity evaluation
495 systematic evaluation of the extent to which a product, process, or service fulfills specified
496 requirements
497 NOTE 1: Conformity evaluation is also known as conformity assessment.
ISA99, WG03, TG01 – 18 – ISA‑62443-1-2, D2E1, February 2018

498 NOTE 2: Conformity assessment is necessary to assure that quality system is in place that assures that the product or
499 system configuration does not change from that which was tested.
500 3.1.64
501 conformity surveillance
502 evaluation for conformity to determine the continuing conformity with specified requirements

503 3.1.65
504 connection
505 association established between two or more endpoints which supports the establishment of a
506 session

507 3.1.66
508 consequence
509 condition or state that logically or naturally follows from an event

510 3.1.67
511 consultant
512 subcontractor that provides expert advice or guidance to the integration or maintenance service
513 provider

514 3.1.68
515 control center
516 central location used to operate a set of assets
517 NOTE 1: Infrastructure industries typically use one or more control centers to supervise or coord inate their operations.
518 If there are multiple control centers (for example, a backup center at a separate site), they are typically connected
519 together via a wide area network. The control center contains the SCADA host computers and associated operator
520 display devices plus ancillary information systems such as a historian.
521 NOTE 2: In some industries the term “control room” may be more commonly used.
522 3.1.69
523 control equipment
524 class that includes distributed control systems, programmable logic controllers, SCADA sys tems,
525 associated operator interface consoles, and field sensing and control devices used to manage
526 and control the process
527 NOTE 1: The term also includes field bus networks where control logic and algorithms are executed on intelligent
528 electronic devices that coordinate actions with each other, as well as systems used to monitor the process and the
529 systems used to maintain the process.
530 3.1.70
531 control network
532 time-critical network that is typically connected to equipment that controls physical processes
533 NOTE 1: The control network can be subdivided into zones, and there can be multiple separate control networks within
534 one company or site.
535 3.1.71
536 control system
537 hardware and software components of an IACS
538 NOTE 1: Control systems are composed of field devices, embedded control devices, network devices, and host devices
539 (including workstations and servers.
540 NOTE 2: Control systems are represented in the Automation Solution by a BPCS and an optional SIS.
541 3.1.72
542 cost
543 value of impact to an organization or person that can be measure d
ISA‑62443-1-2, D2E1, February 2018 – 19 – ISA99, WG03, TG01

544 3.1.73
545 countermeasure
546 action, device, procedure, or technique that reduces a threat, a vulnerability, or an attack by
547 eliminating or preventing it, by minimizing the harm it can cause, or by discovering and reporting
548 it so that corrective action can be taken
549 NOTE 1: The term “control” is also used to describe this concept in some contexts. The term countermeasure has been
550 chosen for this standard to avoid confusion with the term “control” in the context of “process control” and “control
551 system”.
552 3.1.74
553 cryptographic algorithm
554 algorithm based upon the science of cryptography, including encryption algorithms, cryptographic
555 hash algorithms, digital signature algorithms, and key agreement algorithms

556 3.1.75
557 cryptographic key
558 input parameter that varies the transformation perform ed by a cryptographic algorithm
559 NOTE 1: Usually shortened to just “key.”
560 3.1.76
561 cyber physical component
562 part of a system with integrated computational and physical capabilities
563 NOTE 1: Cyber risk reduction factor (CRRF) can be expressed as the ratio of unmitiga ted risk divided by tolerable risk.
564 3.1.77
565 cyber risk reduction factor
566 measure of the degree of cyber security risk reduction required to achieve tolerable risk

567 3.1.78
568 cyber robustness
569 ability of an item to continue operating properly in the event of interference, up to a certain level
570 of interference
571 NOTE 1: The robustness of an item may be increased with measures that modify one or more of its operational
572 parameters.
573 NOTE 2: Cyber robustness includes the ability to withstand changes in procedure or circumstance, and the ability to
574 cope with variations in the operating environment with minimal or no damage, alteration or loss of functionality .
575 3.1.79
576 cyber security
577 measures taken to protect a computer or computer system against unauthorized access or attack
578 NOTE 1: In the context of an IACS, all references in ISO/IEC 27001 to an information security system refer to cyber
579 security.
580 3.1.80
581 cyberattack
582 attempt by digital means to destroy, alter, disable, steal, or gain unauthorized access to or make
583 unauthorized use of an asset
584 NOTE 1: Cyberattacks include targeted and non-targeted (e.g. malwares) attacks by digital means. Cyberattack is
585 synonymous with digital attack.

586 3.1.81
587 cybersecurity
588 set of activities and measures the objective of which is to dete ct, react to, and prevent: a)
589 malicious disclosures of information (confidentiality) that could be used to perform malicious acts
590 which could lead to an accident, an unsafe situation, or IAC S performance degradation. b)
591 malicious modifications (integrity) of functions that may compromise the delivery or integrity of
592 the required service (including loss of control) which could lead to an accident, an unsafe
ISA99, WG03, TG01 – 20 – ISA‑62443-1-2, D2E1, February 2018

593 situation, or IACS performance degradation. c) malicious withholding or prevention of access to


594 or communication of information, data or resources (including loss of view) that could
595 compromise the delivery of the required service by IACS (availability) which could lead to an
596 accident, an unsafe situation, or IACS performance degradation.
597 NOTE 1: It is recognized that the term “cybersecurity” has a broader meaning in other standards and guidance, often
598 including non-malevolent threats, human errors, and protection against natural disasters. Those aspects, except human
599 errors degrading security controls, are not included in this document.
600 NOTE 2: Computer security, security and cybersecurity are considered synonymous in this document.
601 3.1.82
602 cybersecurity feature
603 measure, control, or function specifically designed for cybersecurity purposes
604 NOTE 1: Non-cybersecurity feature implementation can have negative, neutral, but also positive impact on
605 cybersecurity. This is particularly the case of some safety features, as discussed in this standard.
606 3.1.83
607 data confidentiality
608 property that information is not made available or disclosed to any unauthorized sys tem entity,
609 including unauthorized individuals, entities, or processes

610 3.1.84
611 data integrity
612 property that data has not been changed, destroyed, or lost in an unauthorized or accidental
613 manner
614 NOTE 1: This term deals with constancy of and confidence in data val ues, not with the information that the values
615 represent or the trustworthiness of the source of the values.
616 3.1.85
617 data link layer protocols
618 protocols within this specific OSI level for interpreting electrical signals as data, conducting error
619 checking, performing physical addressing and conducting media access control. These protocols
620 exist in most IT enterprise systems connected to Control LANs and in some cases exist in the
621 protocols of industrial networks

622 3.1.86
623 data type (UML)
624 representation, interpretation, and structure of values used in computer systems and other
625 automatic equipment
626 NOTE 1: Depreciated in favor of value type.
627 3.1.87
628 decision
629 conditional indicator where a sequence flow can take on of several alternative paths

630 3.1.88
631 decryption
632 process of changing cipher text into plaintext using a cryptographic algorithm and key

633 3.1.89
634 defense in depth
635 provision of multiple security protections, especially in layers, with the intent to delay if not
636 prevent an attack
637 NOTE 1: Defense in depth implies layers of security and detection, even on single systems, and provides the following
638 features: a) attackers are faced with breaking through or bypassing ea ch layer without being detected; b) flaw in one
639 layer can be protected by capabilities in other layers; c) system security becomes a set of layers within the overall
640 network security.
ISA‑62443-1-2, D2E1, February 2018 – 21 – ISA99, WG03, TG01

641 3.1.90
642 degraded mode
643 mode of operation in the presence of faults that have been anticipated in the design of the control
644 system
645 NOTE 1: Degraded modes allow the control system to continue to provide essen tial functions despite the deficiency of
646 one or several system elements, for example malfunction or outage of control equipment, disruption of communication
647 due to failure or intentional system isolation in response to identified or suspected compromise of subsystems.
648 3.1.91
649 demilitarized zone
650 common, limited network of servers joining two or more zones for the purpose of controlling data
651 flow between zones
652 NOTE 1: The purpose of a demilitarized zone is to enforce the internal network’s policy for external inform ation
653 exchange and to provide external, untrusted sources with restricted access to releasable information while shielding the
654 internal network from outside attacks.
655 NOTE 2: In the context of industrial automation and control systems, the term “internal ne twork” is typically applied to
656 the network or segment that is the primary focus of protection. For example, a control network could be considered
657 “internal” when connected to an “external” business network.
658 NOTE 3: In the context of industrial automation and control systems, denial of service can refer to loss of process
659 function, not just loss of data communications.
660 NOTE 4: Demilitarized zones (DMZs) are typically used to avoid direct connections between different zones.
661 3.1.92
662 denial of service (DoS)
663 prevention or interruption of authorized access to a system resource or the delaying of system
664 operations and functions

665 3.1.93
666 deprecated function
667 software method that is supported but whose use is no longer recommended
668 NOTE 1: Methods are generally deprecated before becoming obsolete (deleted from the set of functions provided by the
669 supplier of the function). Deprecated functions are sometimes called deprecated methods or deprecated APIs.
670 3.1.94
671 design constraint
672 specification that limits implementation of a Solution or part of a Solution

673 3.1.95
674 device
675 asset incorporating one or more processors with the capability of sending or receiving
676 data/control to or from another asset
677 NOTE 1: Examples include controllers, human-machine interfaces (HMIs), PLCs, remote terminal units (RTUs),
678 transmitters, actuators, valves, network switches, etc.
679 3.1.96
680 digital signature
681 result of a cryptographic transformation of data which, when properly implemented, provides the
682 services of origin authentication, data integrity, and signer non -repudiation

683 3.1.97
684 distributed control system
685 type of control system in which the system elements are dispersed but operated in a coupled
686 manner
687 NOTE 1: Distributed control systems may have shorter coupling time constants than those typically found in SCADA
688 systems.
689 NOTE 2: Distributed control systems are commonly associated with continuous processes such as electric power
690 generation; oil and gas refining; chemical, pharmaceutical and paper manufacture, as well as discrete processes such
691 as automobile and other goods manufacture, packaging, and warehousing.
ISA99, WG03, TG01 – 22 – ISA‑62443-1-2, D2E1, February 2018

692 3.1.98
693 distribution
694 see Key Distribution

695 3.1.99
696 domain
697 environment or context that is defined by a security policy, security model, or security
698 architecture to include a set of system resources and the set of system entities that have the right
699 to access the resources

700 3.1.100
701 eavesdropping
702 monitoring or recording of communicated information by unauthorized parties

703 3.1.101
704 electronic security
705 actions required to preclude unauthorized use of, denial of service to, modifications to, disclosure
706 of, loss of revenue from, or destruction of critical systems or informational assets
707 NOTE 1: The objective is to reduce the risk of causing personal injury or endangering public health, losing public or
708 consumer confidence, disclosing sensitive assets, failing to protect business assets or failing to comply with
709 regulations. These concepts are applied to any system in the production process and include both stand -alone and
710 networked components. Communications between systems may be either through internal messa ging or by any human
711 or machine interfaces that authenticate, operate, control, or exchange data with any of these control systems.
712 Electronic security includes the concepts of identification, authentication, accountability, authorization, availability, an d
713 privacy.
714 3.1.102
715 embedded device
716 special purpose device running embedded software designed to directly monitor, control or
717 actuate an industrial process
718 NOTE 1: Typical attributes include no rotating media, limited number of exposed services, programmed through an
719 external interface, embedded operating systems (OSs) or firmware equivalent, real -time scheduler, may have an
720 attached control panel, and may have a communications interface.
721 NOTE 2: Examples include PLCs, field sensor devices, safety instrumented syst em (SIS) controllers, distributed control
722 system (DCS) controllers.
723 3.1.103
724 encryption
725 cryptographic transformation of plaintext into ciphertext that conceals the data’s original meaning
726 to prevent it from being known or used (see “decryption”)
727 NOTE 1: If the transformation is reversible, the corresponding reversal process is called “decryption,” which is a
728 transformation that restores encrypted data to its original state.
729 3.1.104
730 end event
731 indicator defining where in a path the process will end
732 NOTE 1: In terms of sequence flows, the end event ends the flow of the process, and thus, will not have any outgoing
733 sequence flows.
734 NOTE 2: An end event can have a specific result specified as an annotation attached to the indicator; e.g. message
735 error, compensation, signal, link and multiple.
736 3.1.105
737 ensure
738 determine with certainty
739 NOTE 1: “with certainty” requires real evidence that, when enabled, the Solution will perform the required action.
ISA‑62443-1-2, D2E1, February 2018 – 23 – ISA99, WG03, TG01

740 3.1.106
741 enterprise
742 business entity that produces or transports products or operates and maintains infrastructure
743 services

744 3.1.107
745 entity
746 person, device, or process that exhibits a defined behavior

747 3.1.108
748 environment
749 surrounding objects, region or circumstances which may influence the behavior of the IACS
750 and/or may be influenced by the IACS

751 3.1.109
752 equipment under control
753 equipment, machinery, apparatus or plant used for manufacturing, process, transportation,
754 medical or other activities

755 3.1.110
756 essential function
757 function or capability that is required to maintain health, safety, the environment and availability
758 for the equipment under control
759 NOTE 1: Essential functions include, but are not limited to, the safety instrumented function (SIF), the control function
760 and the ability of the operator to view and manipulate the equipment under control. The loss of essential functions is
761 commonly termed loss of protection, loss of control and loss of view respectively. In some industries additional
762 functions such as history may be considered essential.
763 3.1.111
764 event
765 occurrence of or change to a particular set of circumstances
766 NOTE 1: In an IACS this may be an action taken by an individual (authorized or unauthorized), a change detected
767 within the control system (normal or abnormal) or an automated response from the control system itself (normal or
768 abnormal).
769 3.1.112
770 event context
771 set of activities that can be interrupted by an exception (intermediate event)
772 NOTE 1: An exception can be one activity or a group of activities in an expanded sub -process.
773 3.1.113
774 exception
775 event that occurs during the performance of a process that causes a diversion from the normal
776 flow of the process
777 NOTE 1: Exceptions can be generated by intermediate events, such as time, error, or message.
778 3.1.114
779 exception flow
780 sequence flow path that originates from an intermediate event attached to the boundary of an
781 activity
782 NOTE 1: The process does not traverse this path unless the activity is interrupted by the triggering of a boundary
783 intermediate event (an exception).

784 3.1.115
785 existence proof
786 real evidence showing that an object (proof) exists
787 NOTE 1: In some cases, this means displaying the object or giving the method for finding it.
ISA99, WG03, TG01 – 24 – ISA‑62443-1-2, D2E1, February 2018

788 3.1.116
789 expanded sub-process
790 flow detail with the context of its parent process

791 3.1.117
792 extended requirement
793 requirement subtype, which adds some properties to a requirement element
794 NOTE 1: These properties such as source, risk, and verify method are important for requirement management.
795 NOTE 2: Specific profiles should add their own properties.
796 3.1.118
797 external information systems
798 hardware, software components and repositories that are connected by some means or
799 embedded within the component

800 3.1.119
801 firecall
802 method established to provide emergency access to a secure control system
803 NOTE 1: In an emergency situation, unprivileged users can gain access to key systems to correct the problem. When a
804 firecall is used, there is usually a review process to ensur e that the access was used properly to correct a problem.
805 These methods generally either provide a one-time use user identifier (ID) or one-time password.
806 3.1.120
807 firewall
808 hardware device or software package that provides filtering and/or provision of rules to al low or
809 deny specific types of network traffic to flow between zones
810 NOTE 1: A firewall may be either an application installed on a general -purpose computer or a dedicated platform
811 (appliance) that forwards or rejects/drops packets on a network. Typically, firewalls are used to define zone borders.
812 Firewalls generally have rules restricting which ports are open.
813 3.1.121
814 flow
815 direction connector between elements in a process, collaboration, or choreography
816 NOTE 1: A sequence flow represents the sequence of flow obj ects in a process or choreography.
817 NOTE 2: A message flow represents the transmission of a message between collaboration participants.
818 NOTE 3: The term flow is often used to represent the overall progression of how a process or process segment would
819 be performed.
820 3.1.122
821 function
822 a task, action, or activity that must be accomplished to achieve a desired outcome
823 NOTE 1: This technical specification includes an adjective to qualify the context of a function; e.g. authentication
824 function is an action performed to authenticate an entity’s request to access the IACS network.
825 3.1.123
826 functional requirement
827 specification of a behavior that a Solution or part of a solution must perform

828 3.1.124
829 fuzz testing
830 process of creating malformed or unexpected data or call sequences to be consumed by the
831 entity under test to verify that they are handled appropriately without fault
ISA‑62443-1-2, D2E1, February 2018 – 25 – ISA99, WG03, TG01

832 3.1.125
833 gateway
834 relay mechanism that attaches to two (or more) computer networks that have similar functions but
835 dissimilar implementations and that enables host computers on on e network to communicate with
836 hosts on the other
837 NOTE 1: Also described as an intermediate system that is the translation interface between two computer networks.

838 3.1.126
839 geographic site
840 subset of an enterprise’s physical, geographic, or logical group of assets
841 NOTE 1: A geographic site may contain areas, manufacturing lines, process cells, process units, control centers, and
842 vehicles and may be connected to other sites by a wide area network.
843 3.1.127
844 handover
845 act of turning an Automation Solution over to the asset owner
846 NOTE 1: Handover effectively transfers responsibility for operations and maintenance of an Automation Solution from
847 the integration service provider to the asset owner and generally occurs after successful completion of system test,
848 often referred to as Site Acceptance Test (SAT).
849 3.1.128
850 host
851 computer that is attached to a communication subnetwork or inter -network and can use services
852 provided by the network to exchange data with other attached systems

853 3.1.129
854 host device
855 general purpose device running a general-purpose operating system (for example Microsoft
856 Windows OS or Linux) capable of hosting one or more applications, data stores or functions
857 NOTE 1: Typical attributes include rotating media, no real -time scheduler and full HMI (keyboard, mouse, etc.).
858 3.1.130
859 IACS user
860 entity (including human users, processes and devices) that performs a function in the IACS or a
861 component used by the IACS

862 3.1.131
863 identifier
864 symbol, unique within its security domain, that identifies, indicates or names an entity which
865 makes an assertion or claim of identity

866 3.1.132
867 identify
868 assert an identity of

869 3.1.133
870 impact
871 evaluated consequence of a particular event
872 NOTE 1: Impact may be expressed in terms of numbers of injuries and/or fatalities, extent of environmental damage
873 and/or magnitude of losses such as property damage, material loss, loss of intellectual property, lost production, market
874 share loss, and recovery costs.
875 3.1.134
876 incident
877 event that is not part of the expected operation of a system or service that causes, or may cause,
878 an interruption to, or a reduction in, the quality of the service provided by the control system
ISA99, WG03, TG01 – 26 – ISA‑62443-1-2, D2E1, February 2018

879 3.1.135
880 industrial automation and control system
881 collection of personnel, hardware, software, procedures and policies involved in the operation of
882 the industrial process and that can affect or influence its safe, secure and reliable operation
883 NOTE 1: These systems include but are not limited to: a) industrial control systems, including distributed control
884 systems (DCSs), programmable logic controllers (PLCs), remote terminal units (RTUs), intelligent elec tronic devices,
885 supervisory control and data acquisition (SCADA), networked electronic sensing and control, and monitoring and
886 diagnostic systems. (In this context, process control systems include basic process control system and safety -
887 instrumented system (SIS) functions, whether they are physically separate or integrated.); b) associated information
888 systems such as advanced or multivariable control, online optimizers, dedicated equipment monitors, graphical
889 interfaces, process historians, manufacturing execution systems, and plant information management systems; c)
890 associated internal, human, network, or machine interfaces used to provide control, safety, and manufacturing
891 operations functionality to continuous, batch, discrete, and other processes.
892 NOTE 2: The IACS may include components that are not installed at the asset owner’s site.
893 NOTE 3: The definition of IACS was taken from ISA-62443-3-3 and is illustrated in Figure 2. Examples of IACSs include
894 Distributed Control Systems (DCS) and Supervisory Control and Data Acquisition (SCADA) systems. ISA -62443-2-4
895 also defines the proper noun “Solution” to mean the specific instance of the control system product and possibly
896 additional components that are designed into the IACS. The Automation Solution, therefore, differs from the control
897 system since it represents a specific implementation (design and configuration) of the control system hardware and
898 software components for a specific asset owner.
899 3.1.136
900 Industrial Automation and Control System – Security Management System
901 management system to achieve and maintain the security of an industrial automation and control
902 system
903 NOTE 1: In the context of an IACS, all references in ISO/IEC 27001 to information security management system refer to
904 the IACS-SMS.

905 3.1.137
906 information need
907 insight necessary to manage objectives, goals, risks and problems

908 3.1.138
909 infrastructure
910 further categorized into embedded device, host device and network device

911 3.1.139
912 insider
913 “trusted” person, employee, contractor, or supplier who has information that is not generally
914 known to the public (see “outsider”)

915 3.1.140
916 integration service provider
917 service provider that provides integration activities for an Automation Solution including design,
918 installation, configuration, testing, commissioning, and handover
919 NOTE 1: Integration service providers are often referred to as integrators or Main Automation Contractors (MAC).
920 3.1.141
921 integrity
922 intended function based on stated requirements under specific operating conditions
923 NOTE 1: In a formal security mode, integrity is often interprete d more narrowly to mean protection against unauthorized
924 modification or destruction of information.

925 3.1.142
926 interception
927 capture and disclosure of message contents or use of traffic analysis to compromise the
928 confidentiality of a communication system based on message destination or origin, frequency or
929 length of transmission, and other communication attributes
ISA‑62443-1-2, D2E1, February 2018 – 27 – ISA99, WG03, TG01

930 3.1.143
931 interested party
932 person or organization that can affect, be affected by, or perceive themselves to be affected by a
933 decision or activity

934 3.1.144
935 interface
936 logical entry or exit point that provides access to the module for logical information flows

937 3.1.145
938 interface requirement
939 specification of the ports connecting Solutions and parts of a Solution
940 NOTE 1: Optionally, the interface requirement may include the items that flow across the connector and the interface
941 constraints.
942 3.1.146
943 intermediate event
944 event that occurs after a process has started
945 NOTE 1: An intermediate event affects the flow of the process by showing where messages and delays are expected,
946 distributing the normal flow through exception handling, or showing the extra flow required for compensation.
947 NOTE 2: An intermediate event does not start or directly terminate a process.
948 3.1.147
949 intrusion
950 unauthorized act of compromising a system (see “attack”).

951 3.1.148
952 intrusion detection
953 security service that monitors and analyzes system events for the purpose of finding, and
954 providing real-time or near real-time warning of, attempts to access system resources in an
955 unauthorized manner

956 3.1.149
957 ISO
958 International Organization for Standardization

959 3.1.150
960 key distribution
961 transport of a key and other keying material from an entity that either owns the key or generates
962 the key to another entity that is intended to use the key

963 3.1.151
964 key management
965 process of handling and controlling cryptographic keys and related material (such as initialization
966 values) during their life cycle in a cryptographic system, including ordering, generating,
967 distributing, storing, loading, escrowing, archiving, auditing, and destroying the keys and related
968 material

969 3.1.152
970 key pair
971 public key and its corresponding private key used with a public key algorithm

972 3.1.153
973 latency
974 time interval between when a message is sent by one device and received by a second device.
975 Latency, along with jitter, are two key parameters that define the performance of a control
976 system. Increased latency for a control loop can be detrimental since the dynamics of the asset
977 under control dictates the amount of latency to keep the control process safe and productive.
ISA99, WG03, TG01 – 28 – ISA‑62443-1-2, D2E1, February 2018

978 3.1.154
979 least privilege
980 basic principle that holds that users (humans, software processes or devices) should be assigned
981 the fewest privileges consistent with their assigned duties and functions
982 NOTE 1: Least privilege is commonly implemented as a set of roles in an IACS.
983 3.1.155
984 line
985 lower-level element of a manufacturing process that performs manufacturing, field device control,
986 or vehicle functions
987 NOTE 1: See “Cell”.
988 3.1.156
989 local access
990 any access to an organizational IACS by an IACS user communicating through an internal,
991 organization-controlled network (such as a local area n etwork) or directly to the IACS without the
992 use of a network

993 3.1.157
994 local area network
995 communications network designed to connect computers and other intelligent devices in a limited
996 geographic area (typically less than 10 kilometers)

997 3.1.158
998 logical separation
999 separation from a digital data network perspective involving the absence of direct data
1000 communications (i.e. without any proxy or cybersecurity filtering device)

1001 3.1.159
1002 maintenance service provider
1003 service provider that provides support activities for an Automat ion Solution after handover
1004 NOTE 1: Maintenance is often considered to be distinguished from operation (e.g. in common colloquial language it is
1005 often assumed that an Automation Solution is either in operation or under maintenance). Maintenance service pro viders
1006 can perform support activities during operations, e.g. managing user accounts, security monitoring, and security
1007 assessments.
1008 3.1.160
1009 malicious code
1010 programs or code written for the purpose of gathering information about systems or users,
1011 destroying system data, providing a foothold for further intrusion into a system, falsifying system
1012 data and reports, or providing time-consuming irritation to system operations and maintenance
1013 personnel
1014 NOTE 1: Malicious code attacks can take the form of viruses, worms, T rojan Horses, or other automated exploits.
1015 NOTE 2: Malicious code is also often referred to as “malware.”
1016 3.1.161
1017 Management System
1018 set of interrelated or interacting elements of an organization to establish policies and objectives
1019 and processes to achieve those objectives

1020 3.1.162
1021 man-in-the-middle
1022 form of an active wiretapping attack in which the attacker intercepts and selectively modifies
1023 communicated data in order to masquerade as one or more of the entities involved in a
1024 communication association.
ISA‑62443-1-2, D2E1, February 2018 – 29 – ISA99, WG03, TG01

1025 Note 1: This is also defined as snooping and can be effectively misleading and destructive in an IACS cyber-attack
1026 since a control room operator’s screen may be indicating safe and normal routine operation, whilst havoc is conducted
1027 on the automated processes and assets in the field.
1028 3.1.163
1029 manufacturing operations
1030 collection of production, maintenance, and quality assurance operations and their relationship to
1031 other activities of a production facility
1032 NOTE 1: Manufacturing operations include: a) manufacturing or processing facil ity activities that coordinate the
1033 personnel, equipment, and material involved in the conversion of raw materials or parts into products. b) functions that
1034 may be performed by physical equipment, human effort, and information systems. c) managing informati on about the
1035 schedules, use, capability, definition, history, and status of all resources (personnel, equipment, and material) within
1036 the manufacturing facility.
1037 3.1.164
1038 measure
1039 variable to which a value is assigned as the result of measurement
1040 NOTE 1: This can be adopted as a definition of metric.
1041 3.1.165
1042 measurement
1043 process to determine a value

1044 3.1.166
1045 merge
1046 point in a process where two or more alternative sequence flow paths are combined into one
1047 sequence flow path

1048 3.1.167
1049 message
1050 object that depicts the content of a communication between two participants

1051 3.1.168
1052 message flow
1053 connecting object that shows the flow of messages between two participants

1054 3.1.169
1055 mobile code
1056 program transferred between a remote, possibly “untrusted” system, across a network or via
1057 removable media that can be executed unchanged on a local system without explicit installation
1058 or execution by the recipient
1059 NOTE 1: Examples of mobile code include JavaScript, VBScript, Java applets, ActiveX controls, Flash animations,
1060 Shockwave movies, and Microsoft Office macros.
1061 3.1.170
1062 network device
1063 device that facilitates data flow between devices, or restricts the flow of data, but does not
1064 directly interact with a control process
1065 NOTE 1: Typical attributes include embedded OS or firmware, no HMI, no real -time scheduler and configured through
1066 an external interface.
1067 3.1.171
1068 network layer protocol
1069 protocol for routing of messages through a complex network.
1070 NOTE 1: Layer 3 of the OSI reference model. Most modern industrial fieldbus protocols and SCADA protocols usually
1071 contain a network layer
ISA99, WG03, TG01 – 30 – ISA‑62443-1-2, D2E1, February 2018

1072 3.1.172
1073 nonconformity
1074 non-fulfilment of a requirement

1075 3.1.173
1076 non-repudiation
1077 ability to prove the occurrence of a claimed event or action and its originating entities
1078 NOTE 1: The purpose of non-repudiation is to resolve disputes about the occurrence or non -occurrence of the event or
1079 action and involvement of entities in the event.
1080 3.1.174
1081 normal flow
1082 flow that originates from a start event and continues through activities on alternate and parallel
1083 paths until reaching an end event

1084 3.1.175
1085 OPC
1086 set of specifications for the exchange of informatio n in a process control environment
1087 NOTE 1: The abbreviation “OPC” originally came from “OLE for Process Control”, where “OLE” was short for “Object
1088 Linking and Embedding.”
1089 3.1.176
1090 organization
1091 person or group of people that has its own functions with responsibili ties, authorities and
1092 relationships to achieve its objectives (2.56)
1093 NOTE 1: The concept of organization includes but is not limited to sole -trader, company, corporation, firm, enterprise,
1094 authority, partnership, charity or institution, or part or combinat ion thereof, whether incorporated or not, public or
1095 private.
1096 NOTE 2: For organizations with more than one operating unit, a single unit may be defined as an organization.
1097 3.1.177
1098 outsider
1099 person or group not “trusted” with inside access, who may or may not be kno wn to the targeted
1100 organization (See “insider”)
1101 NOTE 1: Outsiders may or may not have been insiders at one time.
1102 3.1.178
1103 part under consideration
1104 specification used for analysis

1105 3.1.179
1106 participant
1107 role that controls or is responsible for a process

1108 3.1.180
1109 password
1110 string of characters (letters, numbers, and other symbols) used to authenticate an identity or to
1111 verify access authorization.

1112 3.1.181
1113 patch
1114 incremental software change in order to address a security vulnerability, a bug, reliability or
1115 operability issue (update) or add a new feature (upgrade)
1116 NOTE 1: Patches may also be called software updates, software upgrades, firmware upgrades, service packs, hotfixes,
1117 basic input output system (BIOS) updates, security advisories and other digital electronic program updates.
ISA‑62443-1-2, D2E1, February 2018 – 31 – ISA99, WG03, TG01

1118 3.1.182
1119 patch lifecycle
1120 period in time that a patch is recommended or created until the patch is installed
1121 NOTE 1: In the context of this technical report, this lifecycle begins when the patch is created and made available.
1122 NOTE 2: Some feel that the patching lifecycle begins when the vulnerability has been disclosed. However, it is not
1123 possible for this technical report to provide all possible guidance for the mitigation of vulnerabilities for the period
1124 between disclosure of a vulnerability, the decision to crea te a patch and the availability of a patch. It is also to the
1125 discretion of the software developer or product supplier to determine if they develop a patch.
1126 3.1.183
1127 patch management
1128 area of systems management that involves monitoring, acquiring, testing, scheduli ng and
1129 installing software patches (code changes) to a product
1130 NOTE 1: See ISA-TR62443-2-3 for additional information.
1131 NOTE 2: Patch management also applies to the process of keeping embedded 3rd party libraries current before
1132 releasing a product.
1133 3.1.184
1134 penetration
1135 successful unauthorized access to a protected system resource

1136 3.1.185
1137 performance requirement
1138 quantitative measure of the extent to which a Solution or a Solution part satisfies a required
1139 capability or condition objective

1140 3.1.186
1141 personal identification number (PIN)
1142 alphanumeric code or password used to authenticate an identity

1143 3.1.187
1144 physical layer protocol
1145 protocol for transmitting raw electrical signals over the communications channel.
1146 NOTE 1: Deals with transmission physics such as cabling, modulation, and transmission rates. Layer 1 of the OSI
1147 reference model.
1148 3.1.188
1149 physical requirement
1150 specification of the physical characteristics and physical constraints of a Solution or a Solution
1151 part

1152 3.1.189
1153 plaintext
1154 unencoded data that is input to and transformed by an encryptio n process, or that is output by a
1155 decryption process

1156 3.1.190
1157 point-to-point protocol (PPP)
1158 protocol defined in RFC 1661, the Internet standard for transmitting network layer datagrams
1159 (e.g. Internet Protocol (IP) packets) over serial point -to-point links, which is occasionally
1160 deployed in certain types of SCADA networks.

1161 3.1.191
1162 portable media
1163 portable devices that contain data storage capabilities that can be used to physically copy data
1164 from one piece of equipment and transfer it to another
ISA99, WG03, TG01 – 32 – ISA‑62443-1-2, D2E1, February 2018

1165 NOTE 1: Types of portable media include but are not limited to: CD / DVD / BluRay Media, USB memory devices, smart
1166 phones, flash memory, solid state disks, hard drives, handhelds, and portable computers.
1167 3.1.192
1168 privilege
1169 authorization or set of authorizations to perform specific functions, especially in the context of a
1170 computer operating system
1171 NOTE 1: Examples of functions that are controlled using privilege include acknowledging alarms, changing setpoints,
1172 modifying control algorithms.
1173 3.1.193
1174 process
1175 set of interrelated or interacting activities which transforms inputs into outputs
1176 NOTE 1: Inputs to a process are generally outputs of other processes.
1177 NOTE 2: Processes in an organization are generally planned and carried out under controlled conditions to add value.
1178 3.1.194
1179 process hazard analysis
1180 set of organized and systematic assessments of the potential hazards associated with an
1181 industrial process

1182 3.1.195
1183 product
1184 system, subsystem, or component that is manufactured/developed or refined for sale or reuse by
1185 other components or products
1186 NOTE 1: The processes required by the practices defined in this standard apply iteratively to all levels of product
1187 design (for example, from the system level to the component level).
1188 3.1.196
1189 product security context
1190 security provided to the product by the environment (end user depl oyment) in which the product is
1191 intended to be used.
1192 NOTE 1: The security provided to the product by its intended environment can effectively restrict the threats that are
1193 applicable to the product.
1194 3.1.197
1195 product supplier
1196 manufacturer of hardware and/or software product
1197 NOTE 1: Used in place of the generic word vendor to provide differentiation.
1198 NOTE 2: The product supplier includes the entity responsible for developing and maintaining a product which may
1199 include more than just the manufacturer (for example, in tegrator).
1200 3.1.198
1201 protection profile
1202 implementation-independent set of security requirements for a category of Targets of Evaluation
1203 that meet specific consumer needs.

1204 3.1.199
1205 protocol
1206 set of rules (i.e., formats and procedures) to implement and control some type of as sociation
1207 (e.g., communication) between systems

1208 3.1.200
1209 pseudorandom number generator (PRNG)
1210 algorithm that produces a sequence of bits that are uniquely determined from an initial value
1211 called a seed.
ISA‑62443-1-2, D2E1, February 2018 – 33 – ISA99, WG03, TG01

1212 NOTE 1: The output of the PRNG “appears” to be random, i.e., the output is statistically indistinguishable from random
1213 values. A cryptographic PRNG has the additional property that the output is unpredictable, given that the seed is not
1214 known
1215 3.1.201
1216 public key
1217 cryptographic key used with a public key cryptographic algorithm that is uniquely associated with
1218 an entity and that may be made public

1219 3.1.202
1220 public key (asymmetric) cryptographic algorithm
1221 cryptographic algorithm that uses two related keys, a public key and a private key. The two keys
1222 have the property that deriving the private key from the public key is computationally infeasible.

1223 3.1.203
1224 public key certificate
1225 set of data that uniquely identifies an entity, contains the entity’s public key, and is digitally
1226 signed by a trusted party, thereby binding the public key to th e entity.

1227 3.1.204
1228 public key infrastructure (PKI)
1229 framework that is established to issue, maintain, and revoke public key certificates.

1230 3.1.205
1231 record of action
1232 documentation of the results developed from the analysis of a specific action
1233 NOTE 1: Documentation requirements are a local matter that may include a weighting factor based on risk assessment.
1234 NOTE 2: A specific action may include, but are not limited to, one or more combinations of documentation review,
1235 analysis of procedures to determine effectiveness, and ana lysis of test results.
1236 3.1.206
1237 reference model
1238 framework for understanding significant relationships among the entities of some environment,
1239 and for the development of consistent standards or specifications supporting that environment.
1240 NOTE 1: A reference model is based on a small number of unifying concepts and may be used as a basis for education
1241 and explaining standards to a non-specialist.
1242 3.1.207
1243 reliability
1244 ability of a system to perform a required function under stated conditions for a specified period of
1245 time

1246 3.1.208
1247 remote access
1248 access to a control system by any user (human, software process or device) communicating from
1249 outside the perimeter of the zone being addressed
1250 NOTE 1: The exact definition of “remote” can vary according to situation. For example, access may come from a
1251 location that is remote to the specific zone, but still within the boundaries of a company or organization. This might
1252 represent a lower risk than access that originates from a location that is remote and outside of a company’s boundaries.
1253 NOTE 2: Examples of applications that support remote access include RDP, OPC, and Syslog.
1254 NOTE 3: In general, remote access applications and the Automation Solution will reside in different security zones as
1255 determined by the asset owner. See ISA-62443-3-2 for the application of zones and conduits to the Automation Solution
1256 by the asset owner.
1257 3.1.209
1258 remote session
1259 session initiated involving access to a control system by any user (human, software process or
1260 device) communicating from outside the perimeter of the zon e being addressed
ISA99, WG03, TG01 – 34 – ISA‑62443-1-2, D2E1, February 2018

1261 3.1.210
1262 repudiation
1263 denial by one of the entities involved in a communication of having participated in all or part of
1264 the communication

1265 3.1.211
1266 requirement
1267 need or expectation that is stated, generally implied or obligatory
1268 NOTE 1: “Generally implied” means that it is custom or common practice for the organization and interested parties that
1269 the need or expectation under consideration is implied.
1270 NOTE 2: A specified requirement is one that is stated, for example in documented information.
1271 3.1.212
1272 requirement context
1273 specification of intended use
1274 NOTE 1: A common security intended use stated as a requirement context is the service provider shall have the
1275 capability to implement a requirement objective. See part 2 -4 for examples.
1276 3.1.213
1277 requirement objective
1278 specification of the goal to be achieved
1279 NOTE 1: A common security goal stated as a requirement objective is a segmentation requirement.
1280 3.1.214
1281 resilience
1282 ability of an IACS organization, process entity or system, to resist being affected by disruptions

1283 3.1.215
1284 risk
1285 expectation of loss expressed as the probability that a particular threat will exploit a particular
1286 vulnerability with a particular consequence

1287 3.1.216
1288 risk assessment
1289 process that systematically identifies potential vulnerabilities to valuable system resources and
1290 threats to those resources, quantifies loss exposures and consequences based on probability of
1291 occurrence, and (optionally) recommends how to allocate resources to countermeasures to
1292 minimize total exposure
1293 NOTE 1: Types of resources include physical, logical and human.
1294 NOTE 2: Risk assessments are often combined with vulnerability assessments to identify vulnerabilities and quantify
1295 the associated risk. They are carried out initially and periodically to reflect changes in the organization's risk tolerance,
1296 vulnerabilities, procedures, personnel and technological changes.
1297 3.1.217
1298 risk management
1299 process of identifying and applying countermeasures commensurate with the value of the assets
1300 protected based on a risk assessment

1301 3.1.218
1302 role
1303 set of connected behaviors, privileges and obligations associated with all users (humans,
1304 software processes or devices) of an IACS
1305 NOTE 1: The privileges to perform certain operations are assigned to specific roles.
1306 NOTE 2: Role definitions must be distinguished in infrastructure role definitions (within a process), functional role
1307 definitions (part of an entity functions) or organizational role definition (a person position). A functional role may be
1308 associated with privileges and confer responsibility and authority on a user assigned to that role .
ISA‑62443-1-2, D2E1, February 2018 – 35 – ISA99, WG03, TG01

1309 3.1.219
1310 root cause
1311 wide range of causes or sources of security-related issues, generally consisting of design,
1312 implementation or test weaknesses
1313 NOTE 1: These weaknesses often result from misapplication of best practices.
1314 3.1.220
1315 router
1316 gateway between two networks at OSI layer 3 and that relays and directs data packets through
1317 that inter-network. The most common form of router passes Internet Protocol (IP) packets

1318 3.1.221
1319 safeguard
1320 security control used to minimize the risk of compromise of an asset or resource

1321 3.1.222
1322 safety
1323 freedom from unacceptable risk

1324 3.1.223
1325 safety feature
1326 measure, control, or function specifically designed for safety purposes
1327 NOTE 1: Non-safety feature implementation can have a negative, neutral, but also positive impact on safety. This is
1328 particularly the case of some cybersecurity features, as discussed in this document.
1329 3.1.224
1330 safety feature
1331 measure, control, or function specifically designed for safety purposes
1332 NOTE 1: Non-safety feature implementation can have a negative, neutral, but also positive impact on safety. This is
1333 particularly the case of some cybersecurity features, as discussed in this document.

1334 3.1.225
1335 safety instrumented system
1336 system used to implement one or more safety-related functions
1337 NOTE 1: See IEC 61508 and IEC 61511 for more information on functional safety.
1338 3.1.226
1339 safety integrity level
1340 discrete level (one out of four) for specifying the safety integrity requirements of the safety -
1341 instrumented functions to be allocated to the safety-instrumented systems
1342 NOTE 1: Safety integrity level 4 has the highest level of safety integrity; safety integrity level 1 has the lowest.
1343 3.1.227
1344 secret
1345 condition of information being protected from being known by any system entities except those
1346 intended to know it

1347 3.1.228
1348 secret key
1349 cryptographic key, used with a secret key cryptographic algor ithm that is uniquely associated with
1350 one or more entities and should not be made public.

1351 3.1.229
1352 secret key (symmetric) cryptographic algorithm
1353 cryptographic algorithm that uses a single secret key for both encryption and decryption.
ISA99, WG03, TG01 – 36 – ISA‑62443-1-2, D2E1, February 2018

1354 3.1.230
1355 security
1356 condition of system resources being free from unauthorized access and from unauthorized or
1357 accidental change, destruction, or loss
1358 NOTE 1: Measures can be controls related to physical security (controlling physical access to computing assets) or
1359 logical security (capability to login to a given system and application.)
1360 3.1.231
1361 security architecture
1362 plan and set of principles that describe the security services that a system is required to provide
1363 to meet the needs of its users, the system elements required to implement the services, and the
1364 performance levels required in the elements to deal with the threat environment
1365 NOTE 1: In this context, security architecture would be an architecture to protect the control network from intentional or
1366 unintentional security events.
1367 3.1.232
1368 security audit
1369 independent review and examination of a system's records and activities to determine the
1370 adequacy of system controls, ensure compliance with established security policy and procedures,
1371 detect breaches in security services, and recommend any ch anges that are indicated for
1372 countermeasures

1373 3.1.233
1374 security compromise
1375 violation of the security of a system such that an unauthorized (1) disclosure or modification of
1376 information or (2) denial of service may have occurred
1377 NOTE 1: A security compromise represents a breach of the security of a system or an infraction of its security policies.
1378 It is independent of impact or potential impact to the system.
1379 3.1.234
1380 security control
1381 see “countermeasure”

1382 3.1.235
1383 security defect
1384 design or implementation deficiency that can b e exploited to compromise an asset or resource

1385 3.1.236
1386 security domain
1387 system or subsystem of a control LAN or enterprise LAN that is under the authority of a single
1388 trusted authority. Security domains may be organized (e.g., hierarchically) to form larger domain s

1389 3.1.237
1390 security event
1391 occurrence in a system that is relevant to the security of the system

1392 3.1.238
1393 security incident
1394 security compromise that is of some significance to the asset owner or failed attempt to
1395 compromise the system whose result could have been of some s ignificance to the asset owner
1396 NOTE 1: The term “of some significance’ is relative to the environment in which the security compromise is detected.
1397 For example, the same compromise may be declared as a security incident in one environment and not in anothe r.
1398 Triage activities are often used by asset owners to evaluate security compromises and identify those that are significant
1399 enough to be considered incidents.
1400 NOTE 2: In some environments, failed attempts to compromise the system, such as failed login attempts, are
1401 considered significant enough to be classified as security incidents.
ISA‑62443-1-2, D2E1, February 2018 – 37 – ISA99, WG03, TG01

1402 3.1.239
1403 security level
1404 measure of confidence that the IACS or a component thereof is free from vulnerabilities and
1405 functions in the intended manner
1406 NOTE 1: Vulnerabilities can either be designed into the IACS, inserted at any time during its lifecycle or result from
1407 changing threats. Designed-in vulnerabilities may be discovered long after the initial deployment of the IACS, for
1408 example an encryption technique has been broken or an improper policy for account management such as not removing
1409 old user accounts. Inserted vulnerabilities may be the result of a patch or a change in policy that opens a new
1410 vulnerability.
1411 3.1.240
1412 security objective
1413 aspect of security which to achieve is the purpose and objective of using certain mitigation
1414 measures, such as confidentiality, integrity, availability, user authenticity, access authorization,
1415 accountability

1416 3.1.241
1417 security patch
1418 software patch that is relevant to the security of a software component
1419 NOTE 1: For the purpose of this definition, firmware is considered software.
1420 NOTE 2: Software patches may address known or potential vulnerabilities, or simply improve the security of the
1421 software component, including its reliable operation.
1422 3.1.242
1423 security perimeter
1424 logical or physical boundary of an IACS, surrounding all the resources that are controlled and
1425 protected by the system

1426 3.1.243
1427 security policy
1428 set of rules that specify or regulate how a system or organization provides security services to
1429 protect its assets

1430 3.1.244
1431 security procedures
1432 definitions of exactly how practices are implemented and executed

1433 3.1.245
1434 security program
1435 portfolio of security services, including integration services and maintenance services, and their
1436 associated policies, procedures, and products that are ap plicable to the IACS

1437 3.1.246
1438 security related issue
1439 characteristic of the design or implementation of the product that can potentially affect the
1440 security of the product

1441 3.1.247
1442 security services
1443 mechanisms used to provide confidentiality, data integrity, authentication, or no repudiation of
1444 information

1445 3.1.248
1446 security verification and validation testing
1447 testing performed to assess the overall security of a component, product or system when used in
1448 its intended product security context and to determine if a compo nent, product or system satisfies
1449 the product security requirements
ISA99, WG03, TG01 – 38 – ISA‑62443-1-2, D2E1, February 2018

1450 NOTE 1: Security verification testing supplements security validation testing with additional testing focused on the
1451 product security context and defense-in-depth strategy.
1452 3.1.249
1453 security violation
1454 act or event that disobeys or otherwise breaches security policy through an intrusion or the
1455 actions of a well-meaning insider

1456 3.1.250
1457 security zone
1458 grouping of logical or physical assets that share common security requirements
1459 NOTE 1: All unqualified uses of the word “zone” in this standard should be assumed to refer to a security zone.
1460 3.1.251
1461 sensor
1462 measuring element connected to process equipment and to the control system

1463 3.1.252
1464 sequence flow
1465 connecting object that shows the order in which activities are performed in a process
1466 NOTE 1: Each flow has only one source and one target.
1467 3.1.253
1468 server
1469 device or application that provides information or services to client applications and devices

1470 3.1.254
1471 service provider
1472 organization (internal or external organization, manufacturer, etc.) that has agreed to undertake
1473 responsibility for providing a given support service and obtaining, when specified, supplies in
1474 accordance with an agreement
1475 NOTE 1: This term is used in place of the generic word “vendor” to provide differentiation.
1476 3.1.255
1477 session
1478 semi-permanent, stateful, interactive information interchange between two or more
1479 communicating devices
1480 NOTE 1: Typically, a session has a clearly defined start process and end process.
1481 3.1.256
1482 session ID
1483 identifier used to indicate a specific session entry

1484 3.1.257
1485 set point
1486 target value identified within a control system that controls one or more actions within the control
1487 system

1488 3.1.258
1489 sniffing
1490 see Interception.

1491 3.1.259
1492 software security update
1493 piece of software provided in a digital system Solution to fix one or several security vu lnerabilities
1494 in a digital component, or implement one or more cybersecurity features
1495 NOTE 1: Security patches are considered as software security updates.
ISA‑62443-1-2, D2E1, February 2018 – 39 – ISA99, WG03, TG01

1496 3.1.260
1497 Solution
1498 control system and any complementary hardware and software components that have been
1499 installed and configured to operate in an IACS
1500 NOTE 1: Commensurate with ISA-62443-2-4, Solution is used as a proper noun in this part of ISA -62443.
1501 NOTE 2: The difference between the control system and the Solution is that the control system is incorporated in to the
1502 Solution design (e.g. a specific number of workstations, controllers, and devices in a specific configuration), which is
1503 then implemented. The resulting configuration is referred to as the Solution.
1504 NOTE 3: The Solution may be comprised of components from multiple suppliers, including the product supplier of the
1505 control system.
1506 3.1.261
1507 specification
1508 document that specifies in a complete, precise, and verifiable manner the requirements, design,
1509 behavior or other characteristics of a Solution or component and, often, the procedures for
1510 determining whether these provisions have been satisfied

1511 3.1.262
1512 specification of intended use
1513 common security intended use stated as a requirement context is the service provider shall have
1514 the capability to implement a requirement objective
1515 NOTE 1: See part IEC 62443-2-4 for examples.
1516 3.1.263
1517 spoof
1518 pretend to be an authorized user and performing an unauthorized action

1519 3.1.264
1520 stakeholder
1521 person or organization that can affect, be affected by, or perceive themselves to be affected by a
1522 decision or activity

1523 3.1.265
1524 start event
1525 event that indicates where a particular process starts
1526 NOTE 1: The start events start the flow of the process and does not have any incoming sequence flow but can have a
1527 trigger.
1528 3.1.266
1529 subcontractor
1530 service provider under contract to the integration or maintenance service provider or to another
1531 subcontractor that is directly or indirectly under contract to the integration or maintenance service
1532 provider

1533 3.1.267
1534 sub-process
1535 process that is included within another process
1536 NOTE 1: The sub-process can be a collapsed view that hides its details.
1537 NOTE 2: A sub-process can be an expanded view that shows its details with the view of a process that it is contained
1538 in.
1539 3.1.268
1540 supervisory control and data acquisition (SCADA) system
1541 type of loosely coupled distributed monitoring and control system commonly associated with
1542 electric power transmission and distribution systems, oil and gas pipelines, and water and
1543 sewage systems
ISA99, WG03, TG01 – 40 – ISA‑62443-1-2, D2E1, February 2018

1544 NOTE 1: Supervisory control systems are also used within batch, continuous, and discrete manufacturing plants to
1545 centralize monitoring and control activities for these sites.
1546 3.1.269
1547 symmetric key
1548 a single cryptographic key that is used with a secret (symmetric) key algorithm. A system
1549 whereby the encrypting key from plain text to cipher text is identical for the key to convert the
1550 cyber text back to plain text.

1551 3.1.270
1552 symmetric key algorithm
1553 see Secret Key Cryptographic Algorithm.

1554 3.1.271
1555 system
1556 interacting, interrelated, or interdependent elements forming a comple x whole
1557 NOTE 1: A system may be packaged as a product.
1558 NOTE 2: In practice, the interpretation of its meaning is frequently clarified by the use of an adjective, such as control
1559 system. In the context of a control system, the elements are largely hardware and software elements.
1560 3.1.272
1561 system [Solution] under consideration
1562 collection of assets for the purpose of a security risk analysis
1563 NOTE 1: A SuC consists of one or more zones and related conduits. All assets within a SuC belong to either a zone or
1564 conduit.
1565 3.1.273
1566 system integrator
1567 person or company that specializes in bringing together component subsystems into a whole and
1568 ensuring that those subsystems perform in accordance with project specifications

1569 3.1.274
1570 system software
1571 special software designed for a specific computer system or family of computer systems to
1572 facilitate the operation and maintenance of the computer system and associated programs and
1573 data

1574 3.1.275
1575 system under consideration
1576 defined collection of IACS and related assets for the purpose of performing a security ri sk
1577 analysis
1578 NOTE 1: A SuC consists of one or more zones and related conduits. All assets within a SuC belong to either a zone or
1579 conduit.
1580 3.1.276
1581 task
1582 object within the logical unit representing the work associated with a command or group of linked
1583 commands

1584 3.1.277
1585 technical controls
1586 countermeasures that use technology-based contrivances in order to protect information systems
1587 from harm

1588 3.1.278
1589 threat
1590 circumstance or event with the potential to adversely affect operations (including mission,
1591 functions, image or reputation), assets, control systems or individuals via unauthorized access,
1592 destruction, disclosure, modification of data and/or denial of service
ISA‑62443-1-2, D2E1, February 2018 – 41 – ISA99, WG03, TG01

1593 3.1.279
1594 threat landscape
1595 summary of available threat information such as threat sources, threat vectors and trends that
1596 may affect a defined target (for example, company, facility or SuC)

1597 3.1.280
1598 threat source
1599 intent and method targeted at the intentional exploitation of a vulnerability, or a situation and
1600 method that may accidentally trigger a vulnerability

1601 3.1.281
1602 threat vector
1603 path or means by which a threat source can gain access to an organizational asset

1604 3.1.282
1605 throughput
1606 maximum continuous traffic rate that an IT or IACS device can handle w ithout dropping a single
1607 packet

1608 3.1.283
1609 tolerable risk
1610 level of risk deemed acceptable to an organization in orde r that some particular benefit or
1611 functionality can be obtained

1612 3.1.284
1613 traffic analysis
1614 inference of information from observable characteristics of data flow(s), even when the data are
1615 encrypted or otherwise not directly available, including the identities and l ocations of source(s)
1616 and destination(s) and the presence, amount, frequency, and duration of occurrence

1617 3.1.285
1618 transaction
1619 sub-process that represents a set of coordinated activities carried out by independent, loosely -
1620 coupled systems in accordance with well-defined relationships
1621 NOTE 1: This coordination leads to an agreed, consistent, and verifiable outcome across all participants.
1622 3.1.286
1623 trigger
1624 mechanism that detects an occurrence and can cause additional processing in response
1625 NOTE 1: Triggers are associated with start events and intermediate events and can be of the type: message, timer,
1626 conditional, signal, link, and multiple.
1627 3.1.287
1628 Trojan horse
1629 computer program that appears to have a useful function, but also has a hidden and potentially
1630 malicious function that evades security mechanisms, sometimes by exploiting legitimate
1631 authorizations of a system entity that invokes the program

1632 3.1.288
1633 trust
1634 confidence that an operation, data transaction source, network or software process can be relied
1635 upon to behave as expected
1636 NOTE 1: An entity can be said to “trust” a second entity when it (the first entity) makes the assumption that the second
1637 entity will behave as the first entity expects.
1638 NOTE 2: Trust may apply only for some specific function.
ISA99, WG03, TG01 – 42 – ISA‑62443-1-2, D2E1, February 2018

1639 3.1.289
1640 trust boundary
1641 element of a threat model that depicts a boundary where authentication is required or a change is
1642 trust level occurs (higher to lower or vice versa); part of a defense -in-depth strategy that
1643 represents a line of demarcation where: a) users external to the trust boundary must have
1644 approval (be trusted) to use capabilities and/or resource s within the trust boundary. b) data
1645 external to the trust boundary that crosses a trust boundary must be valid (be trusted) to be
1646 forwarded or consumed within the trust boundary
1647 NOTE 1: Trust boundary enforcement mechanisms for users typically include authentication (for example,
1648 challenge/response, passwords, biometrics, or digital signatures) and associated authorization (for example, access
1649 control rules).
1650 NOTE 2: Trust boundary enforcement mechanisms for data typically include source authentication (for example,
1651 message authentication codes, digital signatures) and/or content validation.
1652 NOTE 3: Trust boundaries are used to delimit ISA-62443-3-2 security zones.

1653 3.1.290
1654 trustworthiness
1655 attribute or trait of the system which causes it to be deserving of trust

1656 3.1.291
1657 unit
1658 lower-level element of a manufacturing process that performs manufacturing, field device control,
1659 or vehicle functions
1660 NOTE 1: See “Cell”.
1661 3.1.292
1662 unit testing
1663 verification that an individual unit of computer software or hardware performs as intended
1664 NOTE 1: Automated verification, or testing, is generally performed by computer test software.
1665 NOTE 2: What constitutes a unit of source code is a design decision. A unit is often designed as the s mallest testable
1666 part of an application. It may include one or more computer program modules and may also include associated control
1667 data, usage procedures, and operating procedures. In procedural programming, a unit could be an entire module, but is
1668 more commonly an individual function or procedure. In object -oriented programming, a unit is often an entire interface,
1669 such as a class, but could be an individual method.
1670 3.1.293
1671 unlinkability
1672 assurance that a user may make multiple uses of resources or services wit hout others being able
1673 to link these uses together

1674 3.1.294
1675 unmitigated cyber security risk
1676 level of cyber security risk that is present in a system before any countermeasures are
1677 considered
1678 NOTE 1: This level helps identify how much cyber security risk reduction is required to be provided by any
1679 countermeasure.
1680 3.1.295
1681 untraceability
1682 assurance that information cannot be used to track the time or location of a specific user

1683 3.1.296
1684 untrusted
1685 not meeting predefined requirements to be trusted
1686 NOTE 1: An entity may simply be declared as untrusted.
ISA‑62443-1-2, D2E1, February 2018 – 43 – ISA99, WG03, TG01

1687 3.1.297
1688 usability requirement
1689 specification of the fitness for use of a Solution for its users and other actors

1690 3.1.298
1691 use case
1692 technique for capturing potential functional requirements that employs the use of one or more
1693 scenarios that convey how the system should interact with the end user or another system to
1694 achieve a specific goal
1695 NOTE 1: Typically use cases treat the system as a black box, and the interactions with the system, including system
1696 responses, are as perceived from outside of the system . Use cases are popular because they simplify the description of
1697 requirements, and avoid the problem of making assumptions about how this functionality will be accomplished.
1698 3.1.299
1699 value type
1700 a type of quantity
1701 NOTE 1: An attribute within an information object has a primitive, structured, or enumerated value type.
1702 3.1.300
1703 verify
1704 check that the specified requirement was met

1705 3.1.301
1706 virtual private network
1707 private network utilizing shared networks, such as a network based on a cryptogr aphic tunneling
1708 protocol operating over another network infrastructure

1709 3.1.302
1710 virus
1711 self-replicating or self-reproducing program that spreads by inserting copies of itself into other
1712 executable code or documents

1713 3.1.303
1714 vulnerability
1715 flaw or weakness in a system's design, implementation, or operation and management that could
1716 be exploited to violate the system's integrity or security policy
1717 NOTE 1: Security policies typically include policies to protect confidentiality, integrity, and availability of system assets.
1718 3.1.304
1719 wide area network
1720 communications network designed to connect computers, networks and other devices over a
1721 large distance, such as across the country or world
1722 NOTE 1: Any extra text necessary to fully define or explain the term in its current context. This can include full
1723 sentences and other things.
1724 NOTE 2: Another note discussing the term.
1725 NOTE 3: Don’t include examples of the term in the definition, include them separately.
1726 3.1.305
1727 wiretapping
1728 attack that intercepts and accesses data and other information contained in a flow in a
1729 communication system
1730 NOTE 1: Although the term originally referred to making a mechanical connection to an electrical conductor that links
1731 two nodes, it is now used to refer to reading information from any sort of medium used for a link or even directly from a
1732 node, such as a gateway or subnetwork switch.
1733 NOTE 2: “Active wiretapping” attempts to alter the data or otherwise affect the flow; “passive wiretapping” only
1734 attempts to observe the flow and gain knowledge of information it contains.
ISA99, WG03, TG01 – 44 – ISA‑62443-1-2, D2E1, February 2018

1735 3.1.306
1736 worm
1737 computer program that can run independently, can propagate a complete working version of itself
1738 onto other hosts on a network, and may consume computer resources destructively

1739 3.1.307
1740 zone
1741 collection of entities that represents partitioning of a System under Consideration on the basis
1742 their functional, logical and physical (including location) relationship
1743 NOTE 1: A zone has a clear border. The security policy of a zone is typically enforced by a combination of mechanisms
1744 both at the zone edge and within the zone.
1745 3.2 Abbreviated terms and acronyms
1746 The following abbreviated terms and acronyms are used throughout the ISA‑62443 series.

Abbreviation Definition
3DES Triple Digital Encryption Standard
ACL Access Control List
AES Advanced Encryption Standard
AGA American Gas Association
ANSI American National Standards Institute
API Application Programming Interface
ASLR Address Space Layout Randomization
ASM Automated Software Management
AVA_VAN Common Criteria Class AVA Vulnerability Assessment
BCP Business Continuity Planning
BIA Business Impact Assessment
BIOS Basic Input Output System
BPCS Basic Process Control System
BR Base Requirement
CA Certification Authority
CCTS Core Components Technical Specification
CD Compact Disc
CERT Computer Emergency Readiness Team
CHAP Challenge Handshake Authentication Protocol
CIA Confidentiality, Integrity, and Availability
CIP Critical Infrastructure Protection
CIP ® Common Industrial Protocol (formerly Control and Information Protocol)
CMAC Cipher-based Message Authentication Code
CMMI Capability Maturity Model Integration
CMMI-DEV Capability Maturity Model Integration for Development
ISA‑62443-1-2, D2E1, February 2018 – 45 – ISA99, WG03, TG01

Abbreviation Definition
CMVP Cryptographic Module Validation Program
COTS Commercial Off The Shelf
CPNI [UK] Centre for Protection of National Infrastructure
CPU Central Processing Unit
CR Component Requirement
CRL Certificate Revocation List
CRS Cybersecurity Requirements Specification
CS Control System
CSMS Cybersecurity Management System
CVSS Common Vulnerability Scoring System
D2T Decommissioning and Disposal Test
DAC Discretionary Access Control
DC Domain Controller
DC Data Confidentiality
DCS Distributed Control System
DEP Data Execution Prevention
DHCP Dynamic Host Configuration Protocol
DHS [US] Department of Homeland Security
DM Defect Management
DMZ Demilitarized Zone
DNS Domain Name Service
DoS Denial-of-Service
DPA Differential Power Analysis
DRP Disaster Recovery Planning
DVD Digital Versatile Disc
EAL Evaluated Assurance Level
EC Elliptic Curve
ECC Elliptic Curve Cryptosystem
EDR Embedded Device Requirement
EICAR European Institute for Computer Antivirus Research
EMI Electromagnetic Interference
EULA End User License Agreement
EWS Engineering Workstation
FAN Field Area Network
FAQ Frequently Asked Questions
ISA99, WG03, TG01 – 46 – ISA‑62443-1-2, D2E1, February 2018

Abbreviation Definition
FAT Factory Acceptance Testing
FDA [US] Food and Drug Administration
FDIS Final Draft International Standard
FIPS Federal Information Processing Standards
FR Foundational Requirement
FS-PLC Functional Safety PLC
FTP File Transfer Protocol
GCM Galois/Counter Mode
GLONASS Global Navigation Satellite System
GMAC Galois Message Authentication Code
GPS Global Positioning System
HCM Host Configuration Management
HDR Host Device Requirement
HIDS Host Intrusion Detection System
HMI Human Machine Interface
HSE Health, Safety or Environment
HTTP Hyper Text Transfer Protocol
HTTPS Hyper Text Transfer Protocol Secure
IAC Identification and Authentication Control
IACS Industrial Automation and Control System
IACS-SMS IACS Security Management System
IAMS Instrument Asset Management System
IAONA Industrial Automation Open Networking Association
IATF Information Assurance Technical Framework
ICS-CERT [US DHS] Industrial Control Systems Cyber Emergency Response Team
ID Identifier
IDS Intrusion Detection System
IEC International Electrotechnical Commission
IED Intelligent Electronic Devices
IEEE Institute of Electrical and Electronics Engineers
IETF Internet Engineering Task Force
IM Instant Messaging
IP Internet Protocol
IPS Intrusion Prevention System
IPsec Internet Protocol Security
ISA‑62443-1-2, D2E1, February 2018 – 47 – ISA99, WG03, TG01

Abbreviation Definition
IS International Standard
ISA International Society of Automation
ISAC Information Sharing and Analysis Centers
ISMS Information Security Management System
ISO International Organization for Standardization
IT Information Technology
JTAG Joint Test Action Group
KPI Key Performance Indicator
LAN Local Area Network
LDAP Lightweight Directory Access Protocol
LSS Location Signature Sensor
MA Maturity Assessment
MAC Media Access Control
MBSE Model-Based System Engineering
MES Manufacturing Execution System
MIT Massachusetts Institute of Technology
MR Metric requirement ID
MSMUG Microsoft Manufacturing Users Group
MT Maintenance test
NAT Network Address Translation
NDR Network Device Requirement
NERC North American Electric Reliability Corporation
NFA Network Forensics and Analysis
NIDS Network Intrusion Detection System
NISCC [US] National Infrastructure Security Co-ordination Centre
NIST [US] National Institute of Standards and Technology
NSA [US] National Security Agency
NX No Execute
OAGIS Open Applications Group Integration Specification
OCSP Online Certificate Status Protocol
OLE ® Object Linking and Embedding
OMG Object Management Group
OPC ® OLE for Process Control
OS Operating System
OWASP Open Web Application Security Project
ISA99, WG03, TG01 – 48 – ISA‑62443-1-2, D2E1, February 2018

Abbreviation Definition
PAS Publicly Available Specification
PC Personal Computer
PCN Process Control Network
PDA Personal Digital Assistant
PDF Portable Document Format
PGP ® Pretty Good Privacy ®
PHA Process Hazard Analysis
PII Personally Identifiable Information
PIN Personal Identification Number
PKI Public Key Infrastructure
PL Protection Level
PLC Programmable Logic Controller
PPP Point-to-Point Protocol
PRNG Pseudorandom Number Generator
PuC Part under Consideration
PUF Physically uncloneable function
QAT Quality Assurance Test
RA Risk Assessment
RA Resource Availability
RACI Responsible, Accountable, Consulted, Informed
RADIUS Remote Authentication Dial-In User Service
RAID Redundant Array of Independent Disks
RAM Random Access Memory
RASCI Responsible, Accountable, Supportive, Consulted, Informed
RBAC Role-Based Access Control
RCA Requirements Coverage Analysis
RDF Restricted Data Flow
RDP Remote Desktop Protocol
RE Requirement Enhancement
RFC Request For Comment
RJ Registered Jack
RoA Record of Action
RSA ® Rivest, Shamir and Adleman
RTOS Real-time Operating System
RTU Remote Terminal Unit
ISA‑62443-1-2, D2E1, February 2018 – 49 – ISA99, WG03, TG01

Abbreviation Definition
SAM Security Accounts Manager
SAR Software Application Requirements
SAT Site Acceptance Test
SCA Static Code Analysis
SCADA Supervisory Control and Data Acquisition
SCC Security Control Class
SD Secure Design
SDL Secure Development Life-Cycle
SDLA Secure Development Life-Cycle Assessment
SFTP Secure FTP
SHA Secure Hash Algorithm
SI System Integrity
SIEM Security Information and Event Management
SIF Safety Instrumented Function
SIL Safety Integrity Level
SIS Safety Instrumented System
SL Security Level
SL-A Achieved Security Level
SL-C Capability Security Level
SL-T Target Security Level
SNMP Simple Network Management Protocol
SP Security Program
SP [US NIST] Special Publication
SQL Structured Query Language
SR System Requirement
SSH Secure Socket Shell
SSID Service Set Identifier
SSL Secure Sockets Layer
STRIDE Spoofing, Tampering, Repudiation, Information disclosure, Denial of service,
Elevation of privilege
SUC System Under Consideration
SVV Security Verification and Validation
Sysdiff System Difference Packages
SysML System Modeling Language
TC Technical Committee
TCP Transmission Control Protocol
ISA99, WG03, TG01 – 50 – ISA‑62443-1-2, D2E1, February 2018

Abbreviation Definition
TCP/IP Transmission Control Protocol/Internet Protocol
TLS Transport Layer Security
TPM Trusted Platform Module
TR Technical Report or Traceability Requirement ID
TRE Timely Response to Events
TS Technical Specification
UC Use Control
UML Unified Modeling Language
UN United Nations
UN/CEFACT United Nations Centre for Trade Facilitation and Electronic Business
URI Uniform Resource Identifier
USB Universal Serial Bus
US-CERT United States Computer Emergency Readiness Team
VDS Virus Detection System
VLAN Virtual Local Area Network
VoIP Voice over Internet Protocol
VPC Vendor Patch Compatibility
VPN Virtual Private Network
WAN Wide Area Network
WEP Wired Equivalent Privacy
wi-fi Wireless Fidelity
WLAN Wireless Local Area Network
XML eXtensible Markup Language
XSD XML Schema Definition
ZCR Zone and Conduit Requirement
1747
ISA‑62443-1-2, D2E1, February 2018 – 51 – ISA99, WG03, TG01

1748 BIBLIOGRAPHY
1749 NOTE Some of these references are normative references (see Clause 0), published documents, in development, or
1750 anticipated. They are all listed here for completeness of the anticipated parts of the ISA‑62443 series.

1751 [1] ISO/IEC 27001 – Information technology – Security techniques – Information security
1752 management systems – Requirements

1753 [2] ISO/IEC 27002 – Information technology – Security techniques – Code of practice for
1754 information security management ANSI/ISA‑62443-1-1-2007, Security for industrial
1755 automation and control systems: Terminology, concepts and models

1756 [3] ISO/IEC Directives, Part 2, Rules for the structure and drafting of International Standards

1757 [4] ANSI/ISA‑62443-1-3, Security for industrial automation and control systems: System
1758 security compliance metrics

1759 [5] ANSI/ISA‑62443-2-1-2009, Security for industrial automation and control systems:
1760 Establishing an industrial automation and control system security program

1761 [6] ANSI/ISA‑TR62443-2-2, Security for industrial automation and control systems: Operating
1762 an industrial automation and control system security prog ram

1763 [7] ANSI/ISA‑TR62443-2-3, Security for industrial automation and control systems: Patch
1764 management in the IACS environment

1765 [8] ANSI/ISA‑TR62443-3-1-2007, Security for industrial automation and control systems:
1766 Security technologies for industrial automation and control systems

1767 [9] ANSI/ISA‑62443-3-2, Security for industrial automation and control systems: Target
1768 security assurance levels for zones and conduits

1769 [10] ANSI/ISA‑62443-3-3, Security for industrial automation and control systems: System
1770 security requirements and security assurance levels

1771 [11] ANSI/ISA‑62443-3-4, Security for industrial automation and control systems: Product
1772 development requirements

1773 [12] ANSI/ISA‑62443-4-1, Security for industrial automation and control systems: Embedded
1774 devices

1775 [13] ANSI/ISA‑62443-4-2, Security for industrial automation and control systems: Host devices

1776
ISA99, WG03, TG01 – 52 – ISA‑62443-1-2, D2E1, February 2018

1777

1778 Developing and promulgating technically sound consensus standards and recommended practices is one
1779 of ISA's primary goals. To achieve this goal the Standards and Practices Department relies on the
1780 technical expertise and efforts of volunteer committee members, chairmen, and reviewers. ISA is an
1781 American National Standards Institute (ANSI) accredited organization. ISA administers United States
1782 Technical Advisory Groups (USTAGs) and provides secretariat support for International Electrotechnical
1783 Commission (IEC) and International Organization for Standardization (ISO) committees that develop
1784 process measurement and control standards. To obtain additional information on the Society's standards
1785 program, please write:

1786 ISA
1787 Attn: Standards Department
1788 67 Alexander Drive
1789 P.O. Box 12277
1790 Research Triangle Park, NC 27709

1791 ISBN: To be defined


1792

1793

Вам также может понравиться