Вы находитесь на странице: 1из 8

Contents

Acknowledgments vii

Preface ix

Chapter 1 Background 1

Chapter 2 Definitions 5

Chapter 3 Industrial Control System Descriptions 7

Chapter 4 Convergence of Industrial Control Systems and


Information Technology 25

Chapter 5 Differences between Industrial Control Systems


and Information Technology 29

Chapter 6 Electronic Threats to Industrial Control Systems 43

Chapter 7 Myths 53

Chapter 8 Current Personnel Status and Needs 59

Chapter 9 Information Sharing and Disclosure 63

Chapter 10 Industrial Control System Cyber Risk Assessments 87

Chapter 11 Selected Industry Activities 89

Chapter 12 Industrial Control System Security Trends and Observations 95

Chapter 13 Industrial Control System Cyber Security Demonstrations 101

imo-weiss-00fm.indd v 5/3/10 3:44 PM


vi Cont ent s

Chapter 14 Selected Case Histories: Malicious Attacks 107

Chapter 15 Selected Case Histories: Unintentional Incidents 123

Chapter 16 Industrial Control System Incident Categorization 147

Chapter 17 Recommendations 159

Appendix 1 Acronyms 167

Appendix 2 Glossary 171

Appendix 3 Comparison of Key Definitions 197

Appendix 4 CSIS White Paper on Industrial Control Systems 205

Appendix 5 Typical Distributed Control System Procurement Specification 223

Notes 301

Further Reading 313

Index 317

imo-weiss-00fm.indd vi 5/3/10 3:44 PM


Ack nowledgments

I have worked with many people and organizations over the years in different areas of instru-
mentation and control. All have contributed to my knowledge and understanding of the subject.
Organizations whose work has contributed to my experiences include both formal and informal
groups: the Electric Power Research Institute Instrumentation and Control Advisory Commit-
tee, the International Society of Automation (ISA) S67 (Nuclear Plant Standards), ISA77 (Fossil
Plant Standards), ISA99, particularly the ISA99.05 Leadership Team, ISA100, and the various
Institute of Electrical and Electronics Engineers, International Electrotechnical Commission,
North American Electrical Reliability Corporation, and CIGRÉ (International Council on Large
Electric System) committees of which I have participated over the years.
There are several individuals I want to specifically acknowledge because of their extraordinary
contributions to the subject of electronic security of industrial control systems: Eric Cosman has
been an invaluable driver for S99; Jacob Brodsky has provided valuable insights into water and
wireless communications; Walt Boyes has allowed me to use his blog, http://www.controlglobal
.com/unfettered, to help with awareness; Marshall Abrams has been my guru on National Institute
of Standards and Technology and information technology–related issues; and Gary Seifert has been
a pioneer in supervisory control and data acquisition security efforts. I also wanted to acknowl-
edge others I use to bounce ideas off and have acted as sanity checkers. They include Michael
Assante, Keith Christianson, Philip Craig, Jeff Dagle, Clifford Glantz, Mark Hadley, Louis Hat-
ton, Todd Hustrulid, Stuart Katzke, Howard Lipson, Perry Pederson, Michael Peters, David Rahn,
Ronald Ross, Marcus Sachs, Phyllis Schneck, Robert Sill, Jon Stanford, Robert Webb, and Mar-
jorie Widmeyer.
I also want to thank my wife, Linda; my daughter, Michelle; and my son, Jeffrey, for their sup-
port as well as for putting up with me on this labor of love and aggravation.

Joe Weiss

vii

imo-weiss-00fm.indd vii 5/3/10 3:44 PM


Preface

This book is meant to help both the novice and expert in information technology (IT) secu-
rity and industrial control systems (ICSs) gain a better understanding of protecting ICSs from
electronic threats. It illustrates that electronic threats to ICSs are real and have already caused
extensive plant and environmental damage, power outages, and even deaths. By popular demand,
it provides recommendations for securing these systems that will enable facilities to maintain their
reliability and safety. The book was also written to fill a hole that exists in academia—security is
taught in computer science departments, whereas control systems are taught in various engineer-
ing departments. Traditional security approaches can, and have, impacted the performance of
control and even safety systems. This book can be used as an introduction to cyber security of
industrial control systems prior to teaching control system theory in engineering classes or secu-
rity classes in computer science.
As for an explanation of the title, the term “protecting” was chosen as this is not a book on
how to attack ICSs. From a cyber perspective, they are very brittle, and attacking them is not
rocket science. On the other hand, protecting them while at the same time maintaining their mis-
sion can be rocket science. The term “ICS” was chosen as ICSs include supervisory control and
data acquisition, distributed control systems, programmable logic controllers, remote terminal
units, intelligent electronic devices, field controllers, sensors, drives, emission controls, building
controls (including fire suppression, thermostats, and elevator controls), and meters (including
business and residential automated metering). For the purpose of this book, ICSs also include
safety systems. The term “electronic threats” was chosen rather than cyber security because there
are electronic threats to ICSs beyond traditional cyber threats.
Additionally, the book is about protecting the mission of the ICS—a compromise of a
computer that isn’t critical to the mission of the control system may be a cyber security event,
but it is not of importance. One may say that “it takes a village” to secure ICSs, as Operations
alone cannot do this. It takes a team of ICS expertise, IT security expertise, telecom knowledge,
networking, ICS and IT vendor support, and most of all, senior management support to make
this work.
I hope you find the book of interest.

ix

imo-weiss-00fm.indd ix 5/3/10 3:44 PM


CHAPTER 1

Background

Industrial control systems (ICSs) operate the industrial infrastructures worldwide, including elec-
tric power, water, oil and gas, pipelines, chemicals, mining, pharmaceuticals, transportation, and
manufacturing. ICSs measure, control, and provide a view of the process (once only the domain
of the operator). The commonality of ICSs and their architecture enabled the International Soci-
ety of Automation (ISA) to establish one general-process industry committee for cyber security:
ISA99.1
These systems continue to be upgraded with advanced communication capabilities and net-
worked to improve process efficiency, productivity, regulatory compliance, and safety. This
networking can be within a facility or even between facilities that are continents apart. When
an ICS does not operate properly, it can result in impacts ranging from minor to catastrophic.
Consequently, there is a critical need to ensure that electronic impacts do not cause, or enable,
improper operation of ICSs.
Security is like a three-legged stool consisting of physical security, information technology (IT)
security, and ICS security (see Figure 1.1). The first leg is physical security. It is generally well
understood and often addressed by experts coming from the military or law enforcement. The

Figure 1.1. Three Legs of Security

imo-weiss-01.indd 1 4/30/10 5:45 PM


2 Pr ot ect ing Indust r ial Cont r o l S y s t e m s f r o m E l e c t r o n i c T h r e a t s

second is IT security. It generally deals with traditional commercial, off-the-shelf (COTS) hard-
ware and software and connections to the Internet with experts coming from IT and the military.
There is little doubt that IT security is necessary and that systems are continuously being probed,
tested, and hacked. The third leg, ICS security, is much less understood, has few experts in its
field, and is often not considered critical. Those working in this area are generally either from the
IT security community with little knowledge of ICSs or ICS experts knowledgeable in the opera-
tion of systems but not security.
ICS cyber security was formally identified in the mid- to late 1990s with the publication of
Presidential Decision Directive (PDD)-63.2 It was at this time that the U.S. Department of
Energy’s (DOE) National Laboratories started performing cyber security assessments of utilities
on a confidential (but not classified) basis. As these assessments were not made public, there was
little knowledge of the results unless the utilities were willing to share their results.
During this time, I was a project manager at the Electric Power Research Institute (EPRI),
leading research and development efforts in improving the operation of fossil-fueled power plant
instrumentation and control and communication systems. At the time, ICS cyber security aware-
ness in the electric industry was very low and its perceived importance even lower. Generally, it
was viewed as a corporate IT issue with little direct impact on power plant or grid operation.
Moreover, it was viewed as a hindrance to ICS technology advancements. From a security per-
spective, ICSs were generally isolated networks, and the concept of “security by obscurity” was
alive and well. In fact, I was writing papers on the “evils of islands of automation” and the need
to integrate the various systems.3 As security was not a consideration, there was little reason to
question the need for tighter system integration.
There was another issue with both direct and indirect security ramifications. That was the
need for a center at a site to demonstrate and evaluate technology. The electric utility and other
industries take a posture of “we can’t wait to be second to install new technology after someone
else has gone through pains of the initial installation.” In the late 1990s, the electric industry
was trying to justify upgrading the old analog control systems to modern digital control systems.
They needed a facility to evaluate new technologies in an actual plant setting and document the
benefits. This was the premise for the EPRI Instrumentation and Control (I&C) Center at the
Tennessee Valley Authority’s Kingston Steam Plant. At the time we established the center (the
mid- to late 1990s), security was not a consideration. During this time, I was working with a
number of DOE National Laboratories (Oak Ridge National Laboratory, Sandia National Labo-
ratory, Livermore National Laboratory, Argonne National Laboratory, and Brookhaven National
Laboratory—the Idaho National Laboratory [INL] was not on that list) on different I&C proj-
ects. What was obvious at that time and would be very important later was that these laboratories
had control systems, but they weren’t commercially available systems. Rather, they were one-offs
for specific applications. Consequently, the utilities did not feel the laboratories understood their
needs in this area.
In 1997, the Y2K (the year 2000 problem or “millennium bug”) issue finally made it to the
ICS community. Because of my control system knowledge and contacts within the industry, I
became the EPRI Y2K Embedded Systems Technical Manager. Y2K was an unintentional cyber

imo-weiss-01.indd 2 4/30/10 5:45 PM


Background 3

issue focusing on the ability of digital systems’ clocks and basic input/output systems (BIOS) to
account for the century change. In the late 1990s, there were still a plethora of analog control sys-
tems with digital control systems just beginning to make a dent in the marketplace. Throughout
the Y2K efforts, it was not viewed as a “cyber security” issue, but rather as an unintentional BIOS
and clock rollover problem. With the focus on Y2K, there was little room left for addressing more
traditional cyber threats.
There were two items in Y2K that appeared inconsequential at the time that have subsequently
had a significant negative impact on cyber security of ICSs. First, there was significant money
spent on Y2K with few “apparent” resulting impacts. Instead of viewing the Y2K program as
a success by preventing mass impacts, most senior managers viewed the lack of impact as an
indication it was nothing but FUD—fear, uncertainty, and doubt—created by the vendors and
consultants to sell their wares. Many in senior management continue to harbor the perception
that ICS security is like Y2K—that is, FUD. This is hurting the industry very badly.
The second issue with Y2K was more positive. Y2K created a once-in-a-lifetime environment
of information sharing within each company and between companies. Unfortunately, at the time
we did not realize it was a once-in-a-lifetime event. When we started the ICS Cyber Security Pro-
gram at EPRI in early 2000 (actually called the Enterprise Infrastructure Security Program), we
expected the same level of information sharing as had occurred during Y2K. Were we ever wrong!
In retrospect, there were different drivers between Y2K and ICS security. The biggest was liability.
For Y2K, officers and directors were personally liable. It was no wonder they took it so seriously.
The same liability issue has not been applied to ICS security.
Another issue was timing. When the ICS Cyber Security Program first started at EPRI in early
2000, cyber security was not viewed as a national security imperative by private industry. ICS
security was viewed as a business issue, as ICSs were critical to the “bottom line” of an industrial
company. In fact, on September 10, 2001, I held two panel sessions on ICS security at ISA Expo
in Houston, Texas. Attendees included electric utilities, water utilities, oil and gas utilities, and
pipelines. Attendees also included auto parts manufacturers and even a dog food manufacturer.
The next day the world changed forever. From that infamous date onward, the perception of ICS
security changed from a business issue to a national security imperative. This had the unfortunate
implication of shifting the onus from the end user to the government.
One other item that at the time seemed perfectly reasonable has in hindsight been very dis-
ruptive to securing the efficient operation of ICSs. That item was the name “cyber security.” We
did not realize the difference it would have made if we would have used the phrases “critical
infrastructure protection,” “functional security,” or “control system electronic communication
reliability.” By calling the issue cyber security, the focus was transferred from maintaining control
system and process reliability regardless of computer status under the aegis of the Operations
organization to focusing on computers regardless of control system and system reliability under
the aegis of the IT organization. Consequently, the title of this book was changed from Cyber
Security of Industrial Control Systems to Protecting Industrial Control Systems from Electronic Threats.
I will now return to test beds, as they apply to security. In June 2002, I was invited by the
INL Security Test Bed Program to meet with them in Idaho Falls, Idaho, about the possibility of

imo-weiss-01.indd 3 4/30/10 5:45 PM


4 Pr ot ect ing Indust r ial Cont r o l S y s t e m s f r o m E l e c t r o n i c T h r e a t s

their having an industry test bed. My first response to them was that I had not dealt with INL
in the past from an I&C perspective, so I had little knowledge of their capabilities. Based on my
experience, it was clear that for the test bed to be successful, it would require having commercial
ICSs similar to what end users actually had before they would consider a national laboratory test
bed to be relevant. As a result of that presentation, INL contracted KEMA, where I was then
employed, to help with outreach on their supervisory control and data acquisition (SCADA)
test bed. Among other outreach and industry status projects, I contacted ICS vendors about
participating in the SCADA Test Bed Program. The test bed concept continues to be refined. As
demonstrated by the Hatch Nuclear Plant incident (discussed in Chapter 15), cyber security is
not limited to the SCADA or distributed control system (DCS) human-machine interface but
extends throughout all of the system interconnections.
Many issues are coming together that will make electronic security of ICSs paramount—the
Smart Grid, stimulus funding, cyber security funding, terrorism, the “sick economy,” the “green
economy,” reducing carbon footprints, and so on. All of these can be impacted by electronic
security of ICSs.
The Smart Grid is being viewed as the Internet for electrons. The infusion of stimulus funding
and the view that the electric grid is just another IT network have resulted in an influx of “experts”
from the IT community with little understanding of how the electric grid works or the systems
that can compromise it. The National Institute of Standards and Technology (NIST) issued the
Smart Grid cyber security report, also known as the NISTIR.4 At the same time, there are a pleth-
ora of other ongoing activities, such as the document produced by the Institute of Electrical and
Electronics Engineers (IEEE) called P2030, or the “Draft Guide for Smart Grid Interoperability
of Energy Technology and Information Technology Operation with the Electric Power System
(EPS), and End-Use Applications and Loads,” for Smart Grid as well as Smart Grid conferences
almost every week. As an example of the technical confusion, IEEE P1547.4/D10.0 “Draft Guide
for Design, Operation, and Integration of Distributed Resource Island Systems with Electric
Power Systems,” dated February 2010, makes no mention of cyber security. Meanwhile, there are
cyber security conferences for the U.S. Department of Defense monthly, or so it seems. It is very
difficult to tie all of these diverse activities together.
There are numerous “best practices,” vendor white papers, consultant white papers, books, and
Webinars on ICS cyber security. As you will see later in this book, many of these documents can
be more harmful than helpful. I believe NIST SP800-825 is an excellent general reference on ICS
cyber security, and the EPRI Control System Cyber Security Primer6 provides a basic understand-
ing of the needs in this area, even though it is dated.
One last, but very critical, point is that the fundamental reason for securing ICSs is to main-
tain the mission of the systems, whether they produce or deliver electricity, make or distribute
gasoline, provide clean water, and so on. I do not believe it is possible to fully electronically secure
ICSs. However, we can make them more secure and also minimize the possibilities of uninten-
tional incidents that have already cost hundreds of millions of dollars as well as a number of lives.
The confusion created by these paradoxical and contradictory issues that you will read about in
this book is the reason for finally putting “fingers to keyboard.”

imo-weiss-01.indd 4 4/30/10 5:45 PM

Вам также может понравиться