Вы находитесь на странице: 1из 45

High Performance Computing and

Networking for Science

September 1989
Recommended Citation:
U.S. Congress, Office of ‘Technology Assessment, High Performance Computing and
Networkig for Science-Background Paper, OTA-BP-CIT-59 (Washington, DC: U.S.
Government Printing Office, September 1989).

Library of Congress Catalog Card Number 89-600758


For sale by the Superintendent of Documents
U.S. Government Printing Office, Washington, DC 20402-9325
(Order form can be found in the back of this report.)
Foreword
Information technology is fundamental to today’s research and development:
high performance computers for solving complex problems; high-speed data
communication networks for exchanging scientific and engineering information; very
large electronic archives for storing scientific and technical data; and new display
technologies for visualizing the results of analyses.
This background paper explores key issues concerning the Federal role in
supporting national high performance computing facilities and in developing a
national research and education network. It is the first publication from our
assessment, Information Technology and Research, which was requested by the House
Committee on Science and Technology and the Senate Committee on Commerce,
Science, and Transportation.
OTA gratefully acknowledges the contributions of the many experts, within and
outside the government, who served as panelists, workshop participants, contractors,
reviewers, detailees, and advisers for this document. As with all OTA reports,
however, the content is solely the responsibility of OTA and does not necessarily
constitute the consensus or endorsement of the advisory panel, workshop participants,
or the Technology Assessment Board.

-
Director

...
w
Performance Computing and Networking for Science Advisory Panel
John P. (Pat) Crecine, Chairman
President, Georgia Institute of Technology

Charles Bender Lawrence Landweber Sharon J. Rogers


Director Chairman University Librarian
Ohio Supercomputer Center Computer Science Department Gelman Library
University of Wisconsin-Madison The George Washington University
Charles DeLisi
Chairman Carl Ledbetter William Schrader
Department of Biomathematical President/CEO President
Science ETA Systems NYSERNET
Mount Sinai School of Medicine Donald Marsh Kenneth Toy
Deborah L. Estrin Vice President, Technology Post-Graduate Research
Assistant Professor Contel Corp. Geophysicist
Computer Science Department Scripps Institution of Oceanography
Michael J, McGill
University of Southern California Vice President Keith Uncapher
Robert Ewald Technical Assessment & Development Vice President
Vice President, Software OCLC, Computer Library Center, Inc. Corporation for the National
Cray Research, Inc. Research Initiatives
Kenneth W. Neves
Kenneth flamm Manager Al Weis
Senior Fellow Research & Development Program Vice President
The Brookings Institution Boeing Computer Services Engineering & Scientific Computer
Data Systems Division
Malcolm Getz Bernard O’Lear IBM Corp.
Associate Provost Manager of Systems
Information Services & Technology National Center for Atmospheric
Wnderbilt University Research
Ira Goldstein William Poduska
Vice president Research Chairman of the Board
Open Software Foundation Stellar Computer, Inc.
Robert E. Kraut Elaine Rich
Manager Director
Interpersonal Communications Group Artificial Intelligence Lab
Bell Communications Research Microelectronics and Computer
Technology Corp.

NOTE: OTA is grateful for the valuable assistance and thoughtful critiques provided by the advisory panel. The views expressed in
this OTA background paper, however, are the sole responsibility of the Office of Technology Assessment.

iv
—. .--.—— -—

OTA Project Staff-High Performance Computing

John Andelin, Assistant Director, OTA


Science, Information, and Natural Resources Division

James W. Curhin, Program Manager


Communication and Information Technologies Program

Fred W. Weingarten, Project Director


Charles N. Brownstein, Senior Analyst1
Lisa Heinz, Analyst
Elizabeth I. Miller, Research Assistant

Administrative Staff
Elizabeth Emanuel, Administrative Assistant
Karolyn Swauger, Secretary
Jo Anne Price, Secretary

Other Contributors
Bill Bartelone Mervin Jones Timothy Lynagh
Legislative/Federal Program Program Analyst Supervisory Data and
Manager Defense Automation Resources Program Analyst
Cray Research, Inc. Information Center Government Services Administration
List of Reviewers
Janice Abraham Eloise E. Clark Judson M. Harper
Executive Director Vice President, Academic Vice President of Research
Cornell Theory Center Affairs Colorado State University
Cornell University Bowling Green University
Gene Hemp
Lee R. Alley Paul Coleman Senior Associate V.P. for
Assistant Vice President for Professor Academic Affairs
Information Resources Management Institute of Geophysics and University of Florida
Arizona State University Space Physics
University of California Nobuaki Ieda
James Almond Senior Vice President
Director Michael R. Dingerson NTT America, Inc.
Center for High Performance Associate Vice Chancellor for
Computing Research and Dean of the Hiroshi Inose
Balcones Research Center Graduate School Director General
University of Mississippi National Center for Science
Julius Archibald Information System
Department Chairman Christopher Eoyang
Department of Computer Science Director Heidi James
State University of New York Institute for Supercomputing Executive Secretary
College of Plattsburgh Research United States Activities
Board
J. Gary Augustson David Farber IEEE
Executive Director Professor
Computer and Information Computer & Information Russell C. Jones
Systems Science Department University Research Professor
Pennsylvania State University University of Pennsylvania University of Delaware

Philip Austin Sidney Fernbach Brian Kahin, Esq.


President Independent Consultant Research Affiliate on
Colorado State University Communications Policy
Susan Fratkin Massachusetts Institute of Technology
Steven C. Beering Director, Special Programs
President NASULGC Robert Kahn
Purdue University President
Doug Gale Corporation of National
Jerry Berkman Director of Computer Research Initiatives
Fortran Specialist Research Center
Central Computing Services Office of the Chancellor Hisao Kanai
University of California at Berkeley University of Nebraska-Lincoln Executive Vice President
NEC Corporation
Kathleen Bernard Robert Gillespie
Director for Science Policy President Hiroshi Kashiwagi
and Technology Programs Gillespie, Folkner Deputy Director-General
Cray Research, Inc. & Associates, Inc. Electrotechnical Laboratory

Justin L. Bloom Eiichi Goto Lauren Kelly


President Director, Computer Center Department of Commerce
Technology International, Inc. University of Tokyo Thomas Keyes
Charles N. Brownstein C.K. Gunsalus Professor of Chemistry
Executive Officer Assistant Vice Chancellor Boston University
Computing & Information Science & for Research
Engineering University of Illinois
National Science Foundation at Urbana-Champaign
Continued on next page

VI
List Of Reviewers (continued)

Doyle Knight Shoichi Ninomoiya Steven Sample


President Executive Director President
John von Neumann National Fujitsu Limited SUNY, Buffalo
Supercomputer Center John Sell
Bernard O’Lear
Consortium for Scientific President
Manager of Systems
Computing Minnesota Supercomputer Center
National Center for Atmospheric
Mike Levine Research
Hiroshi Shima
Co-director of the Pittsburgh Ronald Orcutt Deputy Director-General for
Supercomputing Center Technology Affairs
Executive Director
Carnegie Mellon University Agency of Industrial Science and
Project Athena
George E. Lindamood MIT Technology , MITI
Program Director Yoshio Shimamoto
Tad Pinkerton
Industry Service Senior Scientist (Retired)
Director
Gartner Group, Inc. Applied Mathematics Department
Office of Information
M. Stuart Lynn Technology Brookhaven National Laboratory
Vice President for University of Wisconsin-Madison Charles Sorber
Information Technologies Dean, School of Engineering
Harold J. Raveche
Cornell University University of Pittsburgh
President
Ikuo Makino Stevens Institute of Technology Harvey Stone
Director Special Assistant to the
Ann Redelf
Electrical Machinery & Consumer President
Manager of Information
Electronics Division University of Delaware
Services
Minisry of International Cornell Theory Center
Trade and Industry Dan Sulzbach
Cornell University
Manager, User Services
Richard Mandelbaum San Diego Supercomputer Center
Glenn Ricart
Vice Provost for Computing
Director Tatsuo Tanaka
University of Rochester
Computer Science Center Executive Director
Martin Massengale University of Maryland in Interoperability Technology
Chancellor College Park Association for Information Processing,
University of N’ebraska-Lincoln Japan
Ira Richer
Gerald W. May Program Manager Ray Toland
President DARPA/ISTO president
University of New Mexico John Riganati Alabama Supercomputing Network
Yoshiro Miki Director of Systems Research Authority
Director, Policy Research Division Supercomputer Research Center Kenneth Tolo
Science and Technology Policy Bureau Institute for Defense Analyses Vice Provost
Science and Technology Agency Mike Roberts University of Texas
Takeo Miura Vice President at Austin
Senior Executive Managing Director EDUCOM
Kenneth Toy
Hitachi. Ltd. Post-Graduate Research
David Roselle
J. Gerald Morgan President Geophysicist
Dean of Engineering University of Kentucky Scripps Institution of
New Mexico State University Oceanography
Nora Sabelli
V. Rama Murthy National Center for Supercomputing August B. Tumbull. III
Vice Provost for Academic Applications Provost & Vice President,
Affairs University of Illinois at Academic Affairs
University of Minnesota Urbana-Champaign Florida State University

Continued on next page


List of Reviewers (continued)

Gerald Turner Hugh Walsh James Woodward


Chancellor Data Systems Division Chancellor
University of Mississippi IBM University of North Carolina
at Charlotte
Douglas Van Houweling Richard West
Vice Provost for Information Assistant Vice President, Akihiro Yoshikawa
& Technology IS&AS Research Director
University of Michigan University of California University of California, Berkeley
BRIE/IIS
Anthony Villasenor Steve Wolff
Program Manager Program Director for Networking
ScienCE Networks Computing & Information Science &
Office of Space Science and Engineering
Applications National Science Foundation
National Aeronautics and
Space Administration

NOTE: OTA is grateful for the valuable assistance and thoughtful critiques provided by the advisory panel, The views expressed in
this OTA background paper, however, are the sole responsibility of the Office of Technology Assessment,

VIII,..
. .. .. .

Contents
Page Page
Chapter 1: Introduction and Overview Other Nations . . . . . . . . . . . . . . . . , . . . . . . 19
Observations . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Chapter 3: Networks . . . . . . . . . . . . . . . . . . . . . 21
RESEARCH AND INFORMATION THE NATIONAL RESEARCH AND
TECHNOLOGY-A FUTURE EDUCATION NETWORK (NREN) . . . . . 22
SCENARIO . . . . . . . . . . . . . . . . . . ., . . . . . . 1 The Origins of Research Networking . . . . 22
MAJOR ISSUES AND PROBLEMS . . . . . . . 1 The Growing Demand for Capability and
NATIONAL IMPORTANCE-THE NEED Connectivity . . . . . . . . . . . . . . . . . . . . , . . 23
FOR ACTION . . . . . . . . . . . . . . . . . . . . . . . . 3 The Present NREN . . . . . . . . . . . . . . . . . . . 23
Economic Importance ., . . . . . . . . . . . . . . . . 3 Research Networking as a Strategic
Scientific Importance . . . . . . . . . . . . . . . . . . . 4 High Technology Infrastructure . . . . . . . 25
Timing . . . . . . . . . . . ... , . . . . . . . . . . . . . . . 4 Federal Coordination of the Evolving
Users . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 Internet . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Collaborators ... , . . . . . . . . . . . . . . . . . . . . . 5 Players in the NREN . . . . . . . . . . . . . . . . . . 26
Service Providers ., . . . . . . . . . . . . . . . . . . . 5 The NREN in the International
Chapter 2: High Performance Computers . . . 7 Telecommunications Environment . . . . 28
WHAT IS A HIGH PERFORMANCE Policy Issues . . . . . . . . . . . . . . . . . . . . . . . . 28
COMPUTER? . . . . . . . . . > . . . . . . . . . . . . . . 7 Planning Amidst Uncertainty , ., . . . . . ., . 29
HOW FAST IS FAST? . . . . . . . . . . . . . . . . . . . 8 Network Scope and Access . . . . . . . . . . . . 29
THE NATIONAL SUPERCOMPUTER Policy and Management Structure . . . . . . . 31
CENTERS . . . . . . . . . . . . . . . . . . . . . . . . . , . 9 Financing and Cost Recovery . . . . . . . . . . 32
The Cornell Theory Center . . . . . . . . . . . . . 9 Network Use . . . . . . . . . . . . . . . . . . . . . . . . . 33
The National Center for Supercomputing Longer-Term Science Policy Issues . . . . . 33
Applications . . . . . . . . . . . . . . . . . . . . . . . 10 Technical Questions . . . . . . . . . . . . . . . . . . 34
Pittsburgh Supercomputing Center . . . . . . 10 Federal Agency Plans:
San Diego Supercomputer Center . . . . . . . 10 FCCSET/FRICC . . . . . . . . . . . . . . . . . . . 34
John von Neumann National NREN Management Desiderata . . . . . . . . . 35
Supercomputer Center . . . . . . . . . . . . . . 11
OTHER HPC FACILITIES . . . . . . . . . . . . . . 11
Figures
Minnesota Supercomputer Center . . . . . . . 11 Figure Page
The Ohio Supercomputer Center . . . . . . . . 12
1-1. An Information Infrastructure for
Center for High Performance Computing, Research . . . . . . . ., ., ... , . . . . . . . . . . . . . . . 2
Texas (CHPC) . . . . . . . . . . . . . . . . . . . . . 12
1-2. Distribution of Federal Supercomputers . . . 14
Alabama Supercomputer Network . . . . . . . 13
Commercial Labs . . . . . . . . . . . . . . . . . . . . 13
Federal Centers . . . . . . . . . . . . . . . . . . . . . . 13 Tables
CHANGING ENVIRONMENT . . . . . . . . . . 13 Table Page
REVIEW AND RENEWAL OF THE NSF 2-1. Some Key Academic High Performance
CENTERS . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Computer Installations . . . . . . . . . . . . . . . . . . 12
THE INTERNATIONAL ENVIRONMENT . 16 3-1. Principal Policy Issues in Network
Japan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Development . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Europe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3-2. Proposed NREN Budget . . . . . . . . . . . . . . . . 36

ix
Chapter 1
Introduction and Overview Observations

The Office of Technology Assessment is conduct- archival storage systems that contain spe-
ing an assessment of the effects of new information cialized research databases;
technologies—including high performance comput- experimental apparatus-such as telescopes,
ing, data networking, and mass data archiving-on environmental monitoring devices, seismographs,
research and development. This background paper and so on---designed to be set-up and operated
offers a midcourse view of the issues and discusses remotely;
their implications for current discussions about services that support scientific communication,
Federal supercomputer initiatives and legislative including electronic mail, computer confer-
initiatives concerning a national data communica- encing systems, bulletin boards, and electronic
tion network. journals;
a “digital library” containing reference mate-
Our observations to date emphasize the critical rial, books, journals, pictures, sound record-
importance of advanced information technology
ings, films, software, and other types of infor-
to research and development in the United States, mation in electronic form; and
the interconnection of these technologies into a specialized output facilities for displaying the
national system (and, as a result, the tighter results of experiments or calculations in more
coupling of policy choices regarding them), and readily understandable and visualizable ways.
the need for immediate and coordinated Federal
action to bring into being an advanced informa- Many of these resources are already used in some
tion technology infrastructure to support U.S. form by some scientists. Thus, the scenario that is
research, engineering, and education. drawn is a straightforward extension of current
usage. Its importance for the scientific community
and for government policy stems from three trends:
RESEARCH AND INFORMATION 1 ) the rapidly and continually increasing capability
TECHNOLOGY—A FUTURE of the technologies; 2) the integration of these
technologies into what we will refer to as an
SCENARIO “information infrastructure”; and 3) the diffusion of
Within the next decade, the desks and laboratory information technology into the work of most
benches of most scientists and engineers will be scientific disciplines.
entry points to a complex electronic web of informa- Few scientists would use all the resources and
tion technologies, resources and information serv- facilities listed, at least on a daily basis; and the
ices, connected together by high-speed data commu- particular choice of resources eventually made
nication networks (see figure 1-1 ). These technolo- available on the network will depend on how the
gies will be critical to pursuing research in most tastes and needs of research users evolve. However,
fields. Through powerful workstation computers on the basic form, high-speed data networks connecting
their desks, researchers will access a wide variety of user workstations with a worldwide assortment of
resources, such as: information technologies and services, is becoming
● an interconnected assortment of local campus, a crucial foundation for scientific research in most
State and regional, national, and even intern- disciplines.
ational data communication networks that link
users worldwide;
● specialized and general-purpose computers in-
MAJOR ISSUES AND PROBLEMS
cluding supercomputers, minisupercomputers, Developing this system to its full potential will
mainframes, and a wide variety of special require considerable thought and effort on the part of
architectures tailored to specific applications; government at all levels, industry, research institu-
● collections of application programs and soft- tions, and the scientific community, itself, It will
ware tools to help users find, modify, or present policy makers with some difficult questions
develop programs to support their research; and decisions.
-1
2

Figure l-l—An Information Infrastructure for Research

M a i n f ra mes

On-1ine
ex p e r i m e n ts Electronic mail I
t
&
Bulletin in boards
/
1

t ‘ —

I
I
\
Digital electronic
Iubraries
/
Net works /

&
Data archives
i Workstations
Associated Services I 1/

Scientific applications are very demanding on . new methods for storing and accessing in for-
technological capability. A substantial R&D com- mation horn very large data archives.
ponent will need to accompany programs in-
tended to advance R&D use of information An important characteristic of this system is that
technology. To realize the potential benefits of this different parts of it will be funded and operated by
new infrastructure, research users need advances in different entities and made available to users in
such areas as: different ways. For example, databases could be
operated by government agencies, professional soci -
● more powerful computer designs;
— eties, non-profit journals, or commercial firms.
Computer facilities could similarly be operated by
● more powerful and efficient computational government, industry, or universities. The network,
techniques and software; overly high-speed itself, already is an assemblage of pieces funded or
switched data communications;
operated by various agencies in the Federal Govem-
● improved technologies for visualizing data ment; by States and regional authorities; and by local
results and interacting with computers; and agencies, firms and educational institutions. Keep-
3

ing these components interconnected technologi- NATIONAL IMPORTANCE—


cally and allowing users to move smoothly among
the resources they need will present difficult THE NEED FOR ACTION
management and policy problems. Over the last 5 years, Congress has become
increasingly concerned about information technol-
Furthermore, the system will require significant ogy and research. The National Science Foundation
capital investment to build and maintain, as well as (NSF) has been authorized to establish supercom-
specialized technical expertise to manage. How the puter centers and a science network. Bills (S 1067
various components are to be funded, how costs HR 3131) are being considered in the Congress to
are to be allocated, and how the key components authorize a major effort to plan and develop a
such as the network will be managed over the national research and education network and to
long term will be important questions. stimulate information technology use in science and
education. Interest in the role information technol-
Since this system as envisioned would be so ogy could play in research and education has
widespread and fundamental to the process of stemmed, first, from the government’s major role as
research, access to it would be crucial to participa- a funder, user, and participant in research and,
tion in science. Questions of access and participa- secondly, from concern for ensuring the strength and
tion are crucial to planning, management, and competitiveness of the U.S. economy.
policymaking for the network and for many of
the services attached to it. Observation 1: The Federal Government needs to
establish its commitment to the advanced infor-
Changes in information law brought about by the mation technology infrastructure necessary for
electronic revolution will create problems and con- furthering U.S. science and education. This need
flicts for the scientific community and may influ- sterns directly from the importance of science and
ence how and by whom these technologies are used. technology to economic growth, the importance
The resolution of broader information issues of information technology to research and devel-
such as security and privacy, intellectual prop- opment, and the critical timing for certain policy
erty protection, access controls on sensitive infor- decisions.
mation, and government dissemination practices
could affect whether and how information tech- Economic Importance
nologies will be used by researchers and who may A strong national effort in science and technology
use them. is critical to the long-term economic competitive-
ness, national security, and social well-being of the
Finally, to the extent that, over the long run,
United States. That, in the modem international
modem information technology becomes so funda-
economy, technological innovation is concomitant
mental to the research process, it will transform the
with social and economic growth is a basic assump-
very nature of that process and the institutions—
tion held in most political and economic systems in
libraries, laboratories, universities, and so on—that
the world these days; and we will take it here as a
serve it. These basic changes in science would
basic premise. It has been a basic finding in many
affect government both in the operation of its own
OTA studies.l (This observation is not to suggest
laboratories and in its broader relationship as a
that technology is a panacea for all social problems,
supporter and consumer of research. Conflicts
nor that serious policy problems are not often raised
may also arise to the extent that government
by its use.) Benefits from of this infrastructure are
becomes centrally involved, both through fund-
expected to flow into the economy in three ways:
ing and through management with the tradition-
ally independent and uncontrolled communication First, the information technology industry can
channels of science. benefit directly. Scientific use has always been a
I For ~xmple, U,S, Congess, Offiw of Tu~~]o~ As~ssmen[, Techm/ogy ad the A~ri~an Ec’ono~”( TransitIon, OTA-TET-283 (wti.Shlllg(Otl,
DC: U.S. Government Printing Office, May 1988) and fqformafwn Techofogy R&D Criticaf Trend andlswes, OTA-CIT-268 (Washington, DC: U.S.
Government Printing Office, February 1985),
4

major source of innovation in computers and com- ● Scientific and technical information is increas-
munications technology. Packet-switched data com- ingly being generated, stored and distributed in
munication, now a widely used commercial offering, electronic form;
was first developed by the Defense Advanced Ž Computer-based communications and data han-
Research Projects Agency (DARPA) to support its dling are becoming essential for accessing,
research community. Department of Energy (DOE) manipulating, analyzing, and communicating
national laboratories have, for many years, made data and research results; and,
contributions to supercomputer hardware and soft- ● In many computationally intensive R&D areas,
ware. New initiatives to develop higher speed from climate research to groundwater modeling
computers and a national science network could to airframe design, major advances will depend
similarly feed new concepts back to the computer upon pushing the state of the art in high
and communications industry as well as to providers performance computing, very large databases,
of information services. visualization, and other related information
technologies. Some of these applications have
Secondly, by improving the tools and methodolo- been labeled “Grand Challenges.” These proj-
gies for R&D, the infrastructure will impact the ects hold promise of great social benefit, such
research process in many critical high technology as designing new vaccines and drugs, under-
industries, such as pharmaceuticals, airframes, chem- standing global warming, or modeling the
icals, consumer electronics, and many others. Inno- world economy. However, for that promise to
vation and, hence, international competitiveness in be realized in those fields, researchers require
these key R&D-intensive sectors can be improved. major advances in available computational
The economy as a whole stands to benefit from power.
increased technological capabilities of information ● Many proposed and ongoing “big science”
systems and improved understanding of how to use projects, from particle accelerators and large
them. A National Research and Education Network array radio telescopes to the NASA EOS
could be the precursor to a much broader high satellite project, will create vast streams of new
capacity network serving the United States, and data that must be captured, analyzed, archived,
many research applications developed for high and made available to the research community.
performance computers result in techniques much These new demands could well overtax the
more broadly applicable to commercial firms. capability of currently available resources.

Scientific Importance Timing


Government decisions being made now and in the
Research and development is, inherently, an
near future will shape the long-term utility and
information activity. Researchers generate, organ-
effectiveness of the information technology infra-
ize, and interpret information, build models, com-
structure for science. For example:
municate, and archive results, Not surprisingly,
then, they are now dependent on information tech- ● NSF is renewing its multi-year commitments to
nology to assist them in these tasks. Many major all or most of the existing National Supercom-
studies by many scientific and policy organizations puting Centers.
over the years-as far back as the President’s ● Executive agencies, under the informal aus-
Science Advisory Committee (PSAC) in the middle pices of the Federal Research Internet Coordi-
1960s, and as recently as a report by COSEPUP of nating Committee (FRICC), are developing a
the National Research Council published in 1988 2— national “backbone” network for science. Deci-
have noted these trends and analyzed the implica- sions made now will have long term influence
tions for science support. The key points are as on the nature of the network, its technical
follows: characteristics, its cost, its management, serv-
Zpme] on ~om~jon l’lxhnoIo~ and the Conduct of Research, Committee on Science, Engineering, and Public Policy, f~ormarion Technology
and the Conduct of Reseurch ” The User’s View (Washington, DC: National Academy Press, 1989).
5

ices available on it, access, and the information interconnect and users must move smoothly among
policies that will govern its use. them, the system requires a high degree of coordina-
● The basic communications industry is in flux, tion rather than being treated as simply a conglomer-
as are the policies and rules by which gover- ation of independent facilities.
nment regulates it. However, if information technology resources for
● Congress and the Executive Branch are cur- science are treated as infrastructure, a major policy
rently considering, and in some cases have issue is one of boundaries. Who is it to serve; who
started, several new major scientific projects, are its beneficiaries? Who should participate in
including a space station, the Earth Orbiting designing it, building and operating it, providing
System, the Hubble space telescope, the super- services over it, and using it? The answers to these
conducting supercollider, human genome map- questions will also indicate to Congress who should
ping, and so on. Technologies and policies are be part of the policymaking and planning process;
needed to deal with these “firehoses of data. ” In they will govern the long term scale, scope, and the
addition, upgrading the information infrastruc- technological characteristics of the infrastructure
ture could open these projects and data streams itself; and they will affect the patterns of support for
to broad access by the research community. the facilities. Potentially interested parties include
Observation 2: Federal policy in this area needs to the following:
be more broadly based than has been traditional Users
with Federal science efforts. Plsnning, building,
and managing the information technology infra- Potential users might include academic and indus-
structure requires cutting across agency pro- trial researchers, teachers, graduate, undergraduate,
grams and the discipline and mission-oriented and high school students, as well as others such as
approach of science support. In addition, many the press or public interest groups who need access
parties outside the research establishment will to and make use of scientific information. Institu-
have important roles to play and stakes in the tions, such as universities and colleges, libraries, and
outcome of the effort. schools also have user interests. Furthermore, for-
eign scientists working as part of international
The key information technologies-high per- research teams or in firms that operate internatio-
formance computing centers, data communication nally will wish access to the U.S. system, which, in
networks, large data archives, along with a wide turn, will need to be connected with other nation’s
range of supporting software-are used in all research infrastructures.
research disciplines and support several different
agency missions. In many cases, economies of scale Collaborators
and scope dictate that some of these technologies Another group of interested parties include State
(e.g., supercomputers) be treated as common re- and local governments and parts of the information
sources. Some, such as communication networks, industry. We have identified them with the term
are most efficiently used if shared or interconnected “collaborators” because they will be participating in
in some way. funding, building, and operating the infrastructure.
There are additional scientific reasons to treat States are establishing State supercomputer centers
information resources as a broadly used infrastruc- and supporting local and regional networking, some
ture: fostering communication among scientists computer companies participate in the NSF National
between disciplines, sharing resources and tech- Supercomputer Centers, and some telecommunica-
niques, and expanding access to databases and tion firms are involved in parts of the science
software, for instance. However, there are very few network.
models from the history of Federal science support Service Providers
for creating and maintaining infrastructure-like re-
sources for science and technology across agency Finally, to the extent that the infrastructure serves
and disciplinary boundaries. Furthermore, since the as a basic tool for most of the research and
networks, computer systems, databases, and so on development community, information service pro-
6

viders will require access to make their products network while protecting privacy and valuable
available to scientific users. The service providers resources will require careful balancing of legal and
may include government agencies (which provide technological controls.
access to government scientific databases, for exam-
ple), libraries and library utilities, journal and Intellectual property protection in an electronic
text-book publishers, professional societies, and environment may pose difficult problems, Providers
private software and database providers. will be concerned that electronic databases, soft-
ware, and even electronic formats of printed journals
Observation 3: Several information policy issues and other writings will not be adequately protected.
will be raised in managing and using the network. In some cases, the product, itself, may not be well
Depending on how they are resolved, they could protected under existing law. In other cases elec-
sharply restrict the utility and scope of network tronic formats coupled with a communications
use in the scientific community. network erode the ability to control restrictions on
Security and privacy have already become of copying and disseminating.
major concern and will pose a problem. In general, Access controls may be called for on material that
users will want the network and the services on it to is deemed to be sensitive (although unclassified) for
be as open as possible; however, they will also want reasons of national security or economic competi-
the networks and services to be as robust and tiveness. Yet, the networks will be accessible
dependable as possible-free free deliberate or worldwide and the ability to identify and control
accidental disruption. Furthermore, different re- users may be limited.
sources will require different levels of security.
Some bulletin boards and electronic mail services The above observations have been broad, looking
may want to be as open and public as possible; others at the overall collection of information technology
may require a high level of privacy. Some databases resources for science as an integrated system and at
may be unique and vital resources that will need a the questions raised by it. The remaining portion of
very high level of protection, others may not be so this paper will deal specifically with high perform-
critical. Maintaining an open, easily accessible ance computers and networking.
Chapter 2
High Performance Computers

An important set of issues has been raised during Have we learned anything about the effectiveness of
the last 5 years around the topic of high performance the National Centers approach? Should the goals of
computing (H-PC). These issues stem from a grow- the Advanced Scientific Computing (ASC) and
ing concern in both the executive branch and in other related Federal programs be refined or rede-
Congress that U.S. science is impeded significantly fined? Should alternative approaches be considered,
by lack of access to HPC 1 and by concerns over the either to replace or to supplement the contributions
competitiveness implications of new foreign tech- of the centers?
nology initiatives, such as the Japanese “Fifth
Generation Project.” In response to these concerns, OTA is presently engaged in a broad assessment
policies have been developed and promoted with of the impacts of information technology on re-
three goals in mind. search, and as part of that inquiry, is examining the
question of scientific computational resources. It has
1. To advance vital research applications cur- been asked by the requesting committees for an
rently hampered by lack of access to very high interim paper that might help shed some light on the
speed computers. above questions. The full assessment will not be
2. To accelerate the development of new HPC completed for several months, however; so this
technology, providing enhanced tools for re- paper must confine itself to some tentative observa-
search and stimulating the competitiveness of tions.
the U.S. computer industry.
3. To improve software tools and techniques for
using HPC, thereby enhancing their contribu- WHAT ISA HIGH PERFORMANCE
tion to general U.S. economic competitive- COMPUTER?
ness.
The term, “supercomputer,” is commonly used in
In 1984, the National Science Foundation (NSF) the press, but it is not necessarily useful for policy.
initiated a group of programs intended to improve In the first place, the definition of power in a
the availability and use of high performance comput- computer is highly inexact and depends on many
ers in scientific research. As the centerpiece of its factors including processor speed, memory size, and
initiative, after an initial phase of buying and so on. Secondly, there is not a clear lower boundary
distributing time at existing supercomputer centers, of supercomputer power. IBM 3090 computers
NSF established five National Supercomputer Cen- come in a wide range of configurations, some of the
ters. largest of which are the basis of supercomputer
centers at institutions such as Cornell, the Universi-
Over the course of this and the next year, the ties of Utah, and Kentucky. Finally, technology is
initial multiyear contracts with the National Centers changing rapidly and with it our conceptions of
are coming to an end, which has provoked a debate power and capability of various types of machines.
about whether and, if so, in what form they should We use the more general term, “high performance
be renewed. NSF undertook an elaborate review and computers,” a term that includes a variety of
renewal process and announced that, depending on machine types.
agency funding, it is prepared to proceed with
2
renewing at least four of the centers . In thinking One class of HPC consists of very large. powerful
about the next steps in the evolution of the advanced machines, principally designed for very large nu-
computing program, the science agencies and Con- merical applications such as those encountered in
gress have asked some basic questions. Have our science. These computers are the ones often referred
perceptions of the needs of research for HPC to as “supercomputers.” They are expensive, costing
changed since the centers were started? If so, how? up to several million dollars each.
lpe[~~ D, Lm, R~PO~ of the pamj on ~rge.scaje Cowtilng in ~clen~e ~ E~glncerl)lg (Wa.$hlngon, Dc: Na[lOnal science Foundam.m, 1982).
-e of the five centers, the John von Neumann National Supercomputer Center, has been based on ETA-10 tednology Tbc Center hw been asked
to resubmit a proposal showing revised plans in reaction to the wnhdrawd of that machme from the markt.

-7-
8

A large-scale computer’s power comes from a slower and, hence, cheaper processors. The problem
combination of very high-speed electronic compo- is that computational mathematicians have not yet
nents and specialized architecture (a term used by developed a good theoretical or experiential frame-
computer designers to describe the overall logical work for understanding in general how to arrange
arrangement of the computer). Most designs use a applications to take full advantage of these mas-
combination of “vector processing” and “parallel- sively parallel systems. Hence, they are still, by and
ism” in their design. A vector processor is an large, experimental, even though some are now on
arithmetic unit of the computer that produces a series the market and users have already developed appli-
of similar calculations in an overlapping, assembly cations software for them. Experimental as these
line fashion, (Many scientific calculations can be set systems may seem now, many experts think that any
up in this way.) significantly large increase in computational power
eventually must grow out of experimental systems
Parallelism uses several processors, assuming that such as these or from some other form of massively
a broken into large i n d e p e n d e n t
problem can be
parallel architecture.
pieces that can be computed on separate processors.
Currently, large, mainframe HPC’S such as those Finally, “workstations,” the descendants of per-
offered by Cray, IBM, are only modestly parallel, sonal desktop computers, are increasing in power;
having as few as two up to as many as eight new chips now in development will offer the
processors. 3 The trend is toward more parallel computing power nearly equivalent to a Cray 1
processors on these large systems. Some experts supercomputer of the late 1970s. Thus, although
anticipate as many as 512 processor machines top-end HPCs will be correspondingly more power-
appearing in the near future. The key problem to date ful, scientists who wish to do serious computing will
has been to understand how problems can be set up have a much wider selection of options in the near
to take advantage of the potential speed advantage of future,
larger scale parallelism. A few policy-related conclusions flow from this
Several machines are now on the market that are discussion:
based on the structure and logic of a large supercom- ● The term “Supercomputer” is a fluid one,
puter, but use cheaper, slower electronic compo- potentially covering a wide variety of machine
nents. These systems make some sacrifice in speed, types, and the “supercomputer industry” is
but cost much less to manufacture. Thus, an applica- similarly increasing y difficult to identify clearly.
tion that is demanding, but that does not necessarily ● Scientists need access to a wide range of high
require the resources of a full-size supercomputer, performance computers, ranging from desktop
may be much more cost effective to run on such a workstations to full-scale supercomputers, and
“minisuper.” they need to move smoothly among these
Other types of specialized systems have also machines as their research needs dictate.
appeared on the market and in the research labora- ● Hence, government policy needs to be flexible
tory. These machines represent attempts to obtain and broadly based, not overly focused on
major gains in computation speed by means of narrowly defined classes of machines.
fundamentally different architectures. They are known
by colorful names such as “Hypercubes,” “Connec-
tion Machines, “ “Dataflow Processors, “ “Butterfly
HOW FAST IS FAST?
Machines, “ “Neural Nets,” or “Fuzzy Logic Com- Popular comparisons of supercomputer speeds are
puters.” Although they differ in detail, many of these usually based on processing speed, the measure
systems are based on large-scale parallelism. That is, being “FLOPS,” or “Floating Point Operation Per
their designers attempt to get increases in processing Second.” The term “floating point” refers to a
speed by hooking together in some way a large particular format for numbers within the computer
number-hundreds or even thousands-of simpler, that is used for scientific calculation; and a floating
3T0 ~st~W1sh ~[w~n t.hls m~es[ level and the larger scale parallehsm found on some more experimental machines, some expetts refer tO th
lirmted parallelism ~ ‘(multiprocxssmg. ”
9

point “operation” refers to a single arithmetic step, One can draw a few policy implications from
such as adding two numbers, using the floating point these observations on speed:
format, Thus, FLOPS measure the speed of the ● Since overall speed improvement is closely
arithmetic processor. Currently, the largest super-
linked with how their machines are actually
computers have processing speeds ranging up to prograrnmed and used, computer designers are
several billion FLOPS.
critically dependent on feedback from that part
of the user community which is pushing their
However, pure processing speed is not by itself a
machines to the limit.
useful measure of the relative power of computers. ● There is no “fastest” machine. The speed of a
To see why, let’s look at an analogy.
high performance computer is too dependent on
the skill with which it is used and programmed,
In a supermarket checkout counter, the calcula-
and the particular type of job it is being asked
tion speed of the register does not, by itself,
to perform.
determine how fast customers can purchase their ● Until machines are available in the market and
groceries and get out of the store. Rather, the speed have been tested for overall performance,
of checkout is also affected by the rate at which each
policy makers should be skeptical of announce-
purchase can be entered into the register and the
ments based purely on processor speeds that
overall time it takes to complete a transaction with
some company or country is producing “faster
a customer and start a new one. Of course, ulti- machines. ”
mately, the length of time the customer must wait in ● Federal R&D programs for improving high
line to get to the clerk may be the biggest determi-
performance computing need to stress software
nant of all.
and computational mathematics as well as
research on machine architecture.
Similarly, in a computer, how fast calculations
can be set up and presented to the processor and how THE NATIONAL
fast new jobs and their associated data can be moved
in, and completed work moved out of the computer, SUPERCOMPUTER CENTERS
determines how much of the processor’s speed can In February of 1985, NSF selected four sites to
actually be harnessed. (Some users refer to this as establish national supercomputing centers: The Uni-
“solution speed.”) In a computer, those speeds are versity of California at San Diego, The University of
determined by a wide variety of hardware and Illinois at Urbana-Champaign, Cornell University
software characteristics. And, similar to the store and the John von Neumann Center in Princeton. A
checkout, as a fast machine becomes busy, users fifth site, Pittsburgh, was added in early 1986. The
may have to wait a significant time to get their turn. five NSF centers are described briefly below.
From a user’s perspective, then, a theoretically fast
computer can look very slow. The Cornell Theory Center
The Cornell Theory Center is located on the
In order to fully test a machine’s speed, experts
campus of Cornell University. Over 1,900 users
use what are called “benchmark programs,” sample from 125 institutions access the center. Although
programs that reproduce the actual work load. Since Cornell does not have a center-oriented network, 55
workloads vary, there are several different bench- academic institutions are able to utilize the resources
mark programs, and they are constantly being at Cornell through special nodes. A 14-member
refined and revised. Measuring a supercomputer’s Corporate Research Institute works within the center
speed is, itself, a complex and important area of in a variety of university-industry cost sharing
research. It lends insight not only into what type of projects.
computer currently on the market is best to use for
particular applications; but carefully structured meas- In November of 1985 Cornell received a 3084
urements can also show where bottlenecks occur computer from IBM, which was upgraded to a
and, hence, where hardware and software improve- four-processor 3090/400VF a year later. The 3090/
ments need to be made. 400VF was replaced by a six-processor 3090/600E
20

in May, 1987. In October, 1988 a second 3090/600E Pittsburgh Supercomputing Center


was added. The Cornell center also operates several
other smaller parallel systems, including an Intel The Pittsburgh Supercomputing Center (PSC) is
iPCS/2, a Transtech NT 1000, and a Topologix run jointly by the University of Pittsburgh, Carnegie-
T1OOO. Some 50 percent of the resources of North- Mellon University, and Westinghouse Electric Corp.
east Parallel Architecture Center, which include two More than 1,400 users from 44 States utilize the
Connection machines, an Encore, and an Alliant center. Twenty-seven universities are affiliated with
FX/80, are accessed by the Cornell facility. PSC.
Until October of 1988, all IBM computers were The center received a Cray X-MP/48 in March of
“on loan” to Cornell for as long as Cornell retained 1986. In December of 1988 PSC became the first
its NSF funding. The second IBM 3090/600, pro- non-Federal laboratory to possess a Cray Y-MP.
cured in October, will be paid for by an NSF grant, Both machines were being used simultaneously for
Over the past 4 years, corporate support for the a short time, however the center has phased out the
Cornell facility accounted for 48 percent of the Cray X-MP. The center’s graphics hardware in-
operating costs. During those same years, NSF and cludes a Pixar image computer, an Ardent Titan, and
New York State accounted for 37 percent and 5 a Silicon Graphics IRIS workstation.
percent respectively of the facility’s budget. This The operating projection at PSC for fiscal year
funding has allowed the center to maintain a staff of 1990, a “typical year,” has NSF supporting 58
about 100. percent of the center’s budget while industry and
vendors account for 22 percent of the costs. The
The National Center for Commonwealth of Pennsylvania and the National
Supercomputing Applications Institutes of Health both support PSC, accounting
The National Center for Supercomputing Appli- for 8 percent and 4 percent of budget respectively.
cations (NCSA) is operated by the University of Excluding working students, the center has a staff of
Illinois at Urbana-Champaign. The Center has over around 65.
2,500 academic users from about 82 academic
afiliates. Each affiliate receives a block grant of San Diego Supercomputer Center
time on the Cray X-MP/48, training for the Cray, and The San Diego Supercomputer Center (SDSC) is
help using the network to access the Cray. located on the campus of the University of Califor-
The NCSA received its Cray X-MP/24 in October nia at San Diego and is operated by General
1985, That machine was upgraded to a Cray Atomics. SDSC is linked to 25 consortium members
X-MP/48 in 1987. In October 1988 a Cray-2s/4-128 but has a user base in 44 States. At the end of 1988,
was installed, giving the center two Cray machines. over 2,700 users were accessing the center. SDSC
This computer is the only Cray-2 now at an NSF has 48 industrial partners who use the facility’s
national center. The center also houses a Connection hardware, software, and support staff.
Machine 2, an Alliant FX/80 and FX/8, and over 30 A Cray X-MP/48 was installed in December,
graphics workstations. 1985. SDSC’s first upgrade, a Y-MP8/864, is
In addition to NSF funding, NCSA has solicited planned for December, 1989. In addition to the Cray,
industrial support. Amoco, Eastman Kodak, Eli SDSC has 5 Sun workstations, two IRIS worksta-
Lilly, FMC Corp., Dow Chemical, and Motorola tions, an Evans and Sutherland terminal, 5 Apollo
have each contributed around $3 million over a workstations, a Pixar, an Ardent Titan, an SCS-40
3-year period to the NCSA. In fiscal year 1989 minisupercomputer, a Supertek S-1 minisupercom-
corporate support has amounted to 11 percent of puter, and two Symbolics Machines.
NCSA’s funding. About 32 percent of NCSA’s The University of California at San Diego spends
budget came from NSF while the State of Illinois more than $250,000 a year on utilities and services
and the University of Illinois accounted for the for SDSC. For fiscal year 1990 the SDSC believes
remaining 27 percent of the center’s $21.5 million NSF will account for 47 percent of the center’s
budget. The center has a full-time staff of 198. operating budget. The State of California currently
provides $1.25 million per year to the center and in OTHER HPC FACILITIES
1988, approved funding of $6 million over 3 years to
SDSC for research in scientific visualization. For Before 1984 only three universities operated
fiscal year 1990 the State is projected to support 10 supercomputers: Purdue University, the University
percent of the center’s costs. Industrial support, of Minnesota, and Colorado State University. The
which has given the center $12.6 million in dona- NSF supercomputing initiative established five new
tions and in-kind services, is projected to provide 15 supercomputer centers that were nationally accessi-
percent of the total costs of SDSC in fiscal year ble. States and universities began funding their own
1990. supercomputer centers, both in response to growing
needs on campus and to increased feeling on the part
of State leaders that supercomputer facilities could
John von Neumann be important stimuli to local R&D and, therefore, to
National Supercomputer Center economic development. Now, many State and uni-
versity centers offer access to high performance
The John von Neumann National Supercomputer computers;4 and the NSF centers are only part of a
Center (JvNC), located in Princeton New Jersey, is much larger HPC environment including nearly 70
managed by the Consortium for Scientific Comput- Federal installations (see table 2-l).
ing Inc., an organization of 13 institutions from New Supercomputer center operators perceive their
Jersey, Pennsylvania, Massachusetts, New York, roles in different ways. Some want to be a proactive
Rhode Island, Colorado, and Arizona. Currently force in the research community, leading the way by
there are over 1,400 researchers from 100 institutes helping develop new applications, training users,
accessing the center. Eight industrial corporations and so on. Others are content to follow in the path
utilize the JvNC facilities. that the NSF National Centers create. These differ-
ences in goals/missions lead to varied services and
At present there are two Cyber 205 and two computer systems. Some centers are “cycle shops,”
ETA-1 OS, in use at the JvNC. The first ETA-10 was offering computing time but minimal support staff.
installed, after a 1-year delay, in March of 1988. In Other centers maintain a large support staff and offer
addition to these machines there is a Pixar H, two consulting, training sessions, and even assistance
Silicon Graphics IRIS and video animation capabili- with software development. Four representative
ties. centers are described below:
When the center was established in 1985 by NSF,
the New Jersey Commission on Science and Tech- Minnesota Supercomputer Center
nology committed $12.1 million to the center over a
The Minnesota Supercomputer Center, originally
5-year period. An addition $13.1 million has been
part of the University of Minnesota, is a for-profit
set-aside for the center by the New Jersey Commis-
computer center owned by the University of Minne-
sion for fiscal year 1991-1995. Direct funding from
sota. Currently, several thousand researchers use the
the State of New Jersey and university sources
center, over 700 of which are from the University of
constitutes 15 percent of the center’s budget for
Minnesota. The Minnesota Supercomputing Insti-
fiscal year 1991-1995. NSF will account for 60
tute, an academic unit of the University, channels
percent of the budget. Projected industry revenue
university usage by providing grants to the students
and cost sharing account for 25 percent of costs.
through a peer review process.
Since the announcement by CDC to close its ETA
subsidiary, the future of JvNC is uncertain. Plans The Minnesota Supercomputer Center received
have been proposed to NSF by JvNC to purchase a its first machine, a Cray 1A, in September, 1981. In
Cray Research Y-MP, eventually upgrading to a mid 1985, it installed a Cyber 205; and in the latter
C-90. NSF is reviewing the plan and a decision on part of that year, two Cray 2 computers were
renewal is expected in October of 1989, installed within 3 months of each other. Minnesota
4The nm~r cmot ~ estfia[~ ex~dy. First, it depends on the dcfiniuon of supercomputer one uses. Secondly, the number kwps Chmghg as
States announce new plans for centers and as large research universities purchase their own HPCS.
12

Table 2-l—Federal Unctassified supercomputers than anyone outside the Federal


Supercomputer lnstallations Government.
Number The Minnesota State Legislature provides funds
Laboratory of machines to the University for the purchasing of supercom-
Department of E n e r g y puter time. Although the University buys a substan-
Los Alams National Lab . . . . . . . . . . . . . . . . . . . . 6
Livermore National Lab, NMFECC . . . . . . . . . . . . 4 tial portion of supercomputing time, the center has
Livermore National Lab . . . . . . . . . . . . . . . . . . . . . 7 many industrial clients whose identities are proprie-
Sandia National Lab, Livermore. . . . . . . . . . . . . . . 3 tary, but they include representatives of the auto,
Sandia National Lab, Albuquerque . . . . . . . . . . . . 2
Oak Ridge National Lab . . . . . . . . . . . . . . . . . . . . . 1 aerospace, petroleum, and electronic industries.
idaho Falls National Engineering . . . . . . . . . . . . . . 1 They are charged a fee for the use of the facility.
Argonne National Lab . . . . . . . . . . . . . . . . . . . . . . . 1
Knolls Atomic Power Lab . . . . . . . . . . . . . . . . . . . . 1
Bettis Atomic Power Lab . . . . . . . . . . . . . . . . . . . . 1 The Ohio Supercomputer Center
Savannah/DOE . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Richland/DOE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 The Ohio Supercomputer Center (OSC) orig-
Schenectady Naval Reactors/DOE . . . . . . . . . . . .
Pittsburgh Naval Reactors/DOE . . . . . . . . . . . . . . . 2 inated from a coalition of scientists in the State. The
Department of Defense center, located on Ohio State University’s campus,
Naval Research Lab . . . . . . . . . . . . . . . . . . . . . . . . 1 is connected to 20 other Ohio universities via the
Naval Ship R&D Center ., . . . . . . . . . . . . . . . . . . . 1
Fleet Numerical Oceanography . . . . . . . . . . . . . . . 1 Ohio Academic Research Network (OARNET). As
Naval Underwater System Command . . . . . . . . . . 1 of January 1989, three private firms were using the
Naval Weapons Center . . . . . . . . . . . . . . . . . . . . . . 1 Center’s resources.
Martin Marietta/NTB . . . . . . . . . . . . . . . . . . . . . . . . . 1
Air Force Wapons Lab . . . . . . . . . . . . . . . . . . . . . 2 In August, 1987, OSC installed a Cray X-MP/24,
Air Force Global Weather . . . . . . . . . . . . . . . . . . . . 1
Arnold Engineering and Development . . . . . . . . . . 1 which was upgraded to an Cray X-MP/28 a year
Wright Patterson AFB . . . . . . . . . . . . . . . . . . . . . . . 1 later. The center replaced the X-MP in August 1989
Aerospace Corp. . . . . . . . . . . . . . . . . . . . . . . . . . . 1 with a Cray Research Y-MP. In addition to Cray
Army Ballistic Research Lab . . . . . . . . . . . . . . . . . . 2
Army/Tacom ... , , . . . . . . . . . . . . . . . . . . . . . . . . . . 1 hardware, there are 40 Sun Graphic workstations, a
Army/Huntsville. . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Pixar II, a Stallar Graphics machine, a Silicon
Army/Kwajaiein . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Graphic workstation and a Abekas Still Store
Army/WES (on order) . . . . . . . . . . . . . . . . . . . . . . . 1
Army/Warren . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 machine. The Center maintains a staff of about 35
Defense Nuclear Agency . . . . . . . . . . . . . . . . . . . . 1 people.
NASA
Ames . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 The Ohio General Assembly began funding the
Goddard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 center in the summer of 1987, appropriating $7.5
Lewis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Langley . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 million. In March of 1988, the Assembly allocated
Marshal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 $22 million for the acquisition of a Cray Y-MP. Ohio
Department Commerce State University has pledged $8.2 million to aug-
National inst. of Standards and Technology . . . . . 1
National Oceanic & Atmospheric Administration . 4 ment the center’s budget. As of February 1989 the
Environmental Protection Agency State has spent $37.7 million in funding.5 OSC’s
Raleigh, North Carolina . . . . . . . . . . . . . . . . . . . . . 1 annual budget is around $6 million (not including
Eeportment and Human Services
National Institutes of Health . . . . . . . . . . . . . . . . . . 1 the purchase/leasing of their Cray).
National Cancer institute . . . . . . . . . . . . . . . . . . . . 1
SOURCE: Offics of Technology Assessment estimate.
Center for High Performance Computing,
Texas (CHPC)
bought its third Cray 2, the only one in use now,at
the end of 1988, just after it installed its ETA-10. The Center for High Performance Computing is
The ETA-10 has recently been decommissioned due located at The University of Texas at Austin. CHPC
to the closure of ETA. A Cray X-MP has been added, serves all 14 institutions, 8 academic institutions,
giving them a total of two supercomputers. The and 6 health-related organizations, in the University
Minnesota Supemomputer Centerhas acquired more of Texas System.
5J~ WUC, “ohio~: Blazing Computer,” Ohio, February 1989, p. 12.
——

The University of Texas installed a Cray X-MP/ Commercial Labs


24 in March 1986, and a Cray 14se in November of
1988. The X-MP is used primarily for research. For A few corporations, such as the Boeing Computer
the time being, the Cray 14se is being used as a Corp., have been selling high performance computer
vehicle for the conversion of users to the Unix time for a while. Boeing operates a Cray X-MP/24.
system. About 40 people staff the center. Other commercial sellers of high performance com-
puting time include the Houston Area Research
Original funding for the center and the Cray Center (HARC). HARC operates the only Japanese
X-NIP came from bonds and endowments from both Supercomputer in America, the NEC SX2. The
The University of Texas system and The University center offers remote services.
of Texas at Austin. The annual budget of CHPC is Computer Sciences Corp. (CSC), located in Falls
about $3 million. About 95 percent of the center’s Church, Virginia, has a 16-processor FLEX/32 from
operating budget comes from State funding and Flexible Computer Corp., a Convex 120 from
endowments. Five percent of the costs are recovered Convex Computer Corp, and a DAP21O from Active
from selling CPU time. Memory Technology. Federal agencies comprise
two-thirds of CSC’s customers. 6 Power Computing
Co., located in Dallas, Texas, offers time on a Cray
Alabama Supercomputer Network X-MP/24. Situated in Houston, Texas, Supercom-
puting Technology sells time on its Cray X-MP/28.
The George C. Wallace Supercomputer Center, Opticom Corp., of San Jose, California, offers time
located in Huntsville Alabama, serves the needs of on a Cray X-MP/24, Cray l-M, Convex C220, and
researchers throughout Alabama. Through the Ala- cl XP.
bama Supercomputer Network, 13 Alabama institu-
tions, university and government sites, are con- Federal Centers
nected to the center. Under contract to the State,
Boeing Computer Services provides the support In an informal poll of Federal agencies, OTA
staff and technical skills to operate the center. identified 70 unclassified installations that operate
Support staff are located at each of the nodes to help supercomputers, confirming the commonly expressed
facilitate the use of the supercomputer from remote view that the Federal Government still represents a
sites. major part of the market for HPC in the United States
(see figure 2-l). Many of these centers serve the
A Cray X-MP/24 arrived in 1987 and became research needs of government scientists and engi-
operational in early 1988. In 1987 the State of neers and are, thus, part of the total research
Alabama agreed to finance the center. The State computing environment. Some are available to
allocated $2.2 million for the center and $38 million non-Federal scientists, others are closed.
to Boeing Services for the initial 5 years. The
average yearly budget is $7 million. The center has CHANGING ENVIRONMENT
a support staff of about 25. The scientific computing environment has
changed in important ways during the few years that
Alabama universities are guaranteed 60 percent of
NSF’s Advanced Scientific Computing Programs
the available time at no cost while commercial
have existed. Some of these changes are as follows:
researchers are charged a user fee. The impetus for
the State to create a supercomputer center has been The ASC programs, themselves, have not
stated as the technical superiority a supercomputer evolved as originally planned. The original NSF
would bring, which would draw high-tech industry planning document for the ASC program originally
to the State, enhance interaction between industry proposed to establish 10 supercomputer centers over
and the universities, and promote research and the a 3-year period; only 5 were funded. Center manag-
associated educational programs within the univer- ers have also expressed the strong opinion that NSF
sity. has not met many of its original commitments for
6N~s p~kcr Smiti, “More Than Just Buying Cycles,” Supercompufer Review, April 1989.
14

Figure 2-1—Distribution of Federal Suparcomputers development of the Cray 3, a machine based on


Supercomputers gallium arsenide electronics,
40
At the middle and lower end, the HPC industry
35 33
has introduced several new so-called “mini-
30 supercomputers"-many of them based on radically
different system concepts, such as massive parallel-
25
ism, and many designed for specific applications,
19
20 such as high-speed graphics. New chips promise
15
very high-speed desktop workstations in the near
10 future.
10
5 Finally, three Japanese manufacturers, NEC, Fujitsu,
5 2 1 and Hitachi have been successfully building and
0
marketing supercomputers that are reportedly com-
DOE DoD NASA Commerce HSS EPA petitive in performance with U.S. machines.7 While
Agencies
these machines have, as yet, not penetrated the U.S.
computer market, they indicate the potential com-
SOURCE: office of Technology Assessment, 1989.
petitiveness of the Japanese computer industry in the
international HPC markets, and raise questions for
funding in successive years of the contracts, forcing U.S. policy.
the centers to change their operational priorities and
search for support in other directions. Many universities and State systems have
established "supercornputer centers” to serve the
Technology has changed. There has been a burst needs of their researchers. 8 Many of these centers
of innovation in the HPC industry. At the top of the have only recently been formed, some have not yet
line, Cray Research developed two lines of ma- installed their systems, so their operational experi-
chines, the Cray 2 and the Cray X-MP (and its ence is, at best, limited to date. Furthermore, some
successor, the Y-MP) that are much more powerful other centers operate systems that, while very
than the Cray 1, which was considered the leading powerful scientific machines, are not considered by
edge of supercomputing for several years by the all experts to be supercomputers. Nevertheless, these
mid- 1980s. IBM has delivered several 3090s equipped centers provide high performance scientific comput-
with multiple vector processors and has also become ing to the research community, and create new
a partner in a project to develop a new supercom- demands for Federal support for computer time.
puter in a joint venture with SSI, a firm started by
Steve Chen, a noted supercomputer architect previ- Individual scientist and research teams are also
ously with Cray Research. getting Federal and private support from their
sponsors to buy their own “minisupercomputers.” In
More recently, major changes have occurred in some cases, these systems are used to develop and
the industry. Control Data has closed down ETA, its check out software eventually destined to run on
supercomputer operation. Cray Research has been larger machines; in other cases, researchers seem to
broken into two parts-Cray Computer Corp. and find these machines adequate for their needs. In
Cray Research. Each will develop and market a either mode of use, these departmental or laboratory
different line of supercomputers. Cray Research systems expand the range of possible sources
will, initially, at least, concentrate on the Y-MP researchers turn to for high performance com-
models, the upcoming C-90 machines, and their puting. Soon, desktop workstations will have per-
longer term successors. Cray Computer Corp., under formance equivalent to that of supercomputers of a
the leadership of Seymour Cray, will concentrate on decade ago at a significantly lower cost.
7Si=, ~ *OW ~ve, com~g ~e ~wer ~d ~rfo~mce of suwrcomputers is a complex and arcane field, OTA will refrti from ~mp~g
or ranking systems in any absolute sense.
8sw N~o~ As~latjon of Smte Unlvemities ~d L~d.Gr~[ Co]]eges, SWerc~~ufi”ngj_~r t~ /W()’~: A Stied Responsibility (Washington,
DC: January 1989).
Finally, some important changes have oc- the review of the JvNC on hold pending review of a
curred in national objectives or perceptions of revised plan that has now been submitted, A decision
issues. For example, the development of a very high is expected soon.
capacity national science network (or “internet”) has
taken on a much greater significance. Originally Due to the environmental changes noted above,
conceived of in the narrow context of tying together if the centers are to continue in their present
supercomputer centers and providing regional ac- status as special NSF-sponsored facilities, the
cess to them, the science network has now come to National Supercomputer Centers will need to
be thought of by its proponents as a basic infrastruc- sharply define their roles in terms of: 1) the users
ture, potentially extending throughout (and, perhaps, they intend to serve, 2) the types of applications
even beyond) the entire scientific, technical, and they serve, and 3) the appropriate balance be-
educational community. tween service, education, and research.

Science policy is also changing, as new important The NSF centers are only a few of a growing
and costly projects have been started or are being number of facilities that provide access to HPC
seriously considered, Projects such as the supercol - resources. Assuming that NSF’s basic objective is to
lider, the space station, NASA’s Earth Observing assure researchers access to the most appropriate
System (EOS) program, and the human genome computing for their work, it will be under increasing
mapping may seem at first glance to compete for pressure to justify dedicating funds to one limited
funding with science networks and supercomputers. group of facilities. Five years ago, few U.S. aca-
However, they will create formidable new demands demic supercomputer centers existed. When scien-
for computation, data communications, and data tific demand was less, managerial attention was
storage facilities; and, hence, constitute additional focused on the immediate problem of getting equip-
arguments for investments in an information tech- ment installed and of developing an experienced
nology infrastructure. user community. Under those circumstances, some
ambiguity of purpose may have been acceptable and
Finally, some of the research areas in the so-called understandable. However, in light of the prolifera-
“Grand Challenges" 9 have attained even greater tion of alternative technologies and centers, as well
social importance-such as fluid flow modeling as growing demand by researchers, unless the
which will help the design of faster and more fuel purposes of the National Centers are more clearly
efficient planes and ships, climate modeling to help delineated, the facilities are at risk of being asked to
understand long term weather patterns, and the serve too many roles and, as a result, serving none
structural analysis of proteins to help understand well.
diseases and design vaccines and drugs to fight
them. Some examples of possible choices are as fol-
lows:
REVIEW AND RENEWAL L Provide Access to HPC
OF THE NSF CENTERS ● Provide access to the most powerful, leading

Based on the recent review, NSF has concluded edge, supercomputers available,
● Serve the HPC requirements for research pro-
that the centers, by and large, have been successful
and are operating smoothly. That is, their systems jects of critical importance to the Federal
are being fully used, they have trained many new Government, for example, the “Grand Chal-
users, and they are producing good science. In light lenge” topics.
of that conclusion, NSF has tentatively agreed to ● Serve the needs of all NSF-funded researchers

renewal for the three Cray-based centers and the for HPC.
IBM-based Cornell Center. The John von Neumann ● Serve the needs of the (academic, educational,

Center in Princeton has been based on ETA-10 and/or industrial) scientific community for
computers. Since ETA was closed down, NSF put HPC.
%’(j~and ~~enge’) ~=ach toplc~ ~ que~iom Of major ~i~ lm~~~e mat rquwe for progress subs~[ially grea~r cOmputklg reSOUCeS dltul
arc currently available. The term was fust corned by Nobel Laureate physlclst, Kenneth Wikm.
16

2. Educate and Train “Fifth Generation” refers to computers specially


designed for artificial intelligence applications, es-
. Provide facilities and programs to teach scien-
pecially those that involve logical inference or
tists and students how to use high performance
“reasoning.”)
computing in their research.
3. Advance the State of HPC Use in Research Although in the eyes of many scientists the Fifth
Generation project has fallen short of its original
. Develop applications and system software. goals, eight years later it has produced some
. Serve as centers for research in computational accomplishments in hardware architecture and arti-
science. ficial intelligence software. MITI’s second project,
. Work with vendors as test sites for advanced
dealing with supercomputers, has been more suc-
HPC systems.
cessful. Since 1981, when no supercomputers were
As the use of HPC expands into more fields and manufactured by the Japanese, three companies
among more researchers, what are the policies for have designed and produced supercomputers.
providing access to the necessary computing re-
sources? The Federal Government needs to de- The Japanese manufacturers followed the Americ-
velop a comprehensive analysis of the require- ans into the supercomputer market, yet in the short
ments of the scientific researchers for high time since their entrance, late 1983 for Hitachi and
performance computing, Federal policies of sup- Fujitsu, they have rapidly gained ground in HPC
port for scientific computing, and the variety of hardware. One company, NEC, has recently an-
Federal and State/private computing facilities nounced a supercomputer with processor speeds up
available for research. to eight times faster than the present fastest Ameri-
can machine.10 Outside of the United States, Japan
We expect that OTA’s final report will contribute is the single biggest market for and supplier of
to this analysis from a congressional perspective. supercomputers, although American supercomputer
However, the executive branch, including both lead companies account for less than one-fifth of all
agencies and OSTP also need to participate actively supercomputers sold in Japan. 11
in this policy and planning process.
In the present generation of supercomputers, U.S.
THE INTERNATIONAL supercomputers have some advantages. One of
ENVIRONMENT American manufacturer’s major advantages is the
availability of scientific applications software. The
Since some of the policy debate over HPCs has
Japanese lag behind the Americans in software
involved comparison with foreign programs, this
development, although resources are being devoted
section will conclude a brief description of the status to research in software by the Japanese manufactur-
of HPC in some other nations. ers and government and there is no reason to think
Japan they will not be successful.

The Ministry of International Trade and Industry Another area in which American firms differ from
(MITI), in October of 1981, announced the undertak- the Japanese has been in their use of multiprocessor
ing of two computing projects, one on artificial architecture (although this picture is now changing).
intelligence, the Fifth Generation Computer Project, For several years, American supercomputer compa-
and one on supercomputing, the National Super- nies have been designing machines with multi-
speed Computer Project. The publicity surrounding processors to obtain speed. The only Japanese
MITI’s announcement focused on fifth generation supercomputer that utilizes multiprocessors is the
computers, but brought the more general subject of NEC system, which will not be available until the
supercomputing to the public attention. (The term fall of 1990.
l~e NEC m~hine is not ~hed~~ for delive~ until 1990, at which time faster Cray computers may well be on the market also. s= ~so ~
comments above about computer speed<
American firms have been active in the Japanese NEC’s current supercomputer architecture is not
market, with mixed success. based on its mainframe computer and it is not IBM
compatible. They entered the supercomputer market
Since 1979 Cray has sold 16 machines in Japan. later than Hitachi and Fujitsu. Three NEC supercom-
Of the 16 machines, 6 went to automobile manufac- puters have been sold/installed in foreign markets:
turers, 2 to NTT, 2 to Recruit, 1 to MITI, 1 to one in the United States, an SX-2 machine at the
Toshiba, 1 to Aichi Institute of Technology, and 1 to Houston Area Research Consortium, one at the
Mitsubishi Electric. None have gone to public Laboratory of Aerospace Research in Netherlands,
universities or to government agencies. and an SX-1 has recently been sold in Singapore.
Their domestic users include five universities.
IBM offers their 3090 with attached vector
facilities, IBM does not make public its customers, On April 10, 1989, in a joint venture with
but report that they have sold around 70 vector Honeywell Inc., NEC announced a new line of
processor computers to Japanese clients. Some supercomputers, the SX-X. The most powerful
owners, or soon to be owners, include Nissan, NTT, machine is reported to be up to eight times faster
Mazda, Waseda University, Nippon Steel and Mis- than the Cray X-MP machine. The SX-X reportedly
tubishi Electric. will run Unix-based software and will have multi-
processors. This machine is due to be shipped in the
ETA sold two supercomputers in Japan. The first fall of 1990.
was to the Tokyo Institute of Technology (TIT). The
sale was important because it was the first sale of a Fujitsu’s supercomputer, like Hitachi’s, is based
CDC/ETA supercomputer to the Japanese as well as on their IBM compatible mainframes. Their first
the first purchase of an American supercomputer by machine was delivered in late 1983. Fujitsu had sold
a Japanese national university. This machine was 80 supercomputers in Japan by rnid-1989. An
delivered late (it arrived in May of 1988) and had estimated 17 machines have been sold to foreign
many operating problems, partially due to its being customers. An Amdahl VP-200 is used at the
the first installment of an eight-processor ETA 1O-E. Western Geophysical Institute in London. In the
The second machine was purchased (not delivered) United States, the Norwegian company GECO,
on February 9, 1989 by the University of Meiji. How located in Houston, has a VP-200 and two VP-1OOs.
CDC will deal with the ETA 10 at TIT in light of the The most recent sale was to the Australian National
closure of ETA is unknown at this time. University, a VP-1OO.
Hitachi, Fujitsu, and NEC, the three Japanese
manufacturers of supercomputers, are among the Europe
largest computer/electronic companies in Japan; and
they produce their own semiconductors. Their size European countries that have (or have ordered)
allows them to absorb the high initial costs of supercomputers include: West Germany, France,
designing a new supercomputer, as well as provide England, Denmark, Spain, Norway, the Netherlands,
large discounts to customers. Japan’s technological Italy, Finland, Switzerland, and Belgium. Europe is
lead is in its very fast single-vector processors. Little catching up quickly with America and Japan in
is known, as of yet, what is happening with parallel understanding the importance of high performance
processing in Japan, although NEC’s recent product computing for science and industry. The computer
announcement for the SX-X states that the machine industry is helping to stimulate European interest.
will have multiprocessors. For example, IBM has pledged $40 million towards
a supercomputer initiative in Europe over the 2-year
Hitachi’s supercomputer architecture is loosely period between 1987-89. It is creating a large base of
based on its IBM compatible mainframe. Hitachi followers in the European academic community by
entered the market in November of 1983. Unlike participating in such programs as the European
their domestic rivals, Hitachi has not entered the Academic Supercomputing Initiative (EASI), and
international market. All 29 of its ordered/installed the Numerically Intensive Computing Enterprise
supercomputers are located in Japan. (NICE). Cray Research also has a solid base in
18

academic Europe, supplying over 14 supermomput- components of technology it needs to be competitive


ers to European universities. on the world markets within a decade. ” 14 The EC has
designed a program that forces collaboration be-
The United Kingdom began implementing a high tween nations, develops recognizable standards in
performance computing plan in 1985. The Joint the information technology industry, and promotes
Working Party on Advanced Research Computing’s pre-competitive R&D. The R&D focuses on five
report in June of 1985, “Future Facilities for main areas: microelectronics, software develop-
Advanced Research Computing,” recommended a ment, office systems, computer integrated manufac-
national facility for advanced research computing. turing, and advanced information.
This center would have the most powerful super-
computer available; upgrade the United Kingdom’s Phase I of Esprit, the first 5 years, received $3.88
networking systems, JANET, to ensure communicat- billion in funding. 15 The finding was split 50-50 by
ions to remote users; and house a national organiza- the EC and its participants, This was considered the
tion of advanced research computing to promote catch-up phase. Emphasis was placed on basic
collaboration with foreign countries and within research, realizing that marketable goods will fol-
industry,12 ensuring the effective use of these re- low. Many of the companies that participated in
sources. Following this report, a Cray XMP/48 Phase I were small experimental companies.
was installed at the Atlas Computer Center in
Rutherford. A Cray 1s was installed at the University Phase II, which begins in late 1989, is called
of London. Between 1986 and 1989, some $11.5 commercialization. Marketable goods will be the
million was spent on upgrading and enhancing major emphasis of Phase II. This implies that the
JANET 13 larger firms will be the main industrial participants
since they have the capital needed to put a product
Alvey was the United Kingdom’s key information on the market. The amount of funds for Phase II will
technology R&D program. The program promoted be determined by the world environment in informa-
projects in information technology undertaken jointly tion technology and the results of Phase I, but has
by industry and academics. The United Kingdom been estimated at around $4.14 billion. l6
began funding the Alvey program in 1983. During
the first 5 years, 350 million pounds were allocated Almost all of the high performance computer
to the Alvey program. The program was eliminated technologies emerging from Europe have been
at the end of 1988. Some research was picked up by based on massively parallel architectures. Some of
other agencies, and many of the projects that were Europe’s parallel machines incorporate the transputer.
sponsored by Alvey are now submitting proposals to Transputer technology (basically a computer on a
Esprit (see below). chip) is based on high density VLSI (very large-scale
integration) chips. The T800, Inmos’s transputer,
The European Community began funding the has the same power as Intel’s 80386/80387 chip, the
European Strategic Programme for Research in difference being in size and price. The transputer is
Information Technology (Esprit) program in 1984 about one-third the size and price of Intel’s chip.17
partly as a reaction to the poor performance of the The transputer, created by the Inmos company, had
European Economic Community in the market of its initial R&D funded by the British government.
information technology and partly as a response to Eventually Thorn EMI bought Inmos and the rights
MITI’s 1981 computer programs. The program, to the transputer. Thorn EMI recently sold Inmos to
funded by the European Community (EC), intends to a French-Italian joint venture company, SGS-
“provide the European IT industry with the key Thomson, just as it was beginning to be profitable.
IZ’’FUtUre F~ilities for A&anced Research Computing,” the report of a Joint Working Party on Advanced Research Computing, United Kingdom,
Juty 1985.
13D&l&on p-r on %upereomputers in Australia,” Department of Industry, Ikchnoloy and Commerce, April 1988, pp. 14-15.
l’W3spnt,” c ommission of the European Communities, p. 5.
Is’’Esprit,” Commission of tic European Cotnmunitics, p. 21.
Ibsimm peq, C’fiWan marn Effort Breaks Ground in Software Standards,” Electronic Business, Aug. 15, 1988, pp. 90-91.
17Graham K. EMS , ‘~r~wuters Advance Parallel Processing,” Research and Devefopmenr, March 1989, p. 50.
19

Some of the more notable high performance com- the West German government in their super-
puter products and R&D in Europe include: computing program. A computer prototype was
recently shown at the industry fair in Hanover.
● T.Node, formerly called Supernode P1085, is
It will be marketed in Germany by the end of
one of the more successful endeavors of the
the year for around $14 million.
Esprit program. T.Node is a massively parallel
machine that exploits the Inmos T800 transputer.
● The Supercluster, produced and manufactured
A single node is composed of 16 transputers by Parsytec GmbH, a small private company,
connected by two NEC VLSI chips and two exemplifies Silicon Valley initiative occurring
additional transputers. The participants in the in West Germany. Parsytec has received some
project are The University of Southampton, financial backing from the West German gov-
Royal Signals, Radar Establishment, Thom- ernment for their venture. This start-up firm
EMI (all British) and the French firm Telemat. sells a massively parallel machine that rivals
The prototype of the French T. Node, Marie, a superminicomputers or low-end supercomput-
massively parallel MIMD (multiple instruc- ers. The Supercluster architecture exploits the
tion, multiple data) computer, was delivered in 32-bit transputer from Inmos, the T800. Sixteen
April of 1988. The product is now being transputer-based processors in clusters of four
marketed in America. are linked together. This architecture is less
costly than conventional machines, costing
● Project 415 is also funded by Esprit. Its project
leader is Philips, the Dutch electronics group. between $230,000 and $320,000.19 Parsytec
This project, which consists of six groups, has just begun to market its product in America.
focuses on symbolic computation, artificial
intelligence (AI), rather than “number crunch- Other Nations
ing” (mathematical operations by conventional
supercomputers). Using parallel architecture, The Australia National University recently pur-
the project is developing operating systems and chased a Fujitsu VP-1OO. A private service bureau in
languages that they hope will be available in 5 Australia, Leading Edge, possesses a Cray Research
years for the office environment.18 computer. At least two sites in India have supercom-
● The Flagship project, originally sponsored by puters, one at the Indian Meteorological Centre and
the Alvey program, has created a prototype one at ISC University. Two Middle Eastern petro-
parallel machine using 15 processors. Its origi- leum companies house supercomputers, and Korea
nal participants were ICL, Imperial College, and Singapore both have research institutes with
and the University of Manchester. Other Alvey supercomputers.
projects worked with the Flagship project in Over half a dozen Canadian universities have high
designing operating systems and languages for performance computers from CDC, Cray Research,
the computer. By 1992 the project hopes to or IBM. Canada’s private sector has also invested in
have a marketable product. Since cancellation supercomputers. Around 10 firms possess high
of the Alvey program, Flagship has gained performance computers. The Alberta government,
sponsorship from the Esprit Program. aside from purchasing a supercomputer and support-
● The Supernum Project of West Germany, ing associated services, has helped finance Myrias
with the help of the French Isis program, Computer Corp. A wholly owned U.S. subsidiary,
currently is creating machinery with massively Myrias Research Corp. manufactures the SP-2, a
parallel architecture. The parallelism, based on minisupercomputer.
Intel’s 80386 microprocessors, is one of Es-
prit’s more controversial and ambitious pro- One newly industrialized country is reported to be
jects. Originally the project was sponsored by developing a minisupercomputer of its own. The

IBJ~a VOWk, s~rc~tig Review, “European Transputer-based Projects @ ChtdlmW to U.S. Su_Ptig SUWXY,” Nov~~_~
1988, pp. 8-9.
lgJotIII Go*, “A New Transputer Design From West German Startup,” Electrom”cs, Mu. 3, 1988, PP. 71-72.
20

first Brazilian rninisupercomputer, claimed to be machine will sell for $2.5 million. The Funding
capable of 150 mips, is planned to be available by the Authority of Studies and Projects (FINEP) financed
end of 1989. The prototype is a parallel machine the project, with annual investment around $1
with 64 processors, each with 32-bit capacity. The million.
Chapter 3
Networks

Information is the lifeblood of science; commu- quate funding to carry out initiatives that are set
nication of that information is crucial to the by Congress.
advance of research and its applications. Data
communication networks enable scientists to talk Research networking faces two particular policy
with each other, access unique experimental data, complications. First, since the network in its broad-
share results and publications, and run models est form serves most disciplines, agencies, and many
on remote supercomputers, all with a speed, different groups of users, it has no obvious lead
capacity, and ease that makes possible the posing champion. As a common resource, its potential
of new questions and the prospect for new sponsors may each be pleased to use it but unlikely
answers. Networks ease research collaboration to give it the priority and funding required to bring
by removing geographic barriers. They have it to its full potential. There is a need for clear central
become an invaluable research tool, opening up leadership, as well as coordination of governments,
new channels of communication and increasing the private sector, and universities. A second com-
access to research equipment and facilities. Most plication is a mismatch between the concept of a
important networking is becoming the indispen- national research network and the traditionally
sable foundation for all other use of information decentralized, subsidized, mixed public-private na-
technology in research. ture of higher education and science. The processes
and priorities of mission agency-based Federal
Research networking is also pushing the frontiers support may need some redesigning, as they are
of data communications and network technologies. oriented towards supporting ongoing mission-
Like electric power, highways, and the telephone, oriented and basic research, and may work less well
data communications is an infrastructure that will be at fostering large-scale scientific facilities and infra-
crucial to all sectors of the economy. Businesses structure that cut across disciplines and agency
demand on-line transaction processing, and finan- missions.
cial markets run on globally networked electronic
trading. The evolution of telephony to digital In the near term, the most important step is
technology allows merging of voice, data, and getting a widely connected, operational network
information services networking, although voice in place. But the “bare bones” networks are a
circuits still dominate the deployment of the technol- small part of the picture. Information that flows
ogy. Promoting scientific research networking— over the network, and the scientific resources and
dealing with data-intense outputs like satellite imag- data available through the network, are the
ing and supercomputer modeling—should push important payoffs. Key long-term issues for the
networking technology that will find application far research community will be those that affect the
outside of science. sort of information available over the network,
who has access to it, and how much it costs. The
Policy action is needed, if Congress wishes to main issue areas for scientific data networking are
see the evolution of a full-scale national research outlined below:
and education network. The existing “internet”
of scientific networks is a fledgling. As this
● research-to develop the technology required
conglomeration of networks evolves from an to transmit and switch data at very high rates;
R&D enterprise to an operational network, users ● private sector participation-role of the com-
will demand round-the-clock, high-quality serv- mon carriers and telecommunication compa-
ice. Academics, policy makers, and researchers nies in developing and managing the network
around the world agree on the pressing need to and of private information firms in offering
transform it into a permanent infrastructure. services;
This will entail grappling with difficult issues of ● scope—who the network is designed to serve
public and private roles in funding, management, will drive its structure and management;
pricing/cost recovery, access, security, and inter- ● access—balancing open use against security
national coordination as well as assuring ade- and information control and determining who
–21 –
22

will be able to gain access to the network for and instructional community. ”
what purpose; EDUCOM/NTTF March 1989.
● standards-the role of government, industry, “The NREN will provide high-speed communica-
users, and international organizations in setting tion access to over 1300 institutions across the
and maintaining technical standards; United States within five years. It will offer suffi-
● management-public and private roles; degree cient capacity, performance, and functionality so that
of decentralization; the physical distance between institutions is no
● funding-an operational network will require longer a barrier to effective collaboration. It will
significant, stable, continuing investment; the support access to high-performance computing fa-
financial responsibilities demarcated must re- cilities and services . , . and advanced information
sharing and exchange, including national file sys-
flect the interests of various players, from
tems and online libraries . . . the NREN will evolve
individual colleges through States and the toward fully supported commercial facilities that
Federal Government, in their stake in network support a broad range of applications and services.”
operations and policies;
● economics-pricing and cost recovery for net- FRICC, Program Plan
work use, central to the evolution and manage- for the NREN, May 23, 1989.
ment of any infrastructure. Economics will This chapter of the background paper reviews the
drive the use of the network; status of and issues surrounding data networking for
● information services-who will decide what science, in particular the proposed NREN. It de-
types of services are to be allowed over the scribes current Federal activities and plans, and
network, who is allowed to offer them; and who identifies issues to be examined in the full report, to
will resolve information issues such as privacy, be completed in summer 1990.
intellectual property, fair competition, and
security; The existing array of scientific networks consists
● long-term science policy issues—the networks’ of a hierarchy of local, regional and national
impacts on the process of science, and on networks, linked into a whole. In this paper,
access to and dissemination of valuable scien- “NREN” will be used to describe the next generation
tific and technical information. of the national “backbone” that ties them together.
The term “Internet” is used to describe a more
specific set of interconnected major networks, all of
THE NATIONAL RESEARCH AND which use the same data transmission protocols. The
EDUCATION NETWORK (NREN) most important are NSFNET and its major regional
subnetworks, ARPANET, and several other feder-
“A universal communications network connected ally initiated networks such as ESNET and
to national and international networks enables elec- NASNET. The term internet is used fairly loosely.
tronic communication among scholars anywhere in At its broadest, the more generic term internet can be
the world, as well as access to worldwide informa-
tion sources, special experimental instruments, and
used to describe the international conglomeration of
computing resources. The network has sufficient networks, with a variety of protocols and capabili-
bandwidth for scholarly resources to appear to be ties, which have a gateway into Internet; which
attached to a world local area network.” could include such things as BITNET and MCI Mail.
EDUCOM, 1988.
The Origins of Research Networking
66
a national research network to provide a distrib
. , ,

uted computing capability that links the govemment, Research users were among the first to link
industry, and higher education communities.” computers into networks, to share information and
.
OSTP, 1987. broaden remote access to computing resources.
“The goal of the National Research and Education DARPA created ARPANET in the 1960s for two
Network is to enhance national competitiveness and purposes: to advance networking and data communi-
productivity through a high-speed, high-quality cations R&D, and to develop a robust communica-
network infrasoucture which supports a broad set of tions network that would support the data-rich
applications and network services for the research conversations of computer scientists. Building on
23

the resulting packet-switched network technology, national backbone. The primary driver for this
other agencies developed specialized networks for interconnecting and coalescing of networks has been
their research communities (e.g., ESNET, CSNET the need for connectivity among users. The power of
NSFNET), Telecommunications and electronic in- the whole is vastly greater than the sum of the pieces.
dustries provided technology and capacity for these Substantial costs are saved by extending connectiv-
networks, but they were not policy leaders or ity while reducing duplication of network coverage.
innovators of new systems. Meanwhile, other research- The real payoff is in connecting people, information,
oriented networks, such as BITNET and Usenet, and resources. Linking brings users in reach of each
were developed in parallel by academic and industry other. Just as telephones would be of little use if only
users who, not being grantees or contractors of a few people had them, a research and education
Federal agencies, were not served by the agency- network’s connectivity is central to its usefulness,
sponsored networks. These university and lab-based and this connectivity comes both from ability of
networks serve a relatively small number of special - each network to reach the desks, labs, and homes of
ized scientific users, a market that has been ignored its users and the extent to which various networks
by the traditional telecommunications industry. The are, themselves, interconnected.
networks sprang from the efforts of users—
academic and other research scientists-and the The Present NREN
Federal managers who were supporting them.l The national research and education network can
be viewed as four levels of increasingly complex and
The Growing Demand for Capability and flexible capability:
Connectivity
Today there are thousands of computer networks
● physical wire/fiberoptic common carrier ’’high-
ways”;
in the United States. These networks range from . user-defined, packet-switched networks;
tempoary linkages between modem-equipped 2 desk- . basic network operations and services; and
top computers linked via common carriers, to . research, education, database, and information
institution-wide area networks, to regional and services accessible to network users
national networks, Network traffic moves through
different media, including copper wire and optical In a fully developed NREN, all of these levels of
cables, signal processors and switches, satellites, service must be integrated. Each level involves
and the vast common carrier system developed for different technologies, services, policy issues, re-
voice communication. Much of this hodgepodge of search opportunities, engineering requirements, cli-
networks has been linked (at least in terms of ability entele, providers, regulators, and policy issues. A
to interconnect) into the internet. The ability of any more detailed look at the policy problems can be
two systems to interconnect depends on their ability drawn by separating the NREN into its major
to recognize and deal with the form information components.
flows take in each. These “protocols” are sets of
technical standards that, in a sense, are the “lan- Level 1: Physical wire/fiber optic common
guages” of communication systems. Networks with carrier highways
different protocols can often be linked together by
The foundation of the network is the physical
computer-based “gateways” that translate the proto-
conduits that carry digital signals. These telephone
cols between the networks.
wires, optical fibers, microwave links, and satellites
National networks have partially coalesced, where are the physical highways and byways of data
technology allows cost savings without losing transit. They are invisible to the network user. To
connectivity. Over the past years, several agencies provide the physical skeleton for the intemet,
have pooled funds and plans to support a shared government, industry, and university network man-
I John S, ~e~an and Josiah C. Hoskins, “Notable Computer Networks,” Cornmurucations of the ACM, vol 29, No, 10, October 1986, pp. 932-971;
John S. Quartennan, The Matrix Networks Around the World, Dlg]tiil Press, August 1989
2A ‘*Mod~m” ~onve~ ~fo~ation in a computer {o a form ~a[ a communication systcm can CqJ, and vice versa. h alSO autOma[eS SOme Simple
functions, such as dialing and answering the phone, dctectmg and corrccung transmission errors.
24

agers lease circuits from public switched common Level 3: Basic network operations and services
carriers, such as AT&T, MCI, GTE, and NTN. In A small number of basic maintenance tools keeps
doing so they take advantage of the large system of the network running and accessible by diverse,
circuits already laid in place by the telecommunica- distributed users. These basic services are software-
tions common carriers for other telephony and data based, provided for the users by network operators
markets. A key issue at this level is to what extent and computer manufacture in operating systems.
broader Federal agency and national telecommuni- They include software for password recognition,
cations policies will promote, discourage, or divert electronic-mail, and file transfer. These are core
the evolution of a research-oriented data network. services necessary to the operation of any network.
These basic services are not consistent across the
current range of computers used by research. A key
Level 2: User-defined subnetworks issue is to what extent these services should be
standardized, and as important, who should make
The internet is a conglomeration of smaller those decisions.
foreign, regional, State, local, topical, private, gov- Level 4: Value-added superstructure: links to
ernment, and agency networks. Generally, these research, education, and information services
separately managed networks, such as SURANET,
The utility of the network lies in the information,
BARRNET, BITNET, and EARN, evolved along services, and people that the user can access through
naturally occurring geographic, topical, or user the network. These value-added services provide
lines, or mission agency needs. Most of these logical specialized tools, information, and data for research
networks emerged from Federal research agency and education. Today they include specialized
(including the Department of Defense) initiatives. In computers and software, library catalogs and publi-
addition, there are more and more commercial, State cation databases, archives of research data, confer-
and private, regional, and university networks (such encing systems, and electronic bulletin boards and
as Accunet, Telenet, and Usenet) at the same time publishing services that provide access to colleagues
specialized and interlined. Many have since linked in the United States and abroad. These information
through the Internet, while keeping to some extent resources are provided by volunteer scientists and by
their own technical and socioeconomic identity. non-profit, for-profit, international, and government
This division into small, focused networks offers the organizations. Some are amateur, poorly maintained
advantage of keeping network management close to bulletin boards; others are mature information or-
its users; but demands standardization and some ganizations with well-developed services. Some are
central coordination to realize the benefits of inter- “free”; others recover costs through user charges.
connection.
Core policy issues are the appropriate roles for
various information providers on the network. If the
Networks at this level of operations are distin- network is viewed as public infrastructure, what is
guished by independent management and technical “fair” use of this infrastructure? If the network eases
boundaries. Networks often have different standards access to sensitive scientific data (whether raw
and protocols, hardware, and software. They carry research data or government regulatory databases),
information of different sensitivity and value. The how will this stress the policies that govern the
diversity of these logical subnetworks matters to relationships of industry, regulators, lobbyists, and
institutional subscribers (who must choose among experts? Should profit-seeking companies be al-
network offerings), to regional and national network lowed to market their services? How can we ensure
managers (who must manage and coordinate these that technologies needed for network maintenance,
networks into an internet), and to users (who can find cost accounting, and monitoring will not be used
the variety of alternatives confusing and difficult to inappropriately or intrusively? Who should set
deal with). A key issue is the management relation- prices for various users and services? How will
ship among these diverse networks; to what extent intellectual property rights be structured for elec-
is standardization and centralization desirable? tronically available information? Who is responsible
25

for the quality and integrity of the data provided and nment support for applied research can catalyze and
used by researchers on the network? integrate R&D, decrease risk, create markets for
network technologies and services, transcend eco-
Research Networking as a Strategic nomic and regulatory barriers, and accelerate early
High Technology Infrastructure technology development and deployment. This would
not only bolster U.S. science and education, but
Research networking has dual roles. First, net- would fuel industry R&D and help support the
working is a strategic, high technology infrastruc- market and competitiveness of the U.S. network and
ture for science. More broadly applied, data net- information services industry,
working enables research, education, business, and
manufacturing, and improves the Nation’s knowl- Governments and private industries the world
edge competitiveness. Second, networking technol- over are developing research networks, to enhance
ogies and applications are themselves a substantial R&D productivity and to create testbeds for highly
growth area, meriting focused R&D. advanced communications services and technolo-
gies. Federal involvement in infrastructure is moti-
Knowledge is the commerce of education and vated by the need for coordination and nationally
research. Today networks are the highways for oriented investment, to spread financial burdens,
information and ideas. The y expand access to and promote social policy goals (such as furthering
computing, data, instruments, the research commun- basic research).3 Nations that develop markets in
ity, and the knowledge they create. Data are network-based technologies and services will create
expensive (relative to computing hardware) and are information industry-based productivity growth.
increasingly created in many widely distributed
locations, by specialized instruments and enter-
Federal Coordination of the Evolving Internet
prises, and then shared among many separate users.
The more effectively that research information is NREN plans have evolved rapidly. Congres-
disseminated to other researchers and to industry, sional interest has grown; in 1986, Congress re-
the more effective is scientific progress and social quested the Office of Science and Technology
application of technological knowledge. An internet Policy (OSTP) to report on options for networking
of networks has become a strategic infrastructure for for research and supercomputing. 4 The resulting
research. report, completed in 1987 by the interagency Federal
Coordinating Council for Science, Engineering, and
The research networks are also a testbed for Technology (FCCSET), called for a new Federal
data communications technology. Technologies program to create an advanced national research
developed through the research networks are likely network by the year 2000.5 This vision incorporated
to enhance productivity of all economic sectors, not two objectives: 1 ) providing vital computer-
just university research. The federally supported
communications network services for the Nation
Internet has not only sponsored frontier-breaking
academic research community, and 2) stimulating
network research, but has pulled data-networking networking and communications R&D which would
technology with it. ARPANET catalyzed the devel-
fuel U.S. industrial technology and commerce in the
opment of packet-switching technology, which has
growing global data communications market.
expanded rapidly from R&D networking to multibil-
lion-dollar data handling for business and financial The 1987 FCCSET report, building on ongoing
transactions. The generic technologies developed Federal activities, addressed near-term questions
for the Internet-hardware (such as high-speed over the national network scope, purposes, agency
switches) and software for network management, authority, performance targets, and budget. It did not
routing, and user interface-will transfer readily resolve issues surrounding the long-term operation
into general data-networking applications. Gover- of a network, the role of commercial services in
3Conwe.lm~ Budget office, NW Dlrectlom for 1~ Na~”on’s p~~l( ~~r~, septcnl~r lq~~, p, X1 ]]: c~o, Ffd~rUl ~ofl( [es for ln~)mtrlu’ture
Management, June 1986,
4pL 99-383, Aug. 21, 1986.
50~p, A Re~eUch ~~ Deve[op~~ s~~egy for fflgh peflor~~e ~t)~utlng, NOV ?-(), 1987
26

providing network operations and services, or inter- ● networking industry, the telecommunications,
face with broader telecommunications policies. data communications, computer, and informa-
tion service companies that provide networking
A 1988 National Research Council report praised
technologies and services;
ongoing activities, emphasized the need for coordi- ● State enterprises devoted to economic develop-
nation, stable funding, broadened goals and design
ment, research, and education;
criteria, integrated management, and increased pri- ● industrial R&D labs (network users); and
vate sector involvement. 6 ● the Federal Government, primarily the national
FCCSET’S Subcommittee on Networking has labs and research-funding agencies
since issued a plan to upgrade and expand the
Federal funding and policy have stimulated the
network. 7 In developing this plan, agencies have
development of the Internet. Federal initiatives have
worked together to improve and interconnect several
been well complemented by States (through finding
existing networks. Most regional networks were State networking and State universities’ institutional
joint creations of NSF and regional consortia, and and regional networking), universities (by funding
have been part of the NSFNET world since their
campus networking), and industry (by contributing
inception. Other quasi-private, State, and regional
networking technology and physical circuits at
networks (such as CICNET, Inc., and CERFNET)
sharply reduced rates). End users have experienced
have been started.
a highly subsidized service during this “experimen-
Recently, legislation has been reintroduced to tal” stage. As the network moves to a bigger, more
authorize and coordinate a national research net- expensive, more established operation, how might
work. 8 As now proposed, a National Research and these relative roles change?
Education Network would link universities, national
laboratories, non-profit institutions and government Universities
research organizations, private companies doing Academic institutions house teachers, research-
government-supported research and education, and ers, and students in all fields. Over the past few
facilities such as supercomputers, experimental decades universities have invested heavily in librar-
instruments, databases, and research libraries. Net- ies, local computing, campus networks, and regional
work research, as a joint endeavor with industry, network consortia. The money invested in campus
would create and transfer technology for eventual networking far outweighs the investment in the
commercial exploitation, and serve the data- NSFNET backbone. In general, academics view the
networking needs of research and higher education NREN as fulfillment of a longstanding ambition to
into the next century. build a national system for the transport of informa-
tion for research and education. EDUCOM has long
Players in the NREN labored from the “bottom” up, bringing together
researchers and educators who used networks (or
The current Internet has been created by Federal believed they could use them) for both research and
leadership and funding, pulling together a wide base teaching.
of university commitment, national lab and aca-
demic expertise, and industry interest and technol- Networking Industry
ogy. The NREN involves many public and private There is no simple unified view of the NREN in
actors. Their roles must be better delineated for the fragmented telecommunications “industry.” The
effective policy. Each of these actors has vested long-distance telecommunications common carriers
interests and spheres of capabilities. Key players are:
generally see the academic market as too specialized
. universities, which house most end users; and risky to offer much of a profit opportunity.
6Nauon~ Re=Mch COMC,l, Tward ~ Natio~/ Rese~c,h Nefw~rk (Wash] nson, DC, NationaJ ~ademy Press, 1988), especiidly pp. 25-37.
TFCCSET or F~era] Cwrdnatlng COMC1} for Science, ~glnwr~g, ~d ~hnology, The Federa/ High Perjormnce Compunng Progrurn,
Washington, DC, OSTP, Sep[. 8, 1989.
ES, 1067, ‘I~e Natlon~ High-peflommce Compukr ~~oloB ~[ of ]989,” May 1989, inwoduc~ by Mr. Gore, Hearings were held on June
21, 1989, H,R, 3131, “The National High-Performance Computer Technology Act of 1989,” introduced by Mr. Walgren.
27

However, companies have gained early experience internet users bring with them their own set of
with new technologies and applications by partici- concerns such as cost accounting, proper network
pating in university R&D; it is for this reason that use, and information security. Other non-R&D
industry has jointly funded the creation and develop- companies, such as business analysts, also are likely
ment of NSFNET. to seek direct network connectivity to universities,
government laboratories, and R&D-intensive com-
Various specialized value-added common carriers
panies.
offer packet-switched services. They could in princi-
ple provide some of the same services that the NREN
would provide, such as electronic mail. They are not, Federal
however, designed to meet the capacity require- Three strong rationales-support of mission and
ments of researchers, such as transferring vast files basic science, coordinating a strategic national
of supercomputer-generated visualizations of weather infrastructure, and promotion of data-networking
systems, simulated airplane test flights, or econo- technology and industrial productivity-drive a
metric models. Nor can common carriers provide the
substantial, albeit changing, Federal involvement.
“reach” to all carriers. Another more modest goal is to rationalize duplica-
States tion of effort by integrating, extending, and moder-
nizing existing research networks. That is in itself
The interests of States in research, education, and quite important in the present Federal budgetary
economic development parallel Federal concerns. environment. The international nature of the net-
Some States have also invested in information work also demands a coherent national voice in
infrastructure development. Many States have in- international telecommunications standardization.
vested heavily in education and research network- The Internet’s integration with foreign networks also
ing, usually based in the State university system and justifies Federal concern over the international flow
encompassing, to varying degrees, private universi- of militarily or economically sensitive technical
ties, State government, and industry. The State is a information. The same university-government-
“natural” political boundary for network financing. industry linkages on a domestic scale drive Federal
In some States, such as Alabama, New York, North interests in the flow of information.
Carolina, and Texas, special initiatives have helped
create statewide networks. Federal R&D agencies’ interest in research net-
working is to enhance their external research support
Industry Users missions. (Research networking is a small, special-
ized part of agency telecommunications. It is de-
There are relatively few industry users of the signed to meet the needs of the research community,
internet; most are very large R&D-intensive compa- rather than agency operations and administrative
nies such as IBM and DEC, or small high- telecommunications that are addressed in FTS
technology companies. Many large companies have 2000.) The hardware and software communications
internal business and research networks which link technologies involved should be of broad commer-
their offices and laboratories within the United cial importance. The NREN plans reflect national
States and overseas; many also subscribe to com- interest in bolstering a serious R&D base and a
mercial services such as MCI Mail. However, these competitive industry in advanced computer commu-
proprietary and commercial networks do not provide nications.
the internet’s connectivity to scientists or the high
bandwidth and services so useful for research The dominance of the Federal Government in
communications. Like universities and national network development means that Federal agency
labs, companies are a part of the Nation’s R&D interests ha-e strongly influenced its form and
endeavor; and being part of the research community shape. Policies can reflect Federal biases; for in-
today includes being “on” the internet. Appropriate stance, the limitation of access to the early AR-
industry use of the NREN should encourage interac- PANET to ARPA contractors left out many academ-
tion of industry, university, and government re- ics, who consequent y created their own grass-roots,
searchers, and foster technology transfer. Industry lower-capability BITNET.
28

International actors are also important. As with tions comes from the voice market. One reason is
the telephone system, the internet is inherently uncertainty about the legal limits, for providing
international. These links require coordination, for information services, imposed on the newly divested
example for connectivity standards, higher level companies. (In comparison, the computer industry
network management, and security. This require- has been unregulated. With the infancy of the
ment implies the need for Federal level management technology, and open markets, computer R&D has
and policy. been exceptionally productive,) A crucial concern
for long-range NREN planning is that scientific
The NREN in the International and educational needs might be ignored among
Telecommunications Environment the regulations, technology priorities, and eco-
nomics of a telecommunications market geared
The nature and economics of an NREN will
toward the vast telephone customer base.
depend on the international telecommunications
context in which it develops. Research networks are
a leading edge of digital network technologies, but
POLICY ISSUES
are only a tiny part of the communications and The goal is clear; but the environment is
information services markets. complex, and the details will be debated as the
network evolves
The 1990s will be a predominantly digital world;
historically different computing, telephony, and There is substantial agreement in the scientific
business communications technologies are evolving and higher education community about the pressing
into new information-intensive systems. Digital national need for a broad-reaching, broad-
technologies are promoting systems and market bandwidth, state-of-the-art research network, The
integration. Telecommunications in the 1990s will existing Internet provides vital communication,
revolve around flexible, powerful, “intelligent” net- research, and information services, in addition to its
works. However, regulatory change and uncertainty, concomitant role in pushing networking and data
market turbulence, international competition, the handling technology, Increasing demand on network
explosion in information services, and significant capacity has quickly saturated each network up-
changes in foreign telecommunications policies, all grade. In addition, the fast-growing demand is
are making telecommunications services more tur- overburdening the current informal administrative
bulent. This will cloud the research network’s arrangements for running the Internet. Expanded
long-term planning. capability and connectivity will require substantial
budget increases. The current network is adequate
High-bandwidth, packet-switched networking is
for broad e-mail service and for more restricted file
at persent a young market in comparison to commer-
transfer, remote logon, and other sophisticated uses.
cial telecommunications. Voice overwhelmingly
Moving to gigabit bandwidth, with appropriate
dominates other services (e.g. fax, e-mail, on-line network services, will demand substantial techno-
data retrieval). While flexible, hybrid voice-data logical innovation as well as investment.
services are being introduced in response to business
demand for data services, the technology base is There are areas of disagreement and even broader
optimized for voice telephony. areas of uncertainty in planning the future national
research network. There are several reasons for this:
Voice communications brings to the world of
the immaturity of data network technology, serv-
computer telecommunications complex regulatory
ices, and markets; the Internet’s nature as strategic
and economic baggage. Divestiture of the AT&T infrastructure for diverse users and institutions;
regulated monopoly opened the telecommunications
and the uncertainties and complexities of overriding
market to new entrants, who have slowly gained
telecommunications policy and economics.
long-haul market share and offered new technolo-
gies and information services. In general, however, First, the current Internet is, to an extent, an
the post-divestiture telecommunications industry experiment in progress, similar to the early days of
remains dominated by the descendants of old the telephone system. Technologies, uses, and po-
AT&T, and most of the impetus for service innova- tential markets for network services are still nascent.
29

Patterns of use are still evolving; and a reliable management structure appropriate to the desired
network has reached barely half of the research mission is established.
community. Future uses of the network are difficult Third, the network is part of the telecommunica-
to identify; each upgrade over the past 15 years has tions world, rampant with policy and economic
brought increased value and use as improved net- confusion. The research community is small, with
work capacity and access have made new applica-
specialized data needs that are subsidiary to larger
tions feasible. markets. It is not clear that science’s particular
networking needs will be met.
The Internet is a conglomeration of networks that
grew up ad hoc. Some, such as ARPANET, CSNET, Planning Amidst Uncertainty
and MFENET, were high-quality national networks
supported by substantial Federal funding. Other Given these three large uncertainties, there is no
smaller networks were built and maintained by the straightforward or well-accepted model for the
late-night labors of graduate students and computer “best” way to design, manage, and upgrade the
centers operators. One of these, BITNET, has future national research network. Future network use
become a far-reaching and widely used university will depend on cost recovery and charging practices,
network, through the coordination of EDUCOM and about which very little is understood. These uncer-
support of IBM. The Internet has since become a tainties should be accommodated in the design of
more coherent whole, under Federal coordination network management as well as the network itself.
led by NSF and DARPA and advised by the Internet One way to clarify NREN options might be to
Activities Board. Improvements in service and look at experiences with other infrastructures (e.g.,
connectivity have been astounding. Yet the patch- waterways, telephones, highways) for lessons about
work nature of the Internet still dominates; some how different financing and charging policies affect
campus and regional networks are high quality and who develops and deploys technology, how fast
well maintained; others are lower speed, less relia- technology develops, and who has access to the
ble, and reach only a few institutions in their region. infrastructure. Additionally, some universities are
Some small networks are gatewayed into the In- beginning trials in charging for network services;
ternet; others are not. This patchwork nature limits these should provide experience in how various
the effectiveness of the Internet, and argues for better charging practices affect usage, technology deploy-
planning and stronger coordination. ment and upgrading, and the impacts of network use
policies on research and education at the level of the
Second, the network is a strategic infrastructure, institution.
with all the difficulties in capitalizing, planning, Table 3-1 lists the major areas of agreement and
financing, and maintaining that seem to attend any disagreement in various “models” of the proper form
infrastructure. 9 Infrastructures tend to suffer from a of network evolution.
“commons” problem, leading to continuing underin-
vestment and conflict over centralized policy. By its
Network Scope and Access
nature the internet has many diverse users, with
diverse interests in and demands on the network. The Scope
network’s value is in linking and balancing the needs
of these many users, whether they want advanced Where should an NREN reach: beyond research-
supercomputer services or merely e-mail. Some intensive government laboratories and universities
users are network-sophisticated, while many users to all institutions of higher education? high schools?
want simple, user-friendly communications. This nonprofit and corporate labs? Many believe that
diversity of users complicates network planning and eventually— perhaps in 20 years-de facto data
management. The scope and offerings of the net- networking will provide universal linkage, akin to a
work must be at least sketched out before a sophisticated phone system.
yconvewlm~ Budget office, NW Dire(,llom for the Na/IOn’s Public Works, September 1988; Nauonal COMCI1 on mblic w’or~ ~Provement,
Fragile Foundation A Report on America’s Pubk Works, Washington, IX, February 1988.
30

Table 3-1-Principal Policy Issues in Network Development

Major areas of agreement Major areas of disagreement and uncertainty


Scope and access
1. The national need for a broad state-of-the-art research network la. The exact scope of the NREN; whether and how to control
that links basic research, government, and higher education. domestic and foreign access.
lb. Hierarchy of network capability. Cost and effort limit the reach
of state-of-the-art networking; an “appropriate networking”
scenario would have the most intensive users on a leading
edge network and less demanding users on a lower-cost
network that suffices for their needs. Where should those lines
be drawn, and who should draw them? How can the Federal
Government ensure that the gap between leading edge and
casual is not too large, and that access is appropriate and
equitable?

Policy and management structure


2. The need for a more formal mechanism for planning and 2a. The form and function of an NREN policy and management
operating the NREN, to supersede and better coordinate authority; the extent of centralization, particularly the role of
informal interagency cooperation and ad hoc university and Federal Government; the extent of participation of industry
State participation, and for international coordination. users, networking industry, common carriers, and universities
in policy and operations; mechanisms for standard setting,

Financing and coat recovery


3. The desirability of moving from the current “market- 3a. How the transition to commmercial operations and charging can
establishing” environment of Federal and State grants and and should be made; more generally, Federal-private sector
subsidies, with services “free” to users, to more formal cost roles in network policy and pricing; how pricing practices will
recovery, shifting more of the cost burden and financial shape access, use, and demand.
incentives to end users.

Network Use
4. The desirability of realizing the potential of a network; the need 4a. Who should be able to use the network for what purposes, and
for standards and policies to link to information services, at what entry rest; the process of guiding economic structure
databases, and nonresearch networks. of services, subsidies, price of for multi-product services;
intellectual property policies.
S0URCE: Office of Technology Assessment, 1989.

The appropriate breadth of the network is unlikely participation in standard-setting to make it feasible
to be fully resolved until more user communities for currently separated communities, such as high
gain more experience with networking, and a better schools and universities, to interconnect later on.
understanding is gained of the risks and benefits of Industry-academic boundaries are of particular
various degrees of network coverage. A balance concern. Interconnection generally promotes re-
must be struck in network scope, which provides a search and innovation. Companies are dealing with
small network optimized for special users (such as risk of proprietary information release by maintain-
scientists doing full-time, computationally intensive ing independent corporate networks and by restrict-
research) and also a broader network serving more ing access to open networks. How can funding and
diverse users. The scope of the internet, and capabil- pricing be structured to ensure that for-profit compa-
ities of the networks encompassed in the internet, nies bear an appropriate burden of network costs?
will need to balance the needs of specialized users
without diluting the value for top-end and low-end Access
users. NREN plans, standards, and technology Is it desirable to restrict access to the internet?
should take into account the possibility of later Who should control access? Open access is desired
expansion and integration with other networks and by many, but there are privacy, security, and
other communities currently not linked up. After-the- commercial arguments for restricting access. Re-
fact technical patches are usually inefficient and stricting access is difficult, and is determined more
expensive. This may require more government by access controls (e.g., passwords and monitoring)
31

on the computers that attach users to the network, Policy and Management Structure
than by the network itself. Study is needed on
whether and how access can be controlled by Possible management models include: federally
technical fixes within the network, by computer chartered nonprofit corporations, single lead agen-
centers attached to the network, informal codes of cies, interagency consortium, government-owned
behavior, or laws, contractor operations, commercial operations; and
Tennessee Valley Authority, Atomic Energy Com-
Another approach is not to limit access, but mission, the NSF Antarctic Program, and Fannie
minimize the vulnerability of the network—and its Mae. What are the implications of various scenarios
information resources and users-to accidents or for the nature of traffic and users?
malice. In comparison, essentially anyone who has Degree of Centralization
a modest amount of money can install a phone, or
use a public phone, or use a friend’s phone, and What is the value of centralized, federally ac-
access the national phone system. However, crimi- countable management for network access control,
nal, fraudulent, and harassing uses of the phone traffic management and monitoring, and security,
system are illegal. Access is unrestricted, but use is compared to the value of decentralized operations,
governed. open access and traffic? There are two key technical
questions here: to what extent does network tech-
nology limit the amount of control that can be
Controlling International Linkages exerted over access and traffic content? To what
extent does technology affect the strengths and
Science, business, and industry are international; weaknesses of centralized and decentralized mana-
their networks are inherently international. It is gement?
difficult to block private telecommunications links
Mechanisms for Interagency Coordination
with foreign entities, and public telecommunica-
tions is already international. However, there is a Interagency coordination has worked well so far,
fundamental conflict between the desire to capture but with the scaling up of the network, more formal
information for national or corporate economic gain, mechanisms are needed to deal with larger budgets
and the inherent openness of a network. Scientists and to more tightly coordinate further development.
generally argue that open network access fosters
scientifically valuable knowledge exchange, which Coordination With Other Networks
in turn leads to commercially valuable innovation. National-level resources allocation and planning
must coordinate with interdependent institutional
and mid-level networking (the other two legs of
Hierarchy of Network Capability networking).
Investment in expanded network access must be Mechanisms for Standard Setting
balanced continually with the upgrading of network
Who should set standards, when should they be
performance, As the network is a significant com-
set, and how overarching should they be? Standards
petitive advantage in research and higher education,
at some common denominator level are absolutely
access to the “best” network possible is important.
necessary to make networks work. But excessive
There are also technological considerations in link-
standardization may deter innovation in network
ing networks of various performance levels and
technology, applications and services, and other
various architectures. There is already a consensus
standards.
that there should be a separate testbed or research
network for developing and testing new network Any one set of standards usually is optimal for
technologies and services, which will truly be at the some applications or users, but not for others. There
cutting edge (and therefore also have the weaknesses are well-established international mechanisms for
of cutting edge technology, particularly unreliability formal standards-setting, as well as strong intern-
and difficulty of use). ational involvement in more informal standards
32

development. These mechanisms have worked well, Funding and Charge Structures
albeit slowly. Early standard-setting by agencies and
their advisers accelerated the development of U.S. Financing issues are akin to ones in more tradi-
networks, In many cases the early established tional infrastructures, such as highways and water-
standards have become, with some modification, de ways. These issues, which continue to dominate
facto national and even international standards. This infrastructure debates, are Federal private sector
is proving the case with ARPANET’s protocol suite, roles and the structure of Federal subsidies and
TCP/IP. However, many have complained that incentives (usually to restructure payments and
agencies’ relatively precipitous and closed standards access to infrastructure services). Is there a continu-
determination has resulted in less-than-satisfactory ing role for Federal subsidies? How can university
standards. NREN policy should embrace standards- accounting, OMB circular A-21, and cost recovery
setting. Should it, however, encourage wider partici- practices be accommodated?
pation, especially by industry, than has been the
case? U.S. policy must balance the need for intern- User fees for network access are currently charged
ational compatibility with the furthering of national as membership/access fees to institutions. End users
interests. generally are not charged. In the future, user fees
may combine access/connectivity fees, and use-
related fees. They may be secured via a trust fund (as
Financing and Cost Recovery is the case with national highways, inland water-
How can the capital and operating costs of the ways, and airports), or be returned directly to
NREN be met? Issues include subsidies, user or operating authorities. A few regional networks (e.g.,
access charges, cost recovery policies, and cost CICNET, Inc.) have set membership/connectivity
accounting. As an infrastructure that spans disci- fees to recover full costs. Many fear that user fees are
plines and sectors, the NREN is outside the tradi- not adequate for full funding/cost recovery.
tional grant mechanisms of science policy. How
might NREN economics be structured to meet costs
and achieve various policy goals, such as encourag- Industry Participation
ing widespread yet efficient use, ensuring equity of
access, pushing technological development while Industry has had a substantial financial role in
maintaining needed standards, protecting intellec- network development. Industry participation has
tual property and sensitive information while en- been motivated by a desire to stay abreast of
couraging open communication, and attracting U.S. data-networking technology as well as a desire to
commercial involvement and third-party informa- develop a niche in potential markets for research
tion services? networking. It is thus desirable to have significant
industry participation in the development of the
NREN. Industry participation does several things:
Creating a Market
industry cost sharing makes the projects financially
One of the key issues centers around the extent to feasible; industry has the installed long-haul tele-
which deliberate creation of a market should be built communications base to build on; and industry
into network policy, and into the surrounding involvement in R&D should foster technology
science policy system. There are those who believe transfer and, generally, the competitiveness of U.S.
that it is important that the delivery of network telecommunications industry. Industry in-kind con-
access and services to academics eventually become tributions to NSFNET, primarily from MCI and
a commercial operation, and that the current Federal IBM, are estimated at $40 million to $50 million
subsidy and apparently “free” services will get compared to NSF’s 5 year, $14 million budget. 10 It
academics so used to free services that there will is anticipated that the value of industry cost sharing
never be a market. How do you gradually create an (e.g., donated switches, lines, or software) for NREN
information market, for networks, or for network- would be on the order of hundreds of millions of
accessible value-added services? dollars.
1~]1~ MU~~, *’NSF ~ns High-S@ Computer Network,” S~ienCt?, p. 22.
33

Network Use a fully implemented NREN change the concen-


tration of academic science and Federal fund-
Network service offerings (e.g., databases and ing in a limited number of departments and
database searching services, news, publication, and research universities, and of corporate science
software) will need some policy treatment. There in a few large, rich corporations; what might be
need to be incentives to encourage development of the impacts of networks on traditional routes to
and access to network services, yet not unduly scientific priority and prestige?)
subsidize such services, or compete with private ● controlling scientific information flow. What
business, while maintaining quality control. Many
technologies and authority to control network-
network services used by scientists have been “free”
resident scientific information? How might
to the end user.
these controls affect misconduct, quality con-
Economic and legal policies will need to be trol, economic and corporate proprietary pro-
clarified for reference services, commercial infor- tection, national security, and preliminary re-
mation industry, Federal data banks, university data lease of tentative or confidential research infor-
resources, libraries, publishers, and generally all mation that is scientifically or medically sensi-
potential services offered over the network.11 These tive?
policies should be designed to encourage use of ● cost and capitalization of doing research; to
services, while allowing developers to capture the what extent might networking reduce the need
potential benefits of network services and ensure for facilities or equipment?
legal and economic incentives to develop and ● oversight and regulation of science, such as
market network services. quality control, investigations of misconduct,
research monitoring, awarding and auditing of
Longer Term Science Policy Issues government grants and contracts, data collec-
The near-term technical implementation of the tion, accountability,
12
and regulation of research
NREN is well laid out. However, longer-term policy procedures. Might national networking ena-
issues will arise as the national network affects more ble or encourage new oversight roles for
deeply the conduct of science, such as: governments?
● patterns of collaboration, communication and
● the access of various publics to scientists and
information transfer, education, and appren- research information;
ticeship; ● the dissemination of scientific information,
● intellectual property, the value and ownership from raw data, research results, drafts of papers
of information; through finished research reports and reviews;
● export control of scientific information might some scientific journals be replaced by
● publishing of research results electronic reports?
● the “productivity” of research and attempts to ● legal issues, data privacy, ownership of data,
measure it copyright. How might national networking
● communication among scientists, particularly interact with trends already underway in the
across disciplines and between university, gov- scientific enterprise, such as changes in the
ernment, and industry scientists. nature of collaboration, sharing of data, and
● potential economic and national security risks impacts of commercial potential on scientific
of international scientific networking, collabo- research? Academic science traditionally has
ration, and scientific communication; emphasized open and early communication,
● equity of access to scientific resources, such as but some argue that pressures from competition
facilities, equipment, databases, research for research grants and increasing potential for
grants, conferences, and other scientists. (Will commercial value from basic research have
1 low, Cuculu A. 130, Ij~ F~~~~ R~~i~t~~ 52730 (~, 24, 1985); A. 1 ~() H,R. 2381, The Information policy ACI of 1989, which restates the ro~c
of OMB and policles on governmen( reformation dissemination.
12u, s, Conqess, office of ~~o]o~ A~\essment, The Regu~to~ En}tlro~en/f~r ,$~[e~~e, OTA-TM-SET-34 (Wtimgton, DC: U.S. Government
Prirmng Office, February 1986).
34

dampened free communication. Might net- universities by 1995, providing reliable service and
works counter, or strengthen, this trend? rapid transfer of very large data streams, such as are
found in interactive computer graphics, in apparent
Technical Questions real time. The currently operating agency networks
would be integrated under this proposal, to create a
Several unresolved technical challenges are im-
shared 45Mb/s service net by 1992. The second part
portant to policy because they will help determine
of the NREN would be R&D on a gigabit network,
who has access to the network for what purposes.
to be deployed in the latter 1990s. The first part is
Such technical challenges include:
primarily an organizational and financial initiative,
● standards for networks and network-accessible requiring little new technology. The second involves
information services; major new research activity in government and
● requirements for interface to common carriers industry.
(local through international);
● requirements for interoperability across many The “service” initiative extends present activities
different computes; of Federal agencies, adding a governance structure
● improving user interfaces; which includes the non-Federal participants (re-
● reliability and bandwidth requirements; gional and local networking institutions and indus-
● methods for measuring access and usage, to try), in a national networking council, It formalizes
charge users that will determine who is most what are now ad-hoc arrangements of the FRICC,
likely to pay for network operating costs; and and expands its scale and scope. Under this effort,
● methods to promote security, which will affect virtually all of the Nation’s research and higher
the balance-between network and information education communities will be interconnected. Traf-
vulnerability, privacy, and open access. fic and traffic congestion will be managed via
priority routing, with service for participating agen-
Federal Agency Plans: FCCSET/FRICC cies guaranteed via “policy” routing techniques. The
benefits will be in improving productivity for
A recently released plan by the Federal Research researchers and educators, and in creating and
Internet Coordinating Committee (FRICC) outlines demonstrating the demand for networks and network
13
a technical and management plan for NREN. This services to the computing, telecommunications, and
plan has been incorporated into the broader FCCSET information industries.
implementation plan. The technical plan is well
thought through and represents further refinement of The research initiative (called stage 3 in the
the NREN concept. The key stages are: FCCSET reports) is more ambitious, seeking sup-
port for new research on communications technolo-
Stage 1: upgrade and interconnect existing agency
gies capable of supporting a network that is at least
networks into a jointly funded and
a thousand times faster than the 45Mb/s net. Such a
managed T1 (1.5 Mb/s) National Net-
net could use the currently unused capabilities of
working Testbed. 14
optical fibers to vastly increase effective capability
Stage 2: integrate national networks into a T3 (45
and capacity, which are congested by today’s
Mb/s) backbone by 1993.
technology for switching and routing, and support
Stage 3: push a technological leap to a multigiga- the next generation of computers and communicat-
bit NREN starting in the mid-1990s.
ions applications. This effort would require a
The proposal identifies two parts of an NREN, an substantial Federal investment, but could invigorate
operational network and networking R&D. A serv- the national communication technology base, and
ice network would connect about 1,500 labs and boost the long-term economic competitiveness of
13FIUCC, Progrm Plmfor the Naiodf?esearch and Educatwn Network, May 23, 1989. FRICC hm members from DHHS, DOE, DwA, USGS,
NASA, NSF, NOAA, and observers from the Internet Activities Board. FRICC is an informal committee that grew out of agencies’ shared interest in
coordinating related network activities and avoiding duplication of resourees. As the de facto interagency coordination forum, FRICC was asked by NSF
to prepare the NREN program plan.
14sw ~w NysE~~ NOTE, vol. 1, No. 1, Feb, 6, 1989 NysER~ h= &n aw~d~ a multfii]]ion-doll~ Confrwt from DARPA to develop
tbe National Networking 7ksIbed.
35

the telecommunications and computing industries. interests would be represented in practice. It is not
The gigabit network demonstration can be consid- clear what form this may take, or whether it will
ered similar to the Apollo project for communica- necessitate some formal policy authority, but there
tions technologies, albeit on a smaller and less is need to accommodate the interests of universities
spectacular scale. Technical research needed would (or some set of universities), industry research labs,
involve media, switches, network design and control and States in parallel to a Federal effort. The
software, operating systems in connected comput- concerns of universities and the private sector about
ers, and applications. their role in the national network are reflected in
EDUCOM’S proposal for an overarching Federal-
There are several areas where the FRICC manage- private nonprofit corporation, and to a lesser extent
ment plan-and other plans-is unclear. It calls for, in NRI’s vision. The FRICC plan does not exclude
but does not detail any transition to commercial such a broader policy-setting body, but the current
operations. It does not outline potential structures for plan stops with Federal agency coordination.
long-term financing or cost recovery. And the
national network council’s formal area of responsi- Funding for the FRICC NREN, based on the
bility is limited to Federal agency operations. While analysis that went into the FCCSET report, is
this scope is appropriate for a Federal entity, and the proposed at $400 million over 5 years, as shown
private sector has participated influentially in past below. This includes all national backbone Federal
Federal FRICC plans, the proposed council does not spending on hardware, software, and research,
encompass all the policy actors that need to partici- which would be funneled through DARPA and NSF
pate in a coordinated national network. The growth and overseen by an interagency council. It includes
of non-Federal networks demonstrates that some some continued support for mid-level or institu-
interests—such as smaller universities on the fringes tional networking, but not the value of any cost
of Federal-supported R&D-have not been served. sharing by industry, or specialized network R&D by
The FRICC/FCCSET implementation plan for net- various agencies. This budget is generally regarded
working research focuses on the more near-term as reasonable and, if anything, modest considering
management problems of coordinated planning and the potential benefits (see table 3-2).15
management of the NREN. It does not deal with two
extremely important and complex interfaces. At the AREN Management Desiderata
most fundamental level, the common carriers, the
network is part of the larger telecommunications All proposed initiatives share the policy goal of
labyrinth with all its attendant regulations, vested increasing the nation’s research productivity and
interests, and powerful policy combatants. At the top creating new opportunities for scientific collabora-
level, the network is a gateway into a global tion. As a technological catalyst, an explicit national
information supermarket. This marketplace of infor- NREN initiative would reduce unacceptably high
mation services is immensely complex as well as levels of risk for industry and help create new
potentially immensely profitable, and policy and markets for advanced computer-communications
regulation has not kept up with the many new services and technologies. What is needed now is a
opportunities created by technology. sustained Federal commitment to consolidate and
fortify agency plans, and to catalyze broader na-
The importance of institutional and mid-level tional involvement. The relationship between science-
networking to the performance of a national net- oriented data networking and the broader telecom-
work, and the continuing fragmentation and regula- munications world will need to be better sorted out
tory and economic uncertainty of lower-level net- before the NREN can be made into a partly or fully
working, signals a need for significant policy commercial operation. As the engineering challenge
attention to coordinating and advancing lower-level of building a fully national data network is sur-
networking. While there is a formal advisory role for mounted, management and user issues of econom-
universities, industry, and other users in the FRICC ics, access, and control of scientific information will
plan, it is difficult to say how and how well their rise in importance.
-
15 F or ~xmple, N~ion~ Rese~ch Comcil, Toward u NUrLOnUl Research Nerwork (Washington, DC: Naloml kademy fieSS 19~8)! PP. 2~ 31
36

Table 3-2-Proposed NREN Budget ($ millions)

FY90 FY91 FY92 FY93 FY94


FCCSET Stage 1 & 2 (upgrade; NSF) . . . . . . . . . . 14 23 55 50 50
FCCSET Stage 3 (gigabit+; DARPA) . . . . . . . . , . . 16 27 40 55 60
Total . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 50 95 95 110
S. 1067 authorization . . . . . . . . . . . . . . . . . . . . . . . . 50 50 100 100 100
HR. 3131 authorization . . . . . . . . . . . . . . . . . . . . . . 50 50 100 100 100
SOURCE: Office of Technology Assessment, 1989.

The NREN is a strategic, complex infrastructure The pace of the resolution of these issues will be
which requires long-term planning. Consequently, controlled initially by the Federal budget of each
network management should be stable (insulated participating agency. While the bulk of the overall
from too much politics and budget vagaries), yet investment rests with midlevel and campus net-
allow for accountability, feedback, and course cor- works, it cannot be integrated without strong central
rection. It should be able to leverage funding, coordination, given present national telecommuni-
maximize cost effciency, and create incentives for cations policies and market conditions for the
commercial networks. Currently, there is no single required network technology. The relatively modest
entity that is big enough, risk-protected enough, and investment proposed by the initiative can have major
regulatory-free enough to make a proper national
network happen. While there is a need to formalize impact by providing a forum for public-private
current policy and management, there is concern that cooperation for the creation of new knowledge, and
setting a strong federally focused structure in place a robust and willing experimental market to test new
might prevent a move to a more desirable, effective, ideas and technologies.
appropriate management system in the long run.
There is need for greater stability in NREN policy. For the short term there is a clear need to maintain
The primary vehicle has been a voluntary coordinat- the Federal initiative, to sustain the present momen-
ing group, the FRICC, consisting of program offi- tum, to improve the technology, and coordinate the
cers from research-oriented agencies, working within expanding networks. The initiative should acceler-
agency missions with loose policy guidance from ate the aggregation of a sustainable domestic market
the FCCSET. The remarkable cooperation and for new information technologies and services.
progress made so far depends on a complex set of These goals are consistent with a primary purpose of
agency priorities and budget fortunes, and continued improving the data communications infrastructure
progress must be considered uncertain. for U.S. science and engineering,