Вы находитесь на странице: 1из 44

Current Developments in DETER

Cybersecurity Testbed Technology

Terry Benzel*, Bob Braden*, Ted Faber*, Jelena Mirkovic*,


Steve Schwab†, Karen Sollins+, and John Wroclawski*
*USC Information Sciences Institute, †Sparta, Inc., and +MIT CSAIL

1
Talk Outline

Today’s DETER Testbed

Key New Research and Development:


• Dynamic Federation
• Experiment Health Support
• Risky Experiment Management

Next Generation Capabilities


Why DETER?

• Lack of experimental infrastructure for cyber-


security research
– Medium-scale, e.g., 100’s of nodes, 1 Gbps links.
– Open facility, researcher- and vendor-neutral.
• Lack of effective tools and methodologies to foster
rigorous and effective evaluation
– Repository of test data, test topologies, traffic, and
metrics.
– Foster good science in cyber-security area.

3
DETER Program Goals

• A functioning and supported cyber-security


testbed
• A set of experiment methodologies and
supporting tools
• Research initiatives to advance the art
• A community of users

4
5
DETER Experimenters and User
Organizations (representative)

6
DETERLab: The Facility

7
DETERlab: The DETER Facility

• Cyber Security testbed located at USC/ISI and UC Berkeley


– Funded by NSF and DHS, started in 2003
– 400 Nodes - 200 each at ISI and UC Berkeley
– Based on Emulab software, with focus on security experimentation
• The DETER testbed infrastructure
– Tool libraries
– SEER workbench
– Operations and Management
– Community building and Outreach
DETER Testbed: Attributes

• Shareable – concurrent “logical testbeds”


• Isolation of “logical testbeds” (experiments)
• Flexible – full access to bare hardware
• Rapidly configurable
• Remotely accessible to (non-local experimenters)
• Security, and safety for “risky” experiments

9
DETER Testbed -
Users
Users
Internet Cluster Architecture
Users
Control
External DB
Firewall

‘User’ Server External ‘Boss’ Server


User VLAN

files User Acct & Web/DB/SNMP,


Data logging Users
Boss switch mgmt
VLAN
server VLAN
Router/Isolator
Control Hardware
Node Serial •Firewall VLAN
Line Server •VLAN control Power Serial

Line Server
Control Network VLAN (per experiment)

Power
Node Node Node 160 Controller
N X 4 @1000bT
Data ports Switch Control Interface

Programmable Patch Panel (VLAN switches)


10
DETER Testbed –
Inter-cluster Architecture

Testbed comprises:
– A number of clusters
– Interconnected by
secure tunnels, using
– Shared or separate
physical networks

11
Extended Hardware Capability

• Original Emulab concept: PC-based experimental


nodes
– Flexible and programmable, but restricted performance
• Extended Hardware Facility allows addition of non-
PC production and custom nodes to a cluster
– High performance and/or dedicated function
– Ability to incorporate commercial devices into expts.
• Objects integrated as first-class Emulab nodes

12
Experiment Methodology and The
SEER Facility
• Palettes capture high-level “design
TOPOLOGY TRAFFIC ATTACK DATA-CAPTURE
PALETTES
patterns” for well-formed
experiments: Topology, Background
and Attack Traffic, and Packet
Capture and Instrumentation. Skeleton
palettes for original and customized
experiments are also available.
?
METHODOLOGY
& GUIDANCE
• Methodology Engine frames standard,
systematic questions that guide an
experimenter in selecting and
combining the right elements.
EXPERIMENT
AUTOMATION • Experiment Automation increases
repeatability and efficiency by
integrating the process to the DETER
testbed environment.
13
SPARTA DDoS Experiment
September 2005

Background Traffic:
REPLAY | NTCG | HARPOON
HIGH FIDELITY TRAFFIC

CORE
Topology: AS-11357
BUILDING-BLOCKS |
JUNIPER ROUTER CORE
REALISTIC CONNECTIVITY AND
SCALE-DOWN

Attack Traffic:
DETER-INTEGRATED ATTACK
SCRIPTING
AUTOMATION OF VARIETY OF
SCENARIOS UNDER STUDY

Instrumentation:
PACKET AND HOST STATISTICS
CAPTURE | SPECTRAL ANALYSIS
ATTACK TRAFFIC
| METRICS CALCULATION |
BACKGROUND
INTEGRATED VISUALIZATION
TRAFFIC
TOOLBOX FOR RIGOROUS
INVESTIGATION OF RESULTS
14
A General Purpose Facility

• Flexibility:
– Topologies shown are
from representative real
experiments
• Sharability
– Many of these topologies
can coexist simultaneously
in separate experiments

15
Beyond the Facility -
Creating new tools for cyberscience

16
Federation

Scale, realism, and


decentralization

17
Dynamic Federation

• On-demand creation of
experiments spanning
multiple, independently
controlled facilities

• Why?
– Scale
– Unusual facilities
– Data & knowledge sharing
– Information hiding -
multiparty scenarios
Elements of Federation

• Experiment decomposition
– Creation and embedding of per-federant sub-experiments
• Create coherent distributed environment (embedding)
• Guide experimenter about potential choices and effects
• Trust, policy, and security analysis mechanisms
– Manage federated resources within local policies
• Access / Authorization (who can use?)
• Constrain use (how can they use?)
• Provide unified runtime environment to researcher and experiment
– Shared file system, scheduling and event system, control hooks, etc.
– Failure management model
• Secure implementation mechanisms
– Inter-facility communication, namespace mapping, etc

19
DFA System Architecture
CEDL
“Assembly Code”
Standard Experiment
Representation Testbeds
Experiment
Experiment Creation
Requirements Tool

Testbed Experiment
Properties Creation Federator
Tool

Experiment
Topology Experiment
Creation
Tool

Testbed
Experiment Properties
Decomposition Tools
CEDL
Canonical Experiment Description Language

• Standard low-level experiment representation – “assembly code”


• Output of all tools / input to Federator

• Expressiveness (today):
– Core semantics:
• Logical {nodes, links, elements}
• Topology (Emulab/ns2)
– Annotations: logical attributes (eg, node type)
• Type information: router, switch, etc.
• Physical selection: map to specific instance
– Annotations: physical attributes
• “Escapes” to allow physical configuration of hardware
DFA Authorization

Flexible
Principal IDs
Attested
Attributes

• Currently: Simple Emulab Generalization


• Moving To: General Attribute-Based Control

22
Authorization for Dynamic
Federated Testbed Environments
(with Steve Schwab, SPARTA)

• Decentralized, collaborative/competitive environment.


Alliances form/break frequently
– Semantics appropriate for testbed federation
• Explicit, visible decision making
– Corollary: clear auditing and understanding
• Multiple trust creation models, independent of mechanism
– Examples: Hierarchical PKI, PGP web of trust, etc.
• Minimize unnecessary communication
– For disconnected operation
• Control and limit revelation of info (credentials, etc.)
– Corollary: potential multi-step negotiation
Attribute Based AC

• We build on Attribute-Based Access Control


– Work by Winsborough, Li, Mitchell, others in turn

• Basic model:
– Principals
• In our work, user’s identity established by local authority’s local means -
Kerberos, certs, passwords, …
– Principals have attributes
• Established by digitally signed credentials…
… through which credential issuers assert judgments about the attributes of
principals
• Expressed in a formal language
– Attributes and Rules drive a reasoning engine
• Authorization decisions are based on applying rules to attributes of the requestor
Revisiting the Architecture

Generalized Federant
Authorization
Model
DETER

Authentication
SEER DETER and GMPLS
CEDL
configuration DRAGON
Internet
DETER
Federator GMPLS
CEDL
Authentication
Credentials DETER and
configuration

USERS Plug-in
DETER
based
federation
interfaces

Federant
Federation: Status

• Operating today
– Hundreds of nodes
– Emulab-based testbeds {DETER, WAIL, Emulab}
– Simple / simplistic ID based authorization
– Scale-limited: primarily by filesystem and event model
• Extending to
– Full attribute-based auth/auth framework
– Removal of scaling limitations
– Richer resource discovery / integration with higher level tools
– Additional testbed architectures
• NSF GENI program
• DRAGON
Experiment Health

Structured experiments and


scientific rigor

27
Supporting Rigorous and
Structured Experiments

• Virtually all experiments have invariants: things


that must be true for the experiment to be valid
• But what if “invariants” aren’t true?
• Happens a lot in cybersecurity research
– Worst-case operating points...
– Complex, often multi-party scenarios...
– Test repeatability and teaching scenarios

28
Maintaining Experiment Invariants:
The DETER Experiment Health System

• Uses higher level knowledge about the experiment –


the desired invariants
• Takes corrective or notification action if invariant is
violated
– Monitor invariants..
– Trigger actions
• Captures invariants in exportable form for
experiment reuse, repeatability and validation, etc.

29
Experiment Health: Benefits

• Sophisticated users benefit in case of large-scale or


batch experiments
– Very difficult to monitor these manually or to discover
problems in a large result set
• Novice users benefit always, but especially when re-
running somebody else’s code (e.g., students)
– Early signal that something is wrong, motivates
understanding of experimentation mechanism

30
Experiment Health System:
Design Goals

• Support an open-ended set of expectations


– Some internal, e.g., verifying if a chosen traffic generator can output
the desired traffic rate
– Some external, e.g., verifying that all nodes are up
• On expectation failure, execute arbitrary recovery actions
– Automatic adaptation or repair
– Alert user for manual intervention
– Terminate experiment
• Support both low-level and high-level sources of knowledge
about invariants

31
Continuing Development

• Develop intelligent expectation build engine


– Reduce user burden and help novice users
– Infer expectations from user actions and observed outcomes
– Refine as the experiment is repeated
– Offer the inferred scenario to user to label and modify
– Potential to build a timeline of events/measurements
• Structured expectation templates – combine and integrate
groups of low-level expectations, support experiment classes
• Develop a library of tools, expectations and measurements
for common experiment classes
– Enable sharing between users
– Public (populated by us initially) and private archives
32
Risky Experiment Management

Experimental cybersecurity in a
complex world

33
Domain of interest

• “Malware Containment” Î
“Risky Experiment Management”

– Malware study
– “Active Defense” research
– Stress and failure testing
– Agent based measurement and monitoring
– …

34
Problem formulation
“Classical Formulation”

No bad stuff Bad stuff

Containment
Boundary

•Focus on isolation
•De facto emphasis of initial DETER work
35
Problem formulation
More accurate formulation
Controlled interaction
with outside
environment

World Experiment

Observation Experiment
and Control
monitoring

•Key issue:
– Not “isolation and containment” but
“understanding and assurance”
36
Model

•Two-stage approach:

Unconstrained Constrained Assured /


malware / malware / constrained
experiment experiment external
behavior behavior behavior

Malware / Testbed
Experiment behavior
behavior constraint constraint
transform: T1 transform: T2

Behavioral composition model: External behavior == T2(T1(experiment))


37
Why is this a good idea?

• Richer function
– Simplifies reasoning about a wider range of potentially
risky experiments/circumstances
– Simplifies tradeoff between different experimenter goals

• Increased verifiability
– T2 (testbed) constraints common to many experiments,
can be extensively tested
– Behavior of T2(T1()) may be subject to formal analysis

38
Why is this a good idea (2)

• Simplified experiment design, increased reusability


– Defined T1 “invariants” (experiment constraints) simplify design of
experiments with known external properties

• Separation of concerns:
– Experimenter best equipped to understand impact of constraints on
experiment validity - express T1 for experimenters
– Testbed designer best equipped to understand impact of constraints
on testbed and external behavior - express T2 for testbed
designers/implementors

39
Near Term: Constraint sets
• Constraint sets are
– pre-established sets of T1 constraints that,
– when met, allow useful, well behaved, well understood experiments to be run.
• It is useful to imagine defining more than one set

Unconstrained malware / Constrained malware / Assured / constrained


experiment behavior experiment behavior external behavior

Very Very Complex


weak T1 strong T2 locked down
constraints restrictions experiment

Simpler, more
Stronger T1 Weaker T2
flexible
constraints restrictions
experiment

Strongest T1 Weakest T2 Simplest, richest


constraints restrictions experiment
40
Long Term: Formal Verification

• It may be [in some cases, “is”] possible to reason formally about


the overall behavior of the T2(T1)exp)) system.

• This might allow fine grain, possibly automatic derivation of


experiment (T1) and testbed configuration (T2) constraints to limit
a particular experiment’s potential external behavior without
damaging its experimental value

• Such a tool would provide a highly general facility for limiting the
risk of risky experiments
A Common Theme

• Interesting to note …
• ... underlying concept of establishing,
understanding, monitoring, enforcing
experiment/test invariants is common to
– Experiment Health
– Risky Experiment Management
– Intelligent, high-level design of experiments
• Direction: unified, shared mechanisms for invariant
management to serve these goals
42
Summary
DETER Cyber Security Testbed

• Facility
– Tool libraries, Workbench, Operations,
Community
• Tools for Cyber Science
– Federation, Experiment Health, Risky Experiment
• Foundation for next generation testbeds
– Geni, National Cyber Range
• More info – www.isi.edu/deter or www.deterlab.net
Beyond the Facility -
Creating new tools for cyber science

• Dynamic Federation
– On-demand creation of experiments spanning multiple, independently
controlled facilities
• High Level User/Workflow Tools
– User-friendly user interface for configuring, launching, monitoring,
and analyzing experiments
• Experiment Health Support
– Supporting Rigorous and Structured Experiments
• Risky Experiment Management
– Experimental cybersecurity in a complex world

Оценить