Вы находитесь на странице: 1из 85

Secure SDLC

Quality
Key elements of ensuring software quality
Code review
Interface testing
Misuse case testing
Reasons for lack of security in SDLC
Lack of education on security
Operating systems and applications were not built on secure
architectures
Software development procedures have not been security oriented
Integrating security as an after thought
Software Development Lifecycle
5 Phases

Requirements Gathering - why, what, whom

Design how

Development coding

Testing and validation verify and validate the software

Release/maintenance deploy and manage


Project Management
Project management is an important part of Product
development
Security management is an important part of project
management
Statement of Work (SOW)
Describes the product and customer requirements
Helps ensure the requirements are clearly understood
Work Breakdown Structure (WBS)
Project management tool designed to group a projects individual
elements in a organized manner
Decomposition of a project into tasks and subtasks that result in clearly
defined deliverables
Requirements Gathering Phase
Understand why the project is needed and what the scope is
Examine the softwares requirement and proposed functionality
Evaluate products currently in the market and identifying
demands not currently met
The following shall be accomplished
Security requirements

Security risk assessment

Privacy risk assessment

Risk-level acceptance
Design Phase
Outlines how the product is going to accomplish the
requirements
Process used to describe the requirements and internal
behavior of the software
Software requirements come from 3 models:
Informational model dictates the type of information and how it will
be processed
Functional model outlines the tasks the application will carry out
Behavioral model explains the states the application will be during
and after specific transitions
The following should be accomplished
Attack surface analysis
Threat modeling
Development Phase
This the phase were actual development happens

Programmers are deeply involved

Design created is broken down into defined deliverables and


programmers develop code

Computer-aided software engineering tools (CASE) are used by


programmers to generate code, test and carry out debugging
activities
Testing / Validation Phase
It is important to map the security risks to test cases and code

Tests should be conducted in an environment that mirrors the


production environment

Security attacks and penetration testing takes place in this


phase

Tests should be run to validate against vulnerabilities identified


during earlier stages
Testing Types
Common testing approach
Unit testing Testing individual components in controlled environment, where programmers
validate data structure, logic and boundary conditions
Integration testing verifying that components work together as outlined
Acceptance testing Ensuring that the code meets customer requirements
Regression testing After a change to a system takes place, retesting to ensure
functionality, performance and protection

Fuzzing
Technique used to discover flaws and vulnerabilities in software by sending large amounts of
malformed, unexpected or random data in order to trigger failure
Helps in identifying buffer overflow, DoS vulnerabilities, injection weakness, validation flaws
Software Development Models
Build and Fix Model

Waterfall Model

V-Shaped Model

Prototyping

Incremental Model

Spiral Model

Rapid Application Development

Agile Model
Build and Fix Model
Not a formal SDLC model

No architecture design is carried out

Development takes place immediately with no planning involved

Problems are dealt as they occur

No formal feedback mechanisms

Not a proactive method for ensuring quality

This approach gets the software product out as soon as possible


Waterfall Model
Uses a linear-sequential life-cycle approach

Each phase must be completed in entirety before the next phase begins

Review takes place at the end of each phase to make sure project is on
the right path

All requirements are gathered in the initial phase and there is no formal
way to integrate changes

Its a very rigid approach

Could be used for small projects that have all requirements understood
V-Shaped Model
Emphasizes the verification and validation of the product at each phase and provides a formal
method of developing testing plans as each coding phase is executed

Lays out a sequential path of execution process.

Each phase must be completed before the next phase begins

Requires testing through-out the development phases

It is also rigid model, does not allow for flexibility and thus adapting to changes is more difficult
and expensive

Does not allow for handling of events concurrently

Does not integrate iterations of phases

It does not contain risk analysis activities

Best used when requirements are best understood and scope changes are very small
Prototyping
Sample product is developed to explore specific approach to a
problem before investing expensive time and resources.

3 main model types

Rapid Prototyping Evolutionary Prototype Operational Prototype


Prototype is quickly created to test They are built with a goal of Extension of evolutionary prototype
the validity of the current incremental update It is designed to be implemented
understanding of the project The prototype is continuously with a production environment as it
requirement improved upon until it reaches the is being tweaked
It is a quick and dirty method of final product stage Feedback is obtained and changes
creating the prototype Feedback gained in each phase is made within the working site
It is not developed to be build upon, used to improve the prototype
rather discarded after use
Incremental Model
Allows developers to carry out multiple development cycles on a piece of software
throughout its development process

Similar to multi-waterfall cycle per software as it mature through the development


stages

The working version is produced after the first iteration,

Advantages
Allows for changes to take place early, testing uncovers issues quickly, each iteration is easily
manageable, cost is lower, customer gets the functionality earlier, risk of critical changes being
introduced is lower.

Best used when risk, program complexity, funding, functionality requirements needs to
be understood early in the development cycle
Spiral Model
Uses an iterative approach to software development

Places emphasis on risk analysis

Has 4 main phases


Determine objectives
Risk analysis
Development and test
Plan the next iteration

Advantages
As more information about the project is gatherer it is integrated into the risk analysis process, improve
prototype, test the prototype, allows for testing to take place early, allows new requirements to be addresses
as they are uncovered

Best suited for complex projects that have fluid requirements


Rapid Application Development
Relies more on the use of rapid prototyping than on extensive
upfront planning

Planning is interleaved with the process of developing the software

Delivery of software can happen in the time compared to waterfall


method

Combines the use of prototyping and iterative development methods

Model provides input to allow for the improvement of the prototype

Allows for customer to be involved during the development phases


Agile Models
It focuses on incremental and iterative development methods that promotes cross-functional team work and
continuous feedback mechanisms

It is considered light weight it is nimble, flexible enough to adapt

The model focuses on small increments of functional code that are created based upon business need

Focuses on individual interactions instead of process and tools.

Promotes customer collaboration instead of customer negotiation

Has ability to respond to change

It breaks the product down into individual features that are constantly being delivered

It focuses on user stories

Development team can take pieces and parts of all the available SDLC models and combine them in a
manner that best suits the project requirement
Agile Model - Scrum
Most widely adopted Agile methodology

It is lean and customer focused

It acknowledges the fact that customer needs will change over time

It focuses on team collaboration, customer involvement and continuous delivery

Allows product features to be added, removed, modified at clearly defined


points.
Change points happen at the conclusion of each sprint ~ fixed duration development interval

Customer is intimately involved in the development process

Allows product to be iteratively developed and changed as it is being built


Agile Model Extreme Programming
Development methodology that makes code reviews to happen
continuously

This is accomplished using Pair programming ~ one developer


dictates and the other developer types

This significantly reduces the incidence of errors and improves the


overall quality of the code

It relies on test-driven development ~ unit tests are written before


the code is developed
Agile Model Kanban Methodology
Stresses visual tracking of all tasks so that the team knows which
one to prioritize at what point in time in order to deliver the right
product

Developed by Toyota

Allows the development team to better react to unforeseen


requirements
Other models
Exploratory Model
Used in instances where clearly defined project objectives are not available.
Relies on covering a set of specifications likely to affect the final products functionality
Testing is an important part of exploratory development
Joint Application Development (JAD)
Uses a team approach in work-shop oriented environment
Distinguishes itself by the inclusion of members other than coders in the team
Reuse Model
Approach development by using progressively developed models
Reusable models are evolved by gradually modifying pre-existing prototypes to customer
specifications
It drastically reduces development cost and time
Clean Room
Attempts to prevent errors or mistakes by following structured and formal methods of
developing and testing
Used for high-quality and mission-critical applications that will be put through a strict
certification process
DevOps
It is the practice of incorporate development, IT and quality
assurance staff into software development projects to align their
incentives and enable frequent, efficient and reliable releases of
software products
Its about changing the culture of the organization
Advantages
Increases trust within the organization
Increases job satisfaction
Improves the morale of project managers
Allows the multifunctional team to identify potential defects, vulnerabilities,
and frictional points early enough to resolve them
CMMI
Capability Maturity Model Integration (CMMI) is a comprehensive,
integrated set of guidelines for developing products and software
It can be used to evaluate the security engineering practices and
identify ways to improve them
Describes principles, procedures that underlie software development
process maturity
Provides a more discipline and repeatable method to improve
software quality
Provides best practices for an organization to develop a
standardized approach to software development
CMMI Levels

Optimized
Company has
Managed budgeted and
integrated plans for
Has formal process in continuous
Defined place to collect and improvement
analyze quantitative
Formal procedures data
Repeatable are in place Metrics are defined
Has a way to allow and fed into the
Formal management for quantitative process
Initial structure, change process improvement
control and quality improvement
Dev process is ad-hoc program
assurance are in
No assurance of place
consistency and Company can
quality properly repeat
processes
throughout each
project
Does not have formal
process model
defined
Change control Process
Make a request for change
Analyze the request
Develop the implementation strategy
Calculate the cost of implementation
Review security implications
Record the change request
Submit the change request for approval
Develop the change
Report results to management
Software Configuration Management
Identifies the attributes of software at various points in time and
perform methodical control of changes for the purpose of maintaining
software integrity and traceability throughout SDLC
SCM should provide the following
Concurrency Management
Helps prevent uncontrolled changes when multiple users simultaneously access the
same file
Versioning
Deals with keeping track of file revisions; it should create log reports of who made
changes, when and what
Synchronization
Allow individuals to check out complete or partial copies of the repositories and work on
the files as needed
Programming Languages
Machine Language
Format that the computers processor can understand and work with directly
It is represented in binary and considered the most primitive form of
programming.
First generation programming language
Assembly Language
Considered low-level programing language, it is a symbolic representation of
machine language
One level above machine language, programmers can use commands
instead of binary codes
Assemblers are used to convert this language code into machine
understandable format
Programs written are hardware specific and are not portable
Programming Languages
High-level Language
Third-generation language
They use abstract statements (eg. If-then-else)
They are easier to work with programs are easier to write and mistakes are easier
to identify
They are processor independent
They are portable and can be used on many different system types
Very High-level languages
4th Generation language
It enhances the natural language approach
Focuses on highly abstract algorithms that allow straightforward programming
implementation
The manual coding required may be ten times less than for the same task on a 3 rd
generation language s
Programming Languages
Natural language
5th generation of programming language
Program creation happens by defining the constrains for achieving a
specified result
Goal is to create software that can solve problems by itself instead of a
programmer having to develop code to deal with individual and specific
problems
Programming Language
Assemblers
Tools that convert assembly language source code into machine code
Compilers
Tools that convert high level language statements to machine code for
specific processors
Allows developers to create software once in high level language and
complied for various platforms
Interpreter
Performs the last step of transforming high-level code to machine-level code
It improves portability
Major advantage is that the platform independence and memory
management functions are part of the interpreter
Major disadvantage is that the program cannot run as a stand-alone
application but requires the interpreter to be installed on the local machine
Object Oriented Concepts
Benefits of Object Oriented Programming
Modularity
Differed commitment
Reusability
Naturalness

It also provides functional independence ~ each module addresses a specific


sub function of requirements and has an interface that is easily understood by
other parts of the application
Data hiding is provided by encapsulation, which protects an objects private data
from outside access
Abstraction is the capability to suppress unnecessary details so that important,
inherent properties can b examined and reviewed
Polymorphism
Polymorphism refers to a programming language's ability to process

objects differently depending on their data type or class

The application will determine what to use at the time of execution

It takes place when different objects respond to the same command

input or message in different ways


Software Development Concepts
Data Modelling
Considers data independently both by the way data is processed and the
component that processes the data
It follows the data from input to the end and ensures the output is correct
Data modelling can be used to provide insights into the data and the
relationships that govern it (used in Databases)
Data Structure
Data Structure is a way of collecting and organising data in such a way that
we can perform operations on these data in an effective way
Data Structures is about rendering data elements in terms of some
relationship
They are structures programmed to store ordered data, so that various
operations can be performed on it easily
Cohesion and coupling
Cohesion
Refers to how many different types of tasks a module can carry out
High cohesion ~ carries one task or tasks that are similar
High cohesion is better. Any change in the task can be done without
impacting other tasks
Coupling
Is a measurement that indicates how much interaction one module
requires to carry out its tasks
Low (loose) coupling is better, High (tight) coupling means a module
depends upon many other modules
Application Programming Interface
It specifies a manner in which an software component interacts
with other components

It encourages software reuse

Makes software more maintainable by localizing the changes that


need to be made

It provides reduced effort and improved maintainability.

A software library is a collection of components that do specific


things that are useful to many other components
Distributed Computing Environment
It is a standard developed by the Open Group
It is a client/server framework that is available to many vendors to use within
their products
It provides RPC service, security service, directory service, time service and
distributed file support
It is a layer of software that sits on top of network layer and provides services
to the application above it
DCE and DCOM offer much of the same functionality
DCOM uses a globally unique identifier (GUID) while DCOM users a
universally unique identifier (UUID)
Common Object Request Broker Architecture
(CORBA)
Open object-oriented standard architecture developed by Object
Management group

Enables applications to communicate with one another no matter where


the application resides or who developed it

It defines the API, communication protocol and C2S communication


methods to allow heterogenous applications to work together

In this model clients request services from objects.

The model provides standards to build a distributed environment


Common Object Request Broker Architecture
(CORBA)
Contains two parts
System Oriented Components
Object oriented brokers (ORB) and object services

Application Oriented Components


Application objects and common facilities

ORB manages all communications between the components and enables them to interact in a
heterogeneous and distributed environment

ORB works independently of the platforms where the objects reside, it provides greater
interoperability

It is the middleware that allows the client/server communication to take place

ORB provides communication between distributed objects

It provides portability for applications


Component Object Model (COM)
It allows interprocess communication within one application or between
applications within the same system

DCOM supports the same model, and also support distributed


interprocess communication

DCOM enables the programmer to focus on coding, while it has the


library to take care session handling, synchronization, buffering, fault
identification and handling, and data format translation

DCOM provides ORB like services


Object Linking and Embedding (OLE)
It provides a way to objects to be shared on a local computer and
to use COM as their foundation

OLE enables objects to be embedded into documents

The capability of one program to call another program is called


linking

Embedding capability to place a piece of data inside a foreign


program or document
Java Platform
Java EE defines a client/server model that is object oriented and
platform independent

It is an enterprise Java computing platform


Meaning it is a framework that is used to develop enterprise software written
mainly in Java programming language
It provides APIs for networking services, fault tolerance, security and web
services for large-scale, multitiered network applications.
It takes advantage of the Write once, Run Anywhere capability of Java
It can handle scalability, concurrency, transactions and various security services
for the client
Service Oriented Architecture (SOA)
It provides a standardized access to the most needed services for
many different applications at one time

Services within an SOA are usually provided through web services

Applications access one centralized place, that provides


functionality they require

For a service to be able to be used they must be


Modular, autonomous, loosely coupled, follow standardized service
identification and categorization, provide provisioning and delivery
SOA Web Services
A web service allows for web based communication to happen
seamlessly using web-based standards, like SOAP, HTTP, WSDL
They provide the functional building blogs for the SOA
Web Services Description Language (WSDL)
Provides a machine readable description of the specific operations provided by
the service

Universal description, discovery and Integration (UDDI)


It is an XML based registry that lists available services
It provides a method for services to be registered by service providers and
located by service consumers
SOA
Mashup
A mashup is a technique by which a website or Web application uses data,
presentation or functionality from two or more sources to create a new service
Open APIs and data sources are commonly aggregated and combined to provide
a more useful and powerful resource

SOAP
It is an XML based protocol that encodes messages in a web service
environment
It actually defines an XML schema of how communication is going to take place
The schema defines how objects communicate directly
SOA
Cloud Computing
Method of providing computing as a service rather than as a physical
product

It provides processing computation capabilities, storage and software


without end user needing to know or worry about the physical location or
configuration of the devices
Mobile Code
Code that can be transmitted across the network to be executed
by a system at the other end

Mobile code has been a regular attack vector. It should be used


cautiously

Java Applets
Object-oriented, platform independent programming language

It is used to write small components, applets, which commonly run in a


users web browser
Mobile Code
ActiveX Controls
They are self-sufficient programs that can be executed in windows
environments

Can be reused by many applications within one system or different


systems within an environment

Allows web browser to execute other software applications within the


browser (media files, PDF etc)

It can be automatically downloaded and executed by web browser

They have greater access to the users system compared to Java applets
ActiveX vs Java applets

ActiveX Java applet

It uses It sets up a
Authenticode sandbox for the
technology, which applet code to
relies on Digital execute thus
certificates and reducing codes
trusting certificate access to resources
authority within the users
computer
Web Environment Threats Input Validation
Attacks
Path or Directory traversal
Also known as dot dot slash attack
It is inserting the characters ../ several times into the URL to traverse into the directories that werent
supposed to be accessible from the web
Access to the command shell allows extensive access to the attacker

Unicode encoding
Webservers use Unicode to support different applications.
Attacker using Unicode could make the same directory traversal attack without using / but with any of
the Unicode representations of that character

URL Encoding
URL encoding replaces unsafe ASCII characters with a "%" followed by two hexadecimal digits. URL
encoding normally replaces a space with a plus (+) sign or with %20
Attackers found they could bypass filtering techniques and make requests by representing characters
differently
Web Environment Threats Input Validation
Attacks
Client-side validation
When the input validation is done at the client before it is even sent to
server to process

It avoids incomplete requests to be sent to the server and the server


having to send back an error message to the user

If a server relies only on Client-side validation, it introduces the risk of


depending on the clients capability to address the threat. It is always
beneficial to also have server-side validation at different degree
Cross-site Scripting (XSS)
XSS enables attackers to inject client-side scripts into web pages
viewed by other users

A cross-site scripting vulnerability may be used by attackers to


bypass access controls

3 different XSS Vulnerabilities


Nonpersistent XSS

Persistent XSS

DOM (Document Object Model)


Cross-site Scripting (XSS)
Nonpersistent XSS Vulnerability
also known as Reflected XSS
The malicious code is executed by the victims browser,
and the payload is not stored anywhere; instead, it is
returned as part of the response HTML that the server
sends.
The victim is being tricked into sending malicious code to
the vulnerable web application, which is then reflected back
to the victims browser where the XSS payload executes
It is the most commonly carried out XSS attack
It is also called Type 1 XSS because the attack is carried
out through a single request / response cycle

http://www.acunetix.com/wp-content/uploads/2014/03/reflected-xss.png
Cross-site Scripting (XSS)
Persistent XSS Vulnerability
Also known as stored or second-order vulnerability
Targeted against websites that allow users to input data that is stored in a
database in the server
The attacker posts some text that contains some malicious JavaScript, and when
other users later view the posts, their browsers render the page and execute the
attackers JavaScript
Persistent XSS is less frequent than Non-Persistent XSS
The damage that Persistent XSS can do is more devastating than the damage
done by Non-Persistent XS
Cross-site Scripting (XSS)
Document Object Model (DOM) XSS
Also referred to as local cross-site scripting

DOM is the standard structure layout to represent to represent HTML and


XML documents in the browser

The attacker uses DOM environment to modify the original client-side


JavaScript

This causes the victim browser to execute the resulting abusive JavaScript
code
Parameter Validation
It is where the application validates the input values before server
application process them within the system
This is different from input validation, in that, whether the application was
expecting the user to input a value

It includes pre-validation and post-validation controls


Pre-validation: Input controls verifying data is in appropriate format and
inline with application specifications prior to submission to the application

Post-validation: Ensuring application output is as expected


Web Application Security Principles
These principles help in governing programming practices
To regulate programing styles and
Strategically reduce the chances of repeating known bugs

The first step is to analyse the website architecture


The clearer and simpler the website is, the easier it is to analyse its various security aspects

Using encryption
Helps secure the input/output operations of a web application

Failing Securely
Systems that fail securely display friendly error messages without revealing internal system details

Maintaining an equilibrium between functionality and security


Best security measures are those that are simple and acceptable
Database Management
A database is a collection of data that is organized such that it can
be useful for the intended purpose(s)

A database management system (DBMS) is a software application


that interacts with the user, other applications.

It also enforces access control, provides data integrity, redundancy


and sets up procedures for data manipulation

They help store, organize and retrieve information effectively and


efficiently
Database Characteristics
Any type of database should have the following:
Consistency among the data held on several servers

Easier backup procedures

Transaction persistence

Fault tolerance

Sharing

Controls that implement integrity checking, access control, and the


necessary level of confidentiality
Database Models
It defines the relationship between different data elements
Dictates how data can be accessed
Defines acceptable operations
The type of integrity offered, and how the data is organized
Types of database
Relational
Hierarchical
Network
Object-oriented
Object-relational
Relational Database Model
The most widely used model today

It uses attributes (columns) and tuples (rows) to contain and organize data

Data is represented in the form of tables

Each table contains unique rows, columns and cells

Each cell contains only one data value that represents a specific attribute value
within a given tuple

A Primary key is the field that links all the data within a record to a unique value

A foreign key is defined in a second table, but it refers to the primary key in the first
table.
Hierarchical Database Model
It is the first type of Database model that was created but not common now

It combines records and fields that are related in a logical tree structure

In this model, parents can have one child, many children or no children

It is not flexible in creating relationship between data elements as a relational


database

They are useful in mapping one-to-many relationships

It does not use indexes to search procedures, and links cannot be established
between different branches and leaves on different layers

LDAP is the most common Hierarchical model


Network Database Model
It is built upon the hierarchical data model

It allows each data element to have multiple parent and child


records forming a redundant network like structure

It uses constructs of records and sets


Records contain fields, which may layout in a hierarchical structure
Sets define the one-to-many relationships between the different records

Application has to use its own procedures to obtain data from the
database and then process the data for its needs
Object-oriented Database Model
It is built upon the hierarchical data model
It allows each data element to have multiple parent and child records forming
a redundant network like structure
It uses constructs of records and sets
Records contain fields, which may layout in a hierarchical structure
Sets define the one-to-many relationships between the different records

It has classes to define the attributes and procedures of its objects


When an application queries for data, it not only returns the data but also the
code to carry out procedures on this data
It does not depend upon SQL for interactions
Object-relational Database Model
It is a relational database with a software front end that is writtern
in an OOP language

Allowing the front end helps the business logic procedures to be


used by requesting applications and the data whithin the databas
Database Programming Interfaces
Object database connectivity (ODBC)

Object Linking and Embedding Database (OLE DB)

ActiveX Data Objects (ADO)

Java Database Connectivity (JDBC)


Object Database Connectivity (ODBC)
Open Standard application API for accessing Database

It allows programs to use SQL requests that will access databases


without having to know the proprietary interfaces to the
databases.

ODBC is a core component of Microsoft Windows Open Services


Architecture
Object Linking and Embedding Database (OLE DB)
Separates data into components that run as middleware on a client
or server

Characteristics
Its a replacement for ODBC

COM-based interfaces provides applications with uniform access to data

It is limited to being used by Microsoft

OLE Services can be accessed through ADO

It allows different apllications to access different types and sources of data


ActiveX Data Objects (ADO)
It is an API that allows applications to access backend database

It is a set of ODBC interfaces that exposes the functionality of data sources through
accessible objects

It uses the OLE DB interface to connect with the database and can be developed
with many different scripting languages

Characteristics
It is a high-level data access programming interface
Its a set of COM objects for accessing data sources not just database access
It allows developer to write programs that access data without knowing how the database is
implemented
SQL commands are not required to access a database when using ADO
Java Database Connectivity (JDBC)
It allows a Java application to communicate with a database

The application can bridge through ODBC or directly to the


database

Characteristics
Provides the same functionality as ODBC but is specifically designed for
use by Java database applications
Has database independent connectivity between the java platform and a
wide range of databases
It enables the java programs to execute SQL statements
Relational Database Components
Component Description

Data Definition Language (DDL) Defines the schema and structure of the database, access
operations and integrity procedures
Data Manipulation Language It examines the data and defines how the data can be
(DML) manipulated within the database
Contains all the commands that enable an user to view,
manipulate and use the database

Data Control Language (DCL) It defines the internal language of the database

Query Language (QL) Enables users to make requests to the database

Report Generator Produces user-defined printouts


Data Dictionary
It is a central collection so data element definitions, schema objects and
reference keys

It can contain the default values for columns, integrity information, the
names of users, the privileges and roles for users and auditing
information

It is a tool used to centrally manage parts of a database by controlling


data about the data

It provides cross-reference between groups of data elements and the


databases
Integrity Services
Three main types of integrity services provided by Database:
Semantic Integrity:
Makes sure structural and semantic rules are followed.

Referential Integrity:
Mechanism to ensure no foreign key contains a reference to a primary key of a
nonexistent record or null value

Entity Integrity:
Guarantees that tuples are uniquely identified by primary key values; every tuple must
contain one primary value
Integrity Protection Mechanisms
Mechanism Operations

Rollback Operation that ends the current transaction and cancels the current changes to
database. Database returns to its pervious state
Commit Completes a transaction and executes all changes just made by the user. This
ensures that partial changes do not take place and that data is not corrupted
Savepoints Helps to make sure if a system failure occurs, or if an error is detected, the
database can attempt to return to a point before the system crashed.
Having too many savepoints can degrade the performance
Checkpoints When the database software fills up a certain amount of memory, a checkpoint
is initiated. It saves the data from a temporary segment to a temporary file. Its
similar to savepoints
Two-phase The requests for database changes are put into a queue and activated all at
commit once. A pre-commit ensures all database are ready before the commit
command is sent to each database
Database Security Issues
Aggregation
Act of combining information from separate sources. This combination
provides new information which the subject does not have the necessary
rights to access

Combined information has a sensitivity that is greater than that of the


individuals parts

Inference
The ability to derive information not explicitly available
Database Security Issues Control mechanism
Content and Context Dependent access control

Content-dependent access control based on the data

Context-dependent access control based on the actions

Some techniques
Cell suppression hiding specific cells that contain information

Partitioning dividing database into different parts and controlling access

Noise and Perturbation - inserting bogus information to mislead the attacker

Polyinstantiation two instances of same object have different attributes


Online Transaction Processing
It is used when databases are clustered to provide fault tolerance
and higher performance

It provides mechanisms that watch for problems and deal with


them appropriately when they occur

The main goal is to ensure that transactions either happen or dont


happen at all

It can load-balance incoming requests if necessary.


OLTP - ACID Test
It provides integrity via ACID
Atomicity:
Either all changes are done or the database is rolled back

Consistency:
All data is consistent in the different databases

Isolation:
Transaction execute in isolation until completed, without interacting with other transaction

Durability:
Once the transaction is verified as accurate, it is committed and the database cannot be
rolled back
Data Warehouse
Data warehousing combines data from multiple databases into a
large database

The data warehouse is developed to carry out analysis

It provides a base data that is then processed and presented in a


more useful and meaningful way
Data Mining
Process of massaging the data held in the data warehouse into more
useful information
They are used to find an association and correlation in data to produce
metadata
Metadata can reveal abnormal patterns not previously apparent
Metadata is the result of storing data with a data warehouse and in the
data with tools
Goal is to extract information to gain knowledge about activities and
trends that were not visible earlier
Data Mining - KDD
It is also known as Knowledge discovery in database

It is a combination of techniques to identify valid and useful


patterns

Three approaches used in KDD are


Classification: Groups together data according to shared similarities
Probabilistic: Identifies data interdependencies and applies probabilities to
their relationships
Statistical: Identifies relationships between data elements and uses rule
discovery
Malicious Software
Type Description

Virus Application or code that infects software; Main function is to replicate and deliver its payload; It
requires a host application to replicate

Macro virus Written in one of the macro languages and it is platform independent
Its more common
Boot sector virus Infects the boot sector and either move data within boot sector or overwrite the sector with new
information

Stealth virus It hides the modifications it makes

Polymorphic virus It produces varied but operational copies of itself, it can use different encryption schemes
requiring different routines
Has the capability to change its own code
Mutipart virus Has several components to it and can be distributed to different parts of the system

Meme virus Its actually not a virus, but types of email messages that are continuously forwarded by users

Script virus The virus will carry out the instructions that is integrated into the script

Tunneling virus Attempts to install itself under the antimalware program and remain stealth
Malicious Software
Type Description

Worm They can reproduce on their own and are self-contained programs

Rootkit A bundle of tools that allow an attacker backdoor entry to the victim system; root kids commonly
include log scrubbers which remove traces of the attackers activities from the system logs.
Root kit detection may be difficult because the rootkit may be able to subvert the AV solutions.
The best way is to reinstall the OS
Spyware Malware that is covertly installed to gather sensitive information about a victim

Adware Software that automatically generates advertisements

Botnets Group of bots (compromised hosts) under the control of a Command and Control server to
perform activities planned by the attacker
Logic Bomb It executes a program when a certain set of condition is met

Trojan Horse It is a program that is disguised as another program. They perform useful functionality in addition
to the malicious functionality in the background. It is one of the fastest growing malware types
Antimalware software
Type Description

Signature based Effective for detecting known malware; it cannot address new variants

Heuristic detection Analysis the overall structure of the malicious code, evaluates the coded instructions and logic
functions and then decides on the malicious action. It helps detect unknown malware
Immunizer An immunizer attaches code to the file, which would fool a virus into thinking it is already infected
Karthikeyan Dhayalan
MD & Chief Security Partner

www.cyintegriti.com

Вам также может понравиться