Вы находитесь на странице: 1из 70

RSA Adaptive Authentication (On-

Premise) 7.1
Performance Guide
Contact Information
Go to the RSA corporate web site for regional Customer Support telephone and fax numbers: www.rsa.com
Trademarks
RSA, the RSA Logo and EMC are either registered trademarks or trademarks of EMC Corporation in the United States and/or
other countries. All other trademarks used herein are the property of their respective owners. For a list of RSA trademarks, go
to www.rsa.com/legal/trademarks_list.pdf.
License agreement
This software and the associated documentation are proprietary and confidential to EMC, are furnished under license, and
may be used and copied only in accordance with the terms of such license and with the inclusion of the copyright notice
below. This software and the documentation, and any copies thereof, may not be provided or otherwise made available to any
other person.
No title to or ownership of the software or documentation or any intellectual property rights thereto is hereby transferred. Any
unauthorized use or reproduction of this software and the documentation may be subject to civil and/or criminal liability.
This software is subject to change without notice and should not be construed as a commitment by EMC.
Note on encryption technologies
This product may contain encryption technology. Many countries prohibit or restrict the use, import, or export of encryption
technologies, and current use, import, and export regulations should be followed when using, importing or exporting this
product.
Distribution
Use, copying, and distribution of any EMC software described in this publication requires an applicable software license.

EMC believes the information in this publication is accurate as of its publication date. The information is subject to change
without notice.

THE INFORMATION IN THIS PUBLICATION IS PROVIDED "AS IS." EMC CORPORATION MAKES NO
REPRESENTATIONS OR WARRANTIES OF ANY KIND WITH RESPECT TO THE INFORMATION IN THIS
PUBLICATION, AND SPECIFICALLY DISCLAIMS IMPLIED WARRANTIES OF MERCHANTABILITY OR
FITNESS FOR A PARTICULAR PURPOSE.

Copyright 2013 EMC Corporation. All Rights Reserved.

Printed: July 2013


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Contents
Preface................................................................................................................................... 7
About This Guide................................................................................................................ 7
RSA Adaptive Authentication (On-Premise) Documentation ............................................ 7
Support and Service ............................................................................................................ 8
Before You Call Customer Support............................................................................. 8

Chapter 1: Performance Test Results Introduction .................................. 9


Purpose of this Guide.......................................................................................................... 9
Performance Test Environment .......................................................................................... 9
Performance Test Process ................................................................................................... 9
Performance Test Objectives ............................................................................................ 10

Chapter 2: Performance Test Environment...................................................11


Lab and Test Machine Specifications ................................................................................11
Test Environment Preconditions....................................................................................... 12
Performance Test Conditions............................................................................................ 12
WebSphere Server Hardware Specifications .................................................................... 12
JVM Configuration .................................................................................................... 13
Data Source Connection Pool Configuration............................................................. 14
Thread Pool Configuration ........................................................................................ 14
Apache Tomcat Server Hardware Specifications ............................................................. 15
Configuring JVM Options ......................................................................................... 16
Configuring the Data Source Connection Pool.......................................................... 16
Configuring the Thread Pool ..................................................................................... 16
WebLogic Server Hardware Specifications ...................................................................... 16
JVM Configuration .................................................................................................... 17
Data Source Connection Pool Configuration............................................................. 18
Thread Pool Configuration ........................................................................................ 19
JBoss Server Hardware Specifications ............................................................................. 19
JVM Configuration .................................................................................................... 20
Data Source Connection Pool Configuration............................................................. 21
Thread Pool Configuration ........................................................................................ 21

Chapter 3: Test Results for WebSphere ......................................................... 23


WebSphere Performance Test Statistics ........................................................................... 23
Statistics Summary .................................................................................................... 23
Transaction Summary ................................................................................................ 23
Average Response Time ............................................................................................ 24
Test Results Showing Transactions Per Second ............................................................... 25
Total Transactions per Second................................................................................... 25
Transactions per Second ............................................................................................ 26
Transaction Summary ................................................................................................ 27
Test Results Showing Response Time .............................................................................. 28

3
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

User Influence............................................................................................................ 28
Overlay of Running Vusers and Average Transaction Response Time .................... 29
Transaction Response Time (Percentile) ................................................................... 30
Server Side Monitors ........................................................................................................ 31
Throughput................................................................................................................. 31
Server Performance.................................................................................................... 32

Chapter 4: Test Results for Apache Tomcat ............................................... 35


Apache Tomcat Performance Test Statistics .................................................................... 35
Statistics Summary .................................................................................................... 35
Transaction Summary ................................................................................................ 35
Average Response Time ............................................................................................ 36
Test Results Showing Transactions Per Second ............................................................... 36
Total Transactions per Second................................................................................... 37
Transactions per Second ............................................................................................ 38
Transaction Summary ................................................................................................ 39
Test Results Showing Response Time .............................................................................. 40
User Influence............................................................................................................ 40
Overlay of Running Vusers and Average Transaction Response Time .................... 41
Transaction Response Time (Percentile) ................................................................... 42
Server Side Monitors ........................................................................................................ 43
Throughput................................................................................................................. 43
Server Performance.................................................................................................... 44
Apache Tomcat Resource Usage ............................................................................... 45

Chapter 5: Test Results for WebLogic ............................................................ 47


WebLogic Performance Test Statistics ............................................................................. 47
Statistics Summary .................................................................................................... 47
Transaction Summary ................................................................................................ 47
Average Response Time ............................................................................................ 48
Test Results Showing Transactions Per Second ............................................................... 49
Total Transactions per Second................................................................................... 49
Transactions per Second ............................................................................................ 50
Transaction Summary ................................................................................................ 51
Test Results Showing Response Time .............................................................................. 52
User Influence............................................................................................................ 52
Overlay of Running Vusers and Average Transaction Response Time .................... 53
Transaction Response Time (Percentile) ................................................................... 54
Server Side Monitors ........................................................................................................ 55
Throughput................................................................................................................. 55
Server Performance.................................................................................................... 56

Chapter 6: Test Results for JBoss ..................................................................... 59


JBoss Performance Test Statistics..................................................................................... 59
Statistics Summary .................................................................................................... 59
Transaction Summary ................................................................................................ 59

4
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Average Response Time ............................................................................................ 60


Test Results Showing Transactions Per Second ............................................................... 61
Total Transactions per Second................................................................................... 61
Transactions per Second ............................................................................................ 62
Transaction Summary ................................................................................................ 63
Test Results Showing Response Time .............................................................................. 64
User Influence............................................................................................................ 64
Overlay of Running Vusers and Average Transaction Response Time .................... 65
Transaction Response Time (Percentile) ................................................................... 66
Server Side Monitors ........................................................................................................ 67
Throughput................................................................................................................. 67
Server Performance.................................................................................................... 68

5
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

6
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Preface

About This Guide


This guide contains information on managing performance of RSA Adaptive
Authentication (On-Premise) 7.1. It is intended for administrators and other trusted
personnel. Do not make this guide available to the general user population.

RSA Adaptive Authentication (On-Premise) Documentation


For more information about RSA Adaptive Authentication (On-Premise), see the
following documentation:
Authentication Plug-In Developers Guide. Describes the Authentication Plug-In
development process that enables external authentication providers to integrate
their products with RSA Adaptive Authentication (On-Premise).
Back Office Users Guide. Provides an overview of the following Back Office
applications: Policy Management, Case Management, Access Management,
Customer Service Administration, and the Report Viewer.
Bait Credentials Setup and Implementation Guide. Describes how to set up and
implement RSA bait credentials, which help provide you with accelerated fraud
detection and prevention capabilities.
Best Practices for Challenge Questions. Describes the best practices related to
challenge questions that RSA has evolved through experience at multiple
deployments.
Installation and Upgrade Guide. Describes detailed procedures on how to install,
upgrade, and configure RSA Adaptive Authentication (On-Premise).
Integration Guide. Describes how to integrate and deploy
RSA Adaptive Authentication (On-Premise).
Operations Guide. Provides information on how to administer and operate
RSA Adaptive Authentication (On-Premise) after upgrade. This guide also
describes how to configure Performance Guide within the Configuration
Framework.
Performance Guide. Provides information about performance testing and
performance test results for the current release version of
RSA Adaptive Authentication (On-Premise).
Product Overview Guide. Provides a high-level overview of
RSA Adaptive Authentication (On-Premise), including system architecture.
Release Notes. Provides information about what is new and changed in this
release, as well as workarounds for known issues. It also includes the supported
platforms and work environments for platform certifications. The latest version of
the Release Notes is available on RSA SecurCare Online at
https://knowledge.rsasecurity.com.

Preface 7
Security Best Practices Guide. Provides recommendations for configuring your
network and RSA Adaptive Authentication (On-Premise) securely.
Web Services API Reference Guide. Describes RSA Adaptive Authentication
(On-Premise) web services API methods and parameters. This guide also
describes how to build your own web services clients and applications using web
services API to integrate and utilize the capabilities of Performance Guide.
Whats New. Highlights new features and enhancements in
RSA Adaptive Authentication (On-Premise) 7.1.
Workflows and Processes Guide. Describes the workflows and processes that
allow end users to interact with your system and that allow your system to interact
with RSA Adaptive Authentication (On-Premise).

Support and Service


RSA SecurCare Online https://knowledge.rsasecurity.com

Customer Support Information www.emc.com/support/rsa/index.htm

RSA Solution Gallery https://gallery.emc.com/community/marketplace/rsa?


view=overview

RSA SecurCare Online offers a knowledgebase that contains answers to common


questions and solutions to known problems. It also offers information on new releases,
important technical news, and software downloads.
The RSA Solution Gallery provides information about third-party hardware and
software products that have been certified to work with RSA products. The gallery
includes Secured by RSA Implementation Guides with step-by-step instructions and
other information about interoperation of RSA products with these third-party
products.

Before You Call Customer Support


Make sure that you have direct access to the computer running the Performance Guide
software.
Please have the following information available when you call:
Your RSA Customer/License ID.
Performance Guide software version number.
The make and model of the machine on which the problem occurs.
The name and version of the operating system under which the problem occurs.
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

1 Performance Test Results Introduction


Purpose of this Guide
Performance Test Environment
Performance Test Process
Performance Test Objectives

Purpose of this Guide


This guide provides the results of the performance tests of Adaptive Authentication
(On-Premise) 7.1 on commonly used supported platforms.
The performance tests measure system capacity and limitations within the application
landscape. They provide details around robustness under various load conditions. The
guide includes information about the test setup, and related software configurations.
This information may be useful to you as you set up your own test environment.

Note: The test results described in this guide are provided for reference only and may
differ from test results obtained in other test environments.

Performance Test Environment


The performance test environment uses commercial off-the-shelf load generation
software that provides the following items:
Utilities that facilitate script writing and script execution for the purpose of
running specific load test scenarios.
Software for monitoring resource consumption, response times, system health
measurements, and other system parameters, and then outputting the collected
data in graphs and tables.

Performance Test Process


The test plan phase determines which tests to run based on changes made to the
Adaptive Authentication (On-Premise) software.
The performance testing team develops scripts to drive the load generators through
various tests. To prepare the test platforms for testing, the team also develops scripts to
populate the database with enough real-world user records and event logs to simulate a
production system. A large volume of user records and event logs is essential for
avoiding constant reuse of the same user records and event logs which can lead to data
collisions and deadlocks.

1 Performance Test Results Introduction 9


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

The team runs scripts up to three times on each platform to provide reliable test
results. The team collects and correlates the results to arrive at conclusions.
During product development, performance test results helped detect potential
performance issues. The development team addresses the issues and the software is
re-tested to determined if the issues are resolved. Finally, performance tests are run on
the final product version. The final test results for representative environments are
provided in this guide.
For details about the Adaptive Authentication (On-Premise) performance test
environment, see Performance Test Environment on page 11.

Performance Test Objectives


Performance testing has the following objectives:
Detect any performance issues resulting from new features or other software
changes.
Detect system or component failures which might have occurred as a result of
high load on the system.
Detect congestion within the system.
Avoid exhaustion of server resources during high load on the system.
Quantify the customer experience (response time, transactions per second) during
high load.
Quantify the performance of API on-line requests and off-line tasks.
Determine the database upgrade time.
Detect whether or not the system performance meets or exceeds the performance
results of the previous version of the same product.

10 1 Performance Test Results Introduction


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

2 Performance Test Environment


Lab and Test Machine Specifications
Test Environment Preconditions
Performance Test Conditions
WebSphere Server Hardware Specifications
Apache Tomcat Server Hardware Specifications
WebLogic Server Hardware Specifications
JBoss Server Hardware Specifications
This chapter describes the test environment used to achieve the test results described
in Test Results for WebSphere, Test Results for Apache Tomcat, Test Results for
WebLogic, and Test Results for JBoss. You may use the information in this chapter as
a guideline to establish your own test environment.

Lab and Test Machine Specifications


The following diagram shows test control machines and the data flows to the
machines under test.

Test Control Virtual machines in the RSA test laboratory

Performance
Test Manager Load Controller

Server Monitor
Load
Generators

Adaptive
Load Authentication
Server

Database Read, Write

Solaris 10 5/08 s10s_u5wos_10 SPARC

DB Server

Monitoring

Adaptive Authentication Systems

2 Performance Test Environment 11


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Environment Preconditions


Test environment preconditions include loading the database with event logs and user
records to simulate a production environment. A high data volume and data variation
such as that shown in the section simulate a production setting. A low number of event
logs and user records can lead to overuse of data in tables resulting in system
congestion and data collisions.
For both Solaris and Windows servers, the database is pre-populated as follows:
100 million records in the EVENT_LOG table
1 million records in the USERS table

Performance Test Conditions


The performance tests are performed using the following conditions:
Machines are fully reimaged with operating system and Adaptive Authentication
(On-Premise) software.
Load clients are used to perform load tests on different platforms of the Adaptive
Authentication (On-Premise) product.

WebSphere Server Hardware Specifications


The following diagram shows the tested WebSphere environment.


  
   
  

6RODULV 2UDFOHZLWK3DUWLWLRQV
:HE6SKHUH ELW

The following table describes the application server software configuration for the
Solaris 10 with WebSphere test machines.

WebSphere Application Server Software Configuration

OS Name Solaris 10

OS Version Solaris 10 5/08 s10s_u5wos_10 SPARC

System Model Sun T5220

Processor(s) 1 SUNW, UltraSPARC-T2, 8 Core, 1.2 GHz, 1167 MHz

Number of Threads 8

12 2 Performance Test Environment


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Total Physical Memory 32 GB

Web Server Software WebSphere 8.0 (64-bit)

JVM See JVM Configuration

The following table describes the database server software configuration for the
Solaris 10 with WebSphere test machines.

Oracle Database Server Software Configuration

OS Name Solaris 10

OS Version Solaris 10 5/08 s10s_u5wos_10 SPARC

System Model Sun T5220

Processor(s) 1 SUNW, UltraSPARC-T2, 8 Core, 1.2 GHz, 1167 MHz

Number of Threads 8

Total Physical Memory 32 GB

Database Software Oracle 11 with partitions

JVM Configuration
A single WebSphere 8.0 instance is run on the application machine using the
following JVM (Java Virtual Machine) configuration:
JVM Version
java "1.6.0_31"
Java 2 Runtime Environment, Standard Edition (IBM J6_26 build 1.6.0_31-b04
29_Feb_2012_23_18 solaris sparcv9 (SR1 FP1))
Java HotSpot Server VM (build 20.6-b01, mixed mode)
IBM Java ORB build orb626sr1fp1-20120206.00
XML build XL TXE Java 1.0.27
XML build IBM JAXP 1.4.6
XML build XML4J 4.5.17
JVM Options
initialHeapSize="3024"
maximumHeapSize="3024"
ache.xerces.parsers.XML11Configuration
-Djavax.management.builder.initial=
-Dcom.sun.management.jmxremote.port=9004
-Dcom.sun.management.jmxremote.authenticate=false

2 Performance Test Environment 13


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

-Dcom.sun.management.jmxremote.ssl=false -server
-XX:-UseAdaptiveSizePolicy -XX:+UseParallelGC
-XX:NewSize=1268m -XX:MaxNewSize=1268m XX:+DisableExplicitGC
-XX:SurvivorRatio=8 -XX:MaxPermSize=512M
-XX:ParallelGCThreads=5 -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps">

Data Source Connection Pool Configuration


The following table lists the recommended data source connection pool properties.

Property Recommended Value

Scope Cell scope

Connection Timeout 1,800

Maximum Connections 40

Minimum Connections 40

Reap Time 180

Unused Timeout 1,800

Aged Timeout 0

Purge Policy FailingConnectionOnly

Statement Cache 100

Note: When conducting the performance test the exact values should be adjusted to
the expected volumes. Generally, the number of data source connection pools should
be almost twice the thread pool count.

Thread Pool Configuration


The following table lists the recommended thread pool properties.

Property Recommended Thread Pool Value

Minimum Size 10

Maximum Size 100

Thread Inactivity Timeout 60,000

14 2 Performance Test Environment


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Apache Tomcat Server Hardware Specifications


RSA Adaptive Authentication (On-Premise) 7.1 is tested on Windows 2008, using
Apache Tomcat 7.0 web server and MS SQL 2008 R2. This section describes the
Apache Tomcat performance environment settings. The following figure shows the
Windows test platform.

$GDSWLYH$XWKHQWLFDWLRQ6HUYHU '%6HUYHU

0LFURVRIW:LQGRZV6HUYHU 0LFURVRIW64/5
$SDFKH7RPFDW

The following table describes the application server software configuration for the
Microsoft Windows with Apache Tomcat test machines.

Apache Tomcat Application Server Software Configuration

OS Name Microsoft Windows Server 2008 R2 Enterprise

OS Version 6.1.7600 N/A Build 7600

System Model Dell PowerEdge R610

System Type x64-based PC

Processor(s) 2 Intel64 Family 6 Model 44 Stepping 2 GenuineIntel ~2660 Mhz

BIOS Version Dell Inc. 3.0.0, 1/31/2011

Total Physical Memory 65,523 MB

Web Server Software Apache Tomcat 7.0 installed on the Adaptive Authentication
server system

JVM Java HotSpot 64-Bit Server VM version 16.0-b13

The following table describes the database server software configuration for the
Microsoft Windows with Apache Tomcat test machines.

MS SQL Database Server Software Configuration

OS Name Microsoft Windows Server 2008 R2 Enterprise

OS Version 6.1.7600 N/A Build 7600

System Model Dell PowerEdge R610

System Type x64-based PC

2 Performance Test Environment 15


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Processor(s) 2 Intel64 Family 6 Model 44 Stepping 2 GenuineIntel ~2660 Mhz

BIOS Version Dell Inc. 3.0.0, 1/31/2011

Total Physical Memory 65,523 MB

Database Software Microsoft SQL 2008 R2 installed on the database system

Configuring JVM Options


A single Apache Tomcat 7.0 instance was run on the application machine using the
following JVM version Java HotSpot 64-Bit Server VM 16.0-b13:

JVM Options
-server -Xms2576m -Xmx2576m
-XX:PermSize=256m -XX:MaxPermSize=512m -XX:NewSize=1024m
-XX:MaxNewSize=1024m -XX:+AggressiveHeap -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps -verbose:gc
-Xloggc:%CATALINA_HOME%\logs\gc.log

Configuring the Data Source Connection Pool


The following JDBC connector settings were used:
<Resource auth="Container"
driverClassName="com.microsoft.sqlserver.jdbc.SQLServerDriver"
maxActive="100" maxIdle="30" maxWait="10000" name="jdbc/PassMarkDB"
password="rsa_core_user" type="javax.sql.DataSource"
url="jdbc:sqlserver://servername:1433;databaseName=db_name
username=user_name/>

Configuring the Thread Pool


The following default thread settings were used:
<Connector port="8080" protocol="HTTP/1.1"
connectionTimeout="20000"
redirectPort="8443" />

WebLogic Server Hardware Specifications


The following diagram shows the tested WebLogic environment.
$GDSWLYH$XWKHQWLFDWLRQ6HUYHU '%6HUYHU

6RODULV 2UDFOHZLWK3DUWLWLRQV
:HE/RJLF ELW

16 2 Performance Test Environment


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

The following table describes the application server software configuration for the
Solaris 10 with WebLogic test machines.

WebLogic Application Server Software Configuration

OS Name Solaris 10

OS Version Solaris 10 5/08 s10s_u5wos_10 SPARC

System Model Sun T5220

Processor(s) 1 SUNW, UltraSPARC-T2, 8 Core, 1.2 GHz, 1167 MHz

Number of Threads 8

Total Physical Memory 32 GB

Web Server Software WebLogic 10.3 (64-bit)

JVM See JVM Configuration

The following table describes the database server software configuration for the
Solaris 10 with WebLogic test machines.

Oracle Database Server Software Configuration

OS Name Solaris 10

OS Version Solaris 10 5/08 s10s_u5wos_10 SPARC

System Model Sun T5220

Processor(s) 1 SUNW, UltraSPARC-T2, 8 Core, 1.2 GHz, 1167 MHz

Number of Threads 8

Total Physical Memory 32 GB

Database Software Oracle 10 with partitions

JVM Configuration
A single WebLogic 10.3 instance is run on the application machine using the
following JVM configuration:
JVM Version
java "1.6.0_31"
Java 2 Runtime Environment, Standard Edition (IBM J6_26 build 1.6.0_31-b04
29_Feb_2012_23_18 solaris sparcv9 (SR1 FP1))
Java HotSpot Server VM (build 20.6-b01, mixed mode)
IBM Java ORB build orb626sr1fp1-20120206.00
XML build XL TXE Java 1.0.27

2 Performance Test Environment 17


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

XML build IBM JAXP 1.4.6


XML build XML4J 4.5.17
JVM Options
Xmx 3072M
Xms 4096M
ache.xerces.parsers.XML11Configuration
-Djavax.management.builder.initial=
-Dcom.sun.management.jmxremote.port=9004
-Dcom.sun.management.jmxremote.authenticate=false
-Dcom.sun.management.jmxremote.ssl=false -server
-XX:-UseAdaptiveSizePolicy -XX:+UseParallelGC
-XX:NewSize=1268m -XX:MaxNewSize=1268m XX:+DisableExplicitGC
-XX:NewRatio=4 -XX:SurvivorRatio=12 -XX:MaxPermSize=512M
-XX:ParallelGCThreads=5 -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps">

Data Source Connection Pool Configuration


The following table lists the recommended data source connection pool properties.

Property Recommended Value

Init jdbc Pools Size 100

Max jdbc Pools Size 100

Statement Cache 100

Note: When conducting the performance test the exact values should be adjusted to
the expected volumes. Generally, the number of data source connection pools should
be almost twice the thread pool count.

18 2 Performance Test Environment


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Thread Pool Configuration


The following table lists the recommended thread pool properties.

Property Recommended Value

Minimum Size 10

Maximum Size 100

Thread Inactivity Timeout 60,000

JBoss Server Hardware Specifications


The following diagram shows the Linux environment tested on a JBoss server.
$GDSWLYH$XWKHQWLFDWLRQ6HUYHU '%6HUYHU

/LQX[5HG+DW 2UDFOHJ
-%RVV ELW

The following table describes the application server software configuration for the
Linux with JBoss test machines.

JBoss Application Server Software Configuration

OS Name Linux RedHat

OS Version 6

System Model Dell

Processor(s) X86 x 64 1.2 Ghz

Number of Threads 64

Total Physical Memory 32 GB

Web Server Software JBoss 5.1

JVM See JVM Configuration

2 Performance Test Environment 19


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

The following table describes the database server software configuration for the Linux
with JBoss test machines.

Oracle Database Server Software Configuration

OS Name Linux RedHat

OS Version 6

System Model Dell

Processor(s) X86 x 64 2.8 Ghz

Number of Threads 2

Total Physical Memory 16 GB

Database Software Oracle 11g

JVM Configuration
A single instance of JBoss 5.1 was run on the application machine using the following
JVM configuration:
JVM Version
java "1.6.0_31"
Java 2 Runtime Environment, Standard Edition (IBM J6_26 build 1.6.0_31-b04
29_Feb_2012_23_18 solaris sparcv9 (SR1 FP1))
Java HotSpot Server VM (build 20.6-b01, mixed mode)
IBM Java ORB build orb626sr1fp1-20120206.00
XML build XL TXE Java 1.0.27
XML build IBM JAXP 1.4.6
XML build XML4J 4.5.17
JVM Options
Xmx 2512M
Xms 2512M
-server
-XX:+UseConcMarkSweepGC
-XX:NewRatio=2 -XX:SurvivorRatio=20 -XX:MaxPermSize=256M

20 2 Performance Test Environment


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Data Source Connection Pool Configuration


The following table lists the recommended data source connection pool properties.

Property Recommended Value

Init jdbc Pools Size 100

Max jdbc Pools Size 100

Statement Cache

Note: When conducting the performance test the exact values should be adjusted to
the expected volumes. Generally, the number of data source connection pools should
be almost twice the thread pool count.

Thread Pool Configuration


The following table lists the recommended thread pool properties.

Property Recommended Value

Minimum Size 10

Maximum Size 100

Thread Inactivity Timeout 60,000

2 Performance Test Environment 21


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

22 2 Performance Test Environment


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

3 Test Results for WebSphere


WebSphere Performance Test Statistics
Test Results Showing Transactions Per Second
Test Results Showing Response Time
Server Side Monitors
This chapter provides performance test results for RSA Adaptive Authentication
(On-Premise) 7.1 on a Solaris 10 system running a WebSphere 8.0 web server and an
Oracle 11 database with partitions.

WebSphere Performance Test Statistics


Performance tests on Solaris demonstrate that Adaptive Authentication (On-Premise)
7.1 meets or exceeds established performance requirements in this environment.

Statistics Summary

Parameter Value

Maximum Running Vusers 50

Total Throughput 2,516,031,536 bytes

Average Throughput 307,809 bytes/second

Total Hits 1,013,171

Maximum Transactions per Second 196

Average response time for peak TPS 260 ms

Maximum Application Server CPU 45%

Maximum Database Server CPU 10%

Transaction Summary
Performance tests yield the following total passed, failed, and stopped transactions:
Total Passed: 1,013,170
Total Failed: 1
Total Stopped: 0

3 Test Results for WebSphere 23


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Average Response Time


The following table shows response time statistics for the various transaction types
used for testing. 90th percentile is used to mitigate the effects of statistical outliers.

Transaction Min. Ave. Max. th


Std. Dev. 90 Pass Fail Stop
Name (sec) (sec) (sec) Percentile

Analyze 0.129 0.216 7.485 0.059 0.316 729,436 0 0

Authenticate 0.106 0.169 0.609 0.049 0.234 68,925 0 0

Challenge 0.031 0.071 0.516 0.031 0.1 68,925 0 0

CreateUser 0.117 0.188 0.531 0.051 0.266 17,035 0 0

FailedSignIn 0.135 0.22 3.531 0.06 0.315 68,925 1 0

Payment 0.133 0.219 8.5 0.07 0.315 42,889 0 0

UpdateUser 0.047 0.1 0.438 0.036 0.129 17,035 0 0

24 3 Test Results for WebSphere


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Transactions Per Second


Total Transactions per Second
Transactions per Second
Transaction Summary

Total Transactions per Second


The Total Transactions per Second test data displays the total number of completed
transactions (both successful and unsuccessful) performed during each second of a
load test. This type of graph helps determine the actual transaction load on the system
at any given point in time.

3 Test Results for WebSphere 25


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transactions per Second


The Transactions per Second test data displays the number of completed transactions
(both successful and unsuccessful) performed during each second of a load test. This
type of graph helps determine the actual transaction load on the system at any given
point in time.

26 3 Test Results for WebSphere


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.

3 Test Results for WebSphere 27


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Response Time


User Influence
Overlay of Running Vusers and Average Transaction Response Time
Transaction Response Time (Percentile)

User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.

28 3 Test Results for WebSphere


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Overlay of Running Vusers and Average Transaction Response Time


The Overlay of Running Vusers and Average Transaction Response Time test data
shows how increasing the number of Vusers (virtual users) increases the average
response times for various transaction types.

3 Test Results for WebSphere 29


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Response Time (Percentile)


The Transaction Response Time test data displays the percentage of transactions that
are performed within a given time range. This type of graph helps determine the
percentage of transactions that meet the performance criteria defined for your system.

30 3 Test Results for WebSphere


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Side Monitors


Throughput
Server Performance

Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web
server during the load test. Throughput represents the amount of data that the Vusers
receives from the server during any given second. This type of graph helps you to
evaluate the amount of load Vusers generate, in terms of server throughput.

3 Test Results for WebSphere 31


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Performance
The Server Performance test data displays a summary of the System UNIX Resources
usage for each UNIX based host.

32 3 Test Results for WebSphere


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

The following table describes the measurements used in the Server Performance test
data summary.

Measurement Description

Average load Average number of processes simultaneously in the Ready state


during the most recent minute of testing. A process in the Ready
state is waiting to run but the CPU Busy state prevents the
process from running. The monitor samples data every five
seconds so the average is from the most recent twelve samples.

Context switch rate Number of switches between processes or threads, per second.

CPU Utilization Percentage of time that the CPU is utilized.

Swap-In Rate Number of processes swapped into memory, per second.

System mode CPU Percentage of time the CPU is utilized in system mode.
Utilization

User mode CPU Percentage of time the CPU is utilized in user mode.
Utilization

3 Test Results for WebSphere 33


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

4 Test Results for Apache Tomcat


Apache Tomcat Performance Test Statistics
Test Results Showing Transactions Per Second
Test Results Showing Response Time
Server Side Monitors
This chapter provides the performance tests results for RSA Adaptive Authentication
(On-Premise) 7.1 server software running on Microsoft Windows Server 2008 R2
Enterprise and Apache Tomcat 7.0 web server with a Microsoft SQL 2008 R2
database.

Apache Tomcat Performance Test Statistics


Performance tests on Apache Tomcat demonstrate that Adaptive Authentication
(On-Premise) 7.1 meets or exceeds established performance requirements in this
environment.

Statistics Summary

Parameter Value

Maximum Running Vusers 30

Total Throughput 4,582,918,650 bytes

Average Throughput 467,931 bytes/second

Total Hits 1,939,265

Maximum Transactions per Second 328

Average response time for peak TPS 130 ms

Maximum Application Server CPU 85%

Maximum Database Server CPU 45%

Transaction Summary
Performance tests yield the following total passed, failed, and stopped transactions:
Total Passed: 1,939,240
Total Failed: 25
Total Stopped: 0

4 Test Results for Apache Tomcat 35


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Average Response Time


The following table shows response time statistics for the various transaction types
used for testing. 90th percentile is used to mitigate the effects of statistical outliers.

Transaction Min. Ave. Max. Std. 90th Pass Fail Stop


Name (sec) (sec) (sec) Dev. Percentile

Analyze 0.016 0.07 2.859 0.059 0.113 1,397,289 0 0

Authenticate 0.016 0.074 2.031 0.061 0.113 131,334 0 0

Challenge 0 0.03 1.984 0.034 0.043 131,334 0 0

CreateUser 0.016 0.051 2 0.046 0.063 32,719 0 0

FailedSignIn 0.016 0.076 2.063 0.062 0.113 131,334 25 0

Payment 0.016 0.072 2.031 0.061 0.103 82,511 0 0

UpdateUser 0 0.03 1.969 0.032 0.043 32,719 0 0

Test Results Showing Transactions Per Second


Total Transactions per Second
Transactions per Second
Transaction Summary

36 4 Test Results for Apache Tomcat


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Total Transactions per Second


The Total Transactions per Second test data displays the total number of completed
transactions (both successful and unsuccessful) performed during each second of a
load test. This type of graph helps determine the actual transaction load on the system
at any given point in time.

4 Test Results for Apache Tomcat 37


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transactions per Second


The Transactions per Second test displays the number of completed transactions (both
successful and unsuccessful) performed during each second of a load test. This type of
graph helps determine the actual transaction load on the system at any given point in
time.

38 4 Test Results for Apache Tomcat


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.

4 Test Results for Apache Tomcat 39


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Response Time


User Influence
Overlay of Running Vusers and Average Transaction Response Time
Transaction Response Time (Percentile)

User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.

40 4 Test Results for Apache Tomcat


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Overlay of Running Vusers and Average Transaction Response Time


The Overlay of Running Vusers and Average Transaction Response Time test data
shows how increasing the number of Vusers increases the average response times for
various transaction types.

4 Test Results for Apache Tomcat 41


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Response Time (Percentile)


The Transaction Response Time test data displays the percentage of transactions that
are performed within a given time range. This type of graph helps determine the
percentage of transactions that meet the performance criteria defined for your system.

42 4 Test Results for Apache Tomcat


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Side Monitors


Throughput
Server Performance
Apache Tomcat Resource Usage

Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web
server during the load test. Throughput represents the amount of data that the Vusers
receives from the server during any given second. This type of graph helps you to
evaluate the amount of load Vusers generate, in terms of server throughput.

4 Test Results for Apache Tomcat 43


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Performance
The Server Performance test data displays a summary of the system resource usage for
each Windows based host. The resource consumption decrease ocurring after 2 hours
and 25 minutes is due to the load test termination.

44 4 Test Results for Apache Tomcat


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Apache Tomcat Resource Usage


The Apache Tomcat Resource Usage test data displays the resources consumed by the
Apache Tomcat web server during the test.

4 Test Results for Apache Tomcat 45


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

5 Test Results for WebLogic


WebLogic Performance Test Statistics
Test Results Showing Transactions Per Second
Test Results Showing Response Time
Server Side Monitors
This chapter provides performance test results for RSA Adaptive Authentication
(On-Premise) 7.1 on a Solaris 10 system running WebLogic 10 web server, with an
Oracle 10 database with partitions.

WebLogic Performance Test Statistics


Performance tests on Solaris demonstrate that Adaptive Authentication (On-Premise)
7.1 meets or exceeds established performance requirements in this environment.

Statistics Summary

Parameter Value

Maximum Running Vusers 38

Total Throughput 730,393,883 bytes

Average Throughput 221,331 bytes/second

Total Hits 312,450

Maximum Transactions per Second (TPS) 151

Maximum Application Server CPU Usage 44%

Maximum Database Server CPU Usage 15.8%

Transaction Summary
Performance tests yielded the following total passed, failed and stopped transactions:
Total Passed: 312,458
Total Failed: 3
Total Stopped: 0

5 Test Results for WebLogic 47


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Average Response Time


The following table shows response time statistics for the various transaction types
used for testing. 90th percentile is used to mitigate the effects of statistical outliers.

Transaction Min. Ave. Max. th


Std. Dev. 90 Pass Fail Stop
Name (sec) (sec) (sec) Percentile

Analyze 0.125 0.195 1.85 0.041 0.233 224,996 0 0

Authenticate 0.158 0.266 1.867 0.061 0.333 21,209 0 0

Challenge 0.083 0.128 1.072 0.033 0.163 21,214 0 0

CreateUser 0.131 0.184 0.479 0.035 0.223 5,321 0 0

FailedSignIn 0.158 0.229 1.851 0.046 0.273 21,215 3 0

Payment 0.139 0.2 0.703 0.041 0.243 13,182 0 0

UpdateUser 0.062 0.098 0.391 0.027 0.133 5,321 0 0

48 5 Test Results for WebLogic


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Transactions Per Second


Total Transactions per Second
Transactions per Second
Transaction Summary

Total Transactions per Second


The Total Transactions per Second test data displays the total number of completed
transactions (both successful and unsuccessful) performed during each second of a
load test. This type of graph helps determine the actual transaction load on the system
at any given point in time.

5 Test Results for WebLogic 49


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transactions per Second


The Transactions per Second test data displays the number of completed transactions
(both successful and unsuccessful) performed during each second of a load test. This
type of graph helps determine the actual transaction load on the system at any given
point in time.

50 5 Test Results for WebLogic


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.

5 Test Results for WebLogic 51


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Response Time


User Influence
Overlay of Running Vusers and Average Transaction Response Time
Transaction Response Time (Percentile)

User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.

52 5 Test Results for WebLogic


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Overlay of Running Vusers and Average Transaction Response Time


The Overlay of Running Vusers and Average Transaction Response Time test data
shows how increasing the number of Vusers increases the average response times for
various transaction types.

5 Test Results for WebLogic 53


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Response Time (Percentile)


The Transaction Response Time test data displays the percentage of transactions that
are performed within a given time range. This type of graph helps determine the
percentage of transactions that meet the performance criteria defined for your system.

54 5 Test Results for WebLogic


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Side Monitors


Throughput
Server Performance

Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web
server during the load test. Throughput represents the amount of data that the Vusers
receives from the server during any given second. This type of graph helps you to
evaluate the amount of load Vusers generate, in terms of server throughput.

5 Test Results for WebLogic 55


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Performance
The Server Performance test data displays a summary of the System UNIX Resources
usage for each UNIX based host.

The following table describes the measurements used in the Server Performance test
data summary.

Measurement Description

CPU Utilization Percentage of time that the CPU is utilized.

Disk Traffic Rate of disk transfers, in transfers per second.

56 5 Test Results for WebLogic


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

5 Test Results for WebLogic 57


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

6 Test Results for JBoss


JBoss Performance Test Statistics
Test Results Showing Transactions Per Second
Test Results Showing Response Time
Server Side Monitors
This chapter provides performance test results for RSA Adaptive Authentication
(On-Premise) 7.1 on a Linux RedHat system running JBoss 5.1 web server, with an
Oracle 11 database with partitions.

JBoss Performance Test Statistics


Performance tests on Linux RedHat demonstrate that Adaptive Authentication
(On-Premise) 7.1 meets or exceeds established performance requirements in this
environment.

Statistics Summary

Parameter Value

Maximum Running Vusers 14

Total Throughput 440,781,135 bytes

Average Throughput 367,318 bytes/second

Total Hits 186,524

Maximum Transactions per Second (TPS) 221

Maximum Application Server CPU Usage 68%

Maximum Database Server CPU Usage 21%

Transaction Summary
Performance tests yielded the following total passed, failed and stopped transactions:
Total Passed: 186,512
Total Failed: 12
Total Stopped: 0

6 Test Results for JBoss 59


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Average Response Time


The following table shows response time statistics for the various transaction types
used for testing. 90th percentile is used to mitigate the effects of statistical outliers.

Transaction Min. Ave. Max. th


Std. Dev. 90 Pass Fail Stop
Name (sec) (sec) (sec) Percentile

Analyze 0.016 0.044 3.891 0.066 0.053 134,060 0 0

Authenticate 0 0.043 2.797 0.048 0.053 12,769 0 0

Challenge 0 0.023 3.844 0.054 0.033 12,763 6 0

CreateUser 0.016 0.039 2.766 0.059 0.053 3,123 0 0

FailedSignIn 0.031 0.051 3.891 0.082 0.063 12,763 6 0

Payment 0.027 0.044 1.735 0.044 0.053 7,911 0 0

UpdateUser 0 0.023 0.078 0.009 0.031 3,123 0 0

60 6 Test Results for JBoss


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Transactions Per Second


Total Transactions per Second
Transactions per Second
Transaction Summary

Total Transactions per Second


The Total Transactions per Second test data displays the total number of completed
transactions (both successful and unsuccessful) performed during each second of a
load test. This type of graph helps determine the actual transaction load on the system
at any given point in time.

6 Test Results for JBoss 61


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transactions per Second


The Transactions per Second test data displays the number of completed transactions
(both successful and unsuccessful) performed during each second of a load test. This
type of graph helps determine the actual transaction load on the system at any given
point in time.

62 6 Test Results for JBoss


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.

6 Test Results for JBoss 63


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Test Results Showing Response Time


User Influence
Overlay of Running Vusers and Average Transaction Response Time
Transaction Response Time (Percentile)

User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.

64 6 Test Results for JBoss


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Overlay of Running Vusers and Average Transaction Response Time


The Overlay of Running Vusers and Average Transaction Response Time test data
shows how increasing the number of Vusers increases the average response times for
various transaction types.

6 Test Results for JBoss 65


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Transaction Response Time (Percentile)


The Transaction Response Time test data displays the percentage of transactions that
are performed within a given time range. This type of graph helps determine the
percentage of transactions that meet the performance criteria defined for your system.

66 6 Test Results for JBoss


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Side Monitors


Throughput
Server Performance

Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web server during the
load test. Throughput represents the amount of data that the Vusers receives from the server during any
given second. This type of graph helps you to evaluate the amount of load Vusers generate, in terms of
server throughput.

6 Test Results for JBoss 67


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

Server Performance
The Server Performance test data displays a summary of the System UNIX Resources
usage for each UNIX based host.

The following table describes the measurements used in the Server Performance test
data summary.

Measurement Description

CPU Utilization Percentage of time that the CPU is utilized.

Disk Traffic Rate of disk transfers, in transfers per second.

68 6 Test Results for JBoss


RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide

6 Test Results for JBoss 69

Вам также может понравиться