Академический Документы
Профессиональный Документы
Культура Документы
Premise) 7.1
Performance Guide
Contact Information
Go to the RSA corporate web site for regional Customer Support telephone and fax numbers: www.rsa.com
Trademarks
RSA, the RSA Logo and EMC are either registered trademarks or trademarks of EMC Corporation in the United States and/or
other countries. All other trademarks used herein are the property of their respective owners. For a list of RSA trademarks, go
to www.rsa.com/legal/trademarks_list.pdf.
License agreement
This software and the associated documentation are proprietary and confidential to EMC, are furnished under license, and
may be used and copied only in accordance with the terms of such license and with the inclusion of the copyright notice
below. This software and the documentation, and any copies thereof, may not be provided or otherwise made available to any
other person.
No title to or ownership of the software or documentation or any intellectual property rights thereto is hereby transferred. Any
unauthorized use or reproduction of this software and the documentation may be subject to civil and/or criminal liability.
This software is subject to change without notice and should not be construed as a commitment by EMC.
Note on encryption technologies
This product may contain encryption technology. Many countries prohibit or restrict the use, import, or export of encryption
technologies, and current use, import, and export regulations should be followed when using, importing or exporting this
product.
Distribution
Use, copying, and distribution of any EMC software described in this publication requires an applicable software license.
EMC believes the information in this publication is accurate as of its publication date. The information is subject to change
without notice.
THE INFORMATION IN THIS PUBLICATION IS PROVIDED "AS IS." EMC CORPORATION MAKES NO
REPRESENTATIONS OR WARRANTIES OF ANY KIND WITH RESPECT TO THE INFORMATION IN THIS
PUBLICATION, AND SPECIFICALLY DISCLAIMS IMPLIED WARRANTIES OF MERCHANTABILITY OR
FITNESS FOR A PARTICULAR PURPOSE.
Contents
Preface................................................................................................................................... 7
About This Guide................................................................................................................ 7
RSA Adaptive Authentication (On-Premise) Documentation ............................................ 7
Support and Service ............................................................................................................ 8
Before You Call Customer Support............................................................................. 8
3
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide
User Influence............................................................................................................ 28
Overlay of Running Vusers and Average Transaction Response Time .................... 29
Transaction Response Time (Percentile) ................................................................... 30
Server Side Monitors ........................................................................................................ 31
Throughput................................................................................................................. 31
Server Performance.................................................................................................... 32
4
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide
5
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide
6
RSA Adaptive Authentication (On-Premise) 7.1 Performance Guide
Preface
Preface 7
Security Best Practices Guide. Provides recommendations for configuring your
network and RSA Adaptive Authentication (On-Premise) securely.
Web Services API Reference Guide. Describes RSA Adaptive Authentication
(On-Premise) web services API methods and parameters. This guide also
describes how to build your own web services clients and applications using web
services API to integrate and utilize the capabilities of Performance Guide.
Whats New. Highlights new features and enhancements in
RSA Adaptive Authentication (On-Premise) 7.1.
Workflows and Processes Guide. Describes the workflows and processes that
allow end users to interact with your system and that allow your system to interact
with RSA Adaptive Authentication (On-Premise).
Note: The test results described in this guide are provided for reference only and may
differ from test results obtained in other test environments.
The team runs scripts up to three times on each platform to provide reliable test
results. The team collects and correlates the results to arrive at conclusions.
During product development, performance test results helped detect potential
performance issues. The development team addresses the issues and the software is
re-tested to determined if the issues are resolved. Finally, performance tests are run on
the final product version. The final test results for representative environments are
provided in this guide.
For details about the Adaptive Authentication (On-Premise) performance test
environment, see Performance Test Environment on page 11.
Performance
Test Manager Load Controller
Server Monitor
Load
Generators
Adaptive
Load Authentication
Server
DB Server
Monitoring
6RODULV 2UDFOHZLWK3DUWLWLRQV
:HE6SKHUHELW
The following table describes the application server software configuration for the
Solaris 10 with WebSphere test machines.
OS Name Solaris 10
Number of Threads 8
The following table describes the database server software configuration for the
Solaris 10 with WebSphere test machines.
OS Name Solaris 10
Number of Threads 8
JVM Configuration
A single WebSphere 8.0 instance is run on the application machine using the
following JVM (Java Virtual Machine) configuration:
JVM Version
java "1.6.0_31"
Java 2 Runtime Environment, Standard Edition (IBM J6_26 build 1.6.0_31-b04
29_Feb_2012_23_18 solaris sparcv9 (SR1 FP1))
Java HotSpot Server VM (build 20.6-b01, mixed mode)
IBM Java ORB build orb626sr1fp1-20120206.00
XML build XL TXE Java 1.0.27
XML build IBM JAXP 1.4.6
XML build XML4J 4.5.17
JVM Options
initialHeapSize="3024"
maximumHeapSize="3024"
ache.xerces.parsers.XML11Configuration
-Djavax.management.builder.initial=
-Dcom.sun.management.jmxremote.port=9004
-Dcom.sun.management.jmxremote.authenticate=false
-Dcom.sun.management.jmxremote.ssl=false -server
-XX:-UseAdaptiveSizePolicy -XX:+UseParallelGC
-XX:NewSize=1268m -XX:MaxNewSize=1268m XX:+DisableExplicitGC
-XX:SurvivorRatio=8 -XX:MaxPermSize=512M
-XX:ParallelGCThreads=5 -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps">
Maximum Connections 40
Minimum Connections 40
Aged Timeout 0
Note: When conducting the performance test the exact values should be adjusted to
the expected volumes. Generally, the number of data source connection pools should
be almost twice the thread pool count.
Minimum Size 10
$GDSWLYH$XWKHQWLFDWLRQ6HUYHU '%6HUYHU
0LFURVRIW:LQGRZV6HUYHU 0LFURVRIW64/5
$SDFKH7RPFDW
The following table describes the application server software configuration for the
Microsoft Windows with Apache Tomcat test machines.
Web Server Software Apache Tomcat 7.0 installed on the Adaptive Authentication
server system
The following table describes the database server software configuration for the
Microsoft Windows with Apache Tomcat test machines.
JVM Options
-server -Xms2576m -Xmx2576m
-XX:PermSize=256m -XX:MaxPermSize=512m -XX:NewSize=1024m
-XX:MaxNewSize=1024m -XX:+AggressiveHeap -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps -verbose:gc
-Xloggc:%CATALINA_HOME%\logs\gc.log
6RODULV 2UDFOHZLWK3DUWLWLRQV
:HE/RJLFELW
The following table describes the application server software configuration for the
Solaris 10 with WebLogic test machines.
OS Name Solaris 10
Number of Threads 8
The following table describes the database server software configuration for the
Solaris 10 with WebLogic test machines.
OS Name Solaris 10
Number of Threads 8
JVM Configuration
A single WebLogic 10.3 instance is run on the application machine using the
following JVM configuration:
JVM Version
java "1.6.0_31"
Java 2 Runtime Environment, Standard Edition (IBM J6_26 build 1.6.0_31-b04
29_Feb_2012_23_18 solaris sparcv9 (SR1 FP1))
Java HotSpot Server VM (build 20.6-b01, mixed mode)
IBM Java ORB build orb626sr1fp1-20120206.00
XML build XL TXE Java 1.0.27
Note: When conducting the performance test the exact values should be adjusted to
the expected volumes. Generally, the number of data source connection pools should
be almost twice the thread pool count.
Minimum Size 10
/LQX[5HG+DW 2UDFOHJ
-%RVVELW
The following table describes the application server software configuration for the
Linux with JBoss test machines.
OS Version 6
Number of Threads 64
The following table describes the database server software configuration for the Linux
with JBoss test machines.
OS Version 6
Number of Threads 2
JVM Configuration
A single instance of JBoss 5.1 was run on the application machine using the following
JVM configuration:
JVM Version
java "1.6.0_31"
Java 2 Runtime Environment, Standard Edition (IBM J6_26 build 1.6.0_31-b04
29_Feb_2012_23_18 solaris sparcv9 (SR1 FP1))
Java HotSpot Server VM (build 20.6-b01, mixed mode)
IBM Java ORB build orb626sr1fp1-20120206.00
XML build XL TXE Java 1.0.27
XML build IBM JAXP 1.4.6
XML build XML4J 4.5.17
JVM Options
Xmx 2512M
Xms 2512M
-server
-XX:+UseConcMarkSweepGC
-XX:NewRatio=2 -XX:SurvivorRatio=20 -XX:MaxPermSize=256M
Statement Cache
Note: When conducting the performance test the exact values should be adjusted to
the expected volumes. Generally, the number of data source connection pools should
be almost twice the thread pool count.
Minimum Size 10
Statistics Summary
Parameter Value
Transaction Summary
Performance tests yield the following total passed, failed, and stopped transactions:
Total Passed: 1,013,170
Total Failed: 1
Total Stopped: 0
Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.
User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.
Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web
server during the load test. Throughput represents the amount of data that the Vusers
receives from the server during any given second. This type of graph helps you to
evaluate the amount of load Vusers generate, in terms of server throughput.
Server Performance
The Server Performance test data displays a summary of the System UNIX Resources
usage for each UNIX based host.
The following table describes the measurements used in the Server Performance test
data summary.
Measurement Description
Context switch rate Number of switches between processes or threads, per second.
System mode CPU Percentage of time the CPU is utilized in system mode.
Utilization
User mode CPU Percentage of time the CPU is utilized in user mode.
Utilization
Statistics Summary
Parameter Value
Transaction Summary
Performance tests yield the following total passed, failed, and stopped transactions:
Total Passed: 1,939,240
Total Failed: 25
Total Stopped: 0
Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.
User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.
Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web
server during the load test. Throughput represents the amount of data that the Vusers
receives from the server during any given second. This type of graph helps you to
evaluate the amount of load Vusers generate, in terms of server throughput.
Server Performance
The Server Performance test data displays a summary of the system resource usage for
each Windows based host. The resource consumption decrease ocurring after 2 hours
and 25 minutes is due to the load test termination.
Statistics Summary
Parameter Value
Transaction Summary
Performance tests yielded the following total passed, failed and stopped transactions:
Total Passed: 312,458
Total Failed: 3
Total Stopped: 0
Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.
User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.
Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web
server during the load test. Throughput represents the amount of data that the Vusers
receives from the server during any given second. This type of graph helps you to
evaluate the amount of load Vusers generate, in terms of server throughput.
Server Performance
The Server Performance test data displays a summary of the System UNIX Resources
usage for each UNIX based host.
The following table describes the measurements used in the Server Performance test
data summary.
Measurement Description
Statistics Summary
Parameter Value
Transaction Summary
Performance tests yielded the following total passed, failed and stopped transactions:
Total Passed: 186,512
Total Failed: 12
Total Stopped: 0
Transaction Summary
The Transaction Summary test data displays the number of transactions that passed,
failed, stopped, or ended with errors.
User Influence
The User Influence test data displays the average transaction response time relative to
the number of Vusers running at any given point during the load test. This type of
graph helps view the general impact of Vuser load on performance time and is most
useful when analyzing a load test which is run with a gradual load.
Throughput
The Throughput test data displays the amount of throughput (in bytes) on the web server during the
load test. Throughput represents the amount of data that the Vusers receives from the server during any
given second. This type of graph helps you to evaluate the amount of load Vusers generate, in terms of
server throughput.
Server Performance
The Server Performance test data displays a summary of the System UNIX Resources
usage for each UNIX based host.
The following table describes the measurements used in the Server Performance test
data summary.
Measurement Description