Вы находитесь на странице: 1из 11

TEMENOS T24 R11 On Oracle Exadata

High Water Benchmark Report

Information in this document is subject to change without notice.


No part of this document may be reproduced or transmitted in any form or by any means,
for any purpose, without the express written permission of TEMENOS HEADQUARTERS SA.
2010 Temenos Headquarters SA - all rights reserved.

T24 R11 Benchmark Report

Table of Contents
Document History ......................................................................................................................................... 3
Introduction ................................................................................................................................................. 4
Management Summary .............................................................................................................................. 4
Scope ........................................................................................................................................................... 5
Test Team .................................................................................................................................................... 5
Test Environment ....................................................................................................................................... 5
Software Used ............................................................................................................................................. 5
Hardware Configuration ............................................................................................................................. 6
App Server Configuration .......................................................................................................................... 6
Exadata Machine Configuration ................................................................................................................ 6
T24 Data Requirements .............................................................................................................................. 6
Acceptance Criteria .................................................................................................................................... 7
Close of Business ....................................................................................................................................... 7
Environment Setup ..................................................................................................................................... 7
Database ...................................................................................................................................................... 7
Application Server ...................................................................................................................................... 7
Test Results................................................................................................................................................. 8
COB Test Results ....................................................................................................................................... 8
Observation ................................................................................................................................................. 8
Conclusion .................................................................................................................................................. 9
Appendix ...................................................................................................................................................... 10
Tools used for Performance Monitoring ................................................................................................ 10
Definitions, Acronyms and Abbreviations ............................................................................................. 11

Performance and Sizing Team | Technology & Research

T24 R11 Benchmark Report

Document History
Author

Version

Date

P&S Team

1.0

14 Jan 2011

P&S Team

1.1

20 Jan 2011

S.Henman

1.2

26 Jan 2011

Comments:
V1.0: Initial Draft
V1.1: Correction to Cores present.
V 1.2 R11 updated not PB

Performance and Sizing Team | Technology & Research

th

th

th

T24 R11 Benchmark Report

Introduction
This document describes the business and operating model of the TEMENOS T24 R11 HWBM
performance and sizing exercise which was executed at Oracle Solution Center, Linlithgow.
The entire project was done offsite from Chennai. Remote connectivity to Linlithgow Centre was
established via a gateway server through which access to the other servers was enabled.
Where is the target? Surely this should be in the objectives section and justified. If not then delete the
Objectives title.

Management Summary
th

th

T24 R11 high watermark tests have been performed during 17 November 2010 to 24 December 2010
at the Oracle Solution centre in Linlithgow. Tests have been performed using multiple Sun Fire machines
as application servers and a Full-Rack Oracle Exadata Machine X2-2 as database server.
The primary objective of the high water mark test is to test the scalability and resilience of the T24 R11
application on Oracle Exadata hardware with high volume retail banking transactions. The tested
database comprises of 15 million customers and 25 million accounts. Test cases and mix percentage
have been compiled from existing and prospective clients to simulate the real world test scenarios and to
simulate a large retail bank with the following characteristics:

15,000,000 customers

25,000,000 accounts (1,000,000 of which are foreign currency)

2000 branches (10 concurrent tellers per branch during peak hour)

32,400,000 transactions per day

1 month of full transaction history

The following results were achieved from the R11 high water mark testing.
Test Name

TPS Achieved

Interest Accrual and Capitalization

6165

As this was on 25M accounts, a total window of less than 75 minutes is required for full accrual and
capitalisation of all accounts.
This is the highest throughput so far achieved by T24 on any platform and T24 seems ideally suited to run
on the Oracle Exadata platform.

Performance and Sizing Team | Technology & Research

T24 R11 Benchmark Report

Scope
The scope of the exercise was driven by the Temenos retail committee team.

Test Team
The test team off site consisted of the Performance and Sizing members.

Test Environment
To run the benchmark, large volumes of data in T24 were required; therefore data preparation was
required prior to the actual test run. Data was prepared using application servers with TAFC and T24
installed and a staging server for the Oracle database was introduced for this purpose. Scripts for 15
million customers and 25 million accounts, and balances for these accounts were prepared and were
executed using T24 TAFCOnlineLoader tool. Once the entire data was ready on the staging server, close
of business (COBs) were executed to progress to the month-end state for capitalisation. Backups were
taken using Rman when required. Finally an export dump was taken so that this data could be
propagated to the Exadata (production) server for the actual testing purposes.

Software Used
The following is a list of software components used in the Benchmark.
Name

Version

Used For

TEMENOS T24

R11

T24 Application

Oracle DB

(PB 201010)
11.2.0.2

Database

TAFC

PB201010
Change
91153

Application Framework (runtime)

Performance and Sizing Team | Technology & Research

T24 R11 Benchmark Report

Hardware Configuration
For this test, the following hardware was utilised:
Server

Purpose

C1718-3-110 C1718-3-118

T24 Application V R11 - (PB 201010)

ed2jcomp01
ed2jcomp08

Oracle Database V 11.2.0.2

App Server Configuration


Server
7 x Apps
2 x Apps

Machine
Type
Sun Fire
X4470
Sun Fire
X4270 M2

CPU
4 * Intel Xeon X7560 @
2.27GHz
2 * Intel Xeon X5670 @
2.93GHz

Cores
per CPU
8 Core

Memory
256GB

6 Core

32GB

Networks (in
use)
2 * igb 1000
BaseT interfaces
2 * igb 1000
BaseT interfaces

OS
Version
Solaris 10
update 9
Solaris 10
update 9

A total of 9 App servers were used for this benchmark.

Exadata Machine Configuration


Database Server Configuration
System
Type
8 x Database
Server

CPU
2 * Intel Xeon X5670
@ 2.93GHz

Cores per
CPU
6 Core

OS
Memory
96Gb

Network
2 x QDR Infiniband (private)
1 x 1000 BaseT (public)

Enterprise
Linux 5.5

A Total of 8 Database servers were used each having 24 cores.

Storage Server Configuration


System Type
14 x Storage
Server

CPU
2 * Intel Xeon
X5640 @
2.26GHz

Cores
per CPU

Memory

6 Core

24Gb

T24 Data Requirements


1. 15,000,000 customers
2. 25,000,000 accounts
3. Local currency as USD
4. Number of branches is 2000

Performance and Sizing Team | Technology & Research

Network
2 x QDR Infiniband
(private) 1 x 1000
BaseT (public)

Disk
12 x 600 Gb
15,000 RPM
SAS disk

Flash
Cache

384 Gb

T24 R11 Benchmark Report

Acceptance Criteria
Close of Business
A close of business was required to be run for batch end of month processing.
A COB capitalisation with a desired throughput of 6000tps (transactions per second) was the target.

Environment Setup
Database
Database T24R10 was built on Oracle Enterprise Edition 11.2.0.2 with 15 million customers and 25
million accounts using the staging server. The data was then imported to the Exadata machine for the
capitalisation test.

Application Server
The application server consists of TAFC, T24 version R11 (Build 201010), and XmlOracle Drivers. There
were 9 application servers used for the COB Test.

Performance and Sizing Team | Technology & Research

T24 R11 Benchmark Report

Test Results
COB Test Results
A close of business capitalisation (COB CAP) was run with 25 million accounts.
The COB CAP was run with the following configuration:
a.
b.
c.
d.
e.

9 app servers were used for this test totalling 248 available cores. i.e. ( 7x32 + 2x12)
One app server (32 cores) was dedicated for the locking daemon.
8 DB servers were running oracle with a total of 192 cores. i.e (ie 8x12)
600 agents were used for this run of COB.
Hyperthreading was enabled on all servers.

DB
Cores
96

App
Cores
248

Agents
600

Tps
6165

tps/
agent
10.28

Exadata Utilisation

DB Server
DB Node-1
DB Node-2
DB Node-3
DB Node-4
DB Node-5
DB Node-6
DB Node-7
DB Node-8

CAP
Time
1:14:21
AppServer Utilisation

CPU-ALL
63%
61%
62%
61%
62%
61%
63%
62%

Total Cores
12
12
12
12
12
12
12
12

Server
Locking Server
App-1
App-2
App-3
App-4
App-5
App-6
App-7
App-8

CPU-ALL

Total Cores

14%
40%
40%
39%
38%
38%
37%
88%
89%

Observation
a.
b.
c.
d.
e.
f.
g.

Total CAP throughput was 6165 tps.


CPU utilisation of all the DB servers was averaging at 63%.
JDLS locking server was utilising 14% of 32 cores.
The total process time for 25 Million accounts took under 1hr and 15 mins.
Application server utilisation was around 40% of 32 cores on average and 89% of 12 cores.
Network bandwidth between App and DB = 501 MB/sec.
Network bandwidth between App and Locking = 58 MB/sec.

Performance and Sizing Team | Technology & Research

32
32
32
32
32
32
32
12
12

T24 R11 Benchmark Report

Conclusion
TEMENOS T24 on Sun Solaris i86pc and Oracle 11g on Exadata, achieved 6165 transactions per
second for processing 25 million account capitalisations in under 1hr and 15 minutes.

This shows that T24 in conjunction with this architecture can satisfactorily process the volumes from
much greater than 25 million accounts.

Performance and Sizing Team | Technology & Research

T24 R11 Benchmark Report

Appendix
Tools used for Performance Monitoring
The following tools were used for the purpose of monitoring during the test
mw42
Used to determine the active process
Determine the current program and line no
f full view
-u <user name> active processes for particular user

TEC
Temenos enterprise consol
Determine current locks
Possibility of lock collisions in a Multiple App server environments (Using JDLS )

sar / sadc
Resource utilisation monitoring and reporting tool

OEM
Oracle enterprise Monitor
Graphical illustration of current operations on Database
Check for locks, contentions, Max hits, DB usage etc.

jdls -dvL

Check for any active locks during online and COB.

10

Performance and Sizing Team | Technology & Research

T24 R11 Benchmark Report

Definitions, Acronyms and Abbreviations


Definitions, Acronyms & Abbreviations
Name

Definition

TXN

Transaction

TPS

Transactions Per second - is the number of requests made to the server by Virtual users in
a second

COB

T24 Close Of Business

HWBM

High Water Benchmark

11

Performance and Sizing Team | Technology & Research

Вам также может понравиться