Вы находитесь на странице: 1из 8

Insight Warehouse

Release R15.000
June 2015

©2015 Temenos Headquarters SA - all rights reserved.

Warning: This document is protected by copyright law and international treaties. Unauthorised reproduction of this document, or any portion of it, may
result in severe and criminal penalties, and will be prosecuted to the maximum extent possible under law.
Table of Contents
Introduction 3
Purpose of this Document 3
Intended Audience 3
Insight Warehouse 4
Insight’s Data Modeling 4
How the ETL Process works? 5
Abstraction Views 6

Insight Warehouse - Release R15.000 - Page 2 of 8


Introduction

Purpose of this Document


The purpose of this document is to describe the Data Modelling concepts and ETL process in Insight.

Intended Audience
This document is intended for users who maintain and use Insight Data Warehouse. To use this document, the user should be familiar with
relational database and data warehouse concepts.

Insight Warehouse - Release R15.000 - Page 3 of 8


Insight Warehouse

Insight Warehouse is a SQL Server database in a Kimball based data warehousing structure with facts and dimensions and contains all of the
historic snapshots of source data. Insight Warehouse integrates information from across widespread global enterprise, combining data from var-
ious business units such as banking, wealth management and credit cards into one reporting and analysis source. The Data Warehouse deals
with multiple entities, fiscal years, and maintains complete historical information for reporting. It sources the operational system data and
transforms it into table structures, optimized for efficient reporting.

The Insight Warehouse data is extracted and stored using Microsoft SQL database tools. The process is referred to as ‘Extract, Transform and
Load’ or ETL. The Extract phase copies the source system data onto the Data Warehouse server. The Transform phase manipulates the data
into common values and terms using specified business rules and Master Data grouping. For example a customer with both a banking system
customer number and an investment system customer number will be combined into one data warehouse customer. The Load phase populates
the transformed data into the final data warehouse tables. These Data Warehouse tables are queried using SQL Analysis Services cubes,
reports/queries and can be represented via dashboards.

Insight’s Data Modeling


Insight Warehouse’s star schema is made up of two sets of tables: Fact Tables and Dimensions. Fact Tables contain measurement-data, usually
numeric. For example, in Wealth Management the fact tables may include account balances. Dimensions are ways that a user might look at the
data in the Fact Table. For example, a business user may want to analyze information by date. Each dimension considers all of the attributes
that might be needed for sorting, grouping or filtering the information in the fact table.

Generally, OLAP Cubes are used to store aggregated data from the data warehouse and they are a popular approach for analysis of data stored
in the data warehouse. The OLAP databases contain two basic types of data – Facts (or measures), which are numeric data, the quantities and
averages that is used to make informed business decisions, and Dimensions, which are the categories that is used to organize these facts or
measures.

In Insight, the dimension tables can play different roles in a fact table depending on the context – for example, times appear in most types of
analysis because business activities happen in a timeframe and objects exist in time. Time is almost always used when calculating and eval-
uating measures in a fact table and the time information may be required in different format such as ‘Time on Day’ or ‘Time in minutes’ or ‘Time
with AM or PM’. To define different views of the same dimension, the Role Playing Dimension tables are formed.

The most basic and fundamental facts are stored in the Transaction Fact tables. The grain associated with a transactional fact table is usually
specified as "one row per line in a transaction", e.g., every line on a receipt. Typically a transactional fact table holds data of the most detailed
level, causing it to have a great number of dimensions associated with it

However, there are also tables in charge of representing business performance at the end of each regular, predictable time period. These are
called Periodic Snapshot Fact tables that could take a snapshot of the moment that could be any defined period of time, for example, a per-
formance summary of product sales over the previous month. Any periodic snapshot table is dependent on an underlying transaction fact table,
as it needs the detailed data held in the transactional fact table in order to deliver the chosen performance output.

Additionally, a Bridge table is used to resolve many-to-many relationships between a fact and a dimension. Bridge tables will typically contain
only two columns, key columns from both the dimension and fact table in the many to many relationship. For example if there is a many to
many relationship between the Customer Dimension and Fact Account then a bridge table will be utilized. A Bridge table can be created by join-
ing the two tables Customer and Account using surrogate keys (e.g. the Customer ID on the Account Table). The Bridge tables are fact-less,
with no measures.

Insight Warehouse - Release R15.000 - Page 4 of 8


How the ETL Process works?

The above diagram offers a representation of how the ETL process updates the components of Insight, starting from the T24 data extraction
process until the presentation of user defined reports, cubes and dashboards through the Insight Vision interface.

The data are extracted from T24 using a T24 application called DW.EXPORT and a set of extraction batch jobs, which are normally run on a
daily basis, online or just after Close of Business. Based on the parameters of DW.EXPORT, the batches export data to a folder in .csv format.
DW.EXPORT can be modified to export any tables from T24 but it has a standard, out-of-the-box configuration which is specific to Insight.

The CSV files are loaded into the Insight Import database. In Insight Import, data from the CSV files are properly data-typed, parsed (multi-
values, sub-values and local reference) and loaded into the Insight Data Warehouse architecture.

The data flows from the source systems through Insight Landing, Insight Source and Insight Staging database before reaching the Insight Ware-
house database. From there it is available to the various reporting options. Insight Landing is a compressed SQL Server database that contains
historical copies of source systems, in their native format, and is used for reprocessing data that exist here at varying frequencies. While Insight
Source is another SQL Server database that contains all the source data required to process the current ETL and the integration support

Insight Warehouse - Release R15.000 - Page 5 of 8


structure for different source system data for just one business date. Insight Landing can be used as a reporting data source for reporting off of
raw source system data.

Insight Master Data is where all user defined business logic lives including how to map source codes to meaningful classifications and how to
group values into meaningful buckets (e.g. Age=23 means Age Group=18 to 25). This database is also the repository for the GL Chart of
Accounts for Financial Intelligence. Furthermore, it contains translations for Reports & Cubes (including object & column names and formats,
label names, and report prompts), Business Rules, Dimension Hierarchies and manually entered data – all information that is key to the oper-
ation of a business. It is integrated with ETL.

Insight Staging is another SQL Server database is where all data is brought together and staged to be populated in Insight Warehouse. The
sources for Insight Staging are Insight Source, Insight Master Data, and the optionally Insight Pricing. Insight Pricing contains a funds transfer
pricing model and calculates profitability for all accounts & customers as of the latest business day processed in ETL. This database is used to
power the interactive Pricing web front end (an ASP.Net web application, used to manage the FTP model and create/manage interactive cus-
tomer pricing scenarios).

The Microsoft Analysis Services cubes will be refreshed to include the new or updated data and for ad-hoc analysis, Insight Cubes can be used
to view the data warehouse data. The cubes are multi-dimensional databases optimized and aggregated for efficient data retrieval. Most analysis
is done by connecting directly to the cubes using Microsoft Excel pivot tables, but they can also be a source for reports.

Abstraction Views
Views are set up that pre-join the dimension and fact tables. The main purpose of a view is to abstract the complexity of creating a specific
result set. In large relational databases, it is often required to join many tables together to get useful data. By creating views, any user can
access it without needing access to the database access layer. Often, views are used for security or simplicity purposes (i.e. to encapsulate busi-
ness logic or computations in something that is simple to query). For security, they can restrict access to sensitive data by filtering (restricting
the rows available) or masking off sensitive fields from the underlying table.

In Insight, the key views for most reporting are:

Insight Warehouse - Release R15.000 - Page 6 of 8


l v_Account – shows account details such as classifications, balances, dates etc. An account record can be linked
to a customer using the customerID column
l v_Customer –presents users with customer attributes and account summaries such as total balance by clas-
sification and number of accounts
l v_AcctTran – shows transactions’ details for an account

l v_GL – GL Information

l v_GLTran – contains all of the general ledger transactions and in case of T24 it has all of the system transactions.

Purging Data
Occasionally, it is inevitable to remove large amounts of data from a data warehouse to make room for new data. However, sometimes certain
outdated or specific data might need to be removed from a data warehouse. Suppose that a customer has previously held Accounts in XYZ
Bank, and that the customer has subsequently closed the account. The business users of the Insight warehouse may decide that they are no
longer required to see any data related to that account, so this data should be deleted.

In Insight Warehouse, users can execute the DataDelete procedure to purge data with options to specify the start date, end date and excluding
the Month End dates.

Insight Warehouse - Release R15.000 - Page 7 of 8


Scheduled Jobs
Insight_ETL
This scheduled job runs daily or monthly depending on the warehouse installation. It has steps to bring in and store source data into Insight
Landing and Insight Source, then runs the extract/transform/load (ETL) procedures to update Insight Staging and Insight Warehouse. As part
of the ETL, the Insight Pricing database is updated (if the module is installed). The final steps enable the business date and process the cubes.

Insight Landing_Maintenance
This scheduled job runs weekly or monthly. For daily data installations, this job purges old daily data from Insight Landing. For SQL Server
Enterprise installations, this job compresses any non-compressed objects in the Insight Landing database. This is to reduce the storage require-
ments of the historical database.

Insight_DB_Maintenance
This scheduled job runs weekly or monthly. This job performs regular database maintenance such as rebuilding indexes and statistics, and
checking database integrity.

Insight_Warehouse _Purge
This scheduled job runs monthly. This job is for installations running daily that choose to purge daily data older than a specified time period
from Insight Warehouse.

Insight Warehouse - Release R15.000 - Page 8 of 8

Вам также может понравиться