Вы находитесь на странице: 1из 9

Testing in Power Center

Introduction

This article briefs about the following:

Different Testing Types like Unit Testing, Integration and UAT.

Testing Architecture & Processes

Testing tools available with Informatica

Testing facilities available with Informatica PowerCenter

Unit Testing

Unit testing can be broadly classified into 2 categories.

Quantitative Testing


Validate your Source and Target

a) Ensure that your connectors are configured properly.
b) If you are using flat file make sure have enough read/write permission
on the file share.
c) You need to document all the connector information.

Analyze the Load Time


a) Execute the session and review the session statistics.
b) Check the Read and Write counters. How long it takes to perform
the load.
c) Use the session and workflow logs to capture the load statistics.
d) You need to document all the load timing information.


Analyze the success rows and rejections.


a)Have customized SQL queries to check the source/targets and here we will perform
the
Record Count Verification.

b) Analyze the rejections and build a process to handle those rejections. This
requires a clear business requirement from the business on how to handle the data
rejections. Do we need to reload or reject and inform etc? Discussions are required
and appropriate process must be developed.


Performance Improvement

a) Network
Performance
b) Session
Performance
c) Database Performance

d) Analyze and if required define the Informatica and DB partitioning requirements.


Qualitative Testing

Analyze & validate your transformation business rules. More of functional
testing.

e) You need review field by field from source to target and ensure that
the required transformation logic is applied.

f) If you are making changes to existing mappings make use of the data
lineage feature available with Informatica PowerCenter. This will help you to
find the consequences of altering or deleting a port from existing mapping.

g) Ensure that appropriate dimension lookups have been used and your
development is in sync with your business requirements.


Integration Testing
After unit testing is complete; it should form the basis of starting integration testing. Integration testing
should test out initial and incremental loading of the data warehouse.


Integration testing will involve following

1. Sequence of ETLs jobs in
batch.
2. Initial loading of records on data
warehouse.
3. Incremental loading of records at a later date to verify the newly inserted or
updated data.
4. Testing the rejected records that dont fulfil transformation
rules.
5. Error log
generation.

Integration Testing would cover End-to-End Testing for DWH. The coverage of the tests would include the below:


Count Validation

Record Count Verification: DWH backend/Reporting queries against source and target
as an initial check.

Control totals: To ensure accuracy in data entry and processing, control totals
can be compared by the system with manually entered or otherwise calculated control
totals using the data fields such as quantities, line items, documents, or dollars, or simple
record counts

Hash totals: This is a technique for improving data accuracy, whereby totals are obtained
on identifier fields (i.e., fields for which it would logically be meaningless to construct a
total), such as account number, social security number, part number, or employee
number. These totals have no significance other than for internal system control purposes.

Limit checks: The program tests specified data fields against defined high or low value
limits
(e.g., quantities or dollars) for acceptability before further
processing.


Dimensional Analysis

Data integrity between the various source tables and
relationships.


Statistical Analysis

Validation for various
calculations.

o When you validate the calculations you dont require loading the entire rows
into target and validating it.

o Instead you use the Enable Test Load feature available in Informatica PowerCenter.




Property Description

Enable Test
Load
You can configure the Integration Service to perform a test load.
With a test load, the Integration Service reads and transforms data without writing to
targets. The Integration Service generates all session files, and performs all pre- and
post-session functions, as if running the full session.
The Integration Service writes data to relational targets, but rolls back the data when
the session completes. For all other target types, such as flat file and SAP BW, the
Integration Service does not write data to the targets.
Enter the number of source rows you want to test in the Number of Rows to
Test field. You cannot perform a test load on sessions using XML sources.
You can perform a test load for relational targets when you configure a session for
normal mode. If you configure the session for bulk mode, the session fails.
Number of
Rows to Test
Enter the number of source rows you want the Integration Service to
test load. The Integration Service reads the number you configure for
the test load.



Data Quality Validation
Check for missing data, negatives and consistency. Field-by-Field data verification can
be done to check the consistency of source and target data.

Overflow checks: This is a limit check based on the capacity of a data field or data file
area to accept data. This programming technique can be used to detect the truncation of
a financial or quantity data field value after computation (e.g., addition, multiplication,
and division). Usually, the first digit is the one lost.

Format checks: These are used to determine that data are entered in the proper mode,
as numeric or alphabetical characters, within designated fields of information. The proper
mode in each case depends on the data field definition.

Sign test: This is a test for a numeric data field containing a designation of an algebraic
sign, +
or - , which can be used to denote, for example, debits or credits for financial
data fields.

Size test: This test can be used to test the full size of the data field. For example, a
social security number in the United States should have nine digits


Granularity

Validate at the lowest granular level
possible

Other validations

Audit Trails, Transaction Logs, Error Logs and Validity
checks.


Note: Based on your project and business needs you might have additional testing
requirements.

UAT (User Acceptance Test)


In this phase you will involve the user to test the end results and ensure that business
is satisfied with the quality of the data.

Any changes to the business requirement will follow the change management
process and eventually those changes have to follow the SDLC process.



Testing Architecture


From my perspective there are two broad categories of testing architecture

1. Unsecured

2. Secured

Unsecured

Even now many organizations go for unsecured testing architecture because it requires little budget
and less maintenance.

Assume that you have sales data warehouse, you have implement a change requirement where by you
need
1 year worth of data from production.

In this case you will develop a mapping to read the data from production warehouse and
load into development and proceed with the development.

Meaning, a developer can see the production data as it is. Some organizations will perform data
masking before bringing the data from production to UAT or Development environment.

Secured

In this case production data will be always masked before they are available in the DEV environment,

Informatica Data Subset


Informatica also provides a tool called Informatica Data Subset

The following is an overview copied from Informatica web site about the Data subset product.

Informatica Data Subset is a flexible enterprise data growth solution that automates the process of
creating smaller, targeted databases from large, complex databases. With referentially intact subsets
of production data, IT organizations can dramatically reduce the amount of time, effort, and disk
space necessary to support nonproduction systems.

Informatica Data Subset helps IT organizations untangle complex transactional systems, separating
out only functionally related data.

Informatica Data Subset is ideal for:

Optimize Development, Testing, and Training Systems

Dramatically accelerate development and test cycles and reduce storage costs by
creating fully functional, smaller targeted data subsets for development, testing, and
training systems, while maintaining full data integrity

Quickly build and update nonproduction systems with a small subset of production data
and replicate current subsets of nonproduction copies faster

Simplify test data management and shrink the footprint of nonproduction systems to
significantly reduce IT infrastructure and maintenance costs

Reduce application and upgrade deployment risks by properly testing configuration
updates with up- to-date, realistic data before introducing them into production

Easily customize provisioning rules to meet each organizations changing business
requirements

Lower training costs by standardizing on one approach and one infrastructure

Train employees effectively using reliable, production-like data in training systems

Support Corporate Divestitures and Reorganizations


Untangle complex operational systems and separate data along business lines to
quickly build the divested organizations system

Accelerate the provisioning of new systems by using only data thats relevant to
the divested organization

Decrease the cost and time of data divestiture with no reimplementation costs

Reduce the Total Cost of Storage Ownership

Dramatically increase an IT teams productivity by reusing a comprehensive list of data
objects for data selection and updating processes across multiple projects, instead of coding
by handwhich is expensive, resource intensive, and time consuming

Accelerate application delivery by decreasing R&D cycle time and
streamlining test data management

Improve the reliability of application delivery by ensuring IT teams have ready access
to updated quality production data

Lower administration costs by centrally managing data growth solutions across all
packaged and custom applications

Substantially accelerate time to value for subsets of packaged applications

Decrease maintenance costs by eliminating custom code and scripting


Testing Processes


Concentrate on the following for any testing requirements that you have:

1. Understanding the application data & business requirement.

2. Identify the sensitive information and develop processes to protect it

3. Understanding data requirements for test and development.

In any organization we will have parallel activities going on. Like BAs want to test a functionality
in UAT for which they need data from Production. Developer wants to perform a unit testing for
which he/she needs data from Production.

For the above such requests we need to create a data load matrix and prioritize their needs.

Data Load Matrix


Project Team
Request
Description

Priority
Data Refresh
Cycle
Request
Type
QA HIGH Monthly One-Off
Development LOW Daily Regular


4. Defining data selection criteria and data masking rules

5. Testing and Validation

5. Auditing and security



Informatica PowerCenter Testing


Debugger: Very useful tool for debugging a valid mapping to gain troubleshooting
information about data and error conditions. Refer informatica documentation to know more
about debugger tool.


Test Load Options Relational
Targets.



Property

Description

Enable
Test Load

You can configure the Integration Service to perform a test load.

With a test load, the Integration Service reads and transforms data without writing
to targets. The Integration Service generates all session files, and performs all pre-
and post-session functions, as if running the full session.

The Integration Service writes data to relational targets, but rolls back the data
when the session completes. For all other target types, such as flat file and SAP
BW, the Integration Service does not write data to the targets.

Enter the number of source rows you want to test in the Number of Rows to Test
field.

You cannot perform a test load on sessions using XML sources.

You can perform a test load for relational targets when you configure a session for



normal mode. If you configure the session for bulk mode, the session fails.

Number of
Rows to
Test

Enter the number of source rows you want the Integration Service to test load.
The Integration Service reads the number you configure for the test load.


Running the Integration Service in Safe Mode


o Test a development environment. Run the Integration Service in safe
mode to test a development environment before migrating to production

o Troubleshoot the Integration Service. Configure the Integration Service to fail
over in safe mode and troubleshoot errors when you migrate or test a production
environment configured for high availability. After the Integration Service fails over in
safe mode, you can correct the error that caused the Integration Service to fail over.


Syntax Testing: Test your customized queries using your source qualifier before
executing the session.

Performance Testing for identifying the following bottlenecks:

o Target

o Source

o Mapping
o Session
o System

Use the following methods to identify performance bottlenecks:

Run test sessions. You can configure a test session to read from a flat file source or to
write to a flat file target to identify source and target bottlenecks.

Analyze performance details. Analyze performance details, such as performance
counters, to determine where session performance decreases.

Analyze thread statistics. Analyze thread statistics to determine the optimal
number of partition points.

Monitor system performance. You can use system monitoring tools to view the
percentage of CPU use, I/O waits, and paging to identify system bottlenecks. You can
also use the Workflow Monitor to view system resource usage.

Use PowerCenter conditional filter in the Source Qualifier to improve performance.

Share metadata. You can share metadata with a third party. For example, you want to
send a mapping to someone else for testing or analysis, but you do not want to disclose
repository connection information for security reasons. You can export the mapping to an
XML file and edit the repository connection information before sending the XML file. The
third party can import the mapping from the XML file and analyze the metadata.

Вам также может понравиться