Вы находитесь на странице: 1из 6


Contact: +91 xxxxxxxxx


Career Objective:
Seeking assignments across Software Development/ Informatica MDM with a growth oriented
organization of high repute
Professional Profile:

With 5+ years in Informatica Master Data Management and Informatica Power center tools.
Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in
building MDM solutions.
Experience in installation and configuration of core Informatica MDM Hub components such as Hub
Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
Hands-on experience with Informatica MDM Hub configurations - Data Mappings (Landing , staging
and Base Objects) , Data validation , Match and Merge rules , customizing/configuring Informatica
data director (IDD)applications.
Experience in defining and configuring landing tables, staging tables, base objects, lookups, query
groups, queries/custom queries, packages, hierarchies and foreign-key relationships.
Expertise in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups. .

Knowledge on implementing hierarchies, relationships types, packages and profiles for

hierarchy management in MDM Hub implementation.

Experience in configuring Entity Base Objects, Entity Types, Relationship Base Objects, Relationship
Types, Profiles using Hierarchy tool.
Experience in several facets of MDM implementations including Data Profiling, metadata acquisition,
data migration, validation, reject processing and pre-landing processing.
Exposure in preparation of Technical Design document, Test cases and Unit test documents.
Experience in Business Intelligence on various types of projects ranging from implementation, design
and development to post production support and maintenance.
Extensive experience in design and development of ETL in Informatica 9.1.0/8.6.1., DB2 and SQL
server 2005
Strong exposure on Data warehousing concepts like star schema, snow flake schema etc.
Having good domain knowledge in Finance, Sales and HR.

Good understanding of RDBMS concepts and experience in writing queries using DB2&SQL server.

Worked on development and customization of Informatica mappings, mapplets, workflows with

worklets & tasks using various transformations for loading the data from multiple sources to data

Good exposure in using parameters, variables and user-defined functions

Implemented various performance tuning techniques on targets, sources, mappings and sessions in
all versions of Informatica
Involved in Informatica version upgradation activities
Sound knowledge in performance tuning of data base queries, indexes as well as supported project
critical SLA applications
Good experience in UNIX shell scripting
Strong exposure on Control-M scheduling tool
Excellent client handling ability with good presentation skills
Excellent communication and interpersonal skills. Ability to grasp new concepts very quickly, both
technical and business related

Master of Computer Applications (MCA) from Osmania University.

EIM Tools
: Informatica Multidomain MDM 9.5.0
Cleanse Adapters : Informatica Data Quality 9.1 (IDQ), Trillium
Business Intelligence
: Informatica Power centre 9.1.0/8.6.1, Cognos 8.4
: DB2, MS SQL Server 2005, Oracle 9i/10g
SQL Editors
: IBM Quest Central, Advanced Query Tool
: SQL, PL/SQL, C, C++, JAVA,Unix Script
: UNIX, Windows 2000/XP
: Control-M, Maestro, HP Quality Center, Remedy

Professional Experience:
Working with [Company Name] from June 2012 to till date.

Project #1
Project Title : Product Information Management, PIM
: Nokia Corporation, Finland
: Developer
Environment : Informatica MDM 9.5.0, Informatica MDM Data Director 9.1.0, Informatica Data Quality (IDQ)
9.5.0,Informatica Power Center 9.1.0, AQT 9.0, Erwin ,Jboss, HP Quality Center 9.2, Windows XP
: Jan 2013 till now
Overview: Product Master enables Nokia to uniquely identify and collect product data with logical relations
across different business areas. This is done by storing and referencing data within the Product Master tool and
with implementation of Data governance, Data Quality, Data standards, providing guidance to improve the Data
source processes where data is collected and maintained. The aim was to build a single data hub to serve as a
"Best version of the truth about product data. Product master is built on Informatica Hub console.

Having strong understanding in Product life cycle, Data governance, Data quality, Data security for
products across the enterprise from many disparate systems.

Planned and executed in scrum mode , agile methodology.

Delivered bug free deliverables in all the sprints. Successfully managed error handling

Interacting with business users for identifying, prioritizing & resolving numerous data issues

Creating project plans, design and development

Building a data model according to business needs by interacting with data architect

Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data
Implemented the Pre-Land and Land Process of loading the dataset into Informatica MDM Hub

Configured the three Source system from which approximately 1.2 million data will be used in
landing process

Configured Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, query
groups, packages and custom cleanse functions

Defined System Trust and Validation rules for base object column

Configured Search Strategy, Match rule sets, Match Columns for Match and Merge Process in
improving data quality by match analysis

Created Queries, Query Groups and packages in MDM Hub Console

Implemented inserting/updating strategy using batch process which is done nightly, SOAP for near
real time data and IDD
Configured Informatica Data Director (IDD) in reference to the Data Governance of users, IT Managers
& Data Stewards
Created and deployed new applications in Informatica Data Director and binding application to a
specific ORS
Used Metadata manager for validation, promotion, importing and exporting the ORS repositories to
different environments
Configured master data for downstream system like PIM online, Data warehouse and Reporting tools
Performed testing on data validation, Batch Jobs and Batch groups.

Project #2
Project Title

: Finance Platform& Accounting(FP&A)

: AON Hewitt
: Developer
: DB2 9.7, Informatica 8.6.1,Informatica 9.1, Cognos8.4, PeopleSoft 9.0,
Sun Solaris, Windows 2000/XP/7
: June 2012 till now
Overview: Global Finance Platform Program Phase 2 is a project to implement PeopleSoft Financials v9.0 for
Hewitts North American entities. The scopes of modules are General Ledger, Asset Management, and
Procurement, e-Procurement, Accounts Payable, Contracts, Projects, Time & Expense, Billing and Accounts
Receivable. Architecturally, EPM will be replaced with a new Finance Data Warehouse which will be the central
collection point for data from the various different sources to be used by the DSS applications.
Documenting the analysis of the functional requirement and preparation of creating the mappings and
transformations using Informatica by Senior Developer.
Preparation of technical design document before developing an ETL/mapping.
Reviewing mapping, workflows and output data and verifying the data loaded into targets with Client and
Testing unit testing of the each mapping.
Extraction of source data and transforms through transformations as per business logic under mapping
using Informatica Power Center 8.6.1.
Involved from Start to End in Designing and implementation of mapping of GL,PC, Billing modules.

Project #3
Project Title : AON Corporation Metrics Data mart
: AON Hewitt
: Developer
Environment : DB2 9.7, Informatica 9.1, Control-M, Sun solaris.
: Aug 2012 till now
Overview: Created a metric system to show the trend of open and closed tickets for Corporate Systems. Data
from three ticketing system need to be merged into a common repository to facilitate reporting.
Preparation of technical design document before developing an ETL/mapping.
Development of mappings in Informatica (Power Center).
Prepare the unit test document for which I implemented.
Get it reviewed from Tech leads and approval process.
Involved End-to-End process.

Project #4
Project Title : Informatica upgradation to 9.1 for FDW, Sales Data Mart
: DB2, Informatica 8.6.1, PeopleSoft, Control-M, Remedy, HP Quality Center, Sun Solaris, Windows
: Aon Hewitt
: July 2012 Oct 2012
Project Scope & Responsibilities:
This is an internal upgrade project for Informatica Tool upgrade in FDW and Sales domain
Managed the project from offshore. This was been the first time, a project is been totally managed from
Offshore team was responsible for designing and implementing the project
Weekly status update been made to Portfolio Manager and GMs
Milestones been identified for deployment, and kept the communication channel open for business,
management and users
Project has been successfully delivered in two waves


Worked with [Company Name], Bangalore from April 2010 June 2012

Project #1
Project Name
Tools Used


Meta Data Service Bureau.

Team Member
Oracle 10g, Windows XP, UNIX
Apr 2011 to June 2012.
Informatica 8.6

Description: Clint is a world leading research-based pharmaceutical company with a powerful

combination of skills and resources that provides a platform for delivering strong growth in today s
rapidly changing healthcare environment. The design specification overview addresses the design
specifications of the new source system integration into the central data warehouse. The new source
systems to be integrated into the Central data warehouse.
Roles and Responsibilities:

Understanding the Business Data Mapping requirements and preparing the Tech. Design

Translated business process in Informatica mappings

Developed Informatica Power Center mappings using transformations such as the Source
qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator.
Developed complex mappings using transformations such as the Source qualifier, Joiner,
Aggregator, Expression, Connected Lookup, Unconnected Lookup and Router transformations,.
Created Mapplets using Mapplets Designer to reuse Mapplets in the mappings
Developed Informatica workflows and sessions associated with the mappings using Workflow
Carryout Review and Unit testing for the mappings developed.
Fixing the defect from UAT and production, if any.

Project #2
Project Name
Tools Used


Jefferson Mort Data

Team Member
Oracle 9i, UNIX, Toad 8.6, Windows XP
Apr 2010 to Dec 2011.
Informatica 8.6

Description: Jefferson is a multinational financial institution in corporate and mortgage; they have
largest branch network in Dallas and established itself as the third largest bank group in Texas.
Jefferson Group relies on IT infrastructure to conduct personalized services and transactions at more
than 1,200 points of presence across the country. Their products offering range from credit cards,
loans to saving accounts etc., so far the bank is using traditional banking practices for its functioning
and using the various media for promoting the various schemes. At a high level, the decision makers
can use the data warehouse to decipher the historical business trends, business performance and
possible fraud causes.
Bank has a need for an analytical data warehouse which will give the business an enterprise wide
view of information which can be used for analysis and detect fraud causes.
Role and Responsibilities:

Translated business process in Informatica mappings

Understanding the SR and BR and Preparing the LLD.
Designing the LLD ETL Design spec.
Developed Informatica Power Center mappings using transformations such as the Source
qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator and
Update Strategy.
Developed complex mappings to implement type 2 slowly changing dimensions using
transformations such as the Source qualifier, Joiner, Aggregator, Expression, Connected
Lookup, Unconnected Lookup, Filter, Router, Union, Sequence Generator, and Update
Created Mapplets using Mapplets Designer to reuse Mapplets in the mappings
Developed Informatica workflows and sessions associated with the mappings using
Workflow Manager.

Carryout Review and Unit testing for the mappings developed. Fixing the defect from UAT
and production, if any.