Вы находитесь на странице: 1из 20

PROFESSIONAL SUMMARY:

Extensively worked on data extraction, Transformation and loading data from various
sources like Oracle, SQL Server and Flat files.
Responsible for all activities related to the development, implementation, administration
and support of ETL processes for large scale data warehouses using Informatica Power
Center.
Strong experience in Data Warehousing and ETL using Informatica Power Center 8.6.
Had experience in data modeling using Erwin, Star Schema Modeling, and Snowflake
modeling, FACT and Dimensions tables, physical and logical modeling.
Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL
processes.
Had knowledge on Kimball/Inmon methodologies.
Hands on experience in tuning mappings, identifying and resolving performance
bottlenecks in various levels like sources, targets, mappings and sessions.
Extensive experience in ETL design, development and maintenance using Oracle SQL,
PL/SQL, SQL Loader, Informatica Power Center v 5.x/6.x/7.x/8.x.
Experience in testing the Business Intelligence applications developed in Qlikview.
Well versed in developing the complex SQL queries, unions and multiple table joins and
experience with Views.
Experience in database programming in PL/SQL (Stored Procedures, Triggers and
Packages).
Well versed in UNIX shell scripting.
Experienced at Creating effective Test data and development thorough Unit test cases to
ensure successful execution of the data & used pager for notifying the alerts after
successful completion.
Excellent communication, documentation and presentation skills using tools like Visio
and PowerPoint.
TECHNICAL SKILLS:
Data warehousing Tools : Informatica Power Center 8.6/8.1, Data Stage
Databases : Oracle10g/9i/ 8i/ 8.0/ 7.x, MS SQL Server 2005/ 2000/ 7.0/ 6.0, MS Access,
MySQL, Sybase.
Programming GUI : SQL, PL/SQL, SQL Plus, Java, HTML, C and UNIX Shell Scripting
BI Tools : QlikView 8.x
Tools/Utilities : TOAD, Benthic golden, PL/SQL developer
Operating Systems : Windows XP/NT/2003, UNIX
Configuration Management Tool : Surround SCM, Visual Source Safe
EDUCATION:
Master of Science in Computer Science.
PROFESSIONAL EXPERIENCE:
Confidential
sanofi-aventis- NJ Oct11- till date
Informatica Developer
USMM implementation project is the upgrade of the current sanofi-aventis 1.x series MCO
medical reps Quest application to latest 4.x .net series of applications. In this project the database
was upgraded and an enterprise data ware house was implemented for the MCO reps. Distributed
data is coming from the heterogeneous sources like SQL server, Oracle and in flat files from the
clients.
Responsibilities:
Analyzed the business requirements and functional specifications.
Extracted data from oracle database and spreadsheets and staged into a single place and
applied business logic to load them in the central oracle database.
Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data
in the data warehouse.
Extensively used Transformations like Router, Aggregator, Normalizer, Joiner,
Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
Developed complex mappings in Informatica to load the data from various sources.
Implemented performance tuning logic on targets, sources, mappings, sessions to provide
maximum efficiency and performance.
Parameterized the mappings and increased the re-usability.
Used Informatica Power Center Workflow manager to create sessions, workflows and
batches to run with the logic embedded in the mappings.
Created procedures to truncate data in the target before the session run.
Extensively used Toad utility for executing SQL scripts and worked on SQL for
enhancing the performance of the conversion mapping.
Used the PL/SQL procedures for Informatica mappings for truncating the data in target
tables at run time.
Extensively used Informatica debugger to figure out the problems in mapping. Also
involved in troubleshooting existing ETL bugs.
Created a list of the inconsistencies in the data load on the client side so as to review and
correct the issues on their side.
Created the ETL exception reports and validation reports after the data is loaded into the
warehouse database.
Written documentation to describe program development, logic, coding, testing, changes
and corrections.
Created Test cases for the mappings developed and then created integration Testing
Document.
Followed Informatica recommendations, methodologies and best practices.
Environment: Informatica Power Center 8.6.1, Oracle 10g/ 9i, MS-SQL Server, Toad, HP
Quality Center, Windows XP and MS Office Suite
Confidential, Aug 07- Sep10
sanofi-aventis- NJ
Informatica Developer
Sales force automation (SFA) system is a CRM solution that provides sales forces with a roboust
set of customer relationship management capabilities that promotes team selling, multi-channel
customer management, information sharing, field reporting, and analytics all within a life
science-tailored mobile application that is easy to use. The Purpose of this project is to maintain
a data warehouse that would enable the home office to take corporate decisions. A decision
support system is built to compare and analyze their products with the competitor products and
the sales information at territory, district, region and Area level.
Responsibilities:
Created mappings and sessions to implement technical enhancements for data warehouse
by extracting data from sources like Oracleand Delimited Flat files.
Development of ETL using Informatica 8.6.
Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta
Loads.
Prepared various mappings to load the data into different stages like Landing, Staging
and Target tables.
Used various transformations like Source Qualifier, Expression, Aggregator, Joiner,
Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
Developed Workflows using task developer, worklet designer, and workflow designer in
Workflow manager and monitored the results using workflow monitor.
Created various tasks like Session, Command, Timer and Event wait.
Modified several of the existing mappings based on the user requirements and maintained
existing mappings, sessions and workflows.
Tuned the performance of mappings by following Informatica best practices and also
applied several methods to get best performance by decreasing the run time of
workflows.
Prepared SQL Queries to validate the data in both source and target databases.
Worked on TOAD and Oracle SQL Developer to develop queries and create procedures
and packages in Oracle.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle
different scenarios.
Created Test cases for the mappings developed and then created integration Testing
Document.
Prepared the error handling document to maintain the error handling process.
Automated the Informatica jobs using UNIX shell scripting.
Closely worked with the reporting team to ensure that correct data is presented in the
reports.
Interaction with the offshore team on a daily basis on the development activities.
Environment: Informatica Power Center 8.1, Oracle 9i, MS-SQL Server, PL/SQL Developer,
Bourne shell, Windows XP,TOAD, MS Office and Delimited Flat files
Confidential, Dec05-Jul07
Chicago- IL
Data warehouse Developer

The American Medical Association (AMA) plays a key information management role by
collecting, maintaining, and disseminating primary source physician data. The development and
implementation of AMA policy and support a variety of data driven products and services. This
repository of physician information is created, maintained, and customized for DEA.
Responsibilities:
Member of warehouse design team assisted in creating fact and dimension tables based
on specifications provided by managers.
Load operational data from Oracle, SQL Server, flat files, Excel Worksheets into various
data marts like PMS and DEA.
Designed and created complex source to target mappings using various transformations
inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier,
Expression, Sequence Generator, and Router Transformations.
Implemented effective date range mapping (Slowly Changing dimension type2)
methodology for accessing the full history of accounts and transaction information.
Design complex mappings involving constraint based loading, target load order.
Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations and Create mapplets that provides reusability in mappings.
Involve in enhancements and maintenance activities of the data warehouse including
performance tuning, rewriting of stored procedures for code enhancements.
Designed workflows with many sessions with decision, assignment task, event wait, and
event raise tasks, used Informatica scheduler to schedule jobs.
Environment: Informatica Power Center 6.2, Oracle, Business Objects 6.x, Windows 2000,
SQL Server 2000, Microsoft Excel, SQL * Plus
Confidential, Sep04-Nov05
AXA - NY
ETL Consultant
A single electronic solution provides to employees of AXA Pacific and AXA Assurance surety
companies to access a centralized system. It also provides and intranet interface to all employees
of AXA surety bonds users across Canada.
Responsibilities:
Designed and developed the data transformations for source system data extraction; data
staging, movement and aggregation; information and analytics delivery; and data quality
handling, system testing, performance tuning.
Created Informatica Mappings to build business rules to load data using transformations
like Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected
lookups, Filters and Sequence, External Procedure, Router and Update strategy.
Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
Worked on Dimensional modeling to design and develop STAR schemas using ER-win
4.0, Identifying Fact and Dimension Tables.
Created various batch Scripts for scheduling various data cleansing scripts and loading
process.
Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
Created post-session and pre-session shell scripts and mail-notifications
Created Data Breakpoints and Error Breakpoints for debugging the mappings using
Debugger Wizard.
Created Several Stored Procedures to update several tables and insert audit tables as part
of the process.
Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with
backend database using PL/SQL.
Written Unix Shell Scripts for getting data from various source systems to Data
Warehouse.
Performance tuning by optimizing the sources targets mappings and sessions.
Tested mappings and sessions using various test cases in the test plans.
Environment: Informatica 6.2, Oracle 8i, TOAD, Windows NT and UNIX
Confidential, India Nov02-Aug04
QA Consultant
HDFC Bank had various products like Accounts & Deposits, Loans, Insurance and Premium
Banking etc.
This project is mainly to test the Accounts & Deposits module functionality. The Accounts &
Deposits has different account types like Savings, Salary, Current, Demat, Deposits, Rural and
Safe Deposit locker.
Responsibilities:
Analyzed the business requirements and functional specifications.
Understanding the requirements specification and use case documents.
Creation of test plan, test strategy and test approach.
Created Test scripts, Traceability Matrix and mapped the requirements to the test cases.
Done Peer review of test scripts and prepare the QA measurement forms.
Attended the QA Audit meetings and test artifacts review.
Participated in the Integration testing and Unit Testing along with the Development team.
Conducted System, GUI, Smoke and Regression testing identified application errors and
interacted with developers to resolve technical issues.
Extensively used SQL scripts/queries for data verification at the backend.
Executed SQL queries, stored procedures and performed data validation as a part of
backend testing.
Used SQL to test various reports and ETL Jobs load in development, testing and
production
Performed negative testing to test the application for invalid data, verified and validated
for the error messages generated.
Responsible for Generating Progress Reports and Updates to Project Lead Weekly
including with test Scenarios Status, Concerns and Functionality outstanding.
Involved in Daily and Weekly Status meetings.
Creation of Test Summary report, traceability Matrix, Test Script Index.
Monitor the testing project in Quality Center and ensuring defects are being entered,
tested, and closed.
Analyzed, documented and maintained Test Results and Test Logs.
Environment: Java, HTML, DHTML, Oracle8i, Data stage, Clarify and Benthic Golden.

Summary:
IT professional with 8+ years of experience in the Development and Implementation of
Data warehousing with Informatica Power Center, OLTP and OLAP using Data
Extraction, Data Transformation, Data Loading and Data Analysis.
Exposure with Installation and Configuring of Informatica tools.
Experience in all phases of the Data warehouse life cycle involving Analysis, Design,
Development and Testing of Data warehouses using ETL.
Strong Data warehousing experience using Informatica Power Center 9X/8x/7x.
Extensively used Informatica tools such as Informatica Server and Client tools like
Designer, Workflow manager, Workflow Monitor, Repository Manager.
Experience in implementing the complex business rules by creating transformations, re-
usable transformations (Expression, Aggregator, Filter, Connected and Unconnected
Lookup, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets
and Mappings, and SQL Stored Procedure, and Triggers.
Extensive knowledge in architecture design of Extract, Transform, Load environment
using Informatica Power Center.
Developed Slowly Changing Dimension Mappings of type I, II and type III (version,
flag and time stamp).
Used Incremental Aggregation to update the values in Summary tables.
Developed mapping using Parameters and Variables. Extensively used parameter file to
pass mapping and session variables.
Experience in Performance Tuning of sources, targets, mappings, transformations and
sessions, by implementing various techniques like partitioning techniques and pushdown
optimization, and also identifying performance bottlenecks.
Extensive experience in implementing CDC using Informatica Power Exchange
8.x/7.x.
Expertise in doing Unit Testing, Integration Testing, System Testing and Data
Validation for Developed Informatica Mappings.
Experience in using the Informatica command line utilities like pmcmd to control
workflows in non-windows environments.
Well Experienced in doing Error Handling and Troubleshooting using various log
files.
Experience using Repository Manager to create Repository, User groups, Users and
managed users by setting up their privileges and profiles.
Having command on troubleshooting with the help of error logs generated by Informatica
server.
Extracted data from various sources like Oracle, Flat file, SQL SERVER, XML files,
loaded into Oracle database.
Strong experience in developing Teradata Loading and Unloading utilities like Fast
Export, Fast Load, and Multiload.
Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams,
Normalization and De normalization Concepts.
Experience with UNIX shell scripting for File validation and SQL programming.
Practical understanding of Star Schema and Snowflake Schema Methodology using Data
Modeling tool Erwin 4.0/4.2.
Hands on experience in writing, testing and implementing Triggers, Stored,
Procedures, Functions, Packages at database level and form level using SQL.
Highly adaptive to a team environment and proven ability to work in a fast paced teaming
environment with excellent communication skills.
Excellent problem solving, interpersonal and communication skills with experience
working in fast paced, challenging and dynamic environment. Highly motivated with
passion for learning.
Technical Skills
Operating Systems : Windows 2000/XP, UNIX.
ETL Tools : Informatica Power Center V9X/V8x/V7x, Informatica power Exchange V8x,
Informatica Power Connect.
Data Modeling Tool : Erwin 4.0/4.2
Databases : Oracle 8i/9i/10g/11g, Sql, Sql Server 2000/2005/2008, Teradata, DB2
Languages : C, C++, Sql, Html
BI Tools : Crystal reports, Business objects XI 3.1
Database utilities : Toad, SQL Loader, PL SQL developer, Oracle SQL Developer
Education
Bachelors of Technology
Certification:
Informatica certified Developer
Professional Experience
Confidential, Georgia Feb 2011 Till Date
Role: Informatica Developer
T-Mobile operates the nation's most reliable and largest wireless voice and data network,
including the largest 3G broadband network. The scope of the project is to develop operational
and analytical reports for each module. The ETL part is handled by using Power centered
Informatica. Oracle was Database used where the Facets supports all the Relational Database.
Responsibilities:
Interacted with the Business users to identify the process metrics and various key
dimensions and measures. Involved in the complete life cycle of the project.
Developed FRD (Functional requirement Document) and data architecture document and
communicated with the concerned stakeholders. Conducted Impact and feasibility
analysis.
Worked on dimensional modeling to design and develop STAR schemas by identifying
the facts and dimensions. Designed logical models as per business requirements using
Erwin.
Worked with Power Center Designer tools in developing mappings and Mapplets to
extract and load the data from flat files, XML files and Oracle (source) and loaded into
Oracle (target).
Prepared software requirement specifications through interaction with business analysts
and designed Star schema, logical and physical database designs.
Created reusable Mapplets transformations to load data from operational data source to
Data Warehouse and involved in Capacity Planning and Storage of data.
Created different transformations like Source Qualifier, Joiner transformation, Update
Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator,
Sequence Generator for loading the data into targets.
Developed complex mappings such as Slowly Changing Dimensions Type 2, Type 3,
type 1 - Time Stamping in the Mapping Designer.
Used Informatica Workflow Manager to create Workflows, database connections,
sessions, and batches to run the mappings.
Monitored the workflow performance and the status with Workflow Monitor.
Developed various command tasks to automate the Pre session jobs. Did performance
tuning to improve the load. Wrote complex SQL Queries involving multiple tables with
joins.
Involved in writing of Triggers, Functions, and Packages.
Converted SQL/Procedures and SQL Loader scripts to Informatica mappings.
Used the Target Load Ordering with Stored Procedures to update database.
Used Variables and Parameters in the mappings to pass the values between sessions.
Worked with Informatica Debugger to debug the mappings in Designer.
Scheduled Informatica jobs using Informatica Scheduler.
Determined various bottle necks and successfully eliminated them to great extent.
Performed Unit Testing and Involved in tuning the Session and Workflows for better
Performance.
Implemented Optimization/performance tuning techniques to identify bottlenecks- Query
tuning (Explain Plan, SQL Trace), table partitioning, memory tuning and cache
management.
Debugged and sorted out the errors and problems encountered in the production
environment.
Generated Adhoc reports using Business objects.
Developed Desktop Intelligence and Web Intelligence reports Using Business objects.
Created graphical representation of reports such as Bar charts, Pie charts as per
requirements.
Created & tested reports using Business Objects functionality like Queries, Sections,
Break points, variables, Formulae etc.
Documented the entire process. The documents included the mapping document, unit
testing document and system testing document among others.
Environment: Informatica 9.0.1/ 8.6, Oracle 11g/10g, MS SQL Server 2008, flat files, Windows
XP, Business objects xi, Power Exchange 8.6.1, UNIX.
Confidential, Richmond, VA Dec 2009 Dec 2010
Role: ETL Developer
Magellan Health Services is a specialty health care management company that delivers
innovative solutions in collaboration with health plans, corporations and government agencies
and their members nationwide.Magellan is a fortune 1000 company serving over 52 million
members with 41 health plans and several pharmaceutical manufacturers and state Medicaid
programs.
Responsibilities:
Worked with business analysts for requirement gathering, business analysis, and testing
and project- coordination.
Responsible for development, support and maintenance of the ETL (Extract, Transform
and Load) processes using Informatica Power Center.
Developed an ETL Informatica mapping in order to load data into staging area. Extracted
from flat files and databases and loaded into Oracle 10g target database.
Designed Mapping document, which is a guideline to ETL Coding.
Created mappings using the transformations like Source Qualifier, Aggregator,
Expression, Look Up, Router, Filter, Update Strategy, Joiner, Sequence Generators
and Stored Procedure
Worked with various Informatica Power Center tools Source Analyzer, Target
designer, Mapping Designer & Mapplet, Transformations.
Used Slowly Changing Dimension Mappings of type II and I.
Created reusable transformations and Mapplets and used them in complex mappings.
Wrote Stored Programs (Procedures & Functions) to do Data Transformations and
integrate them with Informatica programs and the existing application.
Used Workflow Manager for Creating, Validating, Testing and Running the sequential
and concurrent Batches and Sessions, scheduled them to run at a specified time.
Used mapping Parameters and Variables.
Worked on Parameterize of all variables, connections at all levels in Window NT.
Created Sessions, Worklets and Workflows for carrying out test loads.
Analyzed Session log files to resolve errors in mapping and managed session
configuration.
Created, configured, scheduled and monitored the sessions and workflows on the basis of
run on demand, run on time using Informatica Power Center Workflow Manager.
Involved in migrating the ETL application from development environment to testing
environment.
Extensively usedSQLandPL/SQLto write Stored Procedures, Functions, Packages and
Triggers.
Used PLSQL developer, TOAD to run SQL queries and validate the data.
Worked on Teradata SQL Assistant to analyze the existing data and implemented new
business rules to handle various source data anomalies.
Extensively used Teradata utilities like Fast load, Multiload to load data into target
database.
Documented and presented the production/support documents for the components
developed, when handing-over the application to the production support team.
Prepared ETL standards, Naming conventions and wrote ETL flow documentation for
Stage and ODS.
Developed advanced reports using Report Studio Using Tabular Objects, Sub
reports, Cascading/customizing prompts and cross tabs, list, charts, Drill- through
reports.
Customized data by adding Calculations, Summaries and Functions.
Experienced in writing complex reporting using sql performance tuning and providing
operational support for OLTP and OLAP applications.
Environment:Informatica Power Center 8.6.1, Teradata V2R6, Oracle11g/10g, SQL Server
2008, TOAD, Business Objects XI, Linux, Windows XP.
Confidential, CLEVELAND, OH Feb 2009 Nov 2009
Role: Informatica Developer
PNC Bank offers a wide range of services from individuals and small businesses to corporations
and government entities. I worked for the Risk Analytics team. The main scope of the project is
to capture capital and independent exposures from counterparties that PNC is involved in
business with, which in fact helped the users in their package approval process and served
several other purposes
Responsibilities:
Involved in the Design, Analysis and Development of data warehouse using Informatica
Power Center tools. The project involved in creating data warehouse/data mart which is
further used for reporting purpose.
Coordinated with Business system analyst (BSA) to understand and gather the
requirements. Documented user requirements, translated requirements into system
solutions and developed implementation plan and schedule.
Analyzed Mapping, Session, Source, Target and System Bottlenecks to improve and
tuned Performance of various ETL jobs.
Designed and developed complex Informatica mappings including Type-II slowly
changing dimensions.
Worked extensively on different types of transformations like Source Qualifier,
Aggregator, Joiner, Lookup, and Sequence Generator, Stored Procedure, Expression,
Normalizer, Filter and Rank transformations.
Categorized different alerts and exception types to be built from Users Data Validation
list and built mappings to generate exceptions and alerts accordingly.
Coordinated with OBIEE Team regarding the reporting data for Answers and BI
Publisher reports and also for the Dashboard Designs.
Created UNIX scripts for file transfers, to remove unwanted characters from source files,
and update the time dimension table that was maintained for reporting purposes.
Involved in maintaining the data model, creating the tables and granting the privileges for
read and write operations
Widely used of Shared folders, project folders (Non-Shared folders) and short cuts to
maintain integrity between Development, Test, Data validation and Performance.
Frequently using import and export utility to migrate session from developers folder to
subject folder.
Extensively used parameter file to pass Mapping and Session Variables.
Involved in many technical decisions and prepared technical design document along with
vision diagrams for ETL.
Worked as an individual team member to develop most of the process as well as working
in a team to co-ordinate with other team members.
Performed Unit Testing for all developed ETL mappings.
Environment: Informatica Power Center 8.6, SQL*Plus, Toad 8.0, Toad, Oracle 9i, OBIEE
10.1.x, UNIX Shell scripting, PL/SQL, CA-7 Scheduling.
Confidential, FL feb 2008 Jan 2009
Role: Etl Developer

The Web Software Developer will collaborate with the technical team to develop software as
service products. This position will collaborate with internal technical teams to design, develop,
and implement databases for our software applications. This position is responsible for
understanding end-to-end requirements and providing technical expertise in designing,
developing, and maintaining our database architecture. Neighborhood America is seeking a
talented, hands-on software engineers to join our team building enterprise social networking
solutions.
Responsibilities:
Analyzed the business requirements and framing the Business Logic for the ETL
Process.
Extensively used ETL to Load Data from fixed width as well as Delimited Flat files.
Worked extensively on different types of transformations like Normalizer, Expression,
Union, Filter, Aggregator, Update Strategy, Lookup, Stored Procedure, Sequence
Generator and Joiner.
Designed and Developed complex mappings, Reusable Transformations for ETL using
Informatica Power Center.
Loaded the data from .CSV file to Oracle and Teradata.
Designed Mappings between sources to operational staging targets, using Star Schema,
Implemented logic for Slowly Changing Dimensions.
Developed and tested all the Informatica mappings involving complex Router, lookups
and update strategies.
Extensively wrote user SQL coding for overriding for generated SQL query in
Informatica
Created workflows and worklets for Designed Mappings.
Implemented variables and parameters in the mappings.
Scheduled the loads at required frequency by Setting up batches and sessions using
Power center workflow manager, PMCMD and also using scheduling tools.
Generated completion messages and status reports using workflow manager.
Worked with Workflow Manager to import/export metadata, jobs, and routines from
repository, and also created data elements
Worked on Dimension as well as Fact tables, developed mappings and loaded data on to
the relational database.
Extensively worked in Performance Tuning of programs, ETL procedures and
processes. Also used debugger to Troubleshoot Logical Errors.
Wrote ETL specifications and unit test plans for the mappings.
Environment: Informatica Power Center 8.1, Oracle 9i,Teradata, SQL, PL/SQL, MS SQL
Server 2005, TOAD, Erwin 4.1, UNIX Shell Script, Windows 2003
Confidential, IL Dec 2006 JAN 2008
Role: Etl Developer

The Healthcare business of Thomson Reuters is the leading provider of decision support
solutions that help organizations across the healthcare industry improve clinical and business
performance. Our products and services help professionals and stakeholders understand
healthcare markets, access medical and drug information, manage costs, and improve the quality
of healthcare. The Healthcare business of Thomson Reuters offers business solutions for
clinicians, hospitals and healthcare providers, employers, health plans, government agencies,
pharmaceutical companies and researchers.
Responsibilities:
Involved in the team during the entire ETL process and development of data marts using
Informatica Power Center.
Developed Transformation logic and created various Complex Mappings and Mapplets
using the designer.
Created set of Reusable Transformations and Mapplets.
Implemented Type III Slowly Changing Dimensions.
Debugged a valid mapping to gain Troubleshooting information about Data and Error
Conditions.
Employed Informatica Power Center Workflow Manager for session management,
database connection management and scheduling of jobs to be run in the batch process.
Worked with Variables and Parameters in the mappings to pass the values between
sessions.
Worked on Partitioning of the Sessions and Performance Tuning of Informatica.
Created Multiple Sessions and executed workflows using Workflow designer and
viewed the results using workflow monitor.
Involved in Unit Testing for the validity of the data from different data sources
Fine tuned transformation and mappings for better performance.
Worked with PMCMD command line program to talk with the Informatica server.
Environment: Informatica Power Center 7.9.3, Windows 2000, Oracle 8i, SQL Server
2005/2000,Flat files and Toad, Cognos 7.0
Confidential, India Feb 2004- Nov 2006
Role: Software Developer

Version IT is a Hyderabad based firm offers a range of services that includes Custom
Application / Software Development, Hardware and Networking Solutions. It is a leading
software institute focused on delivering the best and most cost-effective solutions to the clients in
areas such as Educational Institutions, E- Business, Healthcare, Hotels, and Industries. The
objective of this project was to build a flexible real-time data warehouse.
Responsibilities:
Extracted Data from Different Sources by using Informatica.
Extensively used Informatica client tools Source Analyzer, Warehouse designer,
Mapping designer and Mapplet Designer
Extracted data from different sources of databases. Created staging area to cleanse the
data and validated the data.
Designed and developed complex Aggregate, expression, filter, join, Router, Lookup and
Update transformation rules.
Developed schedules to automate the update processes and Informatica sessions and
batches.
Analyzed, designed, constructed and implemented the ETL jobs using Informatica.
Developed mappings/Transformations/mapplets by using mapping designer,
transformation developer and mapplet designer in Informatica Power Center.
Developed Shell scripts to setup runtime environment, and to run stored procedures,
packages to populate the data in staging tables.
Created Users, user groups, database connections and managed user privileges using
supervisor.
Environment: Informatica Power Center 7.1.1, Oracle 8i, MS SQL SERVER 2000, SQL,
PL/SQL, SQL*Loader, UNIX Shell Script.

PROFESSIONAL SUMMARY:
Over 7+ years of total IT experience and technical proficiency in building Data
Warehouses, Data Marts, Data Integration, Operational Data Stores and ETL
processes for clients in Financial (Equities, Futures, Options, Commodities, SPOTs,
Swaps, Bonds, Credit Risk, Market Risk, Operational Risk) and HealthCare
(Providers, Customers, Organizations, Plans, Claims, and Extracts) domains.
5+ years of strong experience in working with large scale Data Warehouse
implementations using Informatica PowerCenter 8.x/7.x/6.x, Oracle, DB2, SQL
Server on UNIX and Windows platforms.
Strong knowledge in OLAP systems, Kimball, and Inmon methodology & models,
Dimensional modeling using Star and Snowflake schema.
Extensive experience in Extraction, Transformation, and Loading (ETL) data from
various data sources into Data Warehouse and Data Marts using Informatica
PowerCenter tools (Repository Manager, Designer, Workflow Manager, Workflow
Monitor, and Informatica Administration Console).
Expertise in implementing complex business rules by creating robust Mappings,
Mapplets, Sessions and Workflows using Informatica PowerCenter.
Experience in performance tuning of Informatica mappings and sessions to improve
performance of the large volume projects.
Experience in Migration, Configuration and Administration of Informatica
PowerCenter.
Experience in integration of various data sources like Oracle, DB2, SQL Server, Flat
Files, Mainframes, XML files into Data Warehouse and also experienced in Data
Cleansing and Data Analysis.
Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages,
Cursors, Triggers, Views, and Indexes in distributed environment.
Excellent expertise with different types of data load strategies and scenarios like
Historical Dimensions, Surrogate keys, Summary facts etc.,
Worked extensively in all stages of SDLC, from gathering requirements to testing,
implementation and support.
Experience in preparing documentation such as High level design, System requirement
document, and Technical Specification document etc.,
Strong experience in writing UNIX Shell scripts, SQL Scripts for development,
automation of ETL process, error handling, and auditing purposes. Experience in
using UC4, Autosys, and Control-M scheduling tools to organize and schedule jobs.
Good knowledge on generating various complex reports using OBIEE, MicroStrategy,
and Business Objects.
Experience in using IBM Clear Quest to track defects and document test cases.
Good knowledge on TIBCO Rendezvous and IBM MQSeries.
Worked with cross-functional teams such as QA, DBA and Environment teams to
deploy code from development to QA and Production server.
Experience in project management, estimations, and resource management activities.
Excellent analytical, problem solving skills with strong technical background and
interpersonal skills.
TECHNICAL SKILL SET:
Operating Systems UNIX, Linux, Windows XP/2000
Databases Oracle 11g/10g/9i, DB2 V8.01, SQL Server 2008/2005, MS Access
Database Tools TOAD, SQL Navigator
Load Utilities SQL Loader
ETL Tool Informatica PowerCenter 8.6.1/8.5.1/8.1.1/7.x/6.x
BI Reporting Tools Business Objects, OBIEE, MicroStrategy 8.2
Programming Languages C, C++, HTML, XML, COBOL, PL/ SQL, Java, J2EE, JSP
Scripting Languages Shell Scripting, Perl Scripting
Tools UC4, Cron, Control-M, Autosys
Application Servers Web Logic 10.x/9.x, Tomcat
Middleware TIBCO Rendezvous, IBM MQ Series
Test Management Tools IBM Clear Quest, Quality Center, JIRA
PROFESSIONAL EXPERIENCE:
Confidential, Chicago IL Jul 09 Present
Data Warehouse Consultant
Project: Clearing Positions
Confidential, is the worlds leading and most diverse derivatives marketplace. The main
objective of the project is to build a distributed environment which would be a primary source
for trade processing, position management, performance bond, settlement, asset management,
banking, and deliverables information.
Roles & Responsibilities:
Interacting with business owners to gather both functional and technical requirements.
Documenting the business requirements and framing the business logic for the ETL
process.
Developing technical specifications and other helpful ETL documents following CME
Groups standards.
Involved in creating logical and physical data models using CA ERwin data modeler.
Generating the DDL scripts for the physical data model.
Use Agile methodology for SDLC and utilize scrum meetings for creative and productive
work.
Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and
functions; implement best practices to maintain optimal performance.
Design, develop, and test Informatica mappings, workflows, worklets, reusable objects,
SQL queries, and Shell scripts to implement complex business rules.
Load historical and intraday trades, settlements, positions, and product data into Oracle
data warehouse to enable business analysts to better understand, monitor, and analyze
liquidity generating performance of Market Maker firms trading CME Groups products.
Migrating historical data from DB2 to the Oracle data warehouse.
Transferring the data from various sources like XML, flat files, DB2 into Oracle data
warehouse.
Extensively worked on SCD type 2 using Look up transformation.
Identifying bottlenecks/issues and fine tuning them for optimal performance.
Oversaw unit and system tests and assisted users with acceptance testing.
Upgraded Informatica repository to 8.6.1 within the timeframe.
Responsible for capturing, reporting, and correcting error data.
Used Business Objects XI R2 to programmatically generate reports and gather necessary
information about report instances.
Performed/automated many ETL related tasks including data cleansing, conversion, and
transformations to load Oracle 10G based Data Warehouse.
Work with DBAs and systems support personnel in elevating and automating successful
code to production. Used UC4 for job scheduling, workload automation and for
generating reports.
Developer Shell/Perl scripts to transfer files using FTP, SFTP, and to automate ETL
jobs.
Experience using Web Logic for hosting the servers.
Provide on-call support to production system to resolve any issues.
Conducting code walkthroughs and review peer code and documentation.
Playing role in design of scalable, reusable, and low maintenance ETL templates.
Environment: Informatica Power Center 8.6.1/8.5.1, Oracle 11g/10g RAC, DB2 v8.01, UC4,
RHEL 5.4, Windows XP, SQL, PL/SQL, Shell/Perl Scripting, BO XI, ERwin, TIBCO, Web
Logic 10.3.4, IBM Clear Case & Clear Quest 7.0.1, SQL Developer, TOAD 9.0.
Confidential, Wilmington DE Mar 08 Jun09
Informatica Consultant
The objective of the project is to provide timely, accurate and consistent sales information. This
involves developing and implementing a Solution that will address the technology challenges,
business and organizational issues and providing a more complete picture of sales. This means
providing the ability to measure accurate sales results and identifies growth potentials. The
current technology solution proposed is to implement a data warehouse which would be the
primary source for all business reporting.
The data is extracted from several source systems and consolidates it into an enterprise data store
known as Operational Data Store (ODS). Data is then extracted from the ODS and transformed
into various data marts such as Spending Report (SPR) data mart which is used to track total
spending of different groups/departments.
Roles & Responsibilities:
Interacted with Business Users to gather business requirements and designed user
friendly templates to communicate any further enhancements needs to be implemented.
Coordinated with Data Modelers for designing the dimensional model.
Extensively worked in Credit Cards billing and payments subject area.
Involved in documenting Functional Specifications, Design Specifications documents
and created ETL Specifications documents and updated them as and when needed.
Created System Interface Agreement (SIA) between source system and target systems,
which has the escalation procedures in case of issues and SLA.
Designed ETL specifications with transformation rules using ETL best practices for
good performance, maintainability of the code and efficient restart ability.
Designed reusable objects like mapplets & re-usable transformations in Informatica.
Experienced in developing mappings using transformations such as Source Qualifier,
Aggregator, Lookup, Filter, Sequence Generator, Expression, Router, Update
Strategy, Rank, XML SQ/Parser/Generator, Normalizer, etc., to load data from
different sources like Oracle, Flat Files, Excel Spread Sheets, XML, COBOL files to
the target Data Warehouse.
Experience in implementing Type II changes in Slowly Changing Dimension Tables.
Designed and developed the UNIX shell scripts for the automation of ETL jobs.
Performed data validation in the target tables using complex SQLs to make sure all the
modules are integrated properly.
Involved in cleansing raw data in staging area using stored procedures in pre and post-
session routines.
Tested and tuned the SQL queries for better performance. Identified the bottlenecks in
mapping logic and resolved performance issues.
Worked closely with Business Intelligence (BI) team and assisted them to develop
reports using Business Objects reporting tool.
Conducted code reviews to make sure the business requirements are met and the coding
standards are followed.
Experience in working with different 3rd party data cleansing tools like Trillium.
Coordinated with System support team to setup the system test environment for code
migration and code execution process in QA environment.
Environment: Informatica Power Center 8.1/7.1, Erwin, BO XI, SQL Server 2008/2005,
PL/SQL, Shell Scripting, COBOL, IBM MQSeries, Erwin, Trillium, Autosys, Tomcat, TOAD,
Sun Solaris 2.7, Windows XP.
Confidential, Maryland Mar06 Feb08
Informatica Consultant
COMSORTer Application: Comsort Inc. is a wholly owned subsidiary of Merck Inc, one of the
worlds largest pharmaceutical companies. Comsort creates and sends surveys to physicians to
nominate specialists in their respective fields which Mercks sales and marketing team uses to
target for promoting drugs and to invite them as speakers at physician conferences where they
can promote Mercks drugs. The project was about creating an online application that Comsort
could use to enter the survey data and to generate lists of physicians based on the surveys and the
nominations provided.
Responsibilities:
The COMSORTer application was Oracle based and the existing data was stored on SQL
Server and DB2.
Migrated the data from SQL Server and DB2 to Oracle.
Created a mapping document that outlines the sources mapped to the targets
Created a document outlining the plan of action to be taken for the entire process
Created views to select data from the existing SQL Server databases.
Created DTS packages to generate flat files from the views created.
Designed mappings to load first the Staging tables and then the destination tables.
Designing mappings using transformations such as Source Qualifier, Joiner, Expression,
Lookup, Filter, Router etc.
Created different transformations using Informatica for loading the data into SQL Server
database.
Transferred the data from a combination of different input files like XML, Flat files to
Oracle.
Created, optimized, reviewed, and executed Complex SQL queries to validate
transformation rules used in source to target mappings/source views, and to verify data in
target tables.
Created Functional Spec & Technical Spec documentation & also documented the issues
found in the end to end testing.
Extensively worked with DBAs during the performance testing phase for our database.
Generated SQL Loader scripts and Shell scripts for automated daily load processes.
Developed triggers and stored procedures for data verification and processing.
Extensively worked on database performance tuning techniques and modifying the
complex join statements.
Identifying Bottlenecks, Optimizing SQL, Reducing Unnecessary Caches etc.,
Creating workflows with the Event Wait task to specify when the workflow should load
the tables. The Event Wait task would wait for the indicator file which was being dropped
onto the Informatica server by the Cold Fusion (Front End), and then transfer the control
to the rest of the workflow to load the data.
Used existing UNIX scripts and modified them to load the Oracle tables.
Designed mappings to load the Surveys, Questions, Projects and other tables related to
Surveys.
Involved in the smooth transition from Informatica 7.1 to Informatica 8.0. Worked as an
Informatica Administrator to migrate the mappings, sessions, workflows, repositories into
the new environment.
Configured and Administered Informatica Servers.
Designed and developed scripts for administrative tasks like backups, tuning and
periodically refreshing the test databases from the production databases.
Created views and designed mappings to load test for the UAT to test the application.
Extensively used Debugger Process to test data & applied Break Points.
Provided production support for Business Users and documented problems and solutions
for running the workflow.
Created various geographical and time dimension reports.
Moving the mappings and workflows from Dev to QA and QA to Production
environment and unit testing the process at every level.
Documented detailed steps for migrating the code.
Supporting the application in Production environment by monitoring the ETL process
everyday during the nightly loads.
KT the entire process to the production support members.
Environment: Informatica Power Center 8.1/8.0/7.1, SQL Server 2000, MicroStartegy 8, TOAD
7.6, Oracle 10g/9i, SQL Loader, DB2 UDBv8.1, PL/SQL, T-SQL, Erwin, Control-M, UNIX
AIX4.2, Shell Scripts, Windows XP/2000.
Confidential, Hyderabad India May04 Feb06
Developer
Ceeyes is a leading provider of intellectual property software cores in the acres such as Layer2/7
Networking, Wireless, and Embedded real time software. We have expertise in product design
and development for embedded systems software and system integration.
Project: Annual Maintenance Contract (AMC)
Roles & Responsibilities:
Identifying functional requirements, analyze the system, provide suggestions, and design
as per requirements and test the design.
Coordinated with team members in analyzing the business requirements.
Use Agile methodology in design and development of the application,
Developed conceptual design document with prototyping of UI, involved in estimation
and detailed scheduling of various modules.
Identifying database requirements and was involved in designing of database for various
modules.
Created stored procedures, functions, scripts, and packages for applying the business
rules.
Performance tuning and optimization achieved through the management of indices, table
partitioning, and optimizing the SQL scripts.
Created generic packages useful for other team members.
Environment: PL/SQL, SQL, J2EE (JSP, Servlets), XML, Oracle 9i, UNIX, Windows 2000.
Educational Qualification:
Bachelor Degree in Computer Science & Information Technology

Вам также может понравиться