Вы находитесь на странице: 1из 4

VENKATA

ANURADHA VARANASI

Ph: (972)-
788-5044

EMail :
varanasi.anuradha1@gmail.com

SUMMARY

• Over 7 years of IT and data warehousing experience along with business/software analyst.

• Self-motivated, result-oriented and desire for constant learning.

• Strong analytical, pattern detection and problem solving abilities.

• Ability to meet deadlines and handle pressure coordinating multiple tasks.

• Conducted training programs and mentored team-members for an observed increase in


performance.

• Great customer interfacing skills and customer orientation.

• Comprehensive knowledge of Software Development Life Cycle (SDLC) having thorough


understanding of various phases likes Requirements, Analysis/Design, Development and Testing.

• Extensive Experience in gathering, managing and documenting business requirements,


transforming into functional requirements, building test plans and test cases.

• Creating reports with large data bases like DB2, Oracle, Oracle finance.

• Expertise in documentation and training using Word and PowerPoint.

• Extensive Experience and Knowledge in Mainframes.

• Strong knowledge in design and development of ETL methodology using Ab initio and
Informatica.

TECHNICAL SKILL SET:

ETL Tool Ab initio GDE 1.15.5, Co>Ops 2.15, INFORMATICA 7.1

Databases DB2,Oracle 8x/9x,Teradata,MS SQL Server 2000/7.0,MS Access.

Language SQL,PL/SQL, Java.C,C++,VB 6.0

Operating System z/OS,Mainframes,Windows 9x/ME/2000/XP,MS-DOS,UNIX,AIX

Reporting Tools MS Excel, Oracle Forms 4.5 and Reports2.5,B.O, Web Intelligence

Packages / Others MS Office 97/2003, Acrobat Reader, WSA. FTP

Mainframe
JCL, SAS, QMF, SPUFI, BMC for DB2, PLATINUM, FILE AID for DB2, FILE-AID for datasets, SAS
Technologies

EDUCATION: Masters in Science with Nuclear Physics, India.

Post Graduate Course in Computer Applications (PGCCA) CMC Ltd, India.

PROFESSIONAL EXPERIENCE

Client: Financial/Performance Management Bank Of America, Charlotte, NC


Software/Data Analyst and Sr ETL Developer Date: Oct’09 – Till Date
Project description:
Account Level Profit Metric and Customer Level Profitability Reporting Alignment applications involve
extracting data from various databases like TeraData,Sybase,Oracle,DB2, GDG files located on Mainframe
and Midrange servers for BANK CARD, one of the Bank Of America's largest product and using
transformation logic to incorporate transformation rules on various finance fields and loading into Oracle
Financials database. Once data gets loaded is tested at various stages and business rules are applied
before implementing to production. The Outbound process then loads the data into Warehouse.

Responsibilities:

• Gathered the requirements, interacted with users for functional and technical specifications.

• Analyzed, designed and developed the preexisting processes and built new process as per
requirement document

• Extensively used Ab Initio with EME, SQL and UNIX scripts

• Built various reports for the users to analyze the data

• Performed Unit, System and user acceptance test

• Identified gaps in the existing process and designed new process and areas where the process
can be automated to save cost and to improve efficiency.

• Tuned the graphs , SQL queries to provide maximum efficiency and performance.

• Provided 24 x 7 support

• Documented the process end to end.

Environment: Ab Initio 2.15, Flat Files, GDG, DB2, UNIX, AIX, SQL, SQL*Plus, SQL Assistant for Teradata

Client: TCOE Data Management Bank Of America, Dallas, TX

Data Analyst and Sr. Ab initio Developer Date: Mar’06 – Oct’09

Project description:

This group manages the data in all test platforms. This involves extracting data from production on
periodic basis based on selection criteria defined by various application teams, downsizing the volume and
fictionalizing/masking Non-Public Personal Information (NPPI) data with test values so that the customer
information is protected. This group is also responsible for handling data related defects, data
conditioning, generating reports for various on fly requests and data prep activities.

Responsibilities:
• Involved in requirements gathering, analysis, design, test, implementation, user training and
documentation.
• Wrote test plans and test cases to implement process changes and for the team.
• Identified gaps in the existing process and bridging the gaps, designed new process and Identified
areas where the process can be automated to save cost and to improve efficiency.
• Understand the business requirement what the business does and how it does
• Designed the features of IT system by creating functional specifications and implement new
features by technical design.
• Designed and developed various reports for TCOE Data Management using SQL.
• Developed highly generic graphs for reusability of the code, just by passing all the needed
parameters.
• Developed graphs to make sure application RI is maintained after fictionalization.
• Extensively used VSAM, GDG, Flat files and DB2 tables.
• Developed graphs with various Ab Initio components such as Reformat, Join, Rollup, Lookup,
Normalize and Scan to perform the business sync transformation.
• Developed UNIX shell scripts as a wrapper for file manipulation and to automate the process.
• 24 X 7 support with any data related issues.
Environment: Ab Initio 2.13/2.14, Flat Files, VSAM, GDG, DB2, UNIX, BMC, Platinum, QMF, SQL, JCL, FILE-
AID for MVS datasets, FILE-AID for DB2, DB2I.

Client: COCA-COLA, Atlanta, GA

ETL Developer Date: Sept’04 - Dec'05


Project description:

This project was developed to aid a common solution that multiple Global Customer Teams could use to
analyze point of sale (scan) data as well as data collected from other sources. Data can come from sales of
items as well as Fountain, Hydration and Minute Maid products.The initial implementations, however, were
focused on retail scan data.

Responsibilities:

• Involved in the requirement definition and analysis in support of Data Warehousing efforts.

• Involved in designing the procedures for getting the data from all systems to Data Warehousing
system.

• The data was standardized to store various Business Units information in tables.

• Configured the Work flow Manager to work on tasks like Assignment, Command, Control, Event,
Timer and Decision.

• Created various transformations such as Update Strategy, Look Up, Joiner, Filter and Router
transformations.

• Developed mapping to load data in slowly changing dimensions.

• Used Type2 mapping to update a slowly changing dimension table to keep full history.

• Creating batches and sessions and optimizing the loading process with partitioning and caching.
Created sessions and batches for various mappings in Informatica Server Manager.

Environment: Informatica Power Center 7.1,Oracle 7.X, SQL Server, UNIX,PL/SQL, Windows NT.

CLIENT: Health Care of America(HCA), Nashville, TN

ETL Developer Date: Nov’03 – Aug’04

Project description:

This project involves modifying and testing the existing BI environment to incorporate new functionality of
Order change history from EOM(Enterprise Output Management).Requirement is aimed at improving the
quality of our orders in EOM and their fulfillment. Created Data mart for ORDER CHANGE HISTORY using
INFORMATICA 6.1.This project is to improve the transfer rate of orders, resulting in earlier cash generation.
It will help in measuring the order quality and Transfer on Time for HCA.

Responsibilities:

• Involved in requirements gathering, to identify Dimensions and measures needed for cube
modeling. Systems Study and Analysis.

• Worked on Informatica tool: Source Analyzer, Data warehouse designer, Mapping Designer &
Mapplet and Transformations and modeled data warehousing data marts using Star Schema.

• Performed Configuration and Connectivity of the target Informatica server with Informatica
Server Manager.

• Designed Standards, Policies, Procedures and best practices of using Informatica

• Identified the significant fields to extracts and extensively involved in extraction of the different
flat files,Oracle RDBMS according to the business specifications.

• Worked extensively with complex mappings using expressions, aggregates, filters, lookup and
procedures to develop and feed in Data Warehouse.

• Created mappings using Reusable Transformations and Mapplets,Source qualifier query to


filter the source data.

• Tuned the mappings different logic's to provide maximum efficiency and performance complete.

• Created tasks and work flows in the Task Developer and Work flow Designer respectively provided
with Work flow Manager.Monitored the work flows in Work flow Monitor.

Environment: Informatica power center 6.1,Windows 2000


CMC LTD INC, INDIA

TRAINEE: Oracle Application Developer Date: May’00-May’02

Database Developer Date: Jun’02-May’03

Responsibilities:

• Working as a technical support for a team that involved technical & quality reviews of program
codes and PL/SQL blocks for optimization and maintaining standards & guidelines.

• Developed Database Triggers in order to enforce complicated business logic and integrity
constraints and to enhance data security at database level.

• Developed Custom forms to view edit the data in custom interface tables and event handling
tables using template forms.

• Complete database architecture including developing stored procedures and functions in order to
create short cuts to frequently repeated tasks.

• Promoting the new code to production unit and system testing. Creating reports using Oracle
reports.

• As an Oracle Implementation Technical worked on writing custom Forms and Custom Tuning the
reports using hints.

Environment:Oracle 7.1, SQL, SQL*Plus, PL/SQL, Developer 2000 (Forms 4.5,Reports 2.5)

EMPLYMENT STATUS: CITIZEN

References will be provided on request