Вы находитесь на странице: 1из 4

DURGA AKELLA

Data Stage (ETL) Developer


Cell # +91 94404 71503
Email:
durga_akella@yahoo.co.in

SUMMARY:
3+ years of work experience in Data analysis, design, development and
implementation of Data warehousing applications using IBM DataStage (ETL) in
DW/BI Technologies.

Core Qualifications:
Having 3 years of core experience in Data Stage development.
Know SQL, UNIX shell scripting, DWH concepts, PL-SQL coding.
Have experience on quality stage.
Extremely good problem solving capability.
Proven ability to work in high-pressure situations.
Open to learning and using new systems across platforms.
Excellent communication, interpersonal, analytical skills and strong ability to
perform as part of a team.

TECHNICAL SKILLS:
Excellent Experience in Designing, Developing, Documenting, Testing of ETL
jobs and mappings in Server and Parallel jobs using Data Stage to populate
tables in Data Warehouse and Data marts.
Proficient in developing strategies for Extraction, Transformation and Loading
(ETL) mechanism.
Strong understanding of the principles of Data Warehousing using fact
tables, dimension tables and star/snowflake schema modeling.
Expert in designing Parallel jobs using various stages like Join, Merge, Lookup,
Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify,
Aggregator, XML.
Expert in designing Server jobs using various types of stages like Sequential
file, ODBC, Aggregator, Transformer, Sort, Link Partitioner and Link Collector.
Enjoy challenging and thought provoking work and have a strong desire to
learn and progress (motivated enough to self-learn) Ability to pick up new
technology independently.
Expertise in UNIX shell scripts using K-shell for the automation of processes
Proficient in writing, implementation and testing of triggers, procedures and
functions in PL/SQL
Wrote moderate to complex queries which also involved views for retrieving
data from database.

Tools: IBM Infosphere DataStage V8.5, V9.1, Quality Stage, Unix, PL-SQL, DB2,
Oracle, Shell Scripting, C Language, Data WareHousing.

WORK EXPERIENCE

Software Engineer at TechMahindra, Hyderabad (09/2013-present)

Project 1 Details:
Client : BANK OF NOVA SCOTIA, Canada.
Team Size : 04
Technology : Data Stage, Quality Stage, UNIX, SQL
Database : DB2
Role : Data Stage Developer

Project 1 Summary:

Scotia bank is the business name of Bank of Nova Scotia in Canada. Project is
FINTRAC which is part of Anti Money Laundering unit. FINTRAC (Financial
Transactions and Reports Analysis Centre of Canada) is a Canadian government
agency which requires banks and other designated institutions to report particular
types of transactions for their analysis.
Our project is to transform data as per fintrac requirements and provide files to
downstream which reports to fintrac. We also had requirement to use quality stage in
project. As this project started from scratch, I had lot of scope to learn. I had direct
interaction with the client throughout the project.

Roles & Responsibilities:

Understanding the business functionality & Analysis of business requirements


of client by interacting with client.
Worked on DataStage tools like DataStage Designer, DataStage Director,
Datastage Administrator
Provided Technical support to the team as the ETL developer. Addressed best
practices.
Using Quality stages like standardization in the jobs for data quality.
Created jobs in DataStage to import data from heterogeneous data sources
like DB2, Text files and SQL Server.
Involved in creating and maintaining Sequencers.
Played a role of a SME to the Testing Teams, extended support in providing KT
of Business Requirements.
Extensively worked on Job Sequences to Control the Execution of the job flow
using various Activities & Triggers (Conditional and Unconditional) like Job
Activity, Wait for file, Email Notification, Sequencer, Exception handler activity
and Execute Command.
Created various standard/reusable simple to complex jobs in DataStage using
various active and passive stages like Sort, Lookup, Filter, Join, Transformer,
Aggregator, Funnel, Column import, Sequential file, DataSets.
Created scripts using Unix Shell Scripting to run the Datastage jobs, to pull the
source files and also to push the extract file from and to the remote server, to
archive and backup the data
Preparing unit test case documents and GDD.
Also worked on maintenance releases of project after production
implementation.
Analysing defects in production and providing solutions to business people
and working on them.

Project 2 details:
Client : Scotia Bank, Canada
Team Size : 03
Technology : Data Stage, UNIX, SQL
Database : DB2
Role : Data Stage Developer

Project 2 Summary:

Scotia bank is the business name of Bank of Nova Scotia in Canada. Project is RDC
which is part of Anti Money Laundering unit. Remote Deposit Capture (RDC) project
is an initiative within the Operational Efficiencies & Digitization, Banking Solution
under the Canadian Paper Cheque Processing Program.
The implementation of Remote Deposit Capture across channels provides customers
a more robust choice of deposit methods and increased convenience when using
chequesas well as malicious intentions increased methods and opportunities for
fraud. To mitigate the risk of fraud associated with this new feature, Scotiabank has
partnered with FIS and will be deploying the FIS Memento Bank Data Manager for
RDC cheque fraud prevention by producing fraud alerts and analytics from bank data
files.
Our project is to transform data as per FIS Memento Bank Data Manager
requirements and provide files to downstream which produces fraud alerts and
analytics from bank data files. As this project started from scratch, I had lot of scope
to learn. I had direct interaction with the client throughout the project.

Roles & Responsibilities:


Understanding the requirements of client by interacting with client.
Designing simple to complex jobs on data stage tool.
Involved in creating and maintaining Sequencers.
Extensively worked on Job Sequences to Control the Execution of the job flow
using various Activities & Triggers (Conditional and Unconditional) like Job
Activity, Wait for file, Email Notification, Sequencer, Exception handler activity
and Execute Command.
Created scripts using Unix Shell Scripting to run the Datastage jobs, to pull the
source files and also to push the extract file from and to the remote server, to
archive and backup the data
Preparing unit test case documents and GDD.
Also worked on maintenance releases of project after production
implementation.
Analysing defects in production and providing solutions to business people
and working on them.

EDUCATION:
Bachelor of Technology: in Electronics & Communication Engr.;
Percentage: 68.16%; (2010 to 2013)
Kakinada Institute of Engineering and Technology, Korangi, Kakinada

Engineering Diploma: in Electronics & Communication Engr.; Percentage:


76.03%; (2007 to 2010)
Govt. Polytechnic College for women, Kakinada

SSC: Blue Bells School, Kakinada; Percentage 84.67%; (2006 to 2007)

SCHOLARSHIPS AND AWARDS:

Received BRAVO(May 2015) and PAT ON BACK(Dec 2015) Star Awards as


appreciation for my work at TechMahindra.
Got top rating for two consecutive years.
Got many appreciation mails from client.
Secured Aryabhatta Prathibha Puraskaram Award 2007 at Regional level
ECE diploma supported for 3 years by Mahindra & Mahindra Educational Trust
on
merit basis
NSF & Dhanwantari Foundation scholarship holder

Вам также может понравиться