Вы находитесь на странице: 1из 4

Ramya Chethireddy

Datastage Developer
Over Seven years of experience in the field of information technology and software development with hands on experience in ETL, Data migration, and Data warehousing. Strong work experience on IBM Information Server 8.1, Datastage 7.5.x/7.1/7.0/, SQL, PL/SQL, UNIX (AIX, Solaris, RHEL), Oracle 10G/9i/8i, DB2 UDB and knowledgeable about SQL Server. Involved in various phases of the SDLC starting from requirements gathering, development, testing, debugging, and deployment. Proficient in Data analysis, Data cleansing, Data transformation, and Data migration. Expertise in translating business requirements into Datawarehouse and Datamart design and developing ETL logic based on the requirements using Datastage. Worked extensively with different stages of Datastage like Oracle, DB2, Datasets, Aggregator, Transformer, Merge, Join, Lookup, Sort, Pivot, Change capture, Remove Duplicates, Sequential file, Copy, Peek, and Shared Containers (Server & Parallel) for developing jobs. Experienced in unit testing, system integration testing, implementation, maintenance and performance tuning. Used the Datastage Director extensively to validate, run, schedule, monitor, debug and test the application on development, and to obtain the performance statistics. Good knowledge about Star schema modeling, Snowflake modeling, fact and dimension table design, Datamarts, and Datawarehouse. Proficient in Microsoft Technologies, RDBMS, Data Management, ETL, SQL, and various Operating Systems. Proficiency in C, C++, HTML, and object oriented programming. Experience in using shell scripts for automation, FTP/SFTP to transfer files from one environment to another environment. Excellent oral and written communication skills, ability to understand project development issues at all levels with strong analytical, organizational, presentation and problem solving skills. Technical Skills: ETL Tools IBM Information Server 8.1/7.5 (Data stage PX) Databases MS Access, My SQL, SQL Server, Oracle 10g/9i/8i, DB2, DB2 UDB GUI & Others Tools Toad 7.4, SQL Plus, SQL Loader, Aqua Data Studio, AQT, Putty, SSH, Test Director, Cognos 7, Business Objects Programming Languages C, C++, C#. Net, HTML, XHTML, CSS, PL/SQL, Shell, Java Script Application Software ASP.Net, ADO.Net, UML, Microsoft Visual Studio

Work Experience

Datastage Developer
Epsilon October 2009 to May 2011
Epsilon, an Alliance Data company is a leading provider of multi-channel marketing services, technologies and database solutions. Its solutions include marketing technology, direct marketing, strategic consulting, database and loyalty technology, proprietary data, predictive modeling, and email communications and marketing. It serves clients in consumer packaged goods, financial, insurance, health care and pharmaceutical, and retail industries in North America, Europe, Asia, and Australia. Responsibilities:

Documentation of business process, code design, and interacting with business analysts to gather requirements and drafting guidelines. Extensively used Datastage for extracting, transforming, loading data from different sources like databases and flat files to produce the desired output. Designing, developing, documenting and testing of the different jobs according to requirements. Extensively used parallel stages like Join, Merge, Lookup, Filter, Aggregator, Modify, Copy, Sort, Funnel, Change Data Capture, Remove Duplicates, Surrogate key Generator, Row Generator, and Column Generator for development and debugging purposes. Created several job sequences for the maintenance of Datastage jobs using Job Activity, Execute Command, Exception Handler, Job Sequencers and User Variable stages. Involved in development of job flows with various logic and performance tuning of parallel jobs using performance statistics and dump score. Loading of report mart to facilitate report generation and business intelligence. Building scalable architecture to facilitate development of effective solutions for clients of various industries. Created source to target mapping documents and developed the prototype designs for different types of ETL processes. Extensively used job parameters and environment variables to provide the flexibility in the job designs. Designed multi instantiate job flows and shared containers to facilitate code reusability and reduce the complexity of jobs. Carried out the code migration from development to test environment and then to production environment for deployment. Followed Datastage best practices and coding standards while designing the job flows. Utilized custom developed SQL in various database stages to distribute the load between Database server and Datastage server. Developed shell scripts to automate the Datastage job process and scheduled them in the UNIX Crontab and JCS to facilitate the automated job runs. Wrote shell scripts utilizing SFTP to transfer the files between various client servers and company servers. Environment: IBM Information server 8.1, Oracle 10g, RHEL, Cognos, Aqua Data Studio, Windows XP, Putty, JCS, Filezilla.

Datastage Developer
Citco Fund Services November 2008 to August 2009
is a global leader in offering a complete range of accounting, administration, shareholder and corporate services to hedge funds and collective investment schemes. Citco bank well known across Europe and the Americas, launched a pilot project named "Nextgen bank database", to process the information they get from various commercial data sources (CDS). Responsibilities: Involved in developing job naming standards and modifying ETL mapping document according to the requirements. Developed Datastage jobs to parse the CDS dump file to create individual source files for the subsequent jobs. Developed parameter driven ETL jobs and defined reference lookups and joins, aggregations, defined constraints and derivations to populate final tables in the target. Developed delete record jobs to delete the processed records from the database based on the time comparison. Created master level controlling sequencer for loading and deleting jobs using the Datastage job sequencer.

Involved in performance tuning of the jobs by carefully monitoring visual performance statistics and new monitor in the director. Created effective test data by modifying source file and involved in unit testing and end to end testing of the design. Developed the Datastage repository by importing source file schemas and the target database schemas using Datastage Manager and created new job categories in the project. Extensively used Director to validate and run the jobs by specifying run time parameters and carefully analyzed job log to fix the defects. Worked with Administrator client component to create project, users and to grant user rights. Environment: Ascential Datastage v 7.5.1, Oracle 9i, Erwin 4.1, Windows XP, Solaris 2.x, Putty, SQL Plus.

Datastage Developer
Phoenix Insurance - Phoenix, AZ January 2008 to September 2008
Phoenix insurance launched a project to develop a comprehensive enterprise data warehouse called as "Life and Annuity Financial Management Data Warehouse" (LAW). The primary purpose of the LAW project is to deliver data pertaining to six administrative systems to the enterprise repository, from where a MARTS layer will be created to provide OLAP cubes, reports, and ad-hoc querying capabilities to business end users. Responsibilities: In charge of multiple admin systems from end to end, extracting data from CSV trailer files, transforming the data according to the business requirements and loading them to target. Designed jobs involving various cross reference lookups and joins, shared containers which can be used in multiple jobs. Sequencers are created at job level to include multiple jobs and a layer level sequence which include all job level sequences. Extensively employed Datastage Director to validate, run, schedule, monitor the jobs and followed job log carefully to debug the jobs. Carefully monitored the performance statistics and involved in fine tuning of jobs for the improved processing time. Involved in developing UNIX scripts for file transfers (FTP) and to call Datastage jobs. Involved in fine tuning, trouble shooting, bug fixing, defect analysis and enhancement of the multiple admin systems Datastage jobs. Involved in the designing of marts and dimensional and fact tables. Environment: Ascential DataStage v 7.5.1, DB2 UDB 8.1.9, Erwin 4.1, Windows NT 4.0/2000, IBM AIX 5.2, Control center, Putty.

ETL Developer
E-Matrix Software Solutions - Hyderabad, ANDHRA PRADESH, IN July 2005 to November 2007
As a part of E-Matrix technology delivery team, worked on a data management initiative for a growing Pharmaceutical company. The primary purpose of this project is to deliver data pertaining to different source systems to the enterprise data repository, from where a MARTS layer will be created to provide reporting capabilities to business users Responsibilities: Involved in requirements gathering, designing of ETL mapping document and end to end process flow from source system to target database.

Extensively used Datastage for extracting data from different source systems comprising flat files of different formats and databases by employing customized SQL. Performed various data validations, data transformations, data cleansing, data parsing of source data, according to the business requirements. Used different stages like oracle, lookup, join, filter, aggregator, transformer, merge, remove duplicates, sort, and sequential file stages. Involved in thorough unit testing, SIT, and UAT of the fully developed code and solving the defects raised during various testing phases and modifying the code accordingly. Used Datastage Manager to import schemas, build repository, and migrate the code between various environments. Involved in developing UNIX scripts for file manipulation, secure file transfers (SFTP) and automation of Datastage jobs. Designed and executed DDL to create tables and prepared SQL queries to evaluate the data loaded and testing purposes. Worked closely with Cognos developer in designing the reports and provided SQL assistance to retrieve data from Data mart. Environment: IBM Datastage 7.5, Oracle, RHEL, Cognos, Aqua Data Studio, Crontab, Microsoft Office, Shell Scripting.

SQL Developer/Programmer Analyst


iLogic Technologies - Hyderabad, ANDHRA PRADESH, IN February 2004 to June 2005
Responsibilities: As a part of development team involved in various phases of development cycle performing multiple tasks. Gathered requirements from the client and performed gap analysis to identify any potential issues. Designed and executed DDL to create tables and loaded them using SQL loader. Wrote SQL to perform data profiling and to extract data out of tables and do joins. Maintained directory structure in Linux and performed Data Management by archiving old data. Developed simple shell scripts to transfer files from one server to another using sftp. Scheduled shell scripts in Crontab to facilitate automated job runs. Designed SQL to extract the data from tables and provide customized reports to the client according to requirement. Environment: Oracle, SQL Plus, Linux, MS Office, Putty

Additional Information

Operating Systems AIX 5.2, Win 2000/NT/XP, MS-DOS, MAC OS-X, RHEL 4.x, Sun Solaris 2.x Scheduling Tools Crontab, JCS, Autosys

Вам также может понравиться