Вы находитесь на странице: 1из 6

Summary

8 years of extensive experience in IT, this includes Implementation of Data Warehousing projects with Teradata. Strong understanding of Data warehouse project development life cycle. Expertise in Teradata Database design, implementation and maintenance mainly in Data Warehouse environments. 5+ years of experience on working with Teradata SQL Assistant, Teradata Administrator, PMON and Utilities such as BTEQ, Fast Load, Multi Load, Xml import, Fast Export. Exposure to Tpump on UNIX/Windows/Mainframe environments and running batch process for Teradata CRM. Understanding of Teradata MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc and 3+ years of experience in Teradata production support. Good working Knowledge with Teradata V2R5/V2R6/12.0 and Sound knowledge of Oracle 10g/11g, MS SQL Server 2000, MS Access 2000, PL/SQL, SQL*Plus, DB2 7.0. In Depth understanding and usage of TERADATA OLAP functions. Proficient in TERADATA SQL, Stored Procedures, Macros, Views, Indexes (Primary, Secondary, PPI, Join indexes etc). An expert in the ETL Tool Informatica which includes components like Power Center, Power Exchange, Power Connect, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console, IDE Informatica Data Explorer, IDQ - Informatica Data Quality. Experience in using the Informatica Utilities like Pushdown optimization, Partition and implemented Slowly Changing dimensions Type 1, Type 2 methodology for accessing the full history of accounts and transaction information. Experienced in creating complex mappings and developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica (8.x). Around 4 + years of experience in working with Crystal reports, Business intelligence tools (Business objects) and ETL tool as informatica. Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Reports. Experienced in developing Desktop Intelligence, Web Intelligence / Rich Client and Crystal Reports using different Data Sources. Skilled in large-scale multi terabyte initial database load and ongoing update techniques, backup and recovery requirements, SQL, etc. Provided performance tuning and physical/logical database design support in projects for Teradata systems. Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likes Star Schema, Snowflake Schema. Design and development of OLAP models consisting of multidimensional cubes and drill through functionalities for data analysis. Involved in a upgrade from V2R5 to V2R6. Implemented efficiencies in present processing, utilizing of the performance enhancements in this upgrade. Proven record of success in design, development and implementation of software applications using object oriented technology. Good exposure to multiple flavors of UNIX like SUN SOLARIS, IBM AIX, HP-UX. Well versed in writing UNIX shell scripting.

Education /Certifications/Training
Teradata Certified Professional V2R5

Technical Skills
Operating System: UNIX, Sun Solaris, WINDOWS 98/XP/2000, Mainframes

Databases/Sources: Programming Languages: Tools & Utilities: ETL /Reporting Tools: Web Technologies:

Teradata V2R5/V2R6/12.0, Oracle 10g/11g, SQL Server 2008/2005, DB2, Sybase, XML. C, C++, Shell Scripting (K-Shell, C-Shell), SQL. TERADATA SQL ASSISTANT, TERADATA ADMINISTRATOR, QUERYMAN, BTEQ, MULTILOAD, FASTLOAD, FASTEXPORT, TPUMP. Business Objects XI 3.1(R3.1)/R2, Crystal Reports XI, Informatica 8.X. Data Modeling, ERWIN, VISIO. HTML/XHTML, XML, JavaScript, CSS, Dreamweaver, PHP.

Professional Experience
XXXX, Richmond VA ETL / Teradata Developer/DBA Mar 2010 Till Date

XXXXis a relatively new company that clearly emerged out of Philip Morris; although the strategic reasoning for the creation of this entity is a matter of speculation (the company's webpage provides one explanation). The onset of "rebranding" of YYYYY Companies to XXXX took place in 2003. Responsibilities: Involved in Designing the ETL process to Extract transform and load data from OLAP to Teradata data warehouse. Used BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS. Performed Data transformations using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate, Filter, Router, Normalizer, Update Strategy, etc. Involved in the performance tuning of Informatica mappings, mapplets, and Reusable transformations and analyze the target based commit interval for optimum session performance. Developing Data Extraction, Transformation and Loading jobs from flat files, Oracle, SAP, and Teradata Sources into Teradata using BTEQ, Fast Load, MultiLoad and stored procedure. Design of process oriented UNIX script and ETL processes for loading data into data warehouse. Using Stored Procedures created Database Automation Script to create databases in different Environments. Generated different Space Reports in Teradata Manager to analyze different kind of issues Provide ongoing support by developing processes and executing object migrations, security and access privilege setup and active performance monitoring. Expertise in using Visual Explain, Index wizard, Statistics wizard to tune the bad queries and analyze the plans implement the recommendations to gain the performance. Teradata performance tuning via Explain, PPI, AJI, Indices, collect statistics or rewriting of the code. Developed BTEQ scripts to load data from Teradata Staging area to Teradata data mart Developed scripts to load high volume data into empty tables using Fast Load utility. Created Reports using all the BO functions like Drill Down, Prompts, Dimensional and Measure variables to show accurate Results. Writing SQL queries and matching the data with database and reports. Tuned and Enhanced Universes with SQL Queries for the Report Performance. Created complicated reports including sub-reports, graphical reports, formula base and wellformatted reports according user requirements. Involved in working with SSA requestor responsibilities which will be assigned for both project and support requests. Worked on different data stores and file formats in web services. Used Fast Export utility to extract large volumes of data at high speed from Teradata warehouse. Performance tuning for TERADATA SQL statements using huge volume of data.

Created Fast Load, Fast Export, Multi Load, TPUMP, and BTEQ to load data from Oracle database and Flat files to primary data warehouse. Developed a number of Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data mart. Created UNIX scripts for various purposes like FTP, Archive files and creating parameter files. Scripts were run through UNIX shell programs in Batch scheduling. Created procedures to delete duplicate records from ware house tables. Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading. Responsible for trouble shooting, identifying and resolving data problems, Worked with analysts to determine data requirements and identify data sources, provide estimates for task duration. Gather information from different data warehouse systems and loaded into warehouse using Fast Load, Fast Export, Xml import, Multi Load, BTEQ, Teradata parallel transporter (TPT) and UNIX shell scripts. Generated the Business Objects reports involving complex queries, sub queries, Unions and Intersection. Involved in unit testing, systems testing, integrated testing, Data validation and user acceptance testing. Involved in 24x7 production support

Environment: Teradata V12.0, Informatica 8.1.1, Business Objects XIR3.1, Crystal reports, Oracle 10G, DB2, Teradata SQL Assistant, SQL Server, Flat files, TOAD 9.x, SQL, Erwin, Linux, Shell Scripting. XXXX, Texas ETL / Teradata Developer/DBA Jan 2009 - Feb 2010

XXXXis a technology company that operates in more than 170 countries around the world. We explore how technology and services can help people and companies address their problems and challenges, and realize their possibilities, aspirations and dreams. We apply new thinking and ideas to create more simple, valuable and trusted experiences with technology, continuously improving the way our customers live and work. Responsibilities: Analyzing the Business requirements and System specifications to understand the Application. Importing data from source files like flat files using Teradata load utilities like FastLoad, Multiload, and TPump. Creating adhoc reports by using FastExport and BTEQ. Designed Informatica mappings to propagate data from various legacy source systems to Oracle. The interfaces were staged in Oracle before loading to the Data warehouse. Performed Data transformations using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate, Filter, Router, Normalizer, Update Strategy, etc. Responsible for Tuning Report Queries and ADHOC Queries. Wrote transformations for data conversions into required form based on the client requirement using Teradata ETL processes. Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases. Tuning the queries to improve the performance of the report refresh time. Created customized Web Intelligence reports from various different sources of data. Involved in performance tuning on the source and target database for querying and data loading. Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode. Exported data from Teradata database using Teradata Fast Export. Used UNIX scripts to run Teradata DDL in BTEQ and write to a log table.

Creating, loading and materializing views to extend the usability of data. Automated Unix shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency Making modifications as required for reporting process by understanding the existing data model and involved in retrieving data from relational databases. Involved in working with SSA requestor responsibilities which will be assigned for both project and support requests. Managing queries by creating, deleting, modifying, and viewing, enabling and disabling rules. Loading the data into the warehouse from different flat files. Database testing by writing and executing SQL queries to ensure that data entered has been uploaded correctly into the database. Transfer files over various platforms using secure FTP protocol. Involved in creating Unit test plans for and testing the data for various applications

Environment: Teradata V12.0, Informatica, Business Objects XIR3.1, Crystal reports , Teradata Utilities (Multiload, FastLoad, FastExport, BTEQ, Tpump), SQL Server 2000,Sybase,DB2, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server. XXXX, NJ ETL / Teradata Developer Sep 2007 - Dec 2008

XXXX is an insurance-based financial services provider with a global network that focuses its activities on its key markets in North America and Europe. In North America, Zurich is a leading commercial propertycasualty insurance provider serving the global corporate, large corporate, middle market, small business (not offered in Canada), specialties and programs sectors. Behind this success is a culture of continuous improvement and a pride in excellence. Responsibilities: Parsed high-level design spec to simple ETL coding and mapping standards. Maintained warehouse metadata, naming standards and warehouse standards for future application development. Developed mappings, sessions, workflows and worklets and scheduled the workflows in Maestro. Used External Loaders like Multi Load and Fast Load to load data into Teradata database. Designed the complete workflow for all the extracts (mappings) to serve the business requirements with dependency hierarchy. Tuned Informatica Mappings and Sessions for optimum performance. Worked and resolved on the production data issues due to the migration of Data warehouse from Teradata to Netezza for NFS. Worked extensively with Teradata Utilities like BTEQ, Fast export and all load utilities. Developed Shell Scripts for event automation and scheduling. Implemented duplicate removal logic for existing code to avoid duplicates on production loads. Environment: Informatica PowerCenter 8.6, Oracle 10g, Teradata V2R6, Netezza, SQL Server 2005, Teradata SQL Assistant, Toad, Management Studio, Windows XP, Sun Solaris, UNIX, Maestro Scheduler. XXXX, IL ETL / Teradata Developer Aug 2006 Aug 2007

XXXX receives Supply Chain Management data from worldwide locations into a global access database through Manugistics, Web Methods and flat files. Data loaded into Data warehouse, building a data mart enhances the business analysis capabilities of the existing SCM and helps worldwide Sales & Operation Planning.

Responsibilities: Understanding the specification and analyzed data according to client requirement. Extensively worked in data Extraction, Transformation and loading from source to target system using BTEQ, FastLoad, and MultiLoad. Developed OLAP applications using Cognos suite of tools and extracted data from the enterprise data warehouse to support the analytical and reporting for all Corporate Business Units. Design and developed PL/SQL procedures, packages and triggers to be used for Automation and Testing. Involved in performance tuning on the source and target database for querying and data loading. Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode. Involved in writing scripts for loading data to target data Warehouse for BTEQ, FastLoad, and MultiLoad. Error handling and performance tuning in Teradata queries and utilities. Data reconciliation in various source systems and in Teradata. Involved in unit testing and preparing test cases. Involved in peer-to-peer reviews. Environment: Teradata V2R6, Teradata Utilities (Multiload, FastLoad, FastExport, BTEQ, Tpump), SQL Server 2000, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server XXXX, India Developer Mar 2005 Jul 2006

The project enables to manage the ongoing financial performance of an organization. Foremost are the budgeting and forecasting processes, for which endless streams of data must be gathered from existing sources as well as constantly generated new ones. Responsibilities: Gathered user requirements and did module testing. Developed and tested system using Oracle. Implemented backend database using Oracle Server. Developed Reports as per the management requirement. Tested different Modules for their functionality. Developed and executed test plans and test cases. Analyzed project and created database. Environment: Visual Basic 5.0, Oracle 7.x, Erwin, SQL, PL/SQL, HTML, DHTML, Windows NT 4.0 XXXX , India Developer Sep 2003 Feb 2005

This system enables traders/senior managers to mitigate the credit/market risk associated with nonstandard trades and regulatory compliance because of the ability to monitor and manage activities and display transaction audit trails thereby avoiding risks. The secondary goal is to reduce the work-effort associated with constant changes in regulatory rules related to equity trading. Responsibilities: Actively involved in gathering requirements from Business Users, and converting them into system requirement specifications and creating detailed use-cases and design documents

Designed, developed, and managed the workflow processes to reflect business requirements with several adapters, exceptions and rules Was involved in data modeling. Designed data flows using UML. Designed and developed User Group Management modules to implement complex business rules for permissions. Coordinated in setting up the development, test, production and contingency environment Designed, developed, managed database star schema, with various hierarchical and lookup tables Developed and maintained complex stored procedures Involved in setting up of application server clustered environment Underwent training in Standard Software Process in implementing CMM level 5 in an enterprise organization

Environment: Oracle 8i, Shell Scripts, UML, Test Director, SunOS

Вам также может понравиться