Академический Документы
Профессиональный Документы
Культура Документы
Experience in design and implementation of Server and Parallel jobs including Batches and Job Sequencers. Excellent Communication and Interpersonal skills .Versatile team player with proven problem solving skills. Conducted Datastage training at the ibm level Has been taking up datastage classes on my own from last 3 years and have taught many students from India as well as usa and Australia
Skill Set Operating systems Languages Databases Tool Windows XP/98, 2000, 2003. PL/SQL,JAVA,RDBMS,HTML,DHTML,JAVASCR IPT Oracle,SQL Server 2000 ,TERADATA 13.0,DB2, DataStage7.5.1,7.5.2,7.5.3,8.0,8.1, , 8.5,Informatica 8.x,ssis,ssrs ,oracle stream, InfoSphere QualityStage
Education: B.TECH in Information Technology Institute of Engineering and Management (Kolkata,WB), Technical certifications:
Professional ExperienceMarch 2010 Till date, Senior developer IBM India Pvt. Ltd . India
Ap Moller Maersk is a Shipping company which is used to maintain data warehouse of all the container shipment and related activities. Our client developed a package that supports
server application which runs on 4 nodes. Data governance team used the package to maintain the complete historical data of the enterprise to do analysis and reporting. Maersk Line is one of the leading liner shipping companies in the world, serving customers all over the globe. Maersk has activities in a variety of business sectors, primarily within the transportation and energy sectors. It is the largest container ship operator and supply vessel operator in the world. Maersk is based in Copenhagen, Denmark, with subsidiaries and offices in more than 135 countries worldwide and around 120,000 employees. It ranked 106 on the Fortune Global 500 list for 2009. Responsibilities: Extensively used Ascential DataStage Manager, Designer, and Director for creating and implementing jobs. Migrated jobs from DB2 to Teradata and Oracle to Teradata. Worked with Onshore/Offshore Development Teams to resolve DataStage performance tuning and Job abort issues. Developed various jobs using stages Sequential files, Sort, Link Collector, DB2 UDB API, Oracle OCI and Teradata Involved in Extracting, transforming, loading and testing data from DB2, Oracle OCI and Teradata API using Datastage Creating and setting up Data Stage Projects in Dev, Test and Prod Environments. Migration of jobs from Development to Test Environment using DataStage Manager. Created, optimized, and Executed Teradata SQL test queries to validate in target mappings/source views, and to verify data in target tables Analyzed existing SQL and Datastage jobs for better performance Importing Table definitions and Metadata using DataStage Manager. Created user defined environmental variables using DataStage Administrator. Involved in Administration and support of ETL Platform. Scheduling DataStage jobs in DataStage Director and cron. Monitoring DataStage Jobs and clearing the salves of the job which are created while developing jobs and unlocking jobs from Datastge Administrator and Linux Developed Shell scripts for running Datastage Jobs. Performed Unit testing for jobs developed to ensure that it meets the requirements.
Environment: DataStage 8.0 ,7.5.1,7.5.2, IBM Information Server,IBM Information Server FastTrack,WebSphere Business Glossary ,WebSphere Information Analyzer, DB2, IBM WebSphere Information Services Director, Oracle11g, Aqua data studio, Siperion,Teradata 13,UNIX, Linux, Windows 2003, Teradata ARC,Terdata Parallel Transporter,Teradata Warehouse Miner Team Size : 5 to 7
Developer
Enterprise datawarehouse receive data from various sources such as Legacy,SAP.Data is then loaded to different table through Datastage job. Once table is loaded report is generated fetching values from different table.We use third party MAESTRO to schedule our job.
Responsibilities: Identified/documented data sources and transformation rules required populating and maintaining data warehouse content Used InfoSphere DataStage 8.1 EE as ETL tool to extract data from source systems such as Oracle, Flat Files, SQL Server Database to target system Responsible for the monitoring the production system. Handling production issues on daily basis Developing complex ETL jobs involving multiple data sources including several RDBMS tables and flat files Used different transformations such as Merge, Join, Filter, Look-up, and Aggregator to transform the data
4
Used Parameter files and variables in mappings and sessions Created reusable objects such as containers, routines to handle the reusability of the code Extensively involved in Re-design the load flow by eliminating the intermediate steps and Optimizations in order to get significant ETL conversion performance gains Was Involved in quality checking such as investigating and standardizing of data with the help of quality stage before moving it to final target Extensively worked in writing SQL Queries (Sub queries and Join conditions, correlated sub queries) Monitoring production DataStage jobs and resolving the incidents and tasks as per their severities Designed and implemented Type1 and Type2 Slowly changing dimensions Fully involved in the Performance Tuning of the Data Stage jobs Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis)
Environment: IBM DataStage and quality stage 8.0 ,Datastage 7.5.1,Datastage 7.5.2. teradata,Oracle References: Available upon request