Вы находитесь на странице: 1из 5

Amit Kumar Singh

Mobile :(+91) 9831798822 Home: +91-033-25001745

Email: amitiemcal@gmail.com Profile

5+ years of IT experience in the analysis, design, development,

implementation and testing of datawarehouse applications


Awarded IBM Service of excellence award Extensively involved in the development of Datastage and quality stage ETL process for extracting data from different data sources, data transformation and loading the data into data warehouse from data mart operations. Extensively worked on Datastage and quality stage on version 7.5.1/7.5.2,8.0,8.1,8.5 Good knowledge on other component of infosphere Excellent knowledge and experience in Data Warehouse development life cycle, dimensional modeling, repository management and administration, implementation of Star, Snowflake schemas and slowly changing dimensions. Expertise in designing Parallel and server jobs jobs using various stages like HASH,LINK COLLECTOR ,SORT,LINK PARTITIONER,ROW SPLITTER,DB2 STAGE ,TERADATA FAST LAOD,TERADATA API, Aggregator,Investigating data and standardizing data Ascential Data Stage EE/Parallel Extender as an ETL to extract data from sources like DB2, XML and flat files and loaded to target tables. Demonstrated strong technical writing skills, Proven ability to quickly learn new technologies Experience in Data Modeling and involved in creation of Star Schema and Snowflake dimensional data marts. Experience in writing, testing and implementation of the triggers, Procedures, functions Database level. Extensive experience in Best Practice of Data Stage. Experience in Client/Server Technology and RDBMS. Strong Knowledge in Software Development Lifecycle Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
1

Experience in design and implementation of Server and Parallel jobs including Batches and Job Sequencers. Excellent Communication and Interpersonal skills .Versatile team player with proven problem solving skills. Conducted Datastage training at the ibm level Has been taking up datastage classes on my own from last 3 years and have taught many students from India as well as usa and Australia

Skill Set Operating systems Languages Databases Tool Windows XP/98, 2000, 2003. PL/SQL,JAVA,RDBMS,HTML,DHTML,JAVASCR IPT Oracle,SQL Server 2000 ,TERADATA 13.0,DB2, DataStage7.5.1,7.5.2,7.5.3,8.0,8.1, , 8.5,Informatica 8.x,ssis,ssrs ,oracle stream, InfoSphere QualityStage

Education: B.TECH in Information Technology Institute of Engineering and Management (Kolkata,WB), Technical certifications:

Oracle SQL Certified. IBM Infosphere datastage8.0 certified

Professional ExperienceMarch 2010 Till date, Senior developer IBM India Pvt. Ltd . India

Ap Moller Maersk is a Shipping company which is used to maintain data warehouse of all the container shipment and related activities. Our client developed a package that supports

server application which runs on 4 nodes. Data governance team used the package to maintain the complete historical data of the enterprise to do analysis and reporting. Maersk Line is one of the leading liner shipping companies in the world, serving customers all over the globe. Maersk has activities in a variety of business sectors, primarily within the transportation and energy sectors. It is the largest container ship operator and supply vessel operator in the world. Maersk is based in Copenhagen, Denmark, with subsidiaries and offices in more than 135 countries worldwide and around 120,000 employees. It ranked 106 on the Fortune Global 500 list for 2009. Responsibilities: Extensively used Ascential DataStage Manager, Designer, and Director for creating and implementing jobs. Migrated jobs from DB2 to Teradata and Oracle to Teradata. Worked with Onshore/Offshore Development Teams to resolve DataStage performance tuning and Job abort issues. Developed various jobs using stages Sequential files, Sort, Link Collector, DB2 UDB API, Oracle OCI and Teradata Involved in Extracting, transforming, loading and testing data from DB2, Oracle OCI and Teradata API using Datastage Creating and setting up Data Stage Projects in Dev, Test and Prod Environments. Migration of jobs from Development to Test Environment using DataStage Manager. Created, optimized, and Executed Teradata SQL test queries to validate in target mappings/source views, and to verify data in target tables Analyzed existing SQL and Datastage jobs for better performance Importing Table definitions and Metadata using DataStage Manager. Created user defined environmental variables using DataStage Administrator. Involved in Administration and support of ETL Platform. Scheduling DataStage jobs in DataStage Director and cron. Monitoring DataStage Jobs and clearing the salves of the job which are created while developing jobs and unlocking jobs from Datastge Administrator and Linux Developed Shell scripts for running Datastage Jobs. Performed Unit testing for jobs developed to ensure that it meets the requirements.

Developed jobs using fast track

Environment: DataStage 8.0 ,7.5.1,7.5.2, IBM Information Server,IBM Information Server FastTrack,WebSphere Business Glossary ,WebSphere Information Analyzer, DB2, IBM WebSphere Information Services Director, Oracle11g, Aqua data studio, Siperion,Teradata 13,UNIX, Linux, Windows 2003, Teradata ARC,Terdata Parallel Transporter,Teradata Warehouse Miner Team Size : 5 to 7

July 2007 Sep 2009, ( India)

Developer

LNT INFOTECH , Mumbai

Enterprise datawarehouse receive data from various sources such as Legacy,SAP.Data is then loaded to different table through Datastage job. Once table is loaded report is generated fetching values from different table.We use third party MAESTRO to schedule our job.

Responsibilities: Identified/documented data sources and transformation rules required populating and maintaining data warehouse content Used InfoSphere DataStage 8.1 EE as ETL tool to extract data from source systems such as Oracle, Flat Files, SQL Server Database to target system Responsible for the monitoring the production system. Handling production issues on daily basis Developing complex ETL jobs involving multiple data sources including several RDBMS tables and flat files Used different transformations such as Merge, Join, Filter, Look-up, and Aggregator to transform the data
4

Used Parameter files and variables in mappings and sessions Created reusable objects such as containers, routines to handle the reusability of the code Extensively involved in Re-design the load flow by eliminating the intermediate steps and Optimizations in order to get significant ETL conversion performance gains Was Involved in quality checking such as investigating and standardizing of data with the help of quality stage before moving it to final target Extensively worked in writing SQL Queries (Sub queries and Join conditions, correlated sub queries) Monitoring production DataStage jobs and resolving the incidents and tasks as per their severities Designed and implemented Type1 and Type2 Slowly changing dimensions Fully involved in the Performance Tuning of the Data Stage jobs Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis)

Environment: IBM DataStage and quality stage 8.0 ,Datastage 7.5.1,Datastage 7.5.2. teradata,Oracle References: Available upon request

Вам также может понравиться