Вы находитесь на странице: 1из 4

Mohammed Sheik Dawood

Technical Lead
Mobile: +918608103384
E-mail: sjsheikdawood@gmail.com

PROFILE SUMMARY
 Having 4.3 Years of experience in data modernization, data Migration, data quality, analysis and building data
warehousing using ETL Tools such as Talend and Informatica
 Extensively involved in moving data from legacy databases to Modern databases.
 Engaged in developing solutions for Manufacturing Logistics, Energy and Utilities, Insurance and Finance Domain.
 Currently working in Cognizant as ETL lead developer since Nov 2015.
TECHNICAL SKILLS
Databases :- SnowFlake, Azure SQL DB, MS SQL Server, Oracle 11g
ETL :- Talend Data Fabric (DI and Big data), TAC, Talend Cloud, TMC
Informatica Power Center, Informatica Cloud
Scripting :- Linux/ Batch & Python
Cloud :- Amazon (S3, Redshift, EMR) & Azure Data Lake
Replication Tool :- Attunity
CICD :- Jenkins, Nexus, Maven, GIT
Scheduling Tool :- Autosys
PROJECT UNDERTAKEN

Hartford Cloud Enablement Oct/2019 to Present


Role :- ETL lead Developer
Client :- Hartford
Team size :- 05
Methodology :- Agile
Project Objective:
Hartford is one of the leading insurance company. This project helps in setting up the ETL Cloud Infrastructure
for the Hartford team. Talend Cloud infrastructure set up will be leveraged by different business application within the
organization to move from traditional on-Prem to Cloud environment.
Roles and Responsibilities:
 Talend Cloud Integration with On-Premise Hadoop/RDBMS, EMR, S3, Artifact Repository, Snowflake and
CI/CD/Source Code repository servers
 Configure the Users, user groups, Projects and Security on TMC
 Developing data integration job to test the integration between SnowFlake , oracle and Flat files
 Developing Big data jobs to test the integration between HDFS, Hive, S3 through Native as well as Spark jobs
 Developing the jobs to load/unload semi structured data like Avro, parquet, Json and XML files into HDFS path
 Developing Python script to Automate TMC Task trigger from Autosys
 Install, configuring the Remote Engines
 Preparing unit test case document and capturing the results
Tools and Technologies used in project
Operating System :- Windows, Linux
ETL Tool :- Talend Cloud, Talend Data Fabric
Cloud :- AWS S3, EMR
CICD Tools :- Nexus, GIT, Jenkins, Autosys
Job Scheduler :- TMC, Autosys
Scripting :- Python

Xerox Business solution May/2019 to Sep/2019


Role :- ETL Developer
Client :- Xerox
Team size :- 09
Methodology :- Waterfall
Project Objective:
Xerox is a leading manufacturer of photo copier machine. This project develops/supports Xerox business team
and help the clients in consolidating data into cloud environment from various on premise databases. And we used Talend
Big data/Snowflake to create a framework which extract data from different on-premise database into single cloud
environment.
Roles and Responsibilities:
 Involved in development of Talend jobs/Framework and perform their unit testing and prepare unit test
documents, detailed design documents
 Developing Talend Validation framework to validate the data count once ingestion framework completed
 Responsibilities include developing Azure SQL DB procedure for handle ingestion framework
 PowerShell script created to fetch data count from On-Prem database and FTPed for Validation framework
 Developing jobs to load/retrieve data from Azure Data lake
Tools and Technologies used in project
Database :- Snowflake
Cloud :- Azure SQL DB, Data lake
Operating System :- Windows
ETL Tool :- Talend Big data
Replication Tool :- Attunity
Job Scheduler :- TAC
Scripting :- PowerShell & Batch Scripting

TGP MI Reporting Dec/2017 to Apr/2019


Role :- Team Member
Client :- Total Gas and power (TGP)
Team size :- 03
Methodology :- Waterfall
Project Objective:
Total Gas & Power is a leading business energy provider. This project develops/supports TGP procurement team
and help the clients in decision making by forecasting the future Power and Gas demand. And we used Talend to extract
data from different source and then loaded into Dataware house for Business analysis.
Roles and Responsibilities:
 Data cleansing, data quality tracking and process balancing checkpoints
 Created Type 1 and 2 dimensions, Fact Tables, Star Schema design and Snowflake schema design
 Created Oracle procedure and functions as per the business needs
 Scheduling the job in Informatica Scheduler
 Create batch automation script for Talend code migration from one environment to another
Tools and Technologies used in project
Database :- Oracle
Operating System :- Windows
ETL Tool :- Talend
Reporting Tool :- SAP BO
Job Scheduler :- TAC

NAM Data Lake Mar/2017 to Dec/2019


Role :- ETL Developer
Client :- Schneider Electric
Team size :- 12
Methodology :- Agile
Project Objective:
This project develops/supports Schneider electric in bringing data from all their sources into Data Lake to increase
market of their Product and services for high level customer satisfaction and to help client in decision making to grow
their Business. Informatica Cloud is used to extract data from different source system like Oracle, SQL Server, and
Salesforce then loaded into Amazon S3 bucket for Business analysis.
Roles and Responsibilities:
 Developing Informatica cloud jobs to connect various source extract the data and loaded into target layer.
 Developed Curl Script to trigger Informatica cloud job from Informatica power center.
 Validate the data once loaded into S3 bucket.
Tools and Technologies used in project
Database :- SQL Server, Oracle.
Cloud :- Amazon S3, Salesforce
Operating System :- Linux
ETL Tool :- Informatica Cloud
Job Scheduler :- Informatica Scheduler

NAM Data warehouse Mar/2016 to Feb/2017


Role :- Team Member
Client :- Schneider Electric
Team size :- 12
Methodology :- Waterfall
Project Objective:
Schneider electric is one of the largest electrical/electronics manufacturing Company. This project
develops/supports Schneider electric sales process and provides self-assessment capability to increase market of their
Product and services for high level customer satisfaction and to help client in decision making to grow their Business. And
we used Informatica to extract data from different source system like Oracle, SQL Server, and Salesforce then loaded into
Dataware house for Business analysis.
Roles and Responsibilities:
 Responsibilities include Requirement Gathering, FRS clarification, preparing High level and Low level design
documents.
 Daily communication with onsite counterparts.
 Develop Informatica mappings and perform their unit testing and prepare unit test documents, detailed design
documents.
 Develop Shell scripts for FTP process.
 Primary Role: ETL Development.

Tools and Technologies used in project


Database :- SQL Server, Oracle
Operating System :- Linux
ETL Tool :- Informatica
Job Scheduler :- Informatica Scheduler

CERTIFICATION & TRAININGS


 Certified Talend Data Integration v7 Developer
 Completed the Induction Training Program conducted for the fresher’s batch in Cognizant Technologies Solutions
from 24-11-2015 to 10-03-2016.
 Has under gone Informatica, Oracle, Teradata training.

ACADEMIC CREDENTIALS

Examination Year of Passing University /Board Percentage Class


B.E (EEE) 2015 B S Abdur Rahman 84.8% First
H.S.C. 2011 University
STATE BOARD 93.33% First
S.S.C. 2009 STATE BOARD 82.4% First

PERSONAL DETAILS

Date of Birth :- 03/12/1993


Address :- 7/11,5th street, Masjid colony, Kamarajapuram, Chennai-73

Вам также может понравиться