Вы находитесь на странице: 1из 3

Nagarjuna Bellamkonda

Hadoop Developer - Sapient Corporation Pvt Ltd


- Email me on Indeed: indeed.com/r/Nagarjuna-Bellamkonda/061372fe7282d751
Over 2+ years of experience in meeting critical software engineering challenges.
Having Experience on Cloudera and horton works clusters
Experience in Hadoop-related technologies like HDFS, Map Reduce, Hive, Pig, Flume, SQOOP and HBASE,
Cassandra.
Configured and working with 11 nodes Hadoop cluster. Installation and Configuration of Hadoop, HBASE,
Hive, Pig, SQOOP and Flume.
Importing and exporting data from RDBMS to HDFS, Hive using SQOOP.
Having Experience in writing Hive queries & Pig Scripts.
Expertise in Map Reduce Programming, SQOOP, HBASE, HIVE and PIG component of Hadoop.
Experience in MYSQL & Oracle.
Good programming knowledge in Object Oriented language like Core Java.
Strong analytical, conceptual, problem-solving abilities, excellent communication, presentation and
organizational skills.
Experience in NOSQL (HBASE & Cassandra)

WORK EXPERIENCE

Hadoop Developer
Sapient Corporation Pvt Ltd - September 2013 to Present
Environment
Data Base: Oracle10g, HBASE, MYSQL
Languages: Hive, pig, HBASE, Flume, MapReduce, Sqoop
Tools: Hadoop
Operating system: Ubuntu and CentOS
Project Description:
Network Log analysis is the project to analyze the data in the company network logs. Analysis of the data
collected from network devices will help increase the network security. It converts logs from various network
devices and host devices to detailed reports. There are reports on top sources, top destinations, top unique
pairs, top services, country wise top sources and destinations, blacklisted IPs and user bandwidth. The ultimate
goal of the network analysis is to observe for any network attacks. In this project, we do the following things
in backend
Store the log file into HDFS
Loading data into Hive table
Summarizing the data with hive and pig
Generating the reports
Sending the notification to customers
Responsibilities:
Created the 11 nodes Hadoop cluster environment by using Cloudera CDH4 version.
Involved in importing the data from various formats like JSON, XML to HDFS environment.
Involved in Hadoop ecosystem components configurations and also involved in Hadoop ecosystem
components (Map Reduce, Hive, SQOOP, Flume, HBASE, pig) performance testing and benchmarks.
Implemented the Map Reduce program for converting log files to CSV format.

Involved in transfer of data from post log tables into HDFS and Hive using SQOOP.
Involved in creating Hive tables, then applied Hive QL on those tables for data validation.
Developed and Tested Map Reduce Jobs in Java to analyze the data.
Implemented the hive partitions, hive joins.
Involved in Cluster maintenance, Cluster Monitoring and Troubleshooting.
Involved in managing and reviewing data backups and log files.
Involved in the implementation of flume for loading logs into HDFS.
Involved in HBASE configuration, HBASE java API.
Implemented Queries in Pig Scripts for analyzing the data.
Project#2
Client Name: Alliance BI Asia
Project Name: Alliance BI Asia

Hadoop Developer
Sapient Corporation Pvt Ltd - Bangalore, Karnataka - August 2012 to Present
TECHNICAL SKILLS
Programming Languages: C++, Java
IDE: Eclipse
Database / File System: HADOOP, MS-SQL Server, Oracle
Hadoop tools: Hadoop, Map Reduce, HDFS, PIG, SQOOP Hive, HBase, Flume, Cassandra
Web Technologies: HTML, XML
Operating Systems: Ubuntu, Cent OS, Windows XP
Project#1
Client Name: Log Rhythm Networks
Project Name: Network Log Analysis

Hadoop Developer
Sapient Corporation Pvt Ltd - August 2012 to September 2013
Environment:
Data Base: Oracle10g, MySQL, HBase
Languages: MapReduce, Hive, pig, SQOOP
Tools: Hadoop
Operating system: Ubuntu, CentOS
Project Description:
Alliance is a tyre manufacturer based in Clermont-Ferrand in the Auvergne region of USA. It is one of the two
largest tyre manufacturers in the world along with Bridgestone. Alliance manufactures tyres for automobiles,
heavy equipment, aircraft, motorcycles, and bicycles. The goal of this project is to improve and certify the
quality of key business data used in AsianZone: To prepare for future deployment of BI initiative in Asia zone
the current project scope will focus on the daily Sales and inventory data first and to provide a consolidate daily
view of sales.BI ASIA project objective is will focus on the daily sales and inventory data first and to provide
a consolidate daily view of sales.
Responsibilities:
Involved in importing the data from various formats like JSON, XML to HDFS environment.
Implemented the Map Reduce program for converting data files to CSV format.
Processing from large data sets using Hive.

Experience in NoSql Database such as HBASE.


Implemented Queries in Pig Scripts for analyzing the data.
Involved in integration of Hive and Hbase.
Involved in transfer of data from RDBMS into HDFS and Hive using SQOOP.

EDUCATION

B.Tech
Jawaharlal Nehru Technological University - Hyderabad, Andhra Pradesh
December 2008

Вам также может понравиться