Вы находитесь на странице: 1из 5

ANIL NEVASE MOB :918055678094

EXECUTIVE SUMMARY
Databricks / Oreilly Certified Apache Spark Developer
Completed AWS Cloud Developer Certification Training from Edureka
Completed Big Data Hadoop Developer Course from Edureka
10+ Years of IT Experience
7+ Years of Spring Framework experience
1 Years of Apache Spark Framework experience
Sun Certified Programmer for Java 2 Platform, Standard Edition 6.0 ( SCJP 6.0)
Big Data Technology: Hadoop MR, HDFS, Hive, Hbase, Apache Spark
Java development skills: Core Java, Spring 4.1, Spring MVC, Spring Batch 3.0, Spring
Web Services 2.0, Spring JMS, Spring Data, Spring Integration, Spring Integration Batch, Spring Scheduler, JPA, Hibernate,
SOAP, JAXB, XML, JDBC, Swing
Experience encompasses software design, development and maintenance of custom application software
Good Exposure of version control tools like IBM Rational Clear-Case and Clear-quest, Visual Source Safe 6.0 and
SVN
Quick learner and excellent team player, ability to meet deadlines and work under pressure.

EXPERIENCE

Technical Project Lead


Inautix Technologies - Pune, Maharashtra
November 2015 to Present

1) Project Name: SR14-LiquidPlus Reconciliation System


Description: This is Scheduler based Reconciliation System used to post missed Transaction in the event of Server crash,
power failure or due to other technical issue. It fetch the data from Oracle using Spark SQL, it compare two database
tables based on record timestamp and post missed transactions to Apache Kafka topic/JMS/REST. Digital Pulse insert this
XML data into HP Vertica database, which further used for generating liquidity flow Graphs and data analysis.
Technologies used: Apache Spark, Spark SQL, Apache Kafka, Spring Integration, JAXB, XML, Maven, Tomcat 7.0 and
Oracle 11g.
IDE: MyEclipse Blue

Job Responsibilities:
Proposed Design Solutions for various Change Requests.
Interacting with offshore and On-site Management.
Analysis of Business Requirements and Technical Requirements
Code on Requirement provided by Business Analyst
SVN used as version controller
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production deployment regions

Team Size: 1

2) Project Name: SR14-Digital Pulse Client (DP) - Liquidity Management


Description: This application is used to post generated data during Payment/Forecast in all LQM application like
Payment Release Throttler, Liquidity Calculator, Forecast Message Feeder, Golden Copy Processor and Batch Engine. All
data is available in oracle database table. After data read form tables, it generate XML message which get posted to
Digital Pulse application using Apache Kafka/JMS/Rest API. Digital Pulse insert this XML data into HP Vertica database,
which further used for generating liquidity flow Graphs and data analysis.
Technologies used: Core Java, Apache Kafka, Core Spring 4.1, Spring Integration 4.1, Spring JMS, Spring Rest, Spring
JDBC, JAXB, XML, Maven, Tomcat 7.0 and Oracle 10g.
IDE: MyEclipse Blue
ANIL NEVASE MOB :918055678094

Job Responsibilities:
Analysis and Design.
Involved in system design and development in core java using Spring multithreading.
Used Eclipse as IDE tool to develop the application.
Used Spring OXM Module for XML processing
nvolved in Product Architecture Designing.
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production deployment regions
Team Size: 2

3) Project Name: SR14-Digital Pulse Client (DP) - Fund Transfer Securities


Description: Fund Transfer Securities application sends data in Fixed length text JMS message. FTS has three different
payload types like WIRES, CHIPSINQ and and based on message content. This application split fixed length message
into tokens mapping provided in property file. After split, it generate XML message which get posted to Digital Pulse
application using Apache Kafka/JMS/Rest API. Digital Pulse insert this XML data into HP Vertica database, which further
used for generating liquidity flow Graphs and data analysis.
Technologies used: Core Java, Apache Kafka, Core Spring 4.1, Spring Integration 4.1, Spring JMS, Spring Rest, Spring
JDBC, JAXB, XML, Maven, Tomcat 7.0
IDE: MyEclipse Blue
Job Responsibilities:
nvolved in Product Architecture Designing.
Code on Requirement provided by Business Analyst
Used Spring OXM Module for XML processing
Integration Testing
SVN used as version controller
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production deployment regions
Team Size: 2

Associate
Deutsche Bank - Pune, Maharashtra - August 2015 to November 2015

1) Project Name: FATCA


Description: FATCA was reportedly enacted for the purpose of detecting the non-U.S. financial accounts of U.S.
domestic taxpayers rather than to identify non-resident U.S. citizens and enforce collections. There might be thousands of
resident U.S. citizens with non-U.S. assets, such as astute investors, dual citizens, or legal immigrants.
Technologies used: Core Java, Core Spring 4.1.6, Spring Batch, Maven
IDE: Eclipse
Job Responsibilities:
Worked on Agile base project
Wrote SQL queries to fetch data from Oracle
Handling Environment Setup
Worked on bug fixing and enhancements on change requests.
Team Size: 15

Project Lead
Inautix Technologies - Pune, Maharashtra - November 2012 to August 2015

1) Project Name: EPH-LQM (Liquidity Manager) -Payment Release Throttler


Description: EPH stands for Enterprise Payment Hub and LQM (Liquidity Management) applications is part of EPH.
Payment Release Throttler is payment processing application. It receives the payment in XML format from OPF (Open
Payment Framework-Clear2Pay) . It parse the message against XSD, validate business date, account number, market
status, clearing channel and available liquidity. If all validations are OK, payment request will go to Liquidity Calculator
else respond back to OPF with proper Exception message to take action by OPF.
Technologies used: Core Java, Core Spring 4.1, Spring JMS, Hibernate, JPA, JAXB, XML, Maven, Tomcat 7.0 and
Oracle 10g.
ANIL NEVASE MOB :918055678094

IDE: MyEclipse Blue


Job Responsibilities:
Handling Technical Issues
nvolved in Product Architecture Designing.
Implemented this application single handled
Involved in the development of data access layer using JPA.
Developed POJOs using JPA
Responsible for the analysis, design, construction and testing of the application
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production deployment regions
Team Size: 1

2) Project Name: EPH-LQM- Liquidity Calculator


Description: Its directly deal with liquidity. Based on liquidity availability, it would release the payments. It also
processes Adjustment, Balances, MDOC limit, Opening balance, Target Minimum Cash Balances.
Technologies used: Core Java, Core Spring 3.2, Spring JMS, Hibernate, JPA, Maven, Tomcat 7.0 and Oracle 10g.

IDE: MyEclipse Blue


Job Responsibilities:
Implemented this application single handled
Involved in the development of data access layer using JPA.
Defect Analysis.
Integration Testing
SVN used as version controller
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production regions
Team Size: 2

3) Project Name: EPH-LQM- Forecast Message Feeder


Description: This application is used to process intra day message for Forecasting/projections of Money coming in or
going out. Its helps to project liquidity status ahead. Messages processed in this application are IMMS, GSF, GSP, OPF as
XML or fixed length message.
Technologies used: Core Java, Core Spring 3.2, Spring JMS, Hibernate, JPA, JAXB, XML, Maven, Tomcat 7.0 and
Oracle 10g.
IDE: MyEclipse Blue
Job Responsibilities:
Involved in capturing the business requirements, design, development and testing of the application.
Code on Requirement provided by Business Analyst
Worked on bug fixing and enhancements on change requests.
SVN used as version controller
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production deployment regions
Team Size: 3

4) Project Name: EPH-LQM- Batch Initiator


Description: LQM receives feeds from IMMS (International Money Management System) in flat file for different bank
branch's like 10, 04, 11, which contains records with fixed length text format. This application is used to split the files into
multiple files based on certain number of records. By splitting one file into multiple files reduces rick of processing failure
to minimum less no. records.
Technologies used: Core Java, Core Spring 4.1, Spring Batch 3.0, Spring CRON Scheduler, Hibernate, JPA, Maven,
Tomcat 7.0 and Oracle 10g.
IDE: MyEclipse Blue
Job Responsibilities:
Understand Requirement provided by Business Analyst
Code as per requirement and test it with BA
ANIL NEVASE MOB :918055678094

Involved in the development of data access layer using JPA.


Defect Analysis.
Integration Testing
SVN used as version controller
Team Size: 1

5) Project Name: EPH-LQM- Batch Engine


Description: This application process spitted files from Batch Initiator one by one. This is single threaded model. Its job
is to insert all records into Oracle database which helps to predict all Nostro account balance and helps to predict balance
via forecast for tomorrow. These file contains fixed length format records.
Technologies used: Core Java, Core Spring 3.2, Spring Batch 2.5, Spring CRON Scheduler, Hibernate, JPA, Maven,
Tomcat 7.0 and Oracle 10g.
IDE: MyEclipse Blue
Job Responsibilities:
nvolved in Product Architecture Designing
Involved in the development of data access layer using JPA.
Defect Analysis.
Integration Testing
SVN used as version controller
Maintained the DEV, SIN, UAT, ITE1, ITE2, QA and Production deployment regions
Team Size: 2

Product Engineer
Ixsight Technologies - Pune, Maharashtra
September 2009 to November 2012

1) Project Name: Scrubbix


Description: Its Product used by Bank And Telecoms Companies for Cleansing, standardization, formatting and
enrichment of large volumes of data within challenging deadlines. Aesthetics assurance by intelligent title
casing, abbreviation treatment.
Technologies used: Core Java, Core Spring 3.0, Spring MVC, Spring Batch 2.0, Hibernate, MYSQL, JAXB, XML, Ant.
Job Responsibility:
Analysis and Design.
Worked on bug fixing and enhancements on change requests.
Help Junior Team Members in Coding and to Understand Requirement
SVN used as version controller
Team Size: 2

2) Project Name: Cloud Scrubbix


Description: Its on-line data cleansing Product used by Banks, Telecoms Companies and Data entry operators for on-line
Cleansing, standardization, formatting, validation and enrichment of data.
Technologies used: Core Java, Core Spring, Spring MVC, Spring Batch 2.0, Spring Web Services 2.0, SOA, Hibernate,
MYSQL, SOAP, JAXB Tomcat 6.0.
Job Responsibility:
Analysis and Design.
Integration of Scrubbix Product with SOA using Spring Web Service
Developed Web Services
SVN used as version controller
Team Size: 2

Software Engineer
Lionbridge Technologies - Powai - Mumbai, Maharashtra
ANIL NEVASE MOB :918055678094

February 2007 to July 2009

Project Name: Pearson's Production Assets and Publishing Application (PAPA)


Description: Pearson's Production Assets and Publishing Application is a content management system for production
assets created at Pearson Digital Learning. This tool enables reuse of production assets across all curriculum products and
provides search mechanisms to view (and change) attributes for existing assets.
Examples of these production assets/media include:
images/animation created by the art group
voiceovers/ sound effects recorded by the production group
Technologies used: Core Java, Swing, Oracle 9i.
Role:
Implementation and maintenance of a Java GUI application using Swing.
Written SQL Query and used JDBC connectivity to access the database.

Web Developer/integrator
Praxis Interactive Technologies - Mumbai, Maharashtra - July 2006 to January 2007

Project Name: NETG


Description: NETG is a leading provider of e-learning and performance support solutions for global enterprises,
government, education and small to medium-sized businesses. Skill-soft enables business organizations to maximize
business performance through a combination of comprehensive e-learning content, on-line information resources, and
flexible learning technologies and support services.
Role:
Integrating various modules such as content and .swf files
Handling Audio Process
Handling Critical Pulse Delivery Process and XML

Technical Summary
Programming Languages Java 6
Core Java, Core Spring, Spring MVC, Spring Batch, Spring Web Services, Spring JMS,
Java Technologies
Spring Data, Spring Integration, SOAP, JAXB, JDBC, Sockets, Swing
Internet XML, HTML
Databases Oracle 10g, MySql
Webservers Tomcat 7.0
IDEs Eclipse 3.6, MyEclipse Blue
Big Data Technology Hadoop MR, HDFS, Hive, Hbase, Apache Spark
Operating Systems Windows 2000/NT, Windows 95-98, Linux (Ubuntu 14.0)
Version Control Tools Visual Source Safe 6.0 (VSS), IBM Rational Clear-quest and IBM Rational Clear-Case, SVN
Apache Spark (1 year), Spring (7 years), Java (10+ years), HDFS (1 year), HBase (1 year),
Skills
Hibernate (5 years), Hadoop (1 year)
Databricks Certified Apache Spark Developer
Certifications
Sun Certified Java Programmer

EDUCATION

B. Sc in Information Technology
Mumbai University - Mumbai, Maharashtra
2005
SSC in Maths
Pune University - Pune, Maharashtra

Вам также может понравиться