Вы находитесь на странице: 1из 7

MG Health, Scranton, PA

April 2015-Till Date

Sr Splunk Developer/Administrator

TMG Health is in the business of business process outsourcing (BPO). The


company is a fast-growing
provider of BPO services to health plans and insurers nationwide that need
help with Medicare and
Medicaid management. TMG Health provides both administrative services and
computer-systems
integration to help organizations deliver public health benefits to its
enrollees. Manage and support the
Splunk infrastructure, 16 indexers and approximately 200 Users. 1TB of daily
data.

Responsibilities:

Expertise with Splunk UI/GUI development and operations roles.


Played a major role in understanding the logs, server data and brought an
insight of the data for the
users.

Worked as a Splunk Admin for Creating and managing app, Creating users,
role, Permissions to
knowledge objects.

Created alerts on the requests from managers for certain tasks.

Analyzed security based events, risks and reporting instances.

Involved in Implementing Searches with Job Scheduling.

Prepared, arranged and tested Splunk search strings and operational


strings, writing Regex.

Created Dashboards for various types of business users in organization.

Created Admin, Power Users and User roles for the application and created
the app sharing
permissions for the different roles.

Assisted internal users of Splunk in designing and maintaining


production-quality dashboard.

Involved in writing complex IFX, Rex and Multikv command to extracts the
fields from the log files.

Involved in helping the Unix and Splunk administrators to deploy Splunk


across the UNIX and
windows environment.

Worked with administrators to ensure Splunk is actively and accurately


running and monitoring on
the current infrastructure implementation.
Created alerts based on the critical parameters, which will trigger
emails to the operational team.

Interact with the data warehousing team regarding extracting the data and
suggest the standard data
format such that Splunk will identify most of the fields.

Knowledge about Splunk architecture and various components (indexer,


forwarder, search head,
deployment server), Heavy and Universal forwarder, License model.

Involved in standardizing Splunk forwarder deployment, configuration and


maintenance across UNIX
and Windows platforms.

Optimized the search performance of Splunk queries and reduced the time
for loading the dashboards.

Integrated Ganglia with Nagios to setup monitoring alert notification


system.

Parsing, indexing, Hot, Warm, Cold & Frozen bucketing.

Environment: Splunk Enterprise 6.2/6.3/6.4, Apache Tomcat, JavaScript,


Windows Eclipse IDE, Core
Java, J2EE, Servlets, EJB, Rational Rose, Nagios, Jabix, JUnit, Log4J, JDBC,
Oracle, PostgreSQL,
JQuery.
Nationwide Insurance, Columbus, OH
Oct 2013-March 2015

Splunk Administrator and Developer

Nationwide is a leading US property/casualty insurer that also provides life


insurance and retirement
products through its Nationwide Financial Services subsidiary. Its
property/casualty products range from
general personal and commercial coverage -- including auto, home, and
business owners policies -- to
such specialty lines as professional liability, workers' Compensation,
agricultural, loss-control, pet
insurance, and other coverage. Project was to design, deploy and integrate
Splunk Enterprise with the
existing system infrastructure and setup configuration parameters for
Logging, Monitoring and Alerting.

Responsibilities:

Optimized Splunk for peak performance by splitting Splunk indexing and


search activities across
different machines.

Extracted complex Fields from different types of Log files using Regular
Expressions.

Created Search Commands to retrieve multiline log events in the form


Single transaction giving Start
Line and End Line as inputs.

Created HTML dashboards with third party java scripts and css to create
beautiful visualizations

Field Extraction, Using Ifx, Rex Command and regex

Guarantee high accessibility & execution trough flat scaling and burden
adjusted segments.

Prepared, arranged and tested Splunk search strings and operational


strings. Created and configured
management reports and dashboards.

Created EVAL Functions where necessary to create new field during search
run time.

Provide inputs for identifying best fit architectural solutions -


deployment for Splunk project.

Splunk Engineer/Dashboard Developer responsible for the end-to-end event


monitoring infrastructure
of business-aligned applications.

Experience in setting up dashboards for senior management and production


support- required to use
SPLUNK.

Maintained and managed assigned systems, Splunk related issues and


administrators.

Worked on DB Connect configuration for Oracle, MySQL and MSSQL.

Created many of the proof-of-concept dashboards for IT operations, and


service owners which are
used to monitor application and server health.
Environment: Splunk Enterprise Server 5.x.x/6.x.x, Universal Splunk Forwarder
5.x.x/6.x.x, Shell,
Python Scripting, MS SQL Server 2012, SQL.

Norfolk Southern Railways - Roanoke, VA May 2012


to Sep 2013

SQL BI Developer

Norfolk Southern Railways is one of North America's large Class I railroads


and a subsidiary of Norfolk
Southern Corp. Norfolk Southern also operates an extensive intermodal network
and is a major
transporter of coal and industrial products. CTMS-NG (Commodity
Transportation Management System -
Next Generation) is a Coal Business Group (CBG) sponsored Track 2012
initiative that will provide NS
with an advanced decision support system. The project is about creating ETL
process to update CTMS-
NG database with legacy CTMS database on regular basis and also interact with
other business groups for
valid requirements as per business.

Responsibilities:

Assisted with development of the new CTMS-NG application.

Worked closely with the project leads, project business analysts, project
data modeler, and other NS
personnel to process data requirements into target databases.

Extensively used Joins and sub-queries for complex queries involving


multiple tables from different
databases.

Data pulled from different sources such as CSV files, Text files, Excel
Spreadsheets, DB2, SQL
server and Teradata.

Updated CTMS-NG database regularly with the legacy CTMS database and
other databases based on
business requirements.

Used various SSIS tasks such as Conditional Split, Derived Column, which
were used for Data
Scrubbing, data validation checks during Staging, before loading the data
into the Database.

Created SSIS packages to load data into database using various SSIS Tasks
like Execute SQL Task to
perform SQL statements and to return result sets to be useful for other
SSIS components, Data
Profiler Task to ensure the data being imported is valid, Bulk Insert
Task, Data Flow Task, File
System Task, For Loop and For Each Loop Container.

Created Data Flow Tasks with Merge Joins to get data from multiple
sources from multiple servers,
with Look Ups to access and retrieve data from a secondary dataset, with
Row Count to have the
statistics of rows inserted and updated, and worked with Script Task to
perform a custom code.

Worked with Package Variables, with Event Handlers to react to the


appropriate error conditions, with
SSIS Parallelism, with Package properties to make package more
proficient.
Created complete documentation using Pragmatic Works BI Documenter for
both Control Flow and
Data Flow Task components and transformations such as properties, input
columns, and output
columns.

Increased the performance by performance monitoring, tuning and


optimizing indexes.

Created and monitored scheduled jobs on DEV, QA, and Production servers.
Created alerts for
successful or unsuccessful completion of Scheduled Jobs.

Deployed SSIS Package into Production and used Package configuration to


export package
properties.

Actively participated in interaction with users, team lead, DBAs and


technical manager to fully
understand the requirements of the system.

Worked closely with DBAs to improve the database performance by applying


enough indexes,
triggers, and MQTs.

Worked with Stored Procedures, User Defined Functions, Views, SQL


Scripting for complex business
logic.

Environment: SQL server 2008R2 RTM edition, IBM DB2 v9.7 for Windows, IBM
Data Studio for
DB2, SSIS, VB.Net, Passport, Pragmatic Works BI Documneter.

Corning Incorporated - Corning, NY August


2010 to April 2012

SQL Server/Business Intelligence Developer

Corning Incorporated is the world leader in specialty glass and ceramics.


Drawing on more than 160 years
of materials science and process engineering knowledge, Corning creates and
makes keystone
components that enable high-technology systems for consumer electronics,
mobile emissions control,
telecommunications and life sciences. This project involved complete Business
Intelligence development
from requirement gathering, ETL, OLAP Cubes to Reporting.

Responsibilities:

Analyzed business requirements and involve in design and implementation


of Design Documents and
Approach Documents.

Created database objects like Procedures, Functions, Packages, Triggers,


Indexes and Views using T-
SQL in Development and Production environment for SQL Server 2005/2008.

Developed Complex Stored Procedures, Views and Temporary Tables as per


the requirement.

Wrote T-SQL statements for retrieval of data and involved in performance


tuning of TSQL queries
and Stored Procedures.
Worked extensively with Team Foundation Server, Source Control.

Created reports using Microsoft SQL Server 2008/2005 Reporting Services


(SSRS) with proficiency
in using Report Designer as well as Report Builder.

Created Ad-hoc Reports, Drill-Down Reports, Pivot, Power pivot and


Tabular Reports.

Created complex Stored Procedures for better performance to generate


reports.

Managing the Report Server contents with Admin privileges and Report
Manager for Subscriptions,
Snapshots and Security Content on the reports

Helped Quality Assurance Team to generate Test Plans and Test Executions.

Used SSIS to create ETL packages to validate, extract, transform and load
data into data warehouse
and data marts.

Worked Extensively with SSIS to import, export and transform the data
between the linked servers.

Scheduled the packages and jobs to keep extracting the data from OLTP at
specific time intervals.

Handled Performance Tuning and Optimization, with strong analytical and


troubleshooting skills for
quick issue resolution in large-scale production environments located
globally.

Experience in developing the end to end solution using SQL Server


Business Intelligence.

Used SQL Server Agent for scheduling jobs and alerts.

Created packages in SSIS with custom error handling as well as complex


SSIS packages using
various Data transformations like Lookup, Script Task, and Conditional
Split, Derived Column,
Fuzzy Lookup, and Partition task, For-Each loop container in SSIS;
scheduled the same SSIS
packages by creating the job tasks.

Worked with SQL Server Analysis Services (SSAS) that delivers Online
Analytical Processing
(OLAP) and data mining functionality for business intelligence
applications

Understanding OLAP process for changing and maintaining the Warehouse,


Optimizing Dimensions,
Hierarchies and adding the Aggregations to the Cube.

Defined and processed the Facts, Dimensions & cubes using MS OLAP
Involved in generating Multi-Dimensional Cubes and reports using OLAP
Services

Environment: MS SQL Server 2008/2005, SSIS 2008, SSRS 2008, SSAS 2008, MS
Visual Studio
2008/2010, Visual Studio Team Foundation Server, SQL Server Profiler, Windows
Sever 2008, SQL
Server Management Studio, SQL Server Business Intelligence Development
Studio(BIDS), Oracle SQL
Developer, Oracle SQL Plus.

Вам также может понравиться