Вы находитесь на странице: 1из 42

CORE: Nov.

2011
Service Level Agreements
Contract and Performance
Management
Intellectual Property of the Centre for Outsourcing Research and
Education (CORE). May be used with permission of CORE.
CORE: Nov. 2011 Service Level Agreements
22
Agenda Service Levels for BPO
Preliminary Matters
Use of Weighting Factors
Use of Severity Levels
Use of the Balanced Scorecard
CORE: Nov. 2011 Service Level Agreements
33
Preliminary Matters for All BPOs
Precisely define the services to be provided (the
Services)
Since not practical to measure performance for all
Services, carefully identify which of the Services are
most important to the business and should be
measured (Measured Services)
Determine whether Measured Services are currently
being internally tracked and calculated
If they are, determine whether existing performance
levels (Service Levels) meet the needs of the
business or whether they require improvement
If so, determine whether level of improvements is
required
If not, determine what Service Levels are required to
meet the needs of the business
CORE: Nov. 2011
Use of the Service Level Agreement (the SLA)
Used to document with service provider the
Measured Service Levels required
Use to document with service provider the
amounts payable (the Service Level Credits) for
failure to achieve the Measured Service Levels
Use to permit customer to terminate Master
Agreement
When significant or on-going failures to achieve Service
Levels (Service Level Failures)
Will discuss further in Termination Service Levels
below
4
Service Level Agreements
CORE: Nov. 2011 Service Level Agreements
55
Prioritizing Measured Services
First step is to prioritize the importance of the
Measured Services
Two frequently used approaches
A. Weighting Factors (Mathematical)
B. Severity Levels (Non-Mathematical)
CORE: Nov. 2011 Service Level Agreements
66
A. - Using Weighting Factors
Weighting Factors
Based on importance of portion of Service (Service
Element) being measured
Examples of Weighting Factors might be:
.50
.25
.15
.10
Failure of the Service Level triggers a payment
of the Weighting Factor against the monthly
fees they are cumulative (as per slide 8)
CORE: Nov. 2011 Service Level Agreements
77
Weighting Factors and Limits on Credits
Customer must negotiate relative weights (i.e.,
.50, .25, .10) because comparative importance
is critical
Because higher aggregate Weighting Factors
result in higher Service Level Credits, there is
almost always a debate on the aggregate in
order for service provider to limit Service Level
Credits
CORE: Nov. 2011
Example of Cumulative Weighting Factors
8
Service Level Agreements
1.70 Total
.10 Candidate Screening System
.25 External Candidate Rsum System
.25 External Job Posting System
.10 Training System
.25 Staffing Administration System
.25 HR Administration System
.50 Payroll Administration System
Weighting
Factor
Measured Service Element
CORE: Nov. 2011 Service Level Agreements
99
The At-Risk Amount
Using Weighting Factors also requires a cap
on amounts payable to determine Service Level
Credits
This liability cap is sometimes called the At-
Risk Amount
Percentages vary between service providers
Large value transactions (over $250M), range
is between 8% and 12% of monthly fees
Lower the value, higher the cap should be so as
to make payment meaningful (painful)
CORE: Nov. 2011 Service Level Agreements
10 10
Limits using Weighting Factors
Service Level Credit = A x B where:
A = the sum of all Weighting Factors for all Service
Levels not met in a month not to exceed the Weighting
Factor Limit; and
B = the At-Risk Amount in a month (8% - 12% from
previous slide)
CORE: Nov. 2011 Service Level Agreements
11 11
Example 1
Assume
The charges for the Services are $1M per month
The At-Risk Amount is 10% ($100K)
The negotiated Weighting Factor Limit is 1.5
The sum of the Weighting Factors for all Measured Service
Elements not met in the month is 1.7 (i.e., sum of .50, .25
etc.)
Then, the Service Level Credit that would be 1.7 x $100K
= $170K reduced to $150K because the Weighting Factor
Limit is 1.5
This is a potential loss of Service Level Credits of up to 12
(months) times $20K = $240K
CORE: Nov. 2011 Service Level Agreements
12 12
Example 2
Assume:
The charge for the Services are $1M per month
The At-Risk Amount is 10% ($100K)
The negotiated Weighting Factor Limit is 1.3
The sum of the Weighting Factors for all Measured Service
Levels not met in the month is 1.7 (i.e., sum of .50, .25
etc.)
Then, the Service Level Credit that would be 1.7 x $100K
= $170K is reduced to $130K because the Weighting
Factor Limit is 1.3
This is a potential loss of Service Level Credits of up to 12
(months) times $40K = $480K (compared to $240K in
Example 1)
CORE: Nov. 2011 Service Level Agreements
13 13
B. - Using Severity Levels
Severity Levels
Also based on importance of Service Element being
measured
Severity level types might be:
Essential
Major
Minor
Severity Level Examples in HR BPO:
Payroll Administration System - Essential
HR Administrative System - Major
Training System Minor
Others such as External Job Posting System will vary
CORE: Nov. 2011 Service Level Agreements
14 14
Example of Use of Severity Levels
Measured Service
Element
Severity Level Availability
Service Level
Employee Services
Payroll Administration
System
Essential 99.85%
Human Resources
Administration System
Major 97.85%
Staffing Administration
System
Major 97.85%
Training System Minor 90%
Employment Services
External Job Posting
System
Major 97.85%
External Candidate Rsum
System
Major 97.85%
Candidate Screening
System
Minor 90%

CORE: Nov. 2011
Severity Levels and Service Level Credits
Having established agreed-upon Severity Levels,
parties must then determine Service Level
Credits
Need to address multiple scenarios
Single Failure
Repetitive Failures
The following table is one example using same
At-Risk Amount of $100K per month as above
15
Service Level Agreements
CORE: Nov. 2011
Use of Severity Levels
Service Level Failed Service Level Credit/Remedy
Essential Severity Level in any 1
month
12% of the At-Risk Amount for
Affected Service Element
Essential Severity Level in any 2
of 3 consecutive months
24% of the At-Risk Amount for
Affected Service Element
Major Severity Level in any 1
month
7% of the At-Risk Amount for
Affected Service Element
Major Severity Level in any 2 of
3 consecutive months
14% of the At-Risk Amount for
Affected Service Element
Minor Severity Level in any 1
month
Best Efforts to Repair within 45 days
Minor Severity Level in any 2 of
3 consecutive months
Best Efforts to Repair within 30 days
Service Level Agreements
16
CORE: Nov. 2011
Differences Between the Two Methods
As seen from the above example:
Using Severity Levels to calculate Service Level Credits
is different from using Weighting Factors
Critical difference - no need to develop Weighting
Factors for each Measured Service Element
Rather, the parties simply establish Severity Levels for
use with the At-Risk Amount for each Measured Service
Element
Service Level Agreements
17
CORE: Nov. 2011
Termination Service Level Failures
Irrespective of the measurement method used,
there must be a level of failure that permits the
customer to terminate the Master Services
Agreement
Can be based on the severity of a single event
(i.e., Essential Severity Level falls below a very low
minimum)
Can be based on repetitive failures
(i.e., Major Severity Level 3 fails for consecutive months
or 3 out of 6 rolling months)
18
Service Level Agreements
CORE: Nov. 2011
Call Centre BPOs
Need to establish Measured Service Levels
Will be different from HR and other PBO
Measured Service Levels due to nature of
business function
There are both Objective measurements and
Subjective measurements
19
Service Level Agreements
CORE: Nov. 2011
Call Centre Objective Measurements
Sample objective measurements
Number of rings to pick-up of call
Wait time on hold
Percentage of abandoned calls
Percentage of problems resolved by first call
Degree of escalation to 2
nd
and 3
rd
level support
Other? Suggestions?
20
Service Level Agreements
CORE: Nov. 2011
Deceptive Solutions
Need to identify deceptive solutions proposed
by service provider such as:
Additional staff
Additional lines
Additional infrastructure
They are not viable solutions as all these really
do is result in additional costs to the customer
That being said, change in scope (added country)
may result in need for one of these changes
21
Service Level Agreements
CORE: Nov. 2011
Call Centre Subjective Measurements
Satisfaction Surveys
To ensure that survey results are not ambiguous or
skewed, customer must be very careful about variables of
the survey methodology
Such variables to consider are:
Survey frequency
Survey sample size
Nature and scope of survey questions (the right questions)
Types of employees surveyed (exempt* vs. non-exempt)
Geographical factors in survey questions (cultural & language)
(* exempt from overtime)
22
Service Level Agreements
CORE: Nov. 2011
Key Recommendations for Surveys
Hire qualified and experienced survey designer to
work with in-house staff to develop surveys
Possibly put out RFP to determine experience of
survey designers in customers industry (or type
of survey)
Service Level Agreements
23
CORE: Nov. 2011
Service Level Reporting
The nature and frequency of the reports is critical
to ensure sufficient information to permit
calculation of Service Level Credits
Beware of provision that customer must report
Service Level Failures
Usually, the customer does not have the relevant
information
Obligation should be on the service provider to
provide the data to determine Service Level
Credits
24
Service Level Agreements
CORE: Nov. 2011
Sample Service Level Report Table
25
Service Level Agreements
CORE: Nov. 2011
Background of the Balanced Scorecard
Conceived by Robert Kaplan and David Norton
Published in the Harvard Business Review in 1992
Entitled The Balanced Scorecard Measures that
Drive Performance
Urged companies to complement financial
measures with operational measures of customer
satisfaction, internal processes and innovation
and improvement activities
Now used in BPO transactions in conjunction with
performance measurements discussed previously
26
Service Level Agreements
CORE: Nov. 2011
Use of the Balanced Scorecard
In the outsourcing context, the balanced scorecard is
simply a method of measuring improvements achieved by
the service provider using non-traditional, non-financial
measurements
The balanced scorecard measures improvements in
performance by measuring changes in the performance of
agreed-upon measurable parameters (Metrics) that are
selected by the customer for inclusion into the balanced
scorecard for one period and comparing them to the
performance of the same Metrics for previous periods
27
Service Level Agreements
CORE: Nov. 2011
Development of Metrics
The Metrics in any particular outsourcing
agreement will invariably vary depending on the
intent of the parties
However, for the analysis in this presentation,
only the three Metrics suggested by Kaplan and
Norton are used
This is sufficient because the approach used in
developing a Balanced Scorecard Model is
essentially the same irrespective of the number
and types of Metrics actually used in a particular
transaction
28
Service Level Agreements
CORE: Nov. 2011
Success Categories
For this presentation, the three non-financial
elements of the Balanced Scorecard Model are
treated as distinct success categories to be
measured (Success Categories) so that the
customer may prioritize them
One of the most common methods of doing this
is for the customer to apply Weighting Factors to
each of the Success Categories as illustrated on
the next slide
29
Service Level Agreements
CORE: Nov. 2011
Weighting and Success Categories
Weighting factors are used in the same way as
more traditional measurements discussed above
30
Service Level Agreements
Success Category Weighting
Factor
Customer Satisfaction .50
Internal Processes .20
Innovation and Improvement
Activities
.30
Total 1.0

CORE: Nov. 2011
Weighting and Success Categories
Because there are often diverse objectives for the
three Success Categories, the next step is to
identify the specific business objectives
(Operational Measures) for each Success
Category
31
Service Level Agreements
Success Category Operational Measures
Customer Satisfaction Customer Ret ention
Employee Satisfact ion Internal Processes
Management Effecti veness
Improved Functionality of Supported
Application Software
Innovation and
Improvement Activities
New Functionality of Supported
Application Software

CORE: Nov. 2011
Metric Categories
Because there are different Metrics for the Operational
Measures, the next step is to break down the Operational
Measures into different sub-categories of logical groupings
Then the customer must specify the relative weights to be
applied to those groupings in order to determine how the
results will be built into a total set of measurements for the
Operational Measure
The next slide illustrates this assignment of a weighting
percentage to each of those measurements that comprise a
Metric for a Success Category and its Operational Measures
(Metric Categories)
32
Service Level Agreements
CORE: Nov. 2011
Metric Categories
33
Service Level Agreements
Success Category Operational Measures Metri c Category
Customer Satisfaction Customer Ret ention Semi-Annual Survey (60%)
Cust omer Loss Reviews (40%)
Employee Satisfact ion Annual Survey (75%)
Exi t Interview Information (25%)
Int ernal Processes
Management Effecti veness Performance of Supported Applicati on
Software (50%)
On t ime Del ivery of Supported
Appli cation Soft ware Percentage (25%)
Appli cation Soft ware Error Correction
Targets Met (25%)
Improved Funct ionality of
Supported Application Software
Timeliness (25%)
Quality (75%)
Innovati on and
Improvement Activi ties
New Functional ity of Supported
Application Software
Timeliness (25%)
Quality (75%)

CORE: Nov. 2011
Implementation of the Balanced Scorecard
To implement the Balanced Scorecard, the
customer would first supply the service provider
with comprehensive information about all the
functions to be included into the Services
The parties would then agree upon which
business functions of the customer will be
affected by the Services and thus subject to the
Balanced Scorecard measurements.
34
Service Level Agreements
CORE: Nov. 2011
Implementation of the Balanced Scorecard
The parties would then agree upon a time frame
to finalize:
The Success Categories
The relative Weighting Factors for the Success
Categories
The Operational Measures
The Metric Categories
The methodology for measuring the Metrics
Finally, the parties would agree upon the
frequency of the measurements to establish a
regular period of measurement (Measurement
Period)
35
Service Level Agreements
CORE: Nov. 2011
Baseline Results & Scorecard Targets
Once these items are agreed upon, the service provider
would be obligated to perform an initial performance
measurement on the affected business functions using the
agreed-upon Balanced Scorecard the results of which would
constitute the baseline results (Baseline Results)
Once the Baseline Results have been analyzed, the parties
would set targets for improvements across the various
Metric Categories (Scorecard Targets) that the service
provider would be expected to achieve, both over the entire
term of the agreement and year over year during the term
36
Service Level Agreements
CORE: Nov. 2011
Remedies
As with the traditional SLA, the final step would
be to develop specific credits to be paid by the
service provider to the customer for the failure of
the service provider to achieve the agreed-upon
Scorecard Targets
As with SLAs, these will vary from transaction to
transaction
As a rule, the Balanced Scorecard would also deal
with payments, termination rights and critical
failures, all of which would need to be customized
to align with the approach of the Balanced
Scorecard
37
Service Level Agreements
CORE: Nov. 2011
Cautions in Using the Balanced Scorecard
Excessive Scorecard Targets
Avoid a large number of Scorecard Targets
There is a danger that using too many may result in the
customer losing sight of the critical Scorecard Targets
Better approach is to only establish Scorecard Targets
that are truly essential to the business operations and to
carefully construct those that are selected
38
Service Level Agreements
CORE: Nov. 2011
Cautions in Using the Balanced Scorecard
Irrelevant Scorecard Targets
The customer should also bear in mind that
constructing irrelevant or inaccurate Scorecard
Targets may result in credits being paid if the
service provider fails to meet the Scorecard
Targets but will not result in key business
requirements actually being addressed
39
Service Level Agreements
CORE: Nov. 2011
Cautions in Using the Balanced Scorecard
Any organization contemplating using the
Balanced Scorecard in an outsourcing should
keep in mind Kaplan and Nortons warning that:
Even an excellent set of balanced scorecard measures
does not guarantee a winning strategy. The balanced
scorecard can only translate a companys strategy into
specific measurable objectives.
40
Service Level Agreements
CORE: Nov. 2011
Other SLA Topics of Interest
Ramp-Up of Service Levels
Annual changes to Service Levels (Annual Planning Process)
Ad-Hoc changes to Service Levels (Change Order Process)
Improvements to Service Levels (Annual Planning Process)
Bonuses for Achievement above Service Levels
If applicable, a portion of amount should be re-invested by
service provider in some agreed-upon manner to improve
Services
Relief from Service Levels customer failures
Force Majeure and Disaster Recovery Service Levels (Not
suspension of Service Levels)
41
Service Level Agreements
CORE: Nov. 2011
Service Level Agreements
Adam D. Vereshack
Barrister & Solicitor
adam@adam-vereshack.com
42
Service Level Agreements

Вам также может понравиться