Вы находитесь на странице: 1из 21

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/0960-4529.htm

MSQ
20,5 Service quality dimensions of
hybrid services
Shirshendu Ganguli and Sanjit Kumar Roy
404 Marketing Department, IBS Hyderabad, IFHE University, Hyderabad, India

Abstract
Purpose – This paper aims to identify the dimensions of service quality in the case of hybrid services.
Design/methodology/approach – The service quality dimensions are identified using an
exploratory factor analysis (EFA). Next the reliability and validity of the factors are established
through confirmatory factor analysis (CFA) using AMOS.
Findings – The paper identifies nine service quality dimensions in the hybrid services – customer
service, staff competence, reputation, price, tangibles, ease of subscription, technology security and
information quality, technology convenience, and technology usage easiness and reliability.
Practical implications – The various dimensions of service quality should be viewed as the levers
of improving perceived service quality in the minds of its current customers. Identifying the service
quality dimensions in hybrid contexts can offer service providers valuable insights regarding on
which aspects of the service to focus in order to improve customer satisfaction, loyalty, and
commitment to the firm.
Originality/value – This paper introduces the concept of hybrid services, wherein a mix of
technology and human interaction is used to produce and deliver services. Furthermore, since hybrid
services have received little attention in the literature, the study addresses this gap by identifying a set
of dimensions that are relevant for measuring service quality in hybrid contexts.
Keywords Factor analysis, Customer services quality
Paper type Research paper

Introduction
In the present business scenario information technology (IT) is extensively used in
delivering services to the consumers. Human-human interactions are increasingly
being replaced by human-technology interactions. Rise of information technologies
and the internet in particular, have made human-human interactions in service delivery
redundant for many services (Bitner et al., 2000; Li et al., 2003). However, all types of
services are not uniformly affected. In technology-based services like e-retail, and
online gaming, human-human interactions have been completely replaced by
human-technology interaction. On the other hand, there are still conventional
services like restaurants and barber shops which continue to rely on human-human
interactions in order to deliver their services. But, there has been rise and growth of a
new category of services, which can be termed as the hybrid services. The
distinguishing characteristic of this service category is that customers’ interactions
with a firm are a mix of human and technology interactions. Information technology
tools are utilized to increase efficiency and effectiveness of service delivery (Marshall,
2006). But still these services are not totally devoid of human interactions (Aldrich,
Managing Service Quality 2000). Examples of such services include telecommunication, banking, insurance, air
Vol. 20 No. 5, 2010
pp. 404-424 travel, public transportation and utilities.
q Emerald Group Publishing Limited Significant body of research has focused on the measurement of service quality in
0960-4529
DOI 10.1108/09604521011073713 conventional services (Bolton and Drew, 1991; Parasuraman et al., 1985; Parasuraman
et al., 1988). SERVQUAL (Parasuraman et al., 1988) and SERVPERF (Cronin and Service quality
Taylor, 1992) are the examples of scales, which have been developed to measure dimensions
service quality in such contexts. However, further research has identified additional
dimensions of service quality besides those extracted in SERVQUAL and SERVPERF
(Brady and Cronin, 2001; Levesque and McDougall, 1996). With the rise of
technology-enabled services, research has also captured the technology related
dimensions of service quality (Collier and Bienstock, 2006; Parasuraman et al., 2005). 405
E-S-QUAL is an example of one such scale (Parasuraman et al., 2005). In this context
too further research works showed that there exist additional dimensions of service
quality (Collier and Bienstock, 2006; Joseph and Stone, 2003). Also self service
technology (SST) and call centers are increasingly being used for service delivery and
hence impacting the customers’ perceptions of overall service quality (Curran and
Meuter, 2005; Dean, 2002).
As little is known about the service quality dimensions of hybrid services and
appropriate measures of such dimensions, we need to empirically identify these
dimensions. A mere combination of SERVQUAL (or SERVPERF) and E-S-QUAL and
any such scale to judge the service quality of hybrid services is not sufficient. This is
because extant academic literature on service quality has shown that the perceived
service quality has many more dimensions other than those identified by the scales
mentioned above. Also the same items of SERVQUAL and E-S-QUAL when used in
different service context will yield factors different from the original. Thus we
incorporate a comprehensive list of measurement items (from the literature) related to
conventional and technology-enables services and use them to provide empirical
evidence of hybrid service quality dimensions. Hence this research provides a
significant contribution to the services literature by identifying the service quality
dimensions (from consumers’ perspective) in case of hybrid services. Our research
findings can offer useful and practical guidelines to the managers of hybrid services
because effective management of services requires an in-depth understanding of
customers’ mental representations of consumption experiences (Oliver, 1993).
The remainder of the paper is organized as follows: first, we review the literature on
the conceptualization and measurement of service quality dimensions in conventional
and technology-enabled services. Next, the article discusses the research methodology
used in the paper as well as the survey instrument used in data collection. Finally, we
present the results of our analysis followed by a discussion of the findings.

Literature review
Service quality concept
Service quality has been conceptualized as an overall assessment of service by the
customers. It is a key decision criterion in service evaluation by the customers (Lewis
and Booms, 1983). Perceived service quality is believed to be resulting from
comparison between customers’ prior expectations about the service and their
perceptions after actual experience (Asubonteng et al., 1996). Besides service outcomes,
service quality perceptions also involve evaluation of the service delivery process
(Parasuraman et al., 1985). Hence, conceptualization of service quality ought to include
both the process as well as the service outcomes (Lehtinen and Lehtinen, 1991). In fact,
Lehtinen and Lehtinen (1991) offered a comprehensive model with three dimensions of
service quality: physical, interactive and corporate. Physical quality is about the
MSQ quality of physical products involved in service delivery and consumption. Interactive
20,5 dimension refers to the interaction between the customers and the service organization
employees. Corporate quality is the customer perceived corporate image. A firm’s
ability to serve the customer needs as well as to maintain its competitive advantage
also affects the customer perception of service quality (Yoo and Park, 2007).

406 Service quality in conventional services


In conventional services, human interactions (interactions between customers and
service firm employees) during delivery and consumption of a service are the major
elements for measuring service quality. SERVQUAL (Parasuraman et al., 1988) is
perhaps the most widely-known and researched scale of service quality. It focuses on
human interactions during the service encounter. It consists of five dimensions:
reliability, tangibility, responsiveness, assurance and empathy. However, Cronin and
Taylor (1992) criticized SERVQUAL and proposed an alternative scale called
SERVPERF. It includes all the SERVQUAL scale dimensions, but uses only service
performance (perception) as a measure of customer perceived service quality instead of
the gap (between expectation and perception) approach of SERVQUAL. Further
research works have been carried out with SERVQUAL, which either modified the
dimensions or added new dimensions to the original five in order to accommodate for
uniqueness of different types of service settings (Asubonteng et al., 1996; Babakus and
Boller, 1992; Buttle, 1996; Carman, 1990; Lai et al., 2007).
In the context of conventional services, most of the studies have focused on
human-human interactions in measuring service quality. Service quality dimensions
obtained for retail banking were core quality, relational quality and tangibles
(Levesque and McDougall, 1996). Caruana et al. (2000) identified reliability, assurance
and responsiveness as factors of service quality for audit firms. In the context of retail
stores, physical aspects, reliability, personal interaction, problem solving, policy,
convenience, product quality and selection emerged as dimensions of service quality
(Burke, 2002; Dabholkar et al., 1996). In case of hotels and restaurants, service quality
dimensions included perceived authenticity in the interaction, mutual understanding,
provision of extra attention or personal services, meeting customer expectations,
service provider competence, service settings, recovery from failure, price and
performance perception (Gilbert et al., 2004; Hoffman et al., 1995; Matilla and Enz, 2002;
Voss et al., 1998). Fodness and Murray (2007) found that the customers use the
dimensions of effectiveness, efficiency, productivity, décor, maintenance and
interaction for judging service quality of airport services. Brady and Cronin (2001)
conducted a multi industry study and concluded that service quality consists of the
dimensions namely outcome (waiting time and tangibles), employee interactions and
environmental quality (ambient and social conditions and facility design).
Even a firm’s marketing mix has been used as proxies for service quality.
Customers’ perceptions of price, corporate image and firm’s promotional activities
have been used as indicators of service quality (Andreassen and Lindestad, 1998;
Grönroos, 1984; Kirmani and Wright, 1989; Moorthy and Hawkins, 2005; Rotfeld and
Rotzoll, 1976). Studies have also focused on specific aspects of service delivery such as
environment. Bitner (1992) identified three dimensions of physical environment
(termed as servicescape) – ambient conditions, spatial layout and functionality and
signs, symbols and artifacts. Researchers have also identified and measured certain
factors, like delay in service delivery (e.g. flight delays) affecting customers’ Service quality
perceptions of service quality (Taylor and Claxton, 1994). So, in the conventional dimensions
service context, measurements of service quality have focused primarily on the
interactions of consumers with the firm employees (human-human encounters) besides
using some marketing mix variables.

Service quality in technology-enabled services 407


In case of technology-enabled services, the conventional method of measuring service
quality was no longer relevant. As a result, research has identified new dimensions of
service quality, such as automated search, communication among customers,
information acquisition, content, mass customization, and ease of use (Bailey and
Pearson, 1983; Doll and Torkzadeh, 1988; Peterson et al., 1997). Consumers’ perception
of technology related service quality is also affected by their willingness to use and
adapt to the new technologies. Thus, unique scales such as Technology Anxiety
(Meuter et al., 2003) and Technology Readiness Index (Parasuraman, 2000) are being
used for the measurement of service quality in technology-enabled services. Also
Parasuraman et al. (2005) developed E-S-QUAL, which is a multi-item scale for
assessment of electronic service quality. The four dimensions of E-S-QUAL are
efficiency, fulfillment, system availability and privacy. As service recovery is an
important aspect affecting service quality perception of customers, Parasuraman et al.
(2005) also developed a scale for electronic service recovery quality (E-RecS-QUAL),
which consists of three dimensions - responsiveness, compensation and contact.
Van Riel et al. (2001) identified user interface, core service and supplementary
services as the crucial dimensions of e-service quality in the case of internet-enabled
businesses. Collier and Bienstock (2006) found that e-service quality consists the
dimensions of process, outcome and recovery quality. Other dimensions of e-quality
are web site appearance, ease of use, linkage, layout and content, reliability, efficiency,
support, communication, security, incentives, performance, feature, storage capability,
serviceability, trust, responsiveness, customization, web store policies, reputation,
assurance and empathy (Madu and Madu, 2002; Santos, 2003). In case of electronic
banking researchers identified accuracy, feedback/complaint management, efficiency,
queue management, accessibility, customization, customer service, secure and flexible
service, ease and convenience, quality of ATM, telephone and internet banking,
product portfolio and price as important dimensions of service quality (Al-Hawari et al.,
2005; Joseph et al., 1999; Joseph and Stone, 2003; Yang et al., 2004). In the
telecommunications sector the quality dimensions identified by researchers are
complaint handling, reliability, assurance, network quality, pricing, customer service,
product performance, branch network, billing and corporate image (Athanassopoulos
and Iliakopoulos, 2003; Aydin and Ozer, 2005; Kim et al., 2004; Sharma and Ojha, 2004;
Wang and Lo, 2002; Woo and Fock, 1999).
Other important research areas related to technology-enabled services are
Self-service technology (SST) and call centers (customer service). Depending on the
technology interface, the SSTs have been categorized into the types of telephone,
internet, interactive kiosks (e.g. ATM) and video/CD (Meuter et al., 2000). Consumer
perceptions of service quality vary depending on the specific type of self-service used
(Curran and Meuter, 2005). Service quality dimensions identified for SST based
services are ease of use, fun, performance, solving intense need, saving time and
MSQ money, avoiding service persons as well as technology anxiety (Dabholkar and
20,5 Bagozzi, 2002; Meuter et al., 2000; Meuter et al., 2003). For call centers, dimensions used
to judge quality are adaptiveness, assurance, offering of explanations, empathy,
authority, educating customers and personalization (Burgers et al., 2000; Rafaeli et al.,
2008). Besides that, customer feedback, customer focus and time taken to respond have
been also used to measure service quality of call centers (Danaher and Gallagher, 1997;
408 Dean, 2002, 2004). So, for the technology-enabled services, measures of service quality
focuses mainly on consumer interactions with the technology, although in certain
situations like customer service (call centers) human based interactions are also
considered.

Objectives of the study


Literature review in the previous section clearly highlights that a large body of
research exist on the service quality dimensions related to the conventional as well
technology enabled services. However, not much of attention is paid on the dimensions
of hybrid services. Hence, the objectives of this study are twofold:
(1) To identify the dimensions of service quality of hybrid services.
(2) To confirm the factor structure of the same using confirmatory factor analysis.

Methodology
Measurement instrument
The survey instrument was developed based on literature review. As shown in Table I,
the variables included in the study have been adapted from the existing literature. As
no standard scale is available which includes items of service quality in both
technology-human and human-human interactions, the items used were adopted from
different studies. The measurement instrument consists of two sections:
(1) 18 items related to technology-enabled service quality (including SST related
items); and
(2) 38 items related conventional service quality (including customer service or call
centers items).

The statements in the questionnaire were refined based on the hybrid service
(Banking) chosen for this study.

Sampling and data collection


Banking industry was chosen for data collection as it is an ideal example of hybrid
services because of both human (branch, call center) and technology (ATM, phone
banking, online) interactions being present. Data were collected using
self-administered questionnaires from the existing customers of banks in three cities
of India namely, Hyderabad, Kolkata and Delhi. Respondents were selected randomly
from the list of employees of an educational institution having centers in the above
mentioned cities. Irrespective of being the employees of the same institution the
respondents were the customers of different banks. Given the characteristics of the
three locations the generalisability of our findings is significantly enhanced. In all 1,200
questionnaires were sent, out of which 950 were returned. Of the 950 responses, 750
were usable, resulting in a 62.5 percent response rate, which is sufficient for a survey of
Constructs Measurement items
Service quality
dimensions
Service quality items x1. The technology provided by my bank is easy to use (Chen and Hitt, 2002;
(technology) Dabholkar and Bagozzi, 2002; Doll and Torkzadeh, 1988; Joseph et al.,
1999; Meuter et al., 2000)
x2. The technology provided by my bank is user-friendly (Doll and
Torkzadeh, 1988; Parasuraman et al., 2005; Van Riel et al., 2001) 409
x3. The technology provided by my bank works accurately and is error-free
(Doll and Torkzadeh, 1988; Joseph et al., 1999; Van Riel et al., 2001; Yang
et al. 2004)
x4. My bank’s technology is reliable (Dabholkar and Bagozzi, 2002; Joseph
et al., 1999; Yang et al., 2004)
x5. My bank’s technology never fails (Meuter et al., 2000; Parasuraman, 2000)
x6. I feel safe using my bank’s technology (Parasuraman, 2000; Yang et al.,
2004)
x7. I feel the risk associated with my bank’s technology is low (Yang et al.,
2004)
x8. My personal information exchanged while using technology is not
misused by my bank (Kim and Lim, 2001; Parasuraman et al., 2005; Yang
et al., 2004)
x9. My bank’s technology is personalized (Bitner et al., 2000; Chen and Hitt,
2002; Joseph et al., 1999)
x10. My bank’s technology recognizes me by name (Bitner et al., 2000; Joseph
et al., 1999)
x11. My bank’s technology provides the precise information I need (Doll and
Torkzadeh, 1988; Kim and Lim, 2001; Van Riel et al., 2001)
x12. My bank’s technology provides sufficient information (Doll and
Torkzadeh, 1988; Kim and Lim, 2001; Van Riel et al., 2001)
x13. My bank’s technology provides the reports I need (Doll and Torkzadeh,
1988)
x14. My bank’s technology is accessible beyond regular business hours
(Joseph et al., 1999; Meuter et al., 2000; Parasuraman, 2000; Parasuraman
et al., 2005)
x15. My bank’s technology gives me more freedom of mobility (Meuter et al.,
2000; Parasuraman, 2000; Yang and Fang, 2004)
x16. I find it more convenient to use technology than interacting with branch
employees (Meuter et al., 2000)
x17. My bank’s technology allows me to complete transactions quickly ( Joseph
et al., 1999; Kim and Lim, 2001; Parasuraman et al., 2005)
x18. My bank’s technology saves me a lot of time, especially when I am pressed
for time (Meuter et al., 2000)
Service quality items x19. It is easy to open a new bank account with my bank (Athanassopoulos
(traditional) and Iliakopoulos, 2003; Lai et al., 2007; Olorunniwo and Hsu, 2006)
x20. It is convenient and hassle-free to open a new bank account with my bank
(Athanassopoulos and Iliakopoulos, 2003; Lai et al., 2007; Olorunniwo and
Hsu, 2006)
x21. My bank employees are neat in appearance (Caruana, 2002; Cronin et al.,
2000; Host and Andersen, 2004; Levesque and McDougall, 1996;
Parasuraman et al., 1988)
x22. My bank’s physical facilities are visually appealing (Caruana, 2002;
Dabholkar et al., 1996; Host and Andersen, 2004; Johnson and Sirikit, 2002;
Parasuraman et al., 1988) Table I.
(continued) Measurement instrument
MSQ Constructs Measurement items
20,5
x23. My bank’s printed materials (e.g. brochures) are visually appealing
(Caruana, 2002; Host and Andersen, 2004; Levesque and McDougall, 1996;
Parasuraman et al., 1988)
x24. My bank has a good reputation (Andreassen and Lindestad, 1998;
410 Athanassopoulos and Iliakopoulos, 2003; Aydin and Ozer, 2005; Grönroos,
1990; Lehtinen and Lehtinen, 1991)
x25. My bank’s promotional campaigns are effective in building a positive
reputation (Aydin and Ozer, 2005; Crosby, 1991; Ndubisi and Wah, 2005;
Rotfeld and Rotzoll, 1976)
x26. My bank clearly explains its service charges (Al-Hawari et al., 2005)
x27. The fees that my bank charges are acceptable and reasonable (Al-Hawari
et al., 2005; Voss et al., 1998)
x28. My bank fees are competitive (Al-Hawari et al., 2005; Voss et al., 1998;
Levesque and McDougall, 1996)
x29. My bank offers a wide range of services (Al-Hawari et al., 2005; Bitner
et al., 2000; Yang et al., 2004)
x30. Within each basic service, my bank provides a variety of options
(Al-Hawari et al., 2005; Bitner et al., 2000; Yang et al., 2004)
x31. My bank fulfills its promises (Dabholkar et al., 1996; Parasuraman et al.,
1988; Wang and Lo, 2002)
x32. My bank performs all services right, the first time (Caruana, 2002; Host
and Andersen, 2004; Sureshchandar et al., 2002; Yang et al., 2004)
x33. My bank performs its services reliably, consistently and dependably
(Caruana, 2002; Cronin et al., 2000)
x34. My bank employees are trustworthy (Caruana, 2002; Cronin et al., 2000)
x35. My bank employees are competent (Brady and Cronin, 2001; Cronin et al.,
2000; Host and Andersen, 2004; Madu and Madu, 2002; Sureshchandar
et al., 2002; Yang et al., 2004)
x36. My bank employees are easily approachable (Cronin et al., 2000; Danaher
and Gallagher, 1997)
x37. My bank employees are courteous, polite and respectful (Caruana, 2002;
Cronin et al., 2000; Danaher and Gallagher, 1997; Host and Andersen,
2004)
x38. My bank employees are willing to help customers (Brady and Cronin,
2001; Caruana, 2002; Cronin et al., 2000; Host and Andersen, 2004; Madu
and Madu, 2002; Woo and Fock, 1999)
x39. My bank employees are pleasant and friendly (Brady and Cronin, 2001;
Caruana, 2002; Danaher and Gallagher, 1997)
x40. My bank employees are caring (Brady and Cronin, 2001; Caruana, 2002;
Danaher and Gallagher, 1997)
x41. My bank understands my specific needs (Brady and Cronin, 2001;
Caruana, 2002; Host and Andersen, 2004; Levesque and McDougall, 1996;
Parasuraman et al., 1988)
x42. My bank pays personal attention to me (Caruana, 2002; Dabholkar et al.,
1996; Johnson and Sirikit, 2002; Levesque and McDougall, 1996;
Parasuraman et al., 1988; Sureshchandar et al., 2002)
x43. My bank offers its services promptly with very little waiting time (Brady
and Cronin, 2001; Dabholkar et al., 1996; Danaher and Gallagher, 1997;
Host and Andersen, 2004; Olorunniwo and Hsu, 2006; Parasuraman et al.,
1988; Sureshchandar et al., 2002; Yang et al., 2004)
Table I. (continued)
Constructs Measurement items
Service quality
dimensions
x44. My bank branches and ATMs are sufficiently available in many locations
(Aydin and Ozer, 2005; Lai et al., 2007; Sureshchandar et al., 2002; Woo
and Fock, 1999)
x45. My bank’s operating hours are convenient for me (Athanassopoulos and
Iliakopoulos, 2003; Johnson and Sirikit, 2002; Wang and Lo, 2002; Woo
and Fock, 1999)
411
x46. My bank’s statements and other documents are accurate (Lai et al., 2007;
Levesque and McDougall, 1996; Sharma et al., 1999; Woo and Fock, 1999)
x47. My bank’s statements and other documents are easy to understand (Lai
et al., 2007; Levesque and McDougall, 1996; Sharma et al., 1999; Woo and
Fock, 1999)
x48. When I contact my bank’s customer service (call centre), my requests are
always anticipated properly (Burgers et al., 2000; Rafaeli et al., 2008)
x49. When I contact my bank’s customer service (call centre), I am offered
proper explanations (Burgers et al., 2000; Dean, 2004; Rafaeli et al., 2008)
x50. When I contact my bank’s customer service (call centre), the
representatives are supportive (Dean, 2004; Rafaeli et al., 2008)
x51. When I contact my bank’s customer service (call centre), the
representatives offer personalized information (Burgers et al., 2000;
Rafaeli et al., 2008)
x52. When I contact my bank’s customer service (call centre), my calls are
always answered promptly (Danaher and Gallagher, 1997; Dean, 2004)
x53. When there are problems, my bank is sympathetic and reassuring (Bitner,
1990; Gilbert et al., 2004; Johnson and Sirikit, 2002; Lai et al., 2007)
x54. My bank employees are knowledgeable enough to resolve the problems
(Bitner, 1990; Yang et al., 2004)
x55. My bank resolves my complaints quickly (Athanassopoulos and
Iliakopoulos, 2003; Joseph et al., 1999; Parasuraman et al., 2005; Sharma
et al., 1999; Yang et al., 2004)
x56. My bank offers a fair compensation for its mistakes (Bitner, 1990;
Dabholkar et al., 1996; Parasuraman et al., 2005) Table I.

this type. Respondents were asked to state their level of agreement with the series of
statements stated in Table I using a seven-point Likert scale ranging from “strongly
disagree” to “strongly agree.” The detailed sample characteristics are shown in
Table II.

Data analysis and results


Data analysis proceeds in two steps. First the exploratory factor analysis is used to
identify the underlying dimensions of service quality for hybrid services. For this the
sample was split into two approximately equal sub-samples: sample 1 (n ¼ 380) and
sample 2 (n ¼ 370). This was done by randomly selecting , 50 percent of the cases
using the filtering algorithm in SPSS. Next an exploratory factor analysis was
performed on the 56 items of the measurement scale using the principal component
analysis with varimax rotation. An orthogonal rotation was chosen for the sake of
simplicity (Nunnally and Bernstein, 1994). Next, confirmatory factor analysis to
confirm the factor structure of the service quality dimensions. The descriptive
statistics of the 56 service quality items are shown in Table III.
MSQ
Gender – Male (46.2 percent) Female (53.8 percent)
20,5
Age (Average – 35.5 years)
21 years and less (16.93 percent) . 21 to 30 years (34.73 percent)
. 30 to 40 years (42.07 percent) Greater than 40 years (6.27 percent)
Monthly income
412 Less than $1,000 (85.41 percent) Between $1,000-$1,500 (9.96 percent)
Between $1,501-$2,000 (3.21 percent) Greater than $2,000 (1.42 percent)
Period for which respondents are customers of their bank
Less than six months (2.1 percent) Between 6-12 months (4.9 percent)
More than one – up to three years (39.86 percent) More than three years (53.14 percent)
Frequency of monthly usage
Up to five times (25 percent) More than five – up to ten times (38.54 percent)
More than ten – up to 20 times (21.88 percent) More than 20 times (14.58 percent)
Service types used
Salary savings account (98.97 percent) Savings account (1.03 percent)
Home banking through internet (46.55 percent) Telephone banking (11.38 percent)
Stock trading (1.72 percent) Auto loan (2.07 percent)
Home loan (0.34 percent) Others – credit card, ATMs, bill payments
(12.76 percent)
Table II. Percentage of technology use (Respondents’ use of banks’ technology)
Demographic profiles and Less than 50 percent (6.52 percent) 50-75 percent (25 percent)
usage patterns of More than 75 percent – up to 90 percent
respondents (40.22 percent) More than 90 percent (28.26 percent)

Exploratory factor analysis


In the first stage an exploratory factor analysis was performed on sample 1 using the
56-variables related to the service quality of hybrid services. The criteria used for
factor extraction is twofold i.e. the Eigen value should be greater than one but more
importantly the factor structure should be meaningful, useful and conceptually sound
(Pett et al., 2003). Results of the factor analysis are shown in Table IV.
As can be seen from Table IV, nine factors were extracted, accounting for 73 percent
of the total variance explained. In total 48 items loaded properly on the factors. Three
items, namely “promotions”, “geographical presence” and “operation hours” were
removed because they did not load on any of the factors. Also, based on the Cronbach’s
alpha criteria, five items were deleted from the above 56 items – “tech-never fail”,
“tech-low risk”, “tech-recognize by name”, “prompt service” and “specific needs”. We
retained factor loadings greater than 0.35 for further analysis. Reliability of the factors
was calculated using the Cronbach’s alpha. A Cronbach’s alpha value of greater than or
equal to 0.7 is considered acceptable for the factor to be reliable (Hair et al., 2006). In our
case all the factors had satisfactory value of Cronbach’s alpha. Hence the factors are
reliable.
On examining the content of the items making up each of the dimensions (factors)
we label the factors as shown in Table IV and provide concise definitions for the
dimensions:
.
Customer service: the service provided to customers during problem situations
and through call centers.
Variables Mean Variance
Service quality
dimensions
x1. Tech-EasyToUse 6.05 0.866
x2. Tech-User-friendly 5.98 0.951
x3. Tech-Accuracy 5.67 1.245
x4. Tech-Reliable 5.81 1.072
x5. Tech-NeverFail 5.19 1.919
x6. Tech-Safe 5.91 0.975 413
x7. Tech-LowRisk 5.70 1.306
x8. Tech-InfoMisuse 5.81 1.040
x9. Tech-Personalize 5.43 1.492
x10. Tech-RecognizeByName 5.39 1.873
x11. Tech-PreciseInfo 5.62 1.197
x12. Tech-SufficientInfo 5.74 0.967
x13. Tech-ProvideReportNeeded 5.72 0.977
x14. Tech-BeyondBusinessHours 6.03 1.137
x15. Tech-Mobility 5.92 1.149
x16. TechOverEmployee 5.86 1.540
x17. Tech-QuickTransactions 6.04 0.953
x18. Tech-SavesTime 5.96 1.209
x19. EasyToOpenAccount 5.67 1.144
x20. ConvenientToOpenAccount 5.55 1.169
x21. EmployeesNeat 5.78 0.920
x22. PhysicalFacilities 5.61 1.152
x23. PrintedMaterials 5.55 1.162
x24. Reputation 5.93 0.892
x25. Promotions 5.49 1.275
x26. ExplainCharges 4.74 2.665
x27. FeesReasonable 4.61 2.695
x28. FeesCompetitive 5.00 1.844
x29. ServiceRange 5.69 0.968
x30. ServiceOptions 5.48 1.108
x31. PromisesFulfilled 5.54 1.371
x32. RightFirstTime 5.31 1.591
x33. Reliable-Consistent-Dependable 5.57 1.153
x34. Employee-Trustworthy 5.63 1.113
x35. Employee-Competent 5.62 1.220
x36. Employee-Approachable 5.66 1.090
x37. Employee-Courteous 5.75 1.011
x38. Employee-Helping 5.75 1.087
x39. Employee-Friendly 5.68 1.214
x40. Employee-Caring 5.52 1.323
x41. SpecificNeeds 5.37 1.597
x42. PersonalAttention 5.09 2.022
x43. PromptService 5.29 1.701
x44. GeographicalPresence 5.39 3.158
x45. OperationHours 5.07 2.283
x46. DocumentsAccurate 5.86 0.897
x47. DocumentsEasyToUnderstand 5.75 1.116
x48. CS-RequestAnticipated 5.22 1.859
x49. CS-Explanations 5.29 1.751
x50. CS-Supportive 5.33 1.697
x51. CS-PersonalizedInfo 5.29 1.668
x52. CS-PromptAnswer 4.92 2.322
x53. SympatheticProblemSolving 5.10 1.797
x54. Problem-EmployeeKnowledge 5.41 1.468
x55. Problem-QuickResolve 5.19 1.848 Table III.
x56. Problem-Compensation 4.89 2.057 Descriptive statistics
MSQ
Factor Cronbach’s
20,5 Factors Measurement items loadings alpha

Technology security and Tech-Safe 0.652 0.89


information quality Tech-InfoMisuse 0.677
(TechInfoSecure) Tech-Personalize 0.606
414 Tech-PreciseInfo 0.802
Tech-SufficientInfo 0.759
Tech-ProvideReportNeeded 0.666
Technology convenience Tech-BeyondBusinessHours 0.696 0.88
(TechConven) Tech-Mobility 0.714
TechOverEmployee 0.756
Tech-QuickTransactions 0.802
Tech-SavesTime 0.767
Technology usage easiness and Tech-EasyToUse 0.832 0.90
reliability (TechEaseRel) Tech-User-friendly 0.844
Tech-Accuracy 0.641
Tech-Reliable 0.728
Customer service (Cust Service) PersonalAttention 0.517 0.95
CS-RequestAnticipated 0.755
CS-Explanations 0.778
CS-Supportive 0.789
CS-PersonalizedInfo 0.739
CS-PromptAnswer 0.700
SympatheticProblemSolving 0.746
Problem-EmployeeKnowledge 0.612
Problem-QuickResolve 0.692
Problem-Compensation 0.524
Staff competence (Staff Compt) Employee-Trust 0.689 0.96
Employee-Competent 0.733
Employee-Approachable 0.808
Employee-Courteous 0.841
Employee-Helping 0.808
Employee-Friendly 0.798
Employee-Caring 0.764
Image or reputation (Reputation) Reputation 0.624 0.92
ServiceRange 0.543
ServiceOptions 0.541
PromisesFulfilled 0.613
RightFirstTime 0.567
Reliable-Consistent-Dependable 0.658
DocumentsAccurate 0.438
DocumentsEasyToUnderstand 0.374
Price ExplainCharges 0.715 0.89
FeesReasonable 0.775
FeesCompetitive 0.772
Tangibles EmployeesNeat 0.520 0.81
PhysicalFacilities 0.797
PrintedMaterials 0.770
Table IV. Subscription ease (Esubscription) EasyToOpenAccount 0.875 Corr.
Rotated factor matrix for ConvenientToOpenAccount 0.845 Coeff. ¼ 0.843
hybrid service quality (sig. at 0.01 level)
.
Staff competence: the expertise and nature of employees. Service quality
.
Reputation: image of the service provider through different actions and options. dimensions
.
Price: easy to understand, reasonable and competitive pricing.
.
Tangibles: physical facilities, materials and appearance of employees.
.
Ease of subscription: convenience and ease of subscribing to a service.
.
Technology security and information quality: safety in using technology, proper 415
handling of information and quality of information.
.
Technology convenience: convenience of using technology over the employees as
well as speed and time of using technology.
.
Technology usage easiness and reliability: how reliable and easy to use the
technology is.

Confirmatory factor analysis


After identifying nine clear factors through exploratory factor analysis, the next stage
is to confirm the factor structure on sample 2. Structural equation modeling (SEM)
using AMOS 16.0 was used to perform the confirmatory factor analysis. Confirmatory
factor analysis revealed that the measurement items loaded in accordance with the
pattern revealed in the exploratory factor analysis.

Model fit
The measurement model indicated an acceptable model fit of the data (x 2 ¼ 2946:37,
df ¼ 1044, p , 0:001; x2=df ¼ 2:822 (, 5); CFI ¼ 0:91; TLI ¼ 0:90; IFI ¼ 0:91;
NFI ¼ 0:90; PNFI ¼ 0:77; PCFI ¼ 0:83; PRATIO ¼ 0:92 and RMSEA ¼ 0:06)
(Anderson and Gerbing, 1988). In addition, all the indicators loaded significantly on
the latent constructs. The values of the fit indices indicate a reasonable fit of the
measurement model with data (Byrne, 2001, pp. 79-86). In short the measurement
model confirms to the nine-factor structure of the service quality instrument.

Reliability of the service quality instrument


The Cronbach’s alpha for the service quality instrument was 0.89 which is acceptable
and shows that the instrument is reliable. Further evidence of the reliability of the scale
is provided in Table V, which shows the composite reliability and average variance
extracted scores of the different factors obtained (Fornell and Larcker, 1981; Hair et al.,
2006). Composite reliability (CR) of all the latent variables is greater than the acceptable
limit of 0.70, (Carmines and Zeller, 1988). The average-variance extracted for all the
factors is greater than or equal to 0.5, which is acceptable (Fornell and Larcker, 1981).
This shows the internal consistency of the instrument used in the study.

Construct validity
Construct validity is the extent to which a set of measured variables actually reflects
the latent construct they are designed to measure (Hair et al., 2006). Construct validity
is established in this study by establishing the face validity, convergent validity and
discriminant validity.
Face validity was established by adopting the measurement items used in the study
from the existing literature and adapting the same to the present research context.
MSQ
Constructs Measurement items Standardized estimates p-value AVE CR
20,5
Cust Service x56 0.717 *
Cust Service x55 0.832 *
Cust Service x54 0.807 * 0.52 0.91
Cust Service x53 0.839 *
416 Cust Service x52 0.769 *
*
Cust Service x51 0.869
Cust Service x50 0.865 *
Cust Service x49 0.869 *
Cust Service x48 0.863 *
Cust Service x42 0.729 *
Staff Compt x40 0.874 *
Staff Compt x39 0.911 * 0.75 0.96
Staff Compt x38 0.910 *
Staff Compt x37 0.932 *
Staff Compt x36 0.900 *
Staff Compt x35 0.840 *
Staff Compt x34 0.807 *
Reputation x24 0.692 *
Reputation x29 0.671 * 0.53 0.90
Reputation x30 0.671 *
Reputation x31 0.863 *
Reputation x32 0.858 *
Reputation x33 0.891 *
Reputation x46 0.671 *
Reputation x47 0.695 *
Price x26 0.793 *
Price x27 0.934 * 0.65 0.95
Price x28 0.849 *
Tangibles x21 0.728 *
Tangibles x22 0.829 * 0.57 0.80
Tangibles x23 0.752 *
Esubscription x20 0.951 *
Esubscription x19 0.886 * 0.82 0.90
TechInfoSecure x13 0.804 *
TechInfoSecure x12 0.865 *
TechInfoSecure x11 0.820 *
TechInfoSecure x8 0.636 * 0.50 0.86
TechInfoSecure x6 0.652
TechInfoSecure x9 0.636 *
TechConven x14 0.692 *
TechConven x15 0.751 * 0.56 0.86
TechConven x16 0.709 *
TechConven x17 0.918 *
TechConven x18 0.817 *
TechEaseRel x1 0.932 *
TechEaseRel x2 0.977 * 0.67 0.88
TechEaseRel x3 0.633 *
Table V. TechEaseRel x4 0.732 *
Confirmatory factor
analysis results Note: * Significant at p , 0.001
Convergent validity was assessed by examining the factor loadings and average Service quality
variance extracted of the constructs as suggested by Fornell and Larcker (1981). All the dimensions
indicators had significant loadings onto the respective latent constructs (p , 0:001)
with values varying between 0.63 and 0.97 (Table V). In addition, the average variance
extracted (AVE) for each construct is greater than or equal to 0.50, which further
supports the convergent validity of the constructs.
Fornell and Larcker (1981) states that discriminant validity can be assessed by 417
comparing the average variance extracted (AVE) with the corresponding
inter-construct squared correlation estimates. The AVE values of all the service
quality factors are greater than the inter-construct correlations, which supports the
discriminant validity of the constructs. Thus, the measurement model reflects good
construct validity and desirable psychometric properties.

Discussion and managerial implications


The current research makes important contribution to the field of services marketing
by identifying the service quality dimensions for hybrid services from the customers’
perspective. These dimensions will act as guidelines for the managers of hybrid
services as it will help them to understand the particular dimensions that customers
consider while evaluating the quality delivered by these service providers. The various
dimensions of service quality identified in this study should be viewed as levers of
improving bank’s perceived service quality in the minds of its customers. However, the
degree of emphasis placed on these dimensions depends on the objectives of the banks.
The service quality dimensions identified in this study clearly show two different sets
– one group (three dimensions) is related to the technology aspects of service quality
(Al-Hawari et al., 2005; Collier and Bienstock, 2006; Joseph et al., 1999; Madu and Madu,
2002; Parasuraman et al., 2005); and the other group (six dimensions) is about
human-human interactions and marketing mix items or the traditional service quality
dimensions (Andreassen and Lindestad, 1998; Bolton and Drew, 1991; Grönroos, 1984;
Kirmani and Wright, 1989; Levesque and McDougall, 1996; Parasuraman et al., 1985,
1988; Rotfeld and Rotzoll, 1976).
Among the conventional service quality related dimensions the first one is labeled
as “customer service” as it consists of the items related to call center quality (Burgers
et al., 2000; Dean, 2004; Rafaeli et al., 2008) and problem solving (Athanassopoulos and
Iliakopoulos, 2003; Bitner, 1990; Dabholkar et al., 1996; Johnson and Sirikit, 2002;
Joseph et al., 1999). This dimension highlights the importance placed by customers on
the service provided to them through call centers and during problem situations. In any
service company the importance of employee expertise is very crucial as they interact
with customers for providing service. So the next dimension is labeled as “staff
competence” which consists of the items related to the nature and competency of the
employees (Brady and Cronin, 2001; Caruana, 2002; Cronin et al., 2000; Host and
Andersen, 2004; Sureshchandar et al., 2002; Woo and Fock, 1999; Yang et al., 2004).
The third quality dimension is named as “reputation” of the service provider, which
is consistent with the existing literature (Andreassen and Lindestad, 1998;
Athanassopoulos and Iliakopoulos, 2003; Aydin and Ozer, 2005; Grönroos, 1990;
Lehtinen and Lehtinen, 1991). The measurement items loading on reputation are about
service options, easy and accurate documents, and promises. Price, a marketing mix
variable has often been used as a proxy for service quality (Al-Hawari et al., 2005;
MSQ Voss et al., 1998) and so also in this research we found one dimension related to price.
20,5 Our next dimension is “tangibles” which is an important service quality dimension
(Parasuraman et al., 1988). Physical facilities, employees’ appearance and materials
make up tangibles and they are crucial in order to subsidize the intangible nature of
services (Caruana, 2002; Cronin et al., 2000; Dabholkar et al., 1996; Host and Andersen,
2004; Johnson and Sirikit, 2002). The last conventional service quality related
418 dimension is “ease of subscription” which is perceived by the customers to be an
important evaluation parameter. This is consistent with the existing literature
Athanassopoulos and Iliakopoulos (2003); Lai et al. (2007); Olorunniwo and Hsu(2006).
Coming to the dimensions related to technology based transactions, one of the most
crucial hurdles in the path of popularity of technology is the customers’ concerns about
security as well as quality and proper handling of information (Doll and Torkzadeh, 1988;
Kim and Lim, 2001; Parasuraman, 2000; Parasuraman et al., 2005; Van Riel et al., 2001;
Yang et al., 2004) and so the first dimension has been labeled as “technology security”
and “information quality”. Second dimension is labeled as “technology convenience”
which consists of items showing the usefulness of using technology channel over the
other channels (Joseph et al., 1999; Meuter et al., 2000; Parasuraman, 2000; Parasuraman
et al., 2005). The last dimension consists of items related to easiness to use technology and
the reliability of technology (Dabholkar and Bagozzi, 2002; Doll and Torkzadeh, 1988;
Joseph et al., 1999; Meuter et al., 2000; Parasuraman, 2000; Van Riel et al., 2001) and thus
we named it as “technology usage easiness and reliability”.
The above discussion on different dimensions obtained in this study highlights that
service quality has some universal aspects as demonstrated by the dimensions (for
example, “tangibles”, and “privacy”) similar to those obtained in SERVQUAL or
E-S-QUAL in their specific service contexts. However, variations exist regarding the
complexity (i.e. the factor structure) of service quality in different service contexts. As
hybrid services is altogether a different service context from that of conventional
services (as used in SERVQUAL) or that of technology enabled services (as in
E-S-QUAL), the factor structure (dimensions) obtained in this study makes a
significant contribution to the literature by identifying the dimensions of service
quality in hybrid services.
The current study provides some useful insights for managers of hybrid services.
The service quality dimensions identified in this study consists of conventional service
quality factors as well as the dimensions related to the technology related interactions.
This is useful for the service manager of a hybrid service because he can measure the
overall perceptions of service quality on these dimensions to get a broad indication of
the firm’s service quality performance. Second, the positive impact of service quality
dimensions on customer satisfaction and customer loyalty has been highlighted in the
literature (Dagger et al., 2007). Hence, a fair understanding of these dimensions may
help the service managers of hybrid services to see their impact on customer
satisfaction and customer loyalty in due course of time. Third, the findings of this
study can be used as a means to compare the service quality of a bank vis-à-vis its
competitors. Similarly, these dimensions can be used to track the relative performance
of various branches of a multi-branch bank over a period of time. Finally, these
dimensions can also be used to segment the customers based on their perceptions
about service quality. This will help the service managers to design relationship
marketing strategies for different sets of customers with varied requirements.
Limitations and future research directions Service quality
Studying service quality in hybrid contexts is conceptually interesting. As this dimensions
research has considered both the angles of human-human and human-technology
interactions, it can be a guide towards further research in this area by exploring other
options like testing whether there is a possibility of a multi-tier service quality model in
hybrid services. Little is known about how a combination of human-human and
human-technology interactions affect the overall outcome of service encounter. 419
Therefore, a logical extension of this research is to empirically assess the effect of these
quality dimensions on different customer metrics like satisfaction, loyalty and
word-of-mouth activity of customers in case of hybrid services. The application of the
service quality dimensions identified in this study cannot be generalized as we have
taken only one industry (banking). So to confirm its applicability in other hybrid
services like telecommunications and insurance and build a universal model of service
quality dimensions in hybrid services future research should carry out the same study
in various other hybrid service industries.

References
Al-Hawari, M., Hartley, N. and Ward, T. (2005), “Measuring banks’ automated service quality:
a confirmatory factor analysis approach”, Marketing Bulletin, Vol. 16, May, pp. 1-19.
Aldrich, D. (2000), “The new value chain”, Information Strategy: The Executive’s Journal, Vol. 16
No. 3, pp. 39-41.
Anderson, J. and Gerbing, D.W. (1988), “Structural equation modeling in practice: a review and
recommended two-step approach”, Psychological Bulletin, Vol. 103 No. 3, pp. 411-23.
Andreassen, T.W. and Lindestad, B. (1998), “Customer loyalty and complex services: the impact
of corporate image on quality, customer satisfaction and loyalty for customers with
varying degrees of service expertise”, International Journal of Service Industry
Management, Vol. 9 No. 1, pp. 7-23.
Asubonteng, P., McCleary, K.J. and Swan, J.E. (1996), “SERVQUAL revisited: a critical review of
service quality”, The Journal of Services Marketing, Vol. 10 No. 6, pp. 62-81.
Athanassopoulos, A.D. and Iliakopoulos, A. (2003), “Modeling customer satisfaction in
telecommunications: assessing the effects of multiple transaction points on the perceived
overall performance of the provider”, Production and Operations Management, Vol. 12
No. 2, pp. 224-45.
Aydin, S. and Ozer, G. (2005), “National customer satisfaction indices: an implementation in the
Turkish mobile telephone market”, Marketing Intelligence & Planning, Vol. 23 No. 5,
pp. 486-504.
Babakus, E. and Boller, G.W. (1992), “An empirical assessment of the SERVQUAL scale”, Journal
of Business Research, Vol. 24 No. 3, pp. 253-68.
Bailey, J.E. and Pearson, S.W. (1983), “Development of a tool for measuring and analyzing
computer user satisfaction”, Management Science, Vol. 29 No. 5, pp. 530-45.
Bitner, M.J. (1990), “Evaluating service encounters: the effect of physical surroundings and
employee responses”, Journal of Marketing, Vol. 54 No. 2, pp. 69-82.
Bitner, M.J. (1992), “Servicescapes: the impact of physical surroundings on customers and
employees”, Journal of Marketing, Vol. 56 No. 2, pp. 57-71.
Bitner, M.J., Brown, S.W. and Meuter, M.L. (2000), “Technology infusion in service encounters”,
Journal of the Academy of Marketing Science, Vol. 28 No. 1, pp. 138-49.
MSQ Bolton, R.N. and Drew, J.H. (1991), “A longitudinal analysis of the impact of service changes on
customer attitudes”, Journal of Marketing, Vol. 55 No. 1, pp. 1-9.
20,5 Brady, M.K. and Cronin, J.J. Jr (2001), “Some new thoughts on conceptualizing perceived service
quality: a hierarchical approach”, Journal of Marketing, Vol. 65 No. 3, pp. 34-49.
Burgers, A., Ruyter, K.D., Keen, C. and Streukens, S. (2000), “Customer expectation dimensions of
voice-to-voice service encounters: a scale-development study”, International Journal of
420 Service Industry Management, Vol. 11 No. 2, pp. 142-61.
Burke, R.R. (2002), “Technology and the customer interface: what consumers want in the
physical and virtual store”, Journal of the Academy of Marketing Science, Vol. 30 No. 4,
pp. 411-32.
Buttle, F. (1996), “SERVQUAL: review, critique, research agenda”, European Journal of
Marketing, Vol. 30 No. 1, pp. 8-32.
Byrne, B.M. (2001), Structural Equation Modeling with AMOS: Basic Concepts, Applications and
Programming, Lawrence Erlbaum Associates, Mahwah, NJ.
Carman, J.M. (1990), “Consumer perceptions of service quality: an assessment of the SERVQUAL
dimensions”, Journal of Retailing, Vol. 66 No. 1, pp. 33-55.
Carmines, E.G. and Zeller, R.A. (1988), Reliability and Validity Assessment, Sage, Beverly Hills,
CA.
Caruana, A. (2002), “Service loyalty – the effects of service quality and the mediating role of
customer satisfaction”, European Journal of Marketing, Vol. 36 Nos 7/8, pp. 811-28.
Caruana, A., Money, A.H. and Berthon, P.R. (2000), “Service quality and satisfaction –
the moderating role of value”, European Journal of Marketing, Vol. 34 Nos 11/12,
pp. 1338-52.
Chen, P.Y. and Hitt, L.M. (2002), “Measuring switching costs and the determinants of customer
retention in internet-enabled businesses: a study of the online brokerage industry”,
Information Systems Research, Vol. 13 No. 3, pp. 255-74.
Collier, J.E. and Bienstock, C.C. (2006), “Measuring service quality in e-retailing”, Journal of
Service Research, Vol. 8 No. 3, pp. 260-75.
Cronin, J.J. Jr and Taylor, S.A. (1992), “Measuring service quality: a reexamination and
extension”, Journal of Marketing, Vol. 56 No. 3, pp. 55-68.
Cronin, J.J. Jr, Brady, M.K. and Hult, G.T.M. (2000), “Assessing the effects of quality, value, and
customer satisfaction on consumer behavioral intentions in service environments”, Journal
of Retailing, Vol. 76 No. 2, pp. 193-218.
Crosby, L.A. (1991), “Building and maintaining quality in the service relationship”, in Brown, S.W.,
Gummesson, E., Edvardsson, B. and Gustavsson, B. (Eds), Service Quality – Multidisciplinary
and Multidimensional Perspectives, Lexington Books, Lexington, MA, pp. 269-87.
Curran, J.M. and Meuter, M.L. (2005), “Self-service technology adoption: comparing three
technologies”, Journal of Services Marketing, Vol. 19 No. 2, pp. 103-13.
Dabholkar, P.A. and Bagozzi, R.P. (2002), “An attitudinal model of technology-based self-service:
moderating effects of consumer traits and situational factors”, Journal of the Academy of
Marketing Science, Vol. 30 No. 3, pp. 184-201.
Dabholkar, P.A., Thorpe, D.I. and Rentz, J.O. (1996), “A measure of service quality for retailing
stores: scale development and validation”, Journal of the Academy of Marketing Science,
Vol. 24 No. 1, pp. 3-16.
Dagger, T.S., Sweeney, J.C. and Johnson, L.W. (2007), “A hierarchical model of health service
quality: scale development and investigation of an integrated model”, Journal of Service
Research, Vol. 10 No. 2, pp. 123-42.
Danaher, P.J. and Gallagher, R.W. (1997), “Modeling customer satisfaction in Telecom New Service quality
Zealand”, European Journal of Marketing, Vol. 31 No. 2, pp. 122-33.
dimensions
Dean, A.M. (2002), “Service quality in call centers: implications for customer loyalty”, Managing
Service Quality, Vol. 12 No. 6, pp. 414-23.
Dean, A.M. (2004), “Rethinking customer expectations of service quality: are call centers
different?”, Journal of Services Marketing, Vol. 18 No. 1, pp. 60-77.
Doll, W.J. and Torkzadeh, G. (1988), “The measurement of end-user computing satisfaction”, MIS 421
Quarterly, Vol. 12 No. 2, pp. 259-74.
Fodness, D. and Murray, B. (2007), “Passengers’ expectations of airport service quality”, Journal
of Services Marketing, Vol. 21 No. 7, pp. 492-506.
Fornell, C. and Larcker, D.F. (1981), “Evaluating structural equation models with unobservable
variables and measure”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50.
Gilbert, G.R., Veloutsou, C., Goode, M.M.H. and Moutinho, L. (2004), “Measuring customer
satisfaction in the fast food industry: a cross-national approach”, Journal of Services
Marketing, Vol. 18 No. 5, pp. 371-83.
Grönroos, C. (1984), “A service quality model and its marketing implications”, European Journal
of Marketing, Vol. 18 No. 4, pp. 36-44.
Grönroos, C. (1990), Service Management and Marketing: Managing the Moments of Truth in
Service Competition, Lexington Books, Lexington, MA.
Hair, J.F. Jr, Black, C.W., Babin, J.B., Anderson, R.E. and Tatham, L.R. (2006), Multivariate Data
Analysis, 6th ed., Prentice-Hall, Upper Saddle River, NJ.
Hoffman, K.D., Kelley, S.W. and Rotalsky, H.M. (1995), “Tracking service failures and employee
recovery efforts”, Journal of Services Marketing, Vol. 9 No. 2, pp. 49-61.
Host, V. and Andersen, M.K. (2004), “Modeling customer satisfaction in mortgage credit
companies”, The International Journal of Bank Marketing, Vol. 22 No. 1, pp. 26-42.
Johnson, W.C. and Sirikit, A. (2002), “Service quality in the Thai telecommunication industry:
a tool for achieving a sustainable competitive advantage”, Management Decision, Vol. 40
No. 7, pp. 693-701.
Joseph, M. and Stone, G. (2003), “An empirical evaluation of US bank customer perceptions of the
impact of technology on service delivery in the banking sector”, International Journal of
Retail & Distribution Management, Vol. 31 No. 4, pp. 190-202.
Joseph, M., McClure, C. and Joseph, B. (1999), “Service quality in the banking sector: the impact of
technology on service delivery”, International Journal of Bank Marketing, Vol. 17 No. 4,
pp. 182-91.
Kim, M.K., Park, M.C. and Jeong, D.H. (2004), “The effects of customer satisfaction and switching
barrier on customer loyalty in Korean mobile telecommunication services”,
Telecommunications Policy, Vol. 28 No. 2, pp. 145-59.
Kim, S.Y. and Lim, Y.J. (2001), “Consumers’ perceived importance of and satisfaction with
internet shopping”, Electronic Markets, Vol. 11 No. 3, pp. 148-54.
Kirmani, A. and Wright, P. (1989), “Money talks: perceived advertising expense and expected
product quality”, Journal of Consumer Research, Vol. 16 No. 3, pp. 344-53.
Lai, F., Hutchinson, J., Li, D. and Bai, C. (2007), “An empirical assessment and application of
SERVQUAL in mainland China’s mobile communications industry”, International Journal
of Quality & Reliability Management, Vol. 24 No. 3, pp. 244-62.
Lehtinen, U. and Lehtinen, J.R. (1991), “Two approaches to service quality dimensions”,
The Service Industries Journal, Vol. 11 No. 3, pp. 287-303.
MSQ Levesque, T. and McDougall, G.H.G. (1996), “Determinants of customer satisfaction in retail
banking”, International Journal of Bank Marketing, Vol. 14 No. 7, pp. 12-20.
20,5
Lewis, R.C. and Booms, B.H. (1983), “The marketing aspects of service quality”, in Berry, L.L.,
Shostack, G.L. and Upah, G.D. (Eds), Emerging Perspectives on Services Marketing,
American Marketing Association, Chicago, IL, pp. 99-104.
Li, Y.N., Tan, K.C. and Xie, M. (2003), “Factor analysis of service quality dimension shifts in the
422 information age”, Managerial Auditing Journal, Vol. 18 No. 4, pp. 297-302.
Madu, C.N. and Madu, A.A. (2002), “Dimensions of e-quality”, International Journal of Quality &
Reliability Management, Vol. 19 No. 3, pp. 246-58.
Marshall, L. (2006), “Flying high on service automation”, Customer Relationship Management,
February, pp. 42-3.
Matilla, A.S. and Enz, C.A. (2002), “The role of emotions in service encounters”, Journal of Service
Research, Vol. 4 No. 4, pp. 268-77.
Meuter, M.L., Ostrom, A.L., Bitner, M.J. and Roundtree, R. (2003), “The influence of technology
anxiety on consumer use and experiences with self-service technologies”, Journal of
Business Research, Vol. 56 No. 11, pp. 899-906.
Meuter, M.L., Ostrom, A.L., Roundtree, R.I. and Bitner, M.J. (2000), “Self-service technologies:
understanding customer satisfaction with technology-based service encounters”, Journal
of Marketing, Vol. 64 No. 3, pp. 50-64.
Moorthy, S. and Hawkins, S.A. (2005), “Advertising repetition and quality perception”, Journal of
Business Research, Vol. 58 No. 3, pp. 354-60.
Ndubisi, N.O. and Wah, C.K. (2005), “Factorial and discriminant analyses of the underpinnings of
relationship marketing and customer satisfaction”, The International Journal of Bank
Marketing, Vol. 23 No. 7, pp. 542-57.
Nunnally, J.C. and Bernstein, I.H. (1994), Psychometric Theory, McGraw-Hill, New York, NY.
Oliver, R. (1993), “Cognitive, affective and attribute bases of the satisfaction experiences”, Journal
of Consumer Research, Vol. 20 No. 3, pp. 418-30.
Olorunniwo, F. and Hsu, M.K. (2006), “A typology analysis of service quality, customer
satisfaction and behavioral intentions in mass services”, Managing Service Quality, Vol. 16
No. 2, pp. 106-23.
Parasuraman, A. (2000), “Technology readiness index (TRI) – a multiple-item scale to measure
readiness to embrace new technologies”, Journal of Service Research, Vol. 2 No. 4,
pp. 307-20.
Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1985), “A conceptual model of service quality
and its implications for future research”, Journal of Marketing, Vol. 49 No. 4, pp. 41-50.
Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1988), “SERVQUAL: a multiple-item scale for
measuring consumer perceptions of service quality”, Journal of Retailing, Vol. 64 No. 1,
pp. 12-40.
Parasuraman, A., Zeithaml, V.A. and Malhotra, A. (2005), “E-S-QUAL: a multiple-item scale for
assessing electronic service quality”, Journal of Service Research, Vol. 7 No. 3, pp. 213-33.
Peterson, R.A., Balasubramanian, S. and Bronnenberg, B.J. (1997), “Exploring the implications of
the internet for consumer marketing”, Journal of the Academy of Marketing Science, Vol. 25
No. 4, pp. 329-46.
Pett, M.A., Lackey, N.R. and Sullivan, J.J. (2003), Making Sense of Factor Analysis: The Use of
Factor Analysis for Instrument Development in Health Care Research, Sage Publications,
New Delhi.
Rafaeli, A., Ziklik, L. and Doucet, L. (2008), “The impact of call center employees’ customer Service quality
orientation behaviors on service quality”, Journal of Service Research, Vol. 10 No. 3,
pp. 239-55. dimensions
Rotfeld, H.J. and Rotzoll, K.B. (1976), “Advertising and product quality: are heavily advertised
products better?”, Journal of Consumer Affairs, Vol. 10 No. 1, pp. 33-47.
Santos, J. (2003), “E-service quality: a model of virtual service quality dimensions”, Managing
Service Quality, Vol. 13 No. 3, pp. 233-46. 423
Sharma, N. and Ojha, S. (2004), “Measuring service performance in mobile communications”,
The Service Industries Journal, Vol. 24 No. 6, pp. 109-28.
Sharma, S., Niedrich, R.W. and Dobbins, G. (1999), “A framework for monitoring customer
satisfaction: an empirical illustration”, Industrial Marketing Management, Vol. 28 No. 3,
pp. 231-43.
Sureshchandar, G.S., Rajendran, C. and Anantharaman, R.N. (2002), “The relationship between
service quality and customer satisfaction – a factor-specific approach”, Journal of Services
Marketing, Vol. 16 No. 4, pp. 363-79.
Taylor, S. and Claxton, J.D. (1994), “Delays and dynamics of service evaluations”, Journal of the
Academy of Marketing Science, Vol. 22 No. 3, pp. 254-64.
Van Riel, A.C.R., Liljander, V. and Jurriens, P. (2001), “Exploring consumer evaluations of
e-services: a portal site”, International Journal of Service Industry Management, Vol. 12
No. 4, pp. 359-77.
Voss, G.B., Parasuraman, A. and Grewal, D. (1998), “The roles of price, performance, and
expectations in determining satisfaction in service exchanges”, Journal of Marketing,
Vol. 62 No. 4, pp. 46-61.
Wang, Y. and Lo, H.P. (2002), “Service quality, customer satisfaction and behavior intentions –
evidence from China’s telecommunication industry”, Info, Vol. 4 No. 6, pp. 50-60.
Woo, K.S. and Fock, H.K.Y. (1999), “Customer satisfaction in the Hong Kong mobile phone
industry”, The Service Industries Journal, Vol. 19 No. 3, pp. 162-74.
Yang, Z. and Fang, X. (2004), “Online service quality dimensions and their relationships with
satisfaction – a content analysis of customer reviews of securities brokerage services”,
International Journal of Service Industry Management, Vol. 15 No. 3, pp. 302-26.
Yang, Z., Joon, M. and Peterson, R.T. (2004), “Measuring customer-perceived online service
quality – scale development and managerial implications”, International Journal of
Operations & Production Management, Vol. 24 No. 11, pp. 1149-74.
Yoo, D.K. and Park, J.A. (2007), “Perceived service quality – analyzing relationships among
employees, customers, and financial performance”, International Journal of Quality &
Reliability Management, Vol. 24 No. 9, pp. 908-26.

About the authors


Shirshendu Ganguli is an Assistant Professor in the Marketing Department at IBS Hyderabad,
IFHE University. He was a Visiting Research Scholar from August 2008 to June 2009 at Bentley
University, Boston, MA, USA. He has published in both international and national journals of
repute, namely, The Marketing Management Journal, Abhigyan GITAM Journal of Management,
Icfaian Journal of Management Research, Icfai Journal of Services Marketing and The IUP
Journal of Marketing Management. He has presented research papers at various national and
international conferences. His teaching interests include marketing management, customer
relationship management, services marketing and marketing research. His research interests
include relationship marketing, customer satisfaction and customer loyalty, and service quality.
MSQ Sanjit Kumar Roy is an Assistant Professor in the Marketing Department at IBS Hyderabad,
IFHE University. He was a Visiting Research Scholar from August 2007 to June 2008 at Bentley
20,5 University, Boston, MA, USA. He has published in both international and national journals of
repute, namely, International Journal of Bank Marketing, The Marketing Management Journal,
Computers in Human Behavior, South Asian Journal of Management, Case Studies in Business,
Industry and Government Statistics, IIMB Management Review, GITAM Journal of
Management, Global Management Review, Icfai Journal of Management Research, and Icfai
424 Journal of Brand Management. He has presented research papers at various national and
international conferences. His teaching interests include marketing management, customer
relationship management, services marketing, marketing research and brand management. His
research interests include relationship marketing, customer satisfaction and customer loyalty,
stakeholder marketing, and electronic marketing.

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints

Вам также может понравиться