You are on page 1of 26

Simplify your analytics strategy By Narendra Mulani

While the interests in analytics and resulting benefits are increasing by the day, some businesses are
challenged by the complexity and confusion that analytics can generate. Companies can get stuck trying
to analyze all thats possible and all that they could do through analytics, when they should be taking that
next step of recognizing whats important and what they should be doing for their customers,
stakeholders, and employees. Discovering real business opportunities and achieving desired outcomes can
be elusive. To overcome this, companies should pursue a simpler path to uncovering the insight in their
data and making insight-driven decisions that add value. Following are steps that we have seen work in a
number of companies to simplify their analytics strategy and generate insight that leads to real outcomes:
Accelerate the data: Fast data = fast insight = fast outcomes. Liberate and accelerate data by creating a
data supply chain built on a hybrid technology environment a data service platform combined with
emerging big data technologies. Such an environment enables businesses to move, manage, and mobilize
the ever-increasing amount of data across the organization for consumption faster than previously
possible. Real-time delivery of analytics speeds up the execution velocity and improves the service
quality of an organization. For example, a U.S. bank adopted such a technology environment to more
efficiently manage increasing data volumes for its customer analytics projects. As a result, the firm
experienced improved processing time by several hours, generating quicker insights and a faster reaction
Delegate the work to your analytics technologies. Uncovering data insights doesnt have to be difficult.
Here are ways to delegate the work to your analytics technologies:
Next-Gen Business Intelligence (BI) and data visualization. At its core, next-gen business intelligence is
bringing data and analytics to life to help companies improve and optimize their decision-making and
organizational performance. BI does this by turning an organizations data into an asset by having the
right data, at the right time and place (mobile, laptop, etc), and displayed in the right visual form (heat
map, charts, etc) for each individual decision-maker, so they can use it to reach their desired outcome.
When the data is presented to decision-makers in such a visually appealing and useful way, they are
enabled to chase and explore data-driven opportunities more confidently.
For example, a financial services company applied BI and data visualization to see the different buckets
of risk across its entire loan portfolio. After analyzing its key data and displaying the results via
visualizations, the firm identified the areas in the U.S. where there were high delinquency rates, explored
tranches based on lenders, loan purposes, and loan channels, and viewed bank loan portfolios. Users were
also able to interact with the results and query the data based on their needs select different date
ranges, FICO scores, compare lenders and loan types, etc. Due to the flexibility and data exploration
capabilities of the interactive BI and visualization solution, insight-driven decisions could be made and
actions could be pursued that would benefit the business.
Data discovery. Data discovery can take place alongside outcome-specific data projects. Through the use
of data discovery techniques, companies can test and play with their data to uncover data patterns that
arent clearly evident. When more insights and patterns are discovered, more opportunities to drive value
for the business can be found. For instance, a resources company was able to predict which pipelines are
most risky from both physical and atypical threats through data discovery techniques. Due to the insights
gained, the firm was able to prioritize where they should invest funds for counter-failure measures and
maintenance repairs.

Analytics applications. Applications can simplify advanced analytics as they put the power of analytics
easily and elegantly into the hands of the business user to make data-driven business decisions. They can
also be industry-specific, flexible, and tailored to meet the needs of the individual users across
organizations from marketing to finance, and levels from C-suite to middle management. For example,
an advanced analytics app can help a store manager optimize his inventory and a CMO could use an app
to optimize the companys global marketing spend.
Machine learning and cognitive computing. Machine learning is an evolution of analytics that removes
much of the human element from the data modeling process to produce predictions of customer behavior
and enterprise performance. As described in the Intelligent Enterprise trend in the Accenture Technology
Vision 2015 report: With an influx of big data, and advances in processing power, data science and
cognitive technology, software intelligence is helping machines make even better-informed decisions. As
an example, a retailer combined data from multiple sales channels (mobile, store, online, and more) in
near real-time and used machine learning to improve its ability to make more personalized
recommendations to customers. With this data-driven approach, the company was able to target customers
more effectively and boost its revenues.
Recognize that each path to data insight is unique. The path to insight doesnt come in one single
form. There are many different elements in play, and they are always changing business goals,
technologies, data types, data sources, and then some are in a state of flux. Another main component of a
companys analytics journey depends on the companys culture itself: is it more conservative or willing to
take chances? Does it have a plethora of existing data and analytics technologies to work with, or is it just
starting out with its first analytics project? No matter what combination of culture and technology exists
for a business, each path to analytics insight should be individually paved with an outcome-driven
To do this, companies can take two approaches depending on the nature of the business problem. First, for
a known problem with a known solution such as customer segmentation and propensity modeling for
targeted marketing campaigns the company could take a hypothesis-based approach by starting with
the outcome (e.g. cross-sell/up-sell to existing customers), pilot and test the solution with a control group
and then scale broadly across the customer base. Second, for a known problem area, fraud for example,
but with an unknown solution, the company could take a discovery-based approach to look for patterns in
the data to find interesting correlations that may be predictivefor instance, a bank found that the speed
at which fields were filled out on its online forms was highly correlated with fraudulent behavior. Of note,
when determining which problem to address, companies should first focus on the one that can offer the
highest value, then it can choose a hypothesis-based or discovery-based approach based on the degree of
institutional knowledge it has to solve that kind of problem.
Once insights are uncovered, the next step is for the business, of course, to make the data-driven decisions
that place action behind the data. It is possible to uncover the business opportunities in your data and
increase data equity, simply.

Interview Prof Purba Rao, Author of Book Business

Analytics: an application focus
Posted by: Analytics India Magazine September 15, 2013 in Interviews
AIMAnalytics India Magazine: How did you decide to write a book on
Business Analytics?
PRPurba Rao: For last twenty years or so I have been teaching analytical modeling
in business decision making in different business schools to different audiences of
MBA students. The students seemed to appreciate the topic and, what was very
rewarding to me; they applied these approaches to their real life projects and got
meaning results. They also realized that these approaches could be applied to
virtually any business process whether relating to a business with niche product, a
commodity, a monopoly or competitive market. After graduating from the business
school some of them would write back to me telling me that they were indeed
applying these techniques in decision-making situations in their respective
Now in all my teaching, whether I was teaching a course called Quantitative
Analysis or Advanced Marketing Research, or Predictive Modeling I could not find a
textbook that my students could use. So I taught basically from the class notes
which I prepared after lots of research from journal papers, books on multivariate
analysis internet and computer software manuals. These materials were quite
technical and perhaps, not so user friendly to a heterogeneous audience that I saw
in business schools. You see, in business school on one side you come across
engineers, math majors, hard-core finance majors who follow any mathematical
equation you write on the black board. However, on the other side you also come
across humanities majors, literature graduates, human resource professionals and
even medical practitioners. When I am teaching such a heterogeneous class I have
to make sense to all of them and at the same time help them appreciate that what I
am trying to teach, can very well be applied to their line of business.
So, I taught my analytical modeling in the real life context, with the help of cases,
which I wrote often, based on examples I read about in the projects that the
students worked on and submitted to me.
Also, in place of advanced mathematical modeling I used soft wares sometimes
EXCEL, sometimes SPSS, Crystal Ball, Answer Tree and AMOS. Where software was
concerned my students took huge interest and demonstrated great understanding.
Sometimes I would just show them the basic steps and students would come up and
tell me how to use the software better. In general, my heterogeneous audience
learnt my Analytics topics and applied them very well and felt great satisfaction. To
give an example, I had a student in my Quantitative Analytics class who was a male
nurse in a hospital before he joined the MBA class. He was terribly afraid that he
would never pass my course. However, he did pass, used our class discussions and
applied them to a decision-making scenario in his hospital setting.

All along my teaching my objective was to make full sense to my students whom I
was addressing. To achieve this, I was majorly relying on my class notes along with
a few reference books on basic statistics multivariate analysis, etc. Then a time
came when I thought why not put all my class notes together, make it into a binder
and make it available to the students. Around that time, one day, I was sitting in my
office and the Marketing Director of a large Publishing House walked in and said he
was looking for Business School Faculty to write books for them. I told him that I had
no time to think of writing a book just then but he said he would consider my class
notes and model them to form a book. I said I would think about that idea. Later in
the day I talked about this to my son who had just graduated from Wharton
Business School and was totally knowledgeable with MBA education. He insisted
that I work on the book writing project with the Publishing House and come up with
a book which would bring the wonderful world of Business Analytics to my past,
present and future students. That was 2008 when I remodeled my class notes to
form a book that was called then Predictive Modeling for Strategic Marketing. Five
years have passed since then and now I have come up with this new book Business
Analytics an application focus. I consider this new book as a far more
comprehensive description of the topics I have been discussing over the last five
years. I am also very happy that I have been able to include many real lives case
studies which my students have worked on, as part of my Business Analytics
elective, describing how the analytical modeling applies so effectively in various
managerial decision making situations. My effort will be worth the effort if the
various audiences which the book addresses see value in reading it and gain
significant understanding of how the complex analytical models can be applied so
lucidly in business and managerial situations.
AIM: How is this book different from other similar books in market today?
PR: There are no books on Business Analytics in the market that can be used as
textbook in a course on Business Analytics.
AIM: How did you start your career in analytics?
PR: I have always been a math major, having an undergrad degree in Math honors
(Presidency College, Kolkata), Masters in Applied Math (Science College, University
of Calcutta) and doctorate (Fellow in Management from IIMC) in Operations
Research. So, when I taught, I always taught quantitative/analytics subjects. Even
when I worked in BHEL and Railway Board I was given work pertaining to analytics.
AIM: What do you suggest to new graduates aspiring to get into analytics
PR: Ideally Graduates should have some kind of quantitative aptitude. However,
many students who have never had analytical aptitude, do get interested in the
Analytics subjects later in their career. I think to get into Analytics space, people
should read user friendly managerial journals and management books. Also
attending conferences and seminars on Analytics would greatly help.

AIM: How do you see Analytics evolving today in the industry as a whole?
What are the most important contemporary trends that you see emerging
in the Analytics space across the globe?
PR: The interest and awareness to Business Analytics have been quite a recent
phenomenon in India. Even three or four years back it did not emerge as a field of
study with high visibility. A few large companies such as GE and American Express
had started widespread applications of modeling procedures falling under Business
Analytics, applying them to the huge databases they had created. All the same,
industry by and large, though they were waking up to the fact that they had started
creating extensive business databases which could be perhaps harnessed to help
optimal decision making, was not really thinking of an entirely new field managerial
decision making based and rooted on quantitative modeling, which could be termed
Business Analytics. What indeed was happening in our country was widespread use
of computers and internet and was leading to what can be called an explosion of
data. Outside of the internet retailers, telecom companies, healthcare, airlines,
hotels and even the sports industry were collecting and analyzing massive amounts
of data. Today most of the biggest international companies have centers in India
and many Indian companies have sprung up starting to offer analytics to global
clients. Analytics is one of the fastest growing segments in the KPO (knowledge
process outsourcing) industry in our country. However, proper harnessing the data
has become a challenge for businesses. It is here that Business Analytics can help
companies synthesize data into insights that can be applied for the benefit of the
business. (Rao etal, Business Analytics: A Perspective, International Journal of
Business Analytics and Intelligence, Vol1, Issue 1, Publishing India Group)

COL Business Analytics Solutions

Unlock Huge Business Data to Make
Timely Fact-based Decisions
COL Business Analytics is an innovative analytics tool that puts business users in
control of their data at fingertips. It brings a whole new level of analysis, insights
and values to existing data stores with user interfaces that are clear, simple and
straightforward. Understanding the unique business needs of different industries,
the tool provides industry-specific business analytics solutions, including Decision
Sales, Decision and Decision Insurance with pre-built dashboards with industry
know-how to visualize the power of multi-dimensional analysis and draw valuable
insights for strategic business decisions.

Key Features
In-Memory Technology Innovation
is now a reality. The tool overcomes
the limitations of traditional BI tools
that deliver static and prepackaged

data, enabling business users to

generate dynamic data as they wish.
Business Discovery User-driven
technology empowers users from
different fields including HR, retail,

catering and insurance to explore and

unlock business data in 360-degree to
make rapid, flexible and informed
decisions at their fingertips.

Powerful Analytics Pre-built

dashboards with multi-dimensional
analysis help decision makers gain
actionable, specific insights into the
business performance through
graphical views, data drill-down and
associative search.

Web-based Access
Total Mobility
Pre-built Dashboards with Industry
Graphical Views
Data Drill-down
Associative Search
Any Source of Data
Fast Deployment
Data Extraction to Excel
Graphical Report Printout


A sales analytics tool specially designed for retail and catering

leveraging data to make informed decisions that drive profitable

Enable management, shop-in-charge and buyer / product managers to

analyze up-to-date business performance within minutes
Historical data and trend insights leverage a 360-degree view with
comparative sales based on year, month, day, season, festive day, weather,
Provide analysis on promotion campaigns and customer profile to measure
the effectiveness of marketing initiatives and customer relationship
Provide data drill-down on business data from macro to micro perspectives,
e.g. from business unit to shop outlet, from product category to individual
item, from year to hour, etc.
PR e-built dashboards f or easy access to Key Performance Indicators (KPIs)
such as revenue, gross margin, average spending, revenue per square feet,
productivity per staff and more

A human resources analytics tool for HR professionals and

management to make timely and accurate decisions

Provide a valid, comprehensive and reliable evaluation of current HR

practices within organizations
Visualize, demonstrate and simulate the what if scenarios to provide
accurate workforce forecasts
Multi-dimensional analysis on headcount, compensation, performance,
turnover and absence data
Multiple selections of country, company and department levels for analysis
and comparison
Draw useful workforce insights to inform strategic planning and decisions on
talent strategy, organization development and human capital management

An insurance policy and claim analytics tool tailored for sales

performance and risk management among insurance institutions

Pre-built dashboards to provide business analysis on policy sales performance

and claim loss at fingertips
Enable insurers to keep track of policy sales performance by product types,
brokers and agents
Measure broker performance based on sales amount, number of policy,
premiums, claim paid and commission
Process provider claims by practice types and claim paid for risk
Provide analysis on policyholder profile for relations hip management

IT Auditing and Controls Information Technology Basics

Information Technology Basics
In its most basic form, information technology (IT), can be reduced down to IPO. No thats not an Initial
Public Offering, but rather Input-Processing-Output. Think of it this way, youre lying in bed, sleeping,
and your ears pick up a distinct ringing (INPUT). Your ears send a message to the brain theres ringing
going on out here. Your brain processing the signal (PROCESSING) and sends a signal to your arm to
reach out and hit the snooze button (OUTPUT). In its most basic form, IT takes INPUT, runs it through
some programs PROCESSING and then sends OUTPUT.
INPUT can take on many varied forms from data fields entered on a web-page to an analog circuit
sensing a rise in temperature in a room. PROCESSING can also come in many different forms depending
upon the program being executed or ran. OUTPUT can be as simple as a field in a file being updated, to a
payroll check, to a voluminous report being printed on a high speed printer.
Computers are generally categorized following several criteria, mainly based on their processing power,
size and architecture. Some common types of computers are: Supercomputers, Mainframes, High-end
and midrange servers, Personal computers (PCs), thin client computers, notebook/laptop computers and
then on to smart phones and personal digital assistants (PDAs). Now heres the oddity; I had the
opportunity of seeing a supercomputer at a major university, constructed entirely of Apple MAC laptops.
In my lifetime, the first mainframe I worked on had less memory, less disk space, and less processing
power than the laptop which I currently use to write these articles. It also took up several thousand square
feet of raised floor space and required special air conditioning, humidity control, electrical control and so
forth. My laptop on the other hand fits comfortably in my lap while I sit in the lazy boy rocker writing
this article. Disk storage too has dramatically been reduced in size. Now I can buy a thumb drive with
more storage capacity than an array of hard drives attached to that old mainframe.
However, the IT audit issues remain the same, so lets get to some of those before I bore you completely
out of your skull.
Some of the key control points in todays IT environment which need to be identified and categorized are
those areas which directly affect confidentiality, integrity, and availability. For example, who has access
to the hardware, logical or physical? Who has access to the data and are they authorized to make changes
to the data? Whos reviewing those changes to the data to see if the change was authorized? Let me reintroduce the principle of least privilege here. A person should only have access to the data, systems,
hardware, etc., that they need to be able to do their job, no more. This access should be reviewed
periodically, no less than annually and by all means when a change of employment occurs. Thats
confidentiality, now lets look at integrity. How do you know when you receive an email, say an offer of
employment letter with a substantial salary quoted, that the letter has integrity? That the letter wasnt
tampered with in-route to your inbox in your email. Here you are thinking, WOW, they must be really
impressed if theyre going to offer me that much money, when in fact, one of your co-workers interrupted
your email, made some modifications (increased the salary by adding an extra zero at the end) and then
forwarded it on to you. Ensuring that the email has not been tampered with; ensuring that the calculations
of your net pay are as they should be; are examples of data integrity. Availability means just that, the data
and/or system is available when it is needed. As an IT auditor this means you want to look at disaster
recovery plans, recovery time objectives, recovery point objectives, and so forth but well get into DRP
details in the seventh article.

Some of the basics of computer hardware architecture include the IPO as we discussed before. The I and
O are the Input/Output components and include such things as this keyboard Im typing on, or a mouse,
both of which are I only. That is they only input, whereas a touchscreen is both input and output. On the
O only side you have things like printers. Disk drives are examples of both input and output devices. The
P in IPO is the processing components typically referred to as a CPU or central processing unit which
consists of three primary pieces, an arithmetic logic unit (ALU), a control unit, and an internal memory.
Now there are a lot of other components of a computer and Im sure youve heard of several, like
motherboard (but youll never hear of a fatherboard), RAM, ROM, DRAM, SDRAM and the letters go on
and on. Memory comes in all different types and sizes and what you need to know as an IT auditor is that
even when powered off, some memory still contains sensitive data and needs to be protected.
When we talk about hardware, several things come to mind that I as an IT auditor want to make sure of.
First, is the hardware being maintained, does the vendor still support the hardware or has it reached EOL
(end-of-life) and is unsupported, which can be interpreted as If it breaks it cant be fixed, so Mr. Client
youre out of luck because the system cant be recovered. Next, just like software, the hardware has
something called BIOS (Basic Input Output System) which is being updated by the vendor, typically to
support new functions and/or features. Your question, is, Is the BIOS current, and is the BIOS
protected? Why do you ask if it is protected? Because if a hacker can get to the BIOS, they can alter the
boot sequence, boot from a LINUX CD and copy all your sensitive information off to a gigabyte thumb
drive in less than ten minutes. Reboot your machine, and youll have no evidence that the data has been
copied, except an entry in the log that says the machine was rebooted and that might not even be there if
the hacker covers their tracks.
But enough about hardware, lets look at software, programming, and processing. This too must be
protected from malicious mischief. As an example let me use the following well-known example of a
disgruntled application programmer. This individual, who will remain unnamed had not gotten a pay
raise for the last two years, although in this economy thats not uncommon. Since this individual was
responsible for the payroll system, they added some code to the program which generated the checks to
do a simple thing. Each time the net amount ended in $.13 (thirteen cents), a second check made out to
this individuals alter ego would be added to the check run, then direct deposited to a separate account at
an out-of-state bank. Since this didnt happen every payroll if was almost a year before the company
discovered the error and still another year before they could identify the guilty. So as an IT auditor, where
is your concern? Unauthorized software changes, lack of change management, lack of reconciliation
(wait a minute thats the financial auditors job), separation of duties (the individual had to add his own
alter ego employee record to the master file and add the out-of-state bank direct deposit). So you can see
some of the concerns when it comes to software, programming and processing.
Distributed systems and client/server technology offer the IT auditor some distinct challenges when it
comes to CIA (Confidentiality, Integrity, and Availability). Take for example a point-of-sale system in a
retail store; and lets suppose that the POS system allows for data entry even when connectivity to the
host server at home office is interrupted. How do you as an IT auditor ensure confidentiality of the data
(customer paid with a credit card) when the data is transmitted once the home office system comes back
online? How do you ensure integrity of the data being transmitted and last but not least what do you need
to check to make sure the local store system is available if the home office system goes down? These are
just a few of the considerations you as an IT auditor will need to be familiar with in a distributed
environment and in a client/server architecture. Just as an example of how easy it is to social engineer a
distributed system, consider this example. In the shoe department of a retailer, the store clerk is signed on
to the cash register; the customer asks the clerk to get a specific size and style from the back room and
states they are in a hurry and flashes a wad of cash, stating Theres an extra $20 if you can hurry. The
clerk literally runs to the back room, but forgot to sign off. When they come back, not only is the

customer gone, so is the cash from the cash drawer. The IT audit concern? The obvious one here is
security awareness training.
Youll hear a lot about man-in-the-middle (MITM) attacks as they relate to network connectivity and it
represents a real security issue. But thats just one aspect of network connectivity. Another better even
more common basic security issue is remote access. With the price of gasoline approaching $4.50/gallon
a lot of employee are putting pressure on their bosses to let them work from home, to access the system
remotely, and to let them take their company laptop home with them so they can work more efficiently.
OK, so put on your IT auditor hat and lets look at some issues:

Was remote access approved?

Is remote access via an encrypted communication link?
Is the laptop protected (key lock, hard disk encryption)?
Is there a screen saver and is it password protected?
Is two-factor authentication required to access the network?
Is network access restricted from known locations?
If modems are used is call-back employed?

When we talk about IT system maintenance, patch management, and security, we speak of change
management and configuration control. Are the systems being maintained and are patches being applied
(after they have been tested of course) in a timely manner? What is the security that surrounds system
maintenance? Can anyone, for example, update the anti-virus definition file? Who checks to see if this is
done, assuming that the end-user is the one responsible for updating the .dat file? And the most basic and
fundamental question of all, Do all and I repeat ALL changes go through change management?
Remember, your IT technology audit strategy is to look at what the company says they are going to do;
then perform an audit to see if they are in fact doing what they said they were going to do; including
asking the company or auditee to prove that they did the action and then to report any discrepancies to
management. However, you should also keep in mind that it might be in the companys best interest if
you include in your report if something is not being done and it could potentially affect the
confidentiality, integrity, and availability of the companys systems and data. Just because a company
says, were not going to review login attempts because its a waste of time, doesnt mean you shouldnt
point out the risk of not doing this activity. After all, isnt that what IT auditing is all about making
management aware of the risks they are taking by doing or not doing a particular action?

Risk Indicators for Computer Systems Assisted Financial

Kuang-Hsun Shih Pages 97-106 | received 17 Oct 2009, Accepted 14 Jan 2010, Published online: 11 Dec
Financial examination is the most important link of financial supervision. In response to increased daily
financial transactions, it has become an inexorable trend to apply processing, storage, statistical, and
analytical functions of computer information systems and auditing software to assist financial examiners.
Based on current computer-assisted financial examination conditions, Basel Accord II, and relevant
auditing standards, this study reviews literature, and classifies the risk factors of computer-assisted
implementation of financial examination tasks into various basic perspectives: internal, credit risks,
operating risks, examination operations, computer-assisted, information system environment,

management, and supervision perspectives. Then, by adopting the method of Ou Yang et al. [14], which
integrate the DEMATEL (Decision Making Trial and Evaluation Laboratory) and ANP (Analytic Network
Process) methods, this study discusses the cause-effect relationship and the relative importance among
computer-assisted financial examination risk factors.
The findings of this study suggest that, among all the computer-assisted financial examination risk
factors, six indicators, including auditee's operational flow structure, internal auditing performance,
selection of risk assessment model, client default risk management, possible errors in omissions in
checking transaction tracks by the financial examiner, and lack of a computer-assisted auditing system,
are the most important factors, accounting for 50.89% of the significance of all risk factor perspectives.
When a financial supervisory unit uses a computer system to conduct financial examinations, experienced
financial examiners conduct preliminary checks on the auditee's regulation compliance, organizational
management structure, employee risk perception education, and VaR (Value at Risk) method of
estimation; and then, the computer-assisted audit system is employed for effective auditing to enhance the
quality of financial examination. Besides, strengthening hardware and software equipment, professional
training of financial examiners, internal management, and supervision procedures would effectively
reduce check risks, and considerably improve the quality of computer-assisted financial examinations.

A Review of Expert System in Information System Audit A. B. Devale

# , Dr. R. V. Kulkarni* # Arts, Commerce & Science College, Palus, Maharashtra, India
* Shahu Institute of Business Education & Research (SIBER), Kolhapur, Maharashtra, India
Information system auditors main objective is to formulate an opinion about the effectiveness and the
contribution of information systems to enterprise objective. His or her judgment can be influenced by
factors such as his knowledge of the organization information systems, and the degree of risk of
misstatement through errors. More generally, the purpose of an information technology (IT) audit is to
evaluate IT controls. IT auditor assesses and advises on the following aspects of information technology:
effectiveness, s, efficiency, exclusiveness, etc. Although there is no common understanding regarding the
appropriate evaluation theory, however, there are three main concepts that structure the audit process:
information systems processes and domains, audit criteria, and audit framework. IT auditor should have
IT, financial, and operational audit experience. The ideal IT auditor should be able to discuss IP routing
with the network folks in one hour and financial statement disclosures with the controller in the next.
And, as with all audit positions, communication and other soft skills are crucial as well. The growing
complexity and vulnerabilities of computer networks requires that all auditors have some degree of
technical expertise.
The use of Computer Aided Audit Techniques in an IS Audit
The Information Systems Audit Standards require the course of an audit, the IS auditor should obtain
sufficient, reliable and relevant evidence to achieve the audit objectives. The audit findings and
conclusions are to be supported by the appropriate analysis and interpretation of this evidence. Computer
Assisted Audit Techniques are important tools for the IS auditor in performing audits. They include many
types of tools and techniques, such as generalized audit software, utility software, test data, application
software tracing and mapping, and audit expert systems. Computer Aided Audit Techniques may be used

in performing various audit procedures including: Tests of details of transactions and balances
(Substantive Tests)

Analytical review procedures

Compliance tests of IS general controls
Compliance tests of IS application controls

Computer Aided Audit Techniques may produce a large proportion of the audit evidence developed on IS
audits and, as a result, the IS auditor should carefully plan for and exhibit due professional care in the use
of Computer Aided Audit Techniques. The main difficulty and modelling and formalizing knowledge in
the audit field is the complexity of information system audit, which requires performing of some expertise
that use knowledge from separate or interrelated field of knowledge. The purpose of this paper is to
examine the current state of expert systems and decision support systems in auditing. In so doing we will
examine completed or prototype expert systems and decision support systems in both external and
internal auditing, including special areas of focus such as EDP auditing and governmental auditing. This
paper focuses on those auditing based systems those have appeared in the literature or have been
presented at a conference or of which the authors are aware.
Denning [1987]
He has discussed such a system designed to protect the operating system. That system is based on the
hypothesis that exploitation of systems involves abnormal use of the system. Thus, by detecting abnormal
use of the system, security violations can be detected.
Kelly [1984, 1987]
He developed a prototype model ICE (Internal Control Evaluation) to aid in the audit planning process.
ICE featured a knowledge hierarchy of three different levels. The first level included knowledge about the
industry, economy, management and the audit history. The second level focused on the client
environment, the organization, planning manuals and accounting procedures. The third level focused on
internal control functions in the purchasing process. ICE was developed using USP. Unlike most expert
systems, ICE made use of both frames and rules.
Selfridge and Biggs [1988]
Through his research, it was reported that there were six categories of knowledge, including even1s, interevent causality, company function (financial model and operations model), events/financial performance
causality, measures of financial performance and going concern problems. In that model there were 140
event frames and 215 entity frames.
Dillard and Mutchler [1986]
They have done extensive work in the area of modelling the auditor's going concern opinion decision.
Their system was developed on a DEC 2060 using a menu shell, XINFO. The system apparently employs
approximately 450 decision frames or nodes in a decision tree. The intelligence in the system is in the
decision structure and hierarchy.
Bailey et aI. [1985]

He proposed first auditing-based system TICOM which was to implement artificial intelligence
techniques in the system. TICOM (The Internal Control Model) is an analytic tool that aids the auditor in
modelling the internal control system and querying the model in order to aid the auditor in evaluating the
internal control system. TICOM was implemented in Pascal.
Arthur Young (1985) He has taken a single product, multiple component, middle out strategy in the
development of their decision support system, AY/ASQ. AY/ASQ is software designed to automate the
audit process for manufacturing environments. AY/ASQ was developed in an Apple Macintosh
environment. The operation for each of the applications is similar to the other applications. The system
consists of several modules including Decision Support, Office, Trial Balance, Time Control and Data
His research address the problem of determining "Which Organizations should be audited to achieve the
maximum collection of monies due to the state of Pennsyl-vania?" The general research goal of the paper
is to determine how a computer program can be programmed to learn. In order to accomplish that goal
they chose a genetic learning approach. Genetic algorithms learn by employing different combining rules
on responses, such as inversion and mutation. For example, the system may combine the two sets of
characteristics abc and cde to form abe, in its search for a better set of characteristics.
Vasarhelyi [1988]
He argued that recent advances in hardware and software technology are engendering increasingly
complex information systems environments, thus, requiring increasingly complex audit approaches. He
suggests that audit management utilize decision support systems, management information systems and
management science models to identify and project the deterioration of controls that can occur between
audit engagements.
Denna, E. L., Hansen, J. V., Meservy, R. D.[1991]
The authors evaluate research and development in the design of expert systems for the audit domain,
providing an overview of the domain of expert judgment involved in the audit process. A framework used
to present and analyse work to date and to guide future efforts is constructed. Methods of knowledge
acquisition being used to develop audit applications are examined. The authors address knowledge
representation for the audit domain.
Zong-pu Jia, Zhi-lin Yao, [2005]
They carried Network security research which is an important aspect of CSCW (computer supported
cooperative work). It helps to make work environment essential and reliable. Access control security
technology mainly includes firewall technology, intrusion detection technology, and security auditing
technology. These technologies still have some problems and shortages though they are matured in some
degree. The expert system for preventing and auditing intrusion is a series of software and hardware
systems for reducing the risk of computer network security.
K. Kozhakhmet, G. Bortsova, A. Inoue, L. Atymtayeva [1996]
They commented that Information security auditing plays key role in providing any organizations good
security level. Because of the high cost, time and human resource intensiveness, audit expenses
optimization becomes an actual issue. One of the solutions could be development of software that allows

reducing cost, speeding up and facilitating Information Security audit process. We suggest that fuzzy
expert systems techniques can offer significant benefits when applied to this area.
Davis et al. (1997) He presented a construction of a prototype, which integrated an ES and an ANN. The
rules were contained in the ES model basic CRA heuristics, thus allowing for efficient use of well-known
control variable relationships. The ANN provided a way to recognize patterns in the large number of
control variable inter-relationships that even experienced auditors could not express as a logical set of
specific rules. The ANN was trained using actual case decisions of practicing auditors. The input variables
were judgment cues/variables from general environment, computer processing, general computer and
accounting controls. The ANN model provided the auditor with information on how close a risk category
border was
Ramamoorti (1999)
He used both quantitative (26 variables) and qualitative (19 variables) risk factors as input variables in the
models. The risk was defined in an internal auditing context. The models were in the context of a public
state university. The quantitative data were downloaded from the University of Illinois Financial and
Administration System. The qualitative risk factor values were elicited from audit staff using a predefined scale from 0 to 9. The eventual number of variables selected to construct the models were in the 7
to 18 range. The research project included a Delphi study and a comparison with statistical approaches,
and presented preliminary results, which indicated that internal auditors could benefit from using ANN
technology for assessing risk.
Moore (1995)
He commented that to develop an ANN to serve as either a hand-written character or speech recognition
device and to integrate the ANN with existing software (for example, word processor, spreadsheet, etc.)
might be useful for authority checking. An auditor may analyze minutes and other documents of the entity
with an ANN.
Curry and Peel (1998)
They provided an overview of the ANN modelling approach and the performance of ANNs, relative to
conventional ordinary least squares (OLS) regression analysis, in predicting the cross-sectional variation
in corporate audit fees (AF). The data was derived from a sample of 128 unquoted UK companies
operating in the electronic industrial sector.

Moeller (1988)
He relates the audit of expert systems with existing audit literature, and discusses the relationship of
Statement on Auditing Standards (SAS) as part of his discussion; he notes that "while there is a growing
body of other literature covering the auditor's use of expert systems, there is very little published material
on audit techniques for reviewing expert systems." Moeller's paper summarizes many techniques and
points to a body of work on those techniques that has developed recently: the literature on verification and
Kick (1989)
He discusses some of the risk exposures associated with expert systems resulting from a loss of strategic
or competitive position, an inability to sustain growth, and a loss of strategic knowledge. The primary

emphasis in his report is on ensuring that the auditor examine expert system applications to determine
whether they are properly applied; are deployed to gain strategic advantage; are cost-effective; are well
designed and operationally efficient; minimize exposure to fraud, poor decision making, and other
consequences; are used by individuals who are properly trained; are easy to maintain; and are continually
Watne and Turney (1990)
They briefly analyzed expert systems as a target of an audit. They suggested that systems that directly
impact the balances in the financial statements or that provide information to the auditor individually be
the potential targets of audits. They also analyze some of the controls in expert systems using a structure
based on general and application controls.
McKee's (1991)
His research suggested that AICPA Statements on Standards for Attestation Engagements (SSAE),
Attestation Standards might play a critical role in the audit of expert systems, and although it applies to
independent CPAs, it may also provide useful guidance to internal auditors. Attestation Standards
indicates that an audit should be done by someone who has adequate technical knowledge and
proficiency, in this case in expert systems verification and validation, and in the specific domain.

IS Audit Basics: What Every IT Auditor Should Know About

Computer-generated Reports
Tommie Singleton, CISA, CGEIT, CPA
So many times, auditors of all types use a computer-generated report to perform some aspect of
assurance. For example, financial auditors may pull a computer-generated list of accounts receivable (i.e.,
subsidiary listing) and use it to confirm receivables. IT auditors sometimes do the same thing with lists of
access, logs or other reports relevant to IT audits. A popular use today is to generate data sets (a similar
resulting object) to conduct data mining or data analytics.
It is tempting to look at a neat report that came from a computer and to have a leap of faith as to the
veracity and reliability of the information of that report. Standard setters have realized the fallacy of that
thinking and have issued guidance to auditors regarding computer-generated reports. The Public
Company Accounting Oversight Board (PCAOB) inspection reports show that one major area of
deficiency in financial audits of issuers is not gaining assurance regarding the accuracy and completeness
of the reports information.
The Goal
The goal has been stated by standard setters. It is the completeness and accuracy of the information in the
report upon which the auditor is relying. Accuracy alone is insufficient. One needs to obtain assurance
about both the completeness and the accuracy.
The US Government Accountability Office (GAO) uses data reliability to refer to the accuracy and
completeness of data. They define data reliability as sufficiently reliable data, not sufficiently reliable
data and data of undetermined reliability. A determination of data reliability should lead to the

assessment of assurance on accuracy and completeness of a computer-generated report from these data,
although it may be necessary to couple that with another test for report settings. It is possible for data to
be reliable for one particular purpose but not reliable for another because of differences in data fields.
Consider the example of a financial audit. The financial auditor might use a key report from the
information system (i.e., computer) as the key information or an important audit procedure. In this case,
the reliance upon the information is critical to the conclusions about the assertion of the account balance,
class of transactions or disclosure being tested.
Consider an IT audit. The same result is true. If the IT auditor is using a computer-generated list of credit
card charges (or similar financial data), or a list of users and accesses, the conclusion after testing is
highly dependent on the accuracy and completeness of the information being used.
Therefore, the IT auditor will want to first look at the computer-generated report and figure out why it is
appropriate, with specificity, to rely on the report and why the completeness and accuracy of the report is
There are generally two ways to gain assurance for completeness and accuracy. One is to compare the
report to information or data external to the system and the other is to compare the report to the internal
The best way to get assurance from a computer-generated report is to compare it for completeness and
accuracy against data/information independent of the computer system. For instance, if the entity
produces something and has a standard rate or formula for billing, there is operational data to support the
amount reflected in those billings. That information could be used for completeness and accuracy of a
listing of billings by making some simple calculations. It is possible external information exists in other
repositories as well. Other similar tests would include tests such as the following:

Trace a suitable sample of transactions in the system to the source documents for accuracy:

Trace a suitable sample of source documents to the data for completeness.

Look for existing internal tests, reconciliations or reviews of data and/or reports that could
support accuracy and/or completeness of the data and/or report:

Reconciliations are valid only if variances are researched, explained and resolved in a
timely manner.

A review of reconciliations can substantiate accuracy of the reconciliation.

When external information is not readily available, the comparison would need to be the report against
the database in the system. The following are examples of how that can be accomplished:

Take a suitable sample of transactions from the report and trace them to the internal transactions
for accuracy. (Completeness would need to be a different test.)

Test application control(s) over the transactions for completeness and/or accuracy depending on
the nature of the control(s).

Test internal controls where the purpose is to ensure data reliability on a target data file.

Examine the report settings, especially queries and custom reports, for correctness as to
completeness. Accuracy may need to be assessed via a different test:

Test for the data file being used to make sure it is the appropriate one.

Test filters being used for types of transactions, dates, etc.

Test the change management process for the report, if applicable.

Use data analytics to determine the reliability of the underlying data:


Test key fields to identify issues with the fields that would materially affect accuracy
and/or completeness. For example, verify that all sales records contained valid types of

Establish some criteria for expected results and compare to actual results for accuracy.

Consider separate verification of proper settings for the report itself.

Sometimes a test performed might provide assurance for completeness but not accuracy, and vice versa.
For instance, in confirming receivables, the auditor may not have assurance of accuracy and completeness
over the list of subsidiary accounts and decide to confirm a high percentage of accounts (for example, 80
percent) as a compensating test. However, that test only confirms accuracy and not completeness.
Also, a paperless transaction or system will not have source documents from which to test data or the
report. In the case of the latter, internal controls are critical to obtaining assurance about accuracy and
Finally, sometimes one cannot attain sufficient assurance about the accuracy and completeness of the data
and report, as indicated by the ratings the GAO uses for data reliability. When that happens, what do
auditors do with the report?
They select an alternative approach. For instance, there are two ways to confirm receivables. The first,
confirmation letters, uses a list of subsidiary accounts. If accuracy and completeness of that list cannot be
attained, the alternative confirmation is subsequent payments.
With the combined growth in computer-generated reports, and the growing attention by reviewers and
standard setters on the accuracy and completeness of reports used in audits, there is a need to understand
the situation and to develop a framework for obtaining that assurance. Obviously, the key is to, first, use a
valid source for testing and, second, obtain assurance for both.

Tommie Singleton, CISA, CGEIT, CPA, is the director of consulting for Carr Riggs & Ingram, a large
regional public accounting firm. His duties involve forensic accounting, business valuation, IT assurance
and service organization control engagements. Singleton is responsible for recruiting, training, research,
support and quality control for those services and the staff that perform them. He is also a former
academic, having taught at several universities from 1991 to 2012. Singleton has published numerous
articles, coauthored books and made many presentations on IT auditing and fraud.


Computerized information systems, particularly since the arrival of the Web and mobile computing, have
had a profound effect on organizations, economies, and societies, as well as on individuals whose lives
and activities are conducted in these social aggregates.
Organizational impacts of information systems
Essential organizational capabilities are enabled or enhanced by information systems. These systems
provide support for business operations; for individual and group decision making; for innovation through
new product and process development; for relationships with customers, suppliers, and partners; for
pursuit of competitive advantage; and, in some cases, for the business model itself (e.g., Google).
Information systems bring new options to the way companies interact and compete, the way organizations
are structured, and the way workplaces are designed. In general, use of Web-based information systems
can significantly lower the costs of communication among workers and firms and cost-effectively
enhance the coordination of supply chains or webs. This has led many organizations to concentrate on
their core competencies and to outsource other parts of their value chain to specialized companies. The
capability to communicate information efficiently within a firm has led to the deployment of flatter
organizational structures with fewer hierarchical layers.
Nevertheless, information systems do not uniformly lead to higher profits. Success depends both on the
skill with which information systems are deployed and on their use being combined with other resources
of the firm, such as relationships with business partners or superior knowledge in the industrial segment.
The use of information systems has enabled new organizational structures. In particular, so-called virtual
organizations have emerged that do not rely on physical offices and standard organizational charts. Two
notable forms of virtual organizations are the network organization and the cluster organization.
In a network organization, long-term corporate partners supply goods and services through a central hub
firm. Together, a network of relatively small companies can present the appearance of a large corporation.
Indeed, at the core of such an organization may be nothing more than a single entrepreneur supported by
only a few employees. Thus, network organization forms a flexible ecosystem of companies, whose
formation and work is organized around Web-based information systems.
In a cluster organization, the principal work units are permanent and temporary teams of individuals with
complementary skills. Team members, who are often widely dispersed around the globe, are greatly
assisted in their work by the use of Web resources, corporate intranets, and collaboration systems. Global
virtual teams are able to work around the clock, moving knowledge work electronically to follow the

sun. Information systems delivered over mobile platforms have enabled employees to work not just
outside the corporate offices but virtually anywhere. Work is the thing you do, not the place you go to
became the slogan of the emerging new workplace. Virtual workplaces include home offices, regional
work centres, customers premises, and mobile offices of people such as insurance adjusters. Employees
who work in virtual workplaces outside their companys premises are known as teleworkers.
The role of consumers has changed, empowered by the Web. Instead of being just passive recipients of
products, they can actively participate with the producers in the cocreation of value. By coordinating their
collective work using information systems, individuals created such products as open-sourcesoftware and
online encyclopedias. The value of virtual worlds and massively multiplayer online games has been
created largely by the participants. The electronic word-of-mouth in the form of reviews and opinions
expressed on the Web can make or break products. In sponsored cocreation, companies attract their
customers to generate and evaluate ideas, codevelop new products, and promote the existing goods and
services. Virtual customer communities are created online for these purposes.
Information systems in the economy and society
Along with the global transportation infrastructure, network-based information systems have been a factor
in the growth of international business and corporations. A relationship between the deployment of
information systems and higher productivity has been shown in a number of industries when these
systems complement other corporate resources. Electronic commerce has moved many relationships and
transactions among companies and individuals to the Internet and the Web, with the resulting expansion
of possibilities and efficiencies. The development of the Internet-based ecosystemaccompanied by the
low cost of hardware and telecommunications, the availability of open-source software, and the mass
global access to mobile phoneshas led to a flowering of entrepreneurial activity and the emergence to
prominence and significant market value of numerous firms based on new business models. Among the
examples are electronic auction firms, engine firms, electronic malls, social network platforms, and online
game companies. Because of the vast opportunities for moving work with data, information, and
knowledge in electronic form to the most cost-effective venue, a global redistribution of work has been
taking place.
As the use of information systems became pervasive in advanced economies and societies at large, several
societal and ethical issues moved into the forefront. The most important are issues of individual privacy,
property rights, universal access and free speech, information accuracy, and quality of life.
Individual privacy hinges on the right to control ones personal information. While invasion of privacy is
generally perceived as an undesirable loss of autonomy, government and business organizations do need
to collect data in order to facilitate administration and exploit sales and marketing opportunities.
Electronic commerce presents a particular challenge to privacy, as personal information is routinely
collected and potentially disseminated in a largely unregulated manner. The ownership of and control over
the personal profiles, contacts, and communications in social networks are one example of a privacy issue
that awaits resolution through a combination of market forces, industry self-regulation, and possibly
government regulation. Preventing invasions of privacy is complicated by the lack of an international
legal standard.
Intellectual property, such as software, books, music, and movies, is protected, albeit imperfectly, by
patents, trade secrets, and copyrights. However, such intangible goods can be easily copied and
transmitted electronically over the Web for unlawful reproduction and use. Combinations of legal statutes
and technological safeguards, including antipiracy encryption and electronic watermarks, are in place, but
much of the abuse prevention relies on the ethics of the user. The means of protection themselves, such as
patents, play a great role in the information society. However, the protection of business methods

(e.g., Amazons patenting of one-click ordering) is being questioned, and the global enforcement of
intellectual property protection encounters various challenges.
Access to information systems over the Web is necessary for full participation in modern society. In
particular, it is desirable to avoid the emergence of digital divides between nations or regions and between
social and ethnic groups. Open access to the Web as a medium for human communication and as a
repository for shared knowledge is treasured. Indeed, many people consider free speech a universal
human right and the Internet and Web the most widely accessible means to exercise that right. Yet,
legitimate concerns arise about protecting children without resorting to censorship. Technological
solutions, such as software that filters out pornography and inappropriate communications, are partially
Of concern to everyone is the accuracy and security of information contained in databases and data
warehouseswhether in health and insurance data, credit bureau records, or government filesas
misinformation or privileged information released inappropriately can adversely affect personal safety,
livelihood, and everyday life. Individuals must cooperate in reviewing and correcting their files, and
organizations must ensure appropriate security, access to, and use of such files.
Information systems have affected the quality of personal and working lives. In the workplace,
information systems can be deployed to eliminate tedious tasks and give workers greater autonomy, or
they can be used to thoughtlessly eliminate jobs and subject the remaining workforce to pervasive
electronic surveillance. Consumers can use the Web for shopping, networking, and entertainmentbut at
the risk of contending with spam (unsolicited e-mail), interception of credit card numbers, and attack
by computer viruses.
Information systems can expand participation of ordinary citizens in government through electronic
elections, referendums, and polls and also can provide electronic access to government services and
informationpermitting, for instance, electronic filing of taxes, direct deposit of government checks, and
viewing of current and historical government documents. More transparent and beneficial government
operations are possible by opening the data collected by and about governments to public scrutiny in a
searchable and easy-to-use form. With the Web, the public sphere of deliberation and self-organization
can expand and give voice to individuals. However, information systems have also conjured Orwellian
images of government surveillance and business intrusion into private lives. It remains for society to
harness the power of information systems by strengthening legal, social, and technological means.
With the exponentially growing power of computers, driven by Moores law, and the development of ever
more-sophisticated softwarein particular, systems deploying the techniques of artificial
intelligence (AI)job markets and professions have been affected. Flexible and
inexpensive robotics reduces some opportunities in the labour markets. Cognitive computing, with
systems relying on AI techniquessuch as computer learning, pattern recognition in multiple media, and
massive amounts of stored informationemerged as a competitor to human professionals.
The emergence of the on-demand economy, enabled by information system platforms, has raised
concerns about the quality of jobs. Providing instant access to services, such as transportation, the
platforms (for example, Uber and Lyft) connect the service suppliers, usually individuals, with those
seeking the service. Although claimed to erode stable workplaces, such business models offer flexibility, a
larger measure of independence to the suppliers, and convenience to the demanders.
Information systems as a field of study

Information systems is a discipline of study that is generally situated in business schools. The essential
objective of the discipline is to develop and study the theories, methods, and systems of using information
technology to operate and manage organizations and to support their marketplace offerings. The
discipline employs a sociotechnical approach, placing the study of information technology in the context
of management, organizations, and society. The academic study of information systems originated in the
1960s. The scholarly society furthering the development of the discipline is the Association for
Information Systems (AIS).

Business Analytics: Just another passing Fad?

Jane Griffin, Principal, Deloitte Consulting LLP

Business analytics has the potential to be a fad but only if thats how your organization approaches it. If
you go just far enough to check the business analytics box on a strategic plan buying some software,
adding analytics staff here and there youll get exactly what you put into it, or less. Window dressing
that delivers shallow capabilities is on the fast track to irrelevance.
The executives I talk to every day are wrestling with business decisions where a better understanding of
data at a very deep level can make all the difference. Take risk management and fraud detection, for
example. It used to be that banks were the companies that had advanced analytics capabilities in place to
keep fraud in check. But now other industries such as health care, life sciences and defense are using
advanced analytics in similar ways to detect patterns that show something is amiss. Corrupt practices,
money laundering, embezzling, IP theft and more. Low-level analytics just wont get you there.
Work your way through the list of ground-shaking developments in business today none are areas where
companies can continue to shoot from the hip. Pricing. Workforce trends. Health reform. Even security
and terrorism threats. These are all complex challenges where advanced signal detection capabilities are
critical. And the new generation of business analytics tools can bring those capabilities within easier reach
than ever as long as you dont treat them like just the latest round of business hype. When business
analytics technologies are hardwired into your business processes, the result can be a sharper view of the
patterns and signals buried deep below the surface of your data. Thats no fad. Thats a serious
competitive advantage.

A retail perspective
Mary Delk, Director, Deloitte Consulting LLP

The retailers I talk with think business analytics is the real deal. Their problem is figuring out how to
make it work in organizations that were built with a whole different approach in mind. Retail has always
been about intuition in a world of fickle customer desires, the person who can predict the next big thing
is the one who wins. No algorithm could ever replace that. Right? Instinct still has a part to play in retail,
but the kind of predictive insight that can be obtained from business analytics is already proving to be a
game-changer for some of the leading retail companies. Here are some of the things theyve
Link analytics goals to business drivers.
For instance, if your strategy is to be the lowcost provider, it may make
sense to focus on supply chain analytics to reduce cycle times and cost. Wherever you decide to
begin in applying deep
analytics, keep it relevant especially when youre getting started.
Make the case. A business case will help focus your efforts and help ensure youre getting the
returns you expect.

Know your data. Good data is the fuel for insight. Make sure you know what you have and
what you dont have
from the outset. It can save you a lot of grief down the road.
Start simple. While its tempting to apply analytics in every corner of your business, start by
picking your battles
carefully. Put your energy into projects that can deliver measurable benefits quickly. A lot of
retailers we work with start
with pricing or customer analytics.
Use what you have. Youve been collecting a lot of data already, right? Feed it into your new
business analytics tools.
Run a pilot. Dont feel you need to roll out an enterprise initiative when youre just getting
started. Create manageable
pilots to tests hypotheses in the early stages and build out from there.
Analytics can help retailers make smarter choices that lead to real business value. Organic sales growth.
Margin increases. Reductions in costs or spending. Talent retention. You name it and business analytics
can probably help. But not if you sit on the sidelines and wait for others to show the way.

An insurance perspective
Tami Frankenfield, Specialist Leader, Deloitte Consulting LLP

A passing fad or the real deal? To some extent, we have been doing one or more forms of business
analytics in the insurance industry for some time. But have we really been leveraging our capabilities to
gain both insight and foresight into our business? Business analytics is as much about looking into the
future as it is about evaluating the past.
For the insurance industry, business analytics is about embedding insight into our operations to drive
business strategy. So, what does that mean? It means that we can leverage the wealth of data we have to
guide and evolve our decision making processes; processes such as underwriting a policy, establishing a
reserve for a claim, or presenting the next leading product to up-sell or cross-sell to a customer.
Through business analytics, we can analyze segments of our business with a Business Intelligence tool,
perhaps providing insight into pockets of our business that indicate adverse selection. We can then
leverage this information to hone our underwriting guidelines. We can do this directly in the underwriting
process through embedded analytics.
Through business analytics, we can leverage predictive models derived from claims analysis to refine the
way we reserve claims at first notice of loss. Those same capabilities (and data) can be leveraged to detect
claims fraud. Better yet, we can detect the indication of fraud directly in the loss report process.
Through business analytics, we can identify the buying patterns of our existing customers and derive
propensity models. What better way to leverage that information than use it to predict the behaviors of our
prospective customers? Take it one step further and we can combine the propensity scores with behavior
analytics and embed these into our sales process, increasing the likelihood of presenting the right product
to the right customer at the right time.
Business analytics allows us to leverage our information assets to gain insights, detect exceptions and
identify patterns to drive integrated decision-making. Is the road ahead really that steep? The fact is that
many insurance companies have the data foundation in place to enable business analytics. The next step is
to capitalize on that foundation and take action through business analytics.

A life sciences perspective

W. Scott Evangelista, Principal and National Life Sciences Commercial Solutions Leader, Deloitte
Consulting LLP

Without committed leadership from the top of an organization, the best answers will not find the light of
day and the my number is better than yours issue will persist. The shackles of the past (standard reports
with standard data) will inevitably bind companies to increasingly failing strategies. I believe it is time
leadership embraces predictive modeling to enable more effective decision making.
So many companies when faced with gradual market shifts and increasing competition or strengthening
barriers keep turning to old solutions and dont recognize they are in the midst of new problems.
Leadership with many companies react so slowly to change that the companies are often in dire straits
before the mandate for change comesusually from the newly appointed CEO.
The one advantage pharmaceutical companies have in leveraging predictive analytics is that most of them
are so far behind the state of what is possible that they can learn quickly from other industries and adopt
tested technologies to facilitate their rapid adoption.
If Pharmaceutical Leadership is not investing today in building broad capabilities in predictive analytics,
they will soon be looking to leverage their vast experience on the lecture circuit. Leadership needs to
embrace the notion that analytics can help them create and find insights that will yield competitive
advantage and even if they dont embrace it, they should at least be willing to do tests of the concept. In
many ways, companies have been outsourcing these capabilities for years to vendors that do very robust
statistical modeling and make recommendations on how resources should be allocated. This results in
many companies getting the same advice (most use the same few vendors) and
having no meaningful advantage.
Its time for a change, the lecture circuit isnt that interesting!

Roundup Of Analytics, Big Data & Business Intelligence Forecasts

And Market Estimates, 2015
Salesforce (NYSE:CRM) estimates adding analytics and Business Intelligence (BI) applications will
increase their Total Addressable Market (TAM) by $13B in FY2014.
89% of business leaders believe Big Data will revolutionize business operations in the same way the
Internet did.
83% have pursued Big Data projects in order to seize a competitive edge.
Despite the varying methodologies used in the studies mentioned in this roundup, many share a common
set of conclusions. The high priority of gaining greater insights into customers and their needs, more
precise information on how to best simplify sales cycles, and streamline customer service are common.
The most successful Big Data uses cases illustrate how enterprises are getting beyond the constraints that
hold them back from being more attentive and responsive to customers.

Presented below is a roundup of recent forecasts and estimates:

Wikibon projects the Big Data market will top $84B in 2026, attaining a 17% Compound Annual
Growth Rate (CAGR) for the forecast period 2011 to 2026. The Big Data market reached $27.36B
in 2014, up from $19.6B in 2013. These and other insights are from Wikibons excellent research of
Big Data market adoption and growth. The graphic below provides an overview of their Big Data
Market Forecast. Source: Executive Summary: Big Data Vendor Revenue and Market Forecast,

IBM and SAS are the leaders of the Big Data predictive analytics market according to the latest
Forrester Wave: Big Data Predictive Analytics Solutions, Q2 2015. The latest Forrester Wave is
based on an analysis of 13 different big data predictive analytics providers including Alpine Data
Labs, Alteryx, Angoss Software, Dell, FICO, IBM,, Microsoft, Oracle, Predixion
Software, RapidMiner, SAP, and SAS. Forrester specifically called out Microsoft Azure Learning is
an impressive new entrant that shows the potential for Microsoft to be a significant player in this
market. Gregory Piatetsky (@KDNuggets) has done an excellent analysis of the Forrester Wave Big
Data Predictive Analytics Solutions Q2 2015 report here. Source: Courtesy of Predixion

Software: The Forrester Wave: Big Data Predictive Analytics Solutions, Q2 2015(free, no opt-in).
IBM, KNIME, RapidMiner and SAS are leading the advanced analytics platform market
according to Gartners latest Magic Quadrant. Gartners latest Magic Quadrant for advanced
analytics evaluated 16 leading providers of advanced analytics platforms that are used to building
solutions from scratch. The following vendors were included in Gartners analysis: Alpine Data
Labs, Alteryx, Angoss, Dell, FICO, IBM, KNIME, Microsoft, Predixion, Prognoz, RapidMiner,
Revolution Analytics, Salford Systems, SAP, SAS and Tibco Software, Gregory Piatetsky
(@KDNuggets) provides excellent insights into shifts in Magic Quadrant for Advanced Platform
rankings here.

Source: Courtesy of RapidMiner: Magic Quadrant for Advanced Analytics

Platforms Published: 19 February 2015 Analyst(s): Gareth Herschel, Alexander Linden, Lisa

Kart(reprint; free, no opt-in).

Salesforce estimates adding analytics and Business Intelligence (BI) applications will increase
their Total Addressable Market (TAM) by $13B in FY2014. Adding new apps in analytics is
projected to increase their TAM to $82B for calendar year (CY) 2018, fueling an 11% CAGR in
their total addressable market from CY 2013 to 2018. Source: Building on Fifteen Years of

Customer Success Salesforce Analyst Day 2014 Presentation (free, no opt in).
89% of business leaders believe big data will revolutionize business operations in the same way
the Internet did. 85% believe that big data will dramatically change the way they do business. 79%
agree that companies that do not embrace Big Data will lose their competitive position and may
even face extinction. 83% have pursued big data projects in order to seize a competitive edge. The
top three areas where big data will make an impact in their operations include: impacting customer
relationships (37%); redefining product development (26%); and changing the way operations is

organized (15%).The following graphic compares the top six areas where big data is projected to
have the greatest impact in organizations over the next five years. Source: Accenture, Big Success

with Big Data: Executive Summary (free, no opt in).

The global Big Data market is projected reach $122B in revenue by 2025. Frost & Sullivan also
forecasts the global data traffic will cross 100 Zettabytes annually by 2025. Source: Worlds Top

Global Mega Trends To 2025 and Implications to Business, Society and Cultures.
The global text analytics market has a potential to reach $6.5B by 2020, attaining a CAGR of
25.2% from 2014 to 2020. Customer Relationship Management (CRM), predictive analytics and
brand reputation are the top three projected applications. The following graphic provides an
overview of key findings from the report. Source: Global Text Analytics Market: Text Analytics,

Visualizing and Analyzing Open-Ended Text Data.

Customer analytics (48%), operational analytics (21%), and fraud & compliance (21%) are the
top three use cases for Big Data. Datameers analysis of the market also found that the global
Hadoop market will grow from $1.5B in 2012 to $50.2B in 2020, and financial services, technology
and telecommunications are the leading industries using big data solutions today. Source: Big Data:

A Competitive Weapon for the Enterprise.

37% of Asia Pacific manufacturers are using Big Data and analytics technologies to improve
production quality management. IDC found manufacturers in this region are relying on these
technologies to reduce costs, increase productivity, and attract new customers. Source: Big Data and

Analytics Core to Nex-Gen Manufacturing.

Supply chain visibility (56%), geo-location and mapping data (47%) and product traceability data
(42%) are the top three potential areas of Big Data opportunity for supply chain management.
Transport management, supply chain planning, & network modeling and optimization are the three
most popular applications of Big Data in supply chain initiatives. Source: Supply Chain Report,

February 2015.
Finding correlations across multiple disparate data sources (48%), predicting customer behavior
(46%) and predicting product or services sales (40%) are the three factors driving interest in Big
Data analytics. These and other fascinating findings from InformationWeeks 2015 Analytics & BI
Survey provide a glimpse into how enterprises are selecting analytics applications and platforms.

Source: Information Week 2015 Analytics & BI Survey.

Banking, communications, media, utilities and wholesale trade increased their use of Big Data
analytics the most in the last twelve months. Gartners latest survey of big data adoption completed
in the 4th quarter of 2014 found that columnar/in-memory databases, high-capacity data warehouses,
search-based indexes and log data analytics all experienced greater than 5% growth in references
reporting active use. Source: Survey Analysis: BI and Analytics Spending Intentions, 2015, 13 May

2015, Report G00274741 Analyst(s): Josh Parenteau & Rita L. Sallam (client access).
IDC predicts the global Big Data and Analytics market will reach $125B in hardware, software
and services revenue this year. The professional services-to-technology ratio will also increase 25%

this year, due to increasingly complex system configurations and enterprise-wide solutions. IDC also
predicts the Internet of Things (IoT) market will also attain a 30% CAGR from 2014 to 2019.
Source: IDC Predictions 2015: Accelerating Innovation and Growth on the 3rd Platform by

Frank Gens & IDC Predictions 2015 Team (free, no opt-in courtesy of SAP).
Gartner does not endorse any vendor, product or service depicted in its research publications, and
does not advise technology users to select only those vendors with the highest ratings. Gartner
research publications consist of the opinions of Gartners research organization and should not be
construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect
to this research, including any warranties of merchantability or fitness for a particular purpose.