Вы находитесь на странице: 1из 20

WHITE PAPER 

Considerations for Getting Started with AI 
 
Published 2Q 2018 

COMMISSIONED BY:  KEITH KIRKPATRICK 
 
Principal Analyst 
 
 
CLINT WHEELOCK 
Managing Director 
 
Considerations for Getting Started with AI 

SECTION 1 
INTRODUCTION 
Artificial intelligence (AI) is an umbrella term for multiple technologies that are designed to
provide computers with human-like abilities of hearing, seeing, reasoning, and learning.
These techniques, which include machine learning (ML), deep learning (DL), computer vision
(CV), and natural language processing (NLP), unmask hidden patterns in large data sets,
and then, using complex algorithms, can correlate findings between seemingly unrelated
variables.

As AI gains traction, organizations are realizing that only larger-scale, enterprise-wide


deployments are likely to provide full access to the operational and economic benefits of
these new technologies. But enabling AI is not a plug-and-play proposition. Significant time,
resources, and capital must be deployed, and in most cases, internal company teams are
not experienced enough with AI, nor do they have the cutting-edge data science skills,
software development expertise, or experience in selecting the right software platforms,
hardware, and infrastructure to adequately embark upon a truly transformational AI
implementation journey. Nevertheless, organizations can still take advantage of AI by
tapping into internal operational knowledge and external expertise to bring AI solutions to
market in a matter of months.

1.1 WHAT IS AI? 

Figure 1.1 Artificial Intelligence Techniques

AI: Area of computer science that  Artificial
emphasizes the creation of intelligent  Intelligence
machines that work and react like humans
Machine
Learning
Machine Learning: Predict 
future outcomes based on  Deep
past observations from  Learning
big data
Analytics: Search 
Deep Learning:  Train and 
for “what, when,  Big Data and Analytics
use (infer) models that 
where and why” 
mimic the biology of the 
in data
human brain, to interpret 
speech, images and text

(Source: Cray, Inc.)

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

Before deploying an AI solution, organizations need to have a clear understanding of the


underlying technologies and processes used to enable AI, as well as the benefits and
limitations of potential solutions.

 Machine Learning is a type of AI that involves using computerized mathematical


algorithms that can learn from data and can depart from strictly following rule-based,
pre-programmed logic. ML algorithms typically build a probabilistic model and then
use it to make assumptions and predictions about similar sets of data.

 Deep Learning is a form of ML that uses the model of human neural nets to make
predictions about new data sets. Tractica believes this is currently the most
promising of all AI technologies and is advancing other branches of the science,
including cognitive computing, image recognition, and NLP.

 Natural Language Processing enables computers to understand human language


as it is spoken and written, and to produce human-like speech and writing. Machine
translation of one human language into another language is also a form of NLP.

 Computer Vision attempts to identify images of objects that can be seen. It can
also include attempts to use the same technology to identify patterns in data, such
as seismographic readings, which humans cannot see.

In each of the different methods of providing AI technology, the ability of the solution to
provide “intelligence” is dependent upon the data’s quantity (more is better), granularity
(greater segmentation is preferred), and quality (taken from reputable sources). Further, the
complexity and structure of the algorithm itself will impact the results of the AI system, with
the best solutions and algorithms generally being designed or tuned to identify hidden
patterns, connections, and correlations in the data.

Each of these methods can be considered as steps on a continuum that moves from simply
analyzing data to intelligently acting on data without human intervention. This is illustrated in
Figure 1.2.

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

Figure 1.2 The Artificial Intelligence Continuum

(Source: Cray Inc.)

1.2 POTENTIAL BENEFITS OF USING MACHINE LEARNING AND DEEP LEARNING 
ML and DL are likely to be the primary drivers of AI adoption, given that other AI technologies
rely on ML and DL to make sense of the data captured. The key features of ML and DL are
based on the ability to identify patterns in data, connect discrete data elements, and provide
faster and more powerful analysis than humans or static analytics programs. As a result,
enterprises are able to handle routine tasks more quickly and accurately, thereby increasing
productivity and efficiency.

Furthermore, ML and DL can enable the development of systems that permit more intuitive,
human-like processing of information, making it simpler and more intuitive for humans to
interact with machines and technology.

The end benefit for enterprises is the ability to augment or replace functions that are time or
resource intensive with automated, intelligent technology. This can lead to increased
productivity and increased efficiency, and often can open up new technological, product, or
service offerings that can directly improve a company’s bottom line.

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

SECTION 2 
IMPLEMENTATION CONSIDERATIONS 

2.1 UNDERSTANDING THE JOURNEY 
Enterprises need to understand AI in the context of their own business, industry, application
area, and technology choices. What is AI? Am I already using AI within my organization in
the form of business and data analytics? Once these fundamental questions are addressed,
enterprises need to understand that this is not a one-time implementation but rather a
continuous process that will need constant adjustment. Implementing an AI solution or
strategy is perhaps best considered as a continuum or journey, rather than a finite project
with a beginning, middle, and end. While basic timelines, such as project start dates, can be
fixed and adhered to, maximizing the investment in AI requires a larger investment of time
and resources to allow the solution to run, and then time to apply various tweaks or
adjustments to the algorithm or output to ensure it meets the changing needs of a business
or process.

In particular, ML and DL use a constant feedback or learning loop to refine the algorithm and
its output, and therefore, cannot be considered to be a finalized product or solution. Support
and maintenance of the algorithm is crucial to ensuring that the results remain valid and
optimized, particularly if changes to the business or process are being made based on the
ongoing output of the algorithm. This is especially true when using AI to discover new and
more efficient processes, or when using AI to handle real-time or near real-time analysis of
a specific task or process.

2.2 KEY BUSINESS DRIVERS TO JUSTIFY AI INVESTMENT 
The use of AI is commonly tied to a desire to improve the efficiency of a process, product,
or service; reduce the cost of performing a task or process; or to generate additional revenue
or profit based on the improvement of the efficiency of a task or product. These process or
efficiency improvements are largely due to the ability of AI to uncover new and more efficient
ways of handling or doing a task, finding new patterns or correlations in the data, and leading
to more efficient ways of connecting disparate data points; or simply improving on human or
pre-programmed processes by setting up the algorithm to learn from past actions, making
adjustments in a far faster and more data-driven way than possible by humans.

2.3 COMPETITIVE DIFFERENTIATION 
Perhaps the biggest reason AI is projected to gain favor among enterprises is the ability to
differentiate a product or service offering. Using ML or DL, it is possible to tweak or modify
an algorithm to stress or weight a specific factor or factors, thereby changing or biasing the
result. For example, an auto insurance company that wishes to target younger drivers may
wish to tweak its underwriting algorithm to de-emphasize “years licensed” when it considers
an underwriting decision.

AI could also be used to spawn new business ideas or business models, creating new lines
of revenue and spearheading competitive and product differentiation in the marketplace. AI
should be viewed as a cognitive engine rather than just an analytics engine, allowing
organizations to start thinking on the lines of a living and breathing entity that can adapt to
new competitors, prices, customer demand, business models and supply chain disruptions.
AI can help organizations acquire language and vision capabilities, parsing and
understanding documents, images or video, giving them the ability to react and adapt to

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

shifts in business environment like never before. Organizations that are willing to look
beyond cost efficiencies should follow a parallel path of treating AI as a strategic business
tool.

2.4 PROJECT METRICS AND GUIDELINES 
From an organizational and revenue-generating standpoint, AI is still in its infancy. Thus,
there is a wide range of fee and billing structures used by services firms, platform vendors,
and other market participants. However, generally speaking, when an organization sells a
platform or software package and then adds on a services component, there is usually a
software license or service and support model put in place, in order to cover the development
of the software or tools. Some integration and customization service work may be covered
under these fees, but as integrations have become more complex, there is usually an
additional services component that is charged to address the significant labor costs required
to ensure a smooth integration.

While the exact structure and terms of an AI engagement will vary based on the provider,
the client, and the use case, most AI engagements will follow a fairly standard structure.
Generally, projects are structured in a staged approach (see Figure 2.1) , starting with (1)
the research and selection of a use case. This stage of the process is critical, as a clear
business case, along with stated goals metrics, must be decided upon to ensure the project
remains on track.

Then, a proof-of-concept (POC) program is initiated, which requires (2) selecting data
sources to feed into an algorithm, (3) building or customizing a pre-built algorithm to the
customer’s requirements, and then (4) running and testing the solution to ensure it yields
the expected benefits.

Once the POC has been completed, which can take anywhere from about 3 to 4 months,
the client and any services providers will (5) evaluate the program, and decide whether to
expand the program or to try a new use case.

Figure 2.1 Proof-of-Concept Cycle

(Source: Tractica)

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

2.5 OWNERSHIP OF AI 
The ownership of AI within an organization is key to how AI projects get implemented and
managed on a day-to-day basis. Centralized management structures for AI require a Chief
AI Officer in charge of AI projects, spearheading management and implementation of AI
within an organization. Centralized AI management allows for a “paintbrush” approach to AI
where AI can be applied to multiple silos within an organization, but risks central
management not clearly understanding the key business metrics or needs within each of the
silos. Centralized management is, however, a good approach to attracting the best talent,
with AI engineers looking to join a company where the “Head of AI” is someone that has
major standing in the AI community.

For the most part, decentralized AI is a better approach to follow as most companies don’t
have a clear sense of how or where AI can help. Enterprises need to understand their
enterprise and individual departmental goals first. A decentralized approach gives
departments the freedom to define and drive their own AI strategy rather than have central
management dictate terms. Also, in the long run as AI becomes the default way of running
and managing a business, the CEO will be the default Chief AI Officer or Head of AI, and
therefore having a separate business function for AI in a centralized approach separate from
the CEO can turn out to be a short-sighted approach.

2.6 LEVEL OF INVESTMENT 
While there are no hard-and-fast metrics on how much an initial AI deployment will cost, a
pilot program can generally be started with an investment ranging from the low to mid-six
figure range. The overall total investment can be impacted by the length of the pilot program,
the complexity of the project, and the number of people working on the engagement.

It’s important to differentiate the cost of investing in AI, and the efficiency gain that is being
targeted. While it is advisable to start with a reasonable investment, one should target a
sizable business function with hundreds of millions or even billions of dollars in cost, where
even a few percentage efficiency gains can lead to a meaningful return on investment.

Still, to take the successes of a pilot program and expand them across the organization to
attain tangible benefits, additional investments in hardware, software, and support and
maintenance will be required.

Increasingly, though, some vendors are tying service fees to specific milestones related to
outcomes, which can include revenue increases, operational efficiency improvements, or
other metrics that clearly demonstrate success. This model is particularly useful when a per-
device licensing model would simply be too expensive for most organizations to handle.

The need for high-powered hardware that can handle the demands of DL is another key cost
consideration for enterprises. Hardware vendors, of course, will price their offerings based
on the power, size, and reliability of their solution, though the development of new AI
computing hardware likely will help reduce the cost of hardware over time.

2.7 INFRASTRUCTURE CONSIDERATIONS 
For some applications, AI solutions can be handled in the cloud, with data from an
organization being transferred to a remote third party that handles the development,
processing, and output for ML and DL processes. But for many applications and use cases
where security and privacy are paramount, the organization will need to invest in its own AI
infrastructure, which includes an AI platform, software, and hardware.

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

Moreover, training time for an individual AI model or a portfolio of models is extremely time-
intensive. Cloud-based solutions generally charge enterprises by the compute hour, and
paying for the amount of compute time required to train complex algorithms may wind up
costing significantly more than simply purchasing the hardware and software outright.

Organizations that want to own and manage their own AI infrastructure will need to select an
AI platform on which to build or install AI solutions. These platforms can range from well
known, large frameworks such as TensorFlow from Google, to more specialized platforms,
such as Ayasdi, which is a platform used to support the development of predictive models.

Then, the organization will need to acquire supercomputers powered by powerful, scalable
processors and graphics processing unit (GPU) accelerators, to handle the large data sets
required for training DL as well as the need for continuous, high-speed algorithmic
processing power. Speed is paramount, as training a complex algorithm can take days or
weeks, thereby consuming a significant amount of power. Moreover, time spent training the
algorithm results in delaying the algorithm from being used for inference, or AI parlance for
an algorithm that is actually being put to work. Of course, powerful GPUs offer significant
benefits, as they are required for running billions of computations based on the trained
network to identify known patterns or objects.

A key platform consideration needs to be ease of development and explanability of AI


models. Most enterprise AI platforms today are moving away from code-based development,
using drag and drop modules in graphical user interface (GUI) environments. Improved
visualization helps to bring down development time and open AI development to a wider
talent pool rather than just experienced data scientists. However, with drag and drop
modules there is a risk that interpretability and explanability of AI models gets sidelined. As
AI models start to take over key business decisions, and dictate how customers get serviced
or targeted, companies need to pay attention to the problem of “black box” models. AI
platforms need to provide interpretability functions in their models, like feature selection
impact, which would help regulators or investors parse through the AI decision making
process. The legal and regulatory requirements around AI model interpretability are
expected to get tougher, and therefore AI platforms without model interpretability functions
pose a risk.

Most importantly, the AI hardware solution needs to be robust enough to handle today’s
applications, but also customizable, adaptable, and expandable to address future AI
processing needs. Generally, organizations should select the latest generation of multi-core
processors and GPUs in order to support the use of complex algorithms that may be
integrating hundreds of discrete data inputs at once and ensure that any tweaks or
modifications to the algorithm can be supported with adequate processing power.

As discussed previously, the performance of many algorithms will improve over time, as long
as there is new data being fed into the model, which requires a large reserve of data storage
and processing power. Most vendors offer a variety of storage options, but it is wise to select
systems that can balance performance levels, scalability, and availability within the project’s
budget level.

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

SECTION 3 
CASE STUDY EXAMPLES 
While AI utilization is far from ubiquitous, there is significant activity occurring within
organizations at scale. By leveraging lessons learned during pilot programs, incorporating
internal domain and process knowledge, and then integrating external hardware, software,
and AI resources, organizations have been able to deploy commercially viable AI solutions.
The following examples illustrate how ML and DL have been deployed successfully,
providing organizations with tangible, real-world results.

3.1.1 OIL & GAS: LOCATING NEW RESERVES 
The upstream oil & gas industry is focused on locating and producing crude oil and natural
gas, and is often also referred to as the exploration and production (E&P) sector. Oil and gas
can leverage AI in a number of ways, but in particular, planning and forecasting can be
improved by using DL to help incorporate macroeconomic trends to drive investment
decisions in exploration and production, taking into account economic, production, and
weather patterns to drive investment decisions.

Applying AI in the operational planning and execution stages can significantly improve well
planning, real-time drilling optimization, frictional drag estimation, and well cleaning
predictions. AI techniques can also be applied in other activities, such as reservoir
characterization, modeling, and field surveillance, to accurately characterize reservoirs in
order to attain optimum production levels.

AI systems hold the key to pinpointing new drilling sites containing valuable crude oil, as ML
and DL can transform basic data into valuable insights that can be used throughout the
exploration and production process, including seismic, geology, drilling, petrophysics,
reservoir, and production.

Geophysical feature detection is a critical part of the workflow in the oil & gas industry.
Seismic surveys are carried out in the exploratory phase and during various other phases,
from planning to field characterization before and during oil production. Once the data is
gathered, the seismic traces are then processed and analyzed by human experts. Typically,
this process can take several months.

Recently, Shell and the Massachusetts Institute of Technology (MIT) partnered to use AI
techniques to automate this process and improve workflow efficiencies. Using DL, the raw
seismic traces were analyzed to discover and locate subsurface faults in the underground
structure, which are likely to contain hydrocarbons, before running migration and
interpretation models. While there are still challenges in training and computational
requirements, the study proved that geophysical feature detection could be automated.

Oil exploration capital expenditure is estimated to be around $100 billion per year, so any
savings and efficiencies brought about by geophysical analysis is expected to be adopted
widely across the oil & gas industry.

3.1.2 AUTOMOTIVE: OBJECT DETECTION AND RECOGNITION 
Perhaps no other industry is as closely aligned with AI in the minds of consumers as the
automotive industry, thanks to the media’s focus on self-driving cars. But AI is being used in
other use cases that can support human or self-driving and address maintenance issues,

© 2018 Tractica LLC. All Rights Reserved.


 
Considerations for Getting Started with AI 

personalization services, and ride-sharing services.

The most well publicized use of AI in vehicles is the use of object detection and classification,
which takes sensor data, often from cameras, and then uses complex algorithms to classify
these objects so that the AI system can then “learn” their characteristics and recognize them
in real time. The challenge is not in capturing images, as today’s high-definition (HD)
cameras can present images in stunningly clear detail. However, in a moving environment,
objects can appear to change size as a vehicle or camera approaches. The angle at which
an object is viewed can also skew its appearance, and the presence of other factors (rain,
bright sunlight, low lighting, glare, dirt, snow, or any other number of obstructions) can alter
the appearance of an object, making it hard to accurately and consistently identify the object.

This is an area where machine vision and ML can provide invaluable support. By capturing
a wide range of images of objects from a variety of vantage points, angles, and in different
conditions, a repository of images that can be definitively classified as that object can be
created, and used to “train” a ML system to identify and classify objects that resemble objects
in the repository. By then assigning various other attributes to each object, such as whether
the object is informational like a traffic sign, whether it is permanent or temporary like a road
barrier, or whether or not it has the capability of motion and how it typically moves, the system
can begin to develop logical rules on handling each object and the rules for dealing with
them.

Luxury automaker Mercedes-Benz has introduced AI technology into its 2018 S-class sedan,
via its Drive Pilot driver-assistance features. These semi-autonomous driver-assist
technologies include adaptive cruise control, a lane-keeping setup that can handle limited
steering duties and autonomous lane changes, and a 360° array of radar and ultrasonic
sensors for keeping track of lane markings, other cars, and road signs.

The system uses DL technology to train the system to distinguish between various elements
that may be in the car’s operational envelope (such as lane markings, signs, roadside
barriers, other vehicles, pedestrians, and animals) to provide input into the system. After
ingesting images of these objects during training, the DL technology can then extrapolate
and learn what these images may look like from different angles, when lighting conditions
change, or when the car is moving at different speeds.

Thanks to DL, driver-assistance systems can be brought to market more quickly than if an
object-detection system needed to be fed every object that could be encountered, from all
angles and in all lighting conditions.

3.1.3 LIFE SCIENCES: DRUG COMPOUND DISCOVERY 
In healthcare, AI is largely being implemented as a tool to more efficiently and accurately
review data, and uncover patterns in the data that can be used to improve analyses, uncover
inefficiencies, and streamline care, from both a clinical and an operational perspective. The
overarching driver is to provide better care for humans, while reducing costs and
administrative headaches and bottlenecks. Some of the use cases that have shown promise
and results include the use of ML to analyze huge clinical and genomic databases and
identify relevant predictive biomarkers for specific types of diseases.

Researchers are using a range of data sets (e.g., genomic data, gene expressions,
proteomics, clinical data, etc.), and integrating signals and timeframes from this data to
develop molecular profiles. In some cases, humans provide canonical disease or drug maps
to cover various therapeutic areas and disease types. Hundreds of canonical pathways are
analyzed and enriched to infer a disease or drug’s mechanisms of actions (MOA). From

© 2018 Tractica LLC. All Rights Reserved.

10 
 
Considerations for Getting Started with AI 

here, molecular profiling data is fed into algorithms that use ML to identify biomarkers and/or
drug sensitivity to specific biomarkers.

AI can also be used to increase the speed and efficiency of drug discovery and testing. AI
offers new ways for researchers to leverage existing databases, develop new databases
involving bigger and more diverse data, and to predict how molecules will behave and how
likely they are to make a useful drug, thereby saving time and money on unnecessary tests.
DL could help with drug development by finding patterns in sparse pathology data combined
with large genomic data sets.

The ability of AI to find, identify, and analyze patterns is also yielding benefits in analyzing
medical images. Historically, analyzing medical images has been difficult, highly prone to
human error or oversight, and time-consuming and costly. Medical images like magnetic
resonance imaging (MRIs), X-rays, computed tomography (CT) scans, and other diagnostic
images are essential to better understanding and diagnosing a wide range of conditions.
When it comes to diagnosing critical conditions, including cancer, neurodegeneration, and
heart disease, the faster and smarter the speed, precision, and predictive capabilities, the
better. Analyzing images is a strong application for DL and CV within the realm of patient
data processing. In particular, DL is now being applied to automate the analysis and increase
accuracy, precision, and understanding of images down to the pixel.

The methods for drug discovery, the process by which new medications are discovered, has
largely centered around identifying the active ingredient from traditional remedies simply by
serendipity. Upon sequencing the human genome (which enabled rapid cloning and
synthesis of large quantities of purified proteins), it has become common to use high-
throughput screening of large compounds’ libraries against isolated biological targets. New
drug development costs still run about $2.6 billion per year and take as long as 14 years,
and less than 10% of potential medications make it to market, according to research from
Tufts University and the U.S. Food and Drug Administration (FDA).

AI offers new ways for researchers to leverage existing databases, develop new databases
involving bigger and more diverse data, and predict how molecules will behave and how
likely they are to make a useful drug, thereby saving time and money on unnecessary tests.
Many large pharmaceutical companies are partnering with AI drug discovery startups in a
bid to reduce costs and time to market.

GlaxoSmithKlein (GSK) recently announced a $43 million partnership with Exscientia to


search for drug candidates for up to 10 disease-related targets. Atomwise recently partnered
with drug giant Merck and published first findings on Ebola treatment drugs last year.
BenevolentAI is a British company focused on developing better drugs to target diseases of
inflammation and neurodegeneration, and rare cancers. The idea is to use much of the dark
data within pharma research and development (R&D) organizations and apply vast data sets
available on human health and biological systems to DL systems that learn and reason from
interaction between human judgement and data. Numerous other companies are emerging
in this space, such as Calico, Numerate, Globavir, NuMedi, twoXAR, and Cloud
Pharmaceuticals.

3.1.4 FINANCIAL SERVICES: INSURANCE  
The financial services market is awash in data; transaction data used for inter-institution and
market-making activities, product sales, customer data, and operational data used for
managing the day-to-day operations of an institution, managing security, and marketing
operations. The ability to harness this data, identify patterns, and create new efficiencies is
a prime driver of AI technology in the financial services industry. While it is a highly regulated

© 2018 Tractica LLC. All Rights Reserved.

11 
 
Considerations for Getting Started with AI 

industry, financial services firms have made the case that efficiency and safety can, in many
cases, be better handled by machines than humans, because they are not able to be swayed
by emotion, stress, or other outside factors.

For example, credit rating agencies are now beginning to explore AI, ML, and DL to aid in
credit scoring, primarily to assess creditworthiness more precisely through more nuanced
evaluations of data. Instead of looking at one or a few separate variables, AI engines help
consider mitigating interactions between multiple variables. For instance, even if a consumer
skipped payments on two debts within 24 months, but paid consistently for 12 months
straight, and obtained new lines of credit, that may be weighted to mitigate the risk of the
past missed payments. The other potential benefit is to consider people who might not have
been able to get a score in the past, via traditional logistic regression-based scoring (which
looks at credit history).

Using NLP, ML, and DL, in some cases, companies are using AI to automatically generate
reports that can handle the identification and extraction of data from relevant internal and
external data sources. Using ML, the system can then apply predictive modeling and data
enrichment, and then create hundreds of “what if” scenarios and perform trend analysis.

An insurance company’s primary objective is to use and process customer data to model
risk factors, improve insurance products, prevent losses and fraud, and reduce the amount
of money it pays out. In the near term, insurance companies are seeing potential for reducing
fraud by detecting anomalies or patterns associated with fraudulent activity. Algorithms can
also help speed up claims processing by automatically assessing the severity of a claim and
predicting costs from historical data, sensors, images, or other data sources.

FitSense is a company focused on personalizing insurance products by using app and


device data. It has built a data aggregation platform that integrates, processes, and securely
stores data across various channels (e.g., wearables, biometrics, health apps, demographic
data, etc.). It uses ML and NLP to model and interpret raw data into specific customer and
risk profiles, and then leverages that data to help insurance companies design and
substantiate new insurance products and services.

In particular, the app enables insurance companies to offer their own white-labeled self-
quantification, health management, and incentive programs. Other companies developing in
this space include DreamQuark, Big Cloud Analytics, and CogniCor, which offers a chatbot
assistant for complaints and claims resolution and then uses interactions to improve
insurance products and services.

© 2018 Tractica LLC. All Rights Reserved.

12 
 
Considerations for Getting Started with AI 

3.1.5 CONSUMER: CUSTOMER RECOMMENDATIONS 
AI technology is also being used with consumer products and in the retail environment to
make life more convenient or to improve the efficiency of operations. By harnessing the
power of ML or DL, products, attributes, and shopping-related advertisements and marketing
information can be tailored to granular actions and triggers, creating a truly personalized
product experience. Furthermore, because DL algorithms can be used to uncover new
connections between data points, seemingly unrelated consumer behavior or data can be
more efficiently tracked and harnessed by retailers, marketers, and product manufacturers
on an ongoing basis.

Sentiment analysis, which involves understanding the emotional context of buyers can be
very useful for gaining an overview of public option, ideation, or feedback on a given topic.
Common approaches for measuring brand sentiment include the net promoter score (NPS),
up/down votes, emojis, basic Likert scales, or other measures.

However, AI and NLP are now enhancing sentiment analysis by capturing and understanding
the unstructured, more nuanced, and qualitative feedback, not just the best fitting response
in a multiple-choice scenario. This data is combined with structured data sets for advanced
analytics to surface trends. For example, retailers can track social media sentiment analysis,
and then use NLP to dig deeply into the rich nuances of comments and feedback.

The ability to see beyond simple happy-neutral-angry or like-dislike then allows retailers to
plan and act according to far more nuanced categories, personas, product lines, or
campaigns. The majority of data available to most organizations is “dark,” unstructured, and
unused, but potentially full of valuable insights, so AI can be used as a tool to shed light on
sentiments found in call logs, emails, transcripts, videos, rating applications, and audio data.

A large pharmaceutical company interested in optimizing C Space, its online community of


caregivers for people with schizophrenia, recently partnered with AI software company
Luminoso to better understand the major issues these caregivers face and how to provide
them with better resources and communications. Together, they used NLP and deep
analytics on vast amounts of rich, but disparate and unstructured data, pulling together
content from online communities, online discussion boards, multiple research projects, photo
collages, and open-ended responses from surveys.

Luminoso’s software then vectorized the data, meaning it effectively turned the text into
mathematical vectors, and then mapped unstructured data based on relationships between
topics and ideas. It uncovered key themes and associations about the emotional composition
of caretakers, their struggles, concerns, resource needs, and how they change over time.
The pharmaceutical company also used the findings to improve community management,
messaging, and support services.

© 2018 Tractica LLC. All Rights Reserved.

13 
 
Considerations for Getting Started with AI 

3.2 CONCLUSIONS AND RECOMMENDATIONS 
As organizations face an expanding number of AI projects, the demands on systems
supporting ML and DL are growing in lockstep. Therefore, it is important to select platforms,
software, hardware, and service provides that can scale with the growth of an AI deployment.

Partnering with vendors that can provide their particular platform or product, and provide
honest, technology- and vendor-agnostic consulting, support, and maintenance services is
a recipe for success. Assuming the vendor has been involved in the AI space, it likely has
significant data-science expertise, as well as real-world experience with handling
deployments, including how to overcome the inevitable hurdles and challenges.

© 2018 Tractica LLC. All Rights Reserved.

14 
 
Considerations for Getting Started with AI 

SECTION 4 
ACRONYM AND ABBREVIATION LIST 
Computed Tomography ............................................................................................................................. CT

Computer Vision ......................................................................................................................................... CV

Deep Learning .............................................................................................................................................DL

Exploration & Production.......................................................................................................................... E&P

Food and Drug Administration (U.S.) ....................................................................................................... FDA

Graphical User Interface ........................................................................................................................... GUI

Graphics Processing Unit........................................................................................................................ GPU

High Definition ............................................................................................................................................ HD

Machine Learning....................................................................................................................................... ML

Machine Reasoning ...................................................................................................................................MR

Magnetic Resonance Imaging ..................................................................................................................MRI

Massachusetts Institute of Technology ..................................................................................................... MIT

Mechanisms of Actions ........................................................................................................................... MOA

National Language Processing ................................................................................................................ NLP

Net Promoter Score ................................................................................................................................. NPS

Proof-of-Concept ..................................................................................................................................... POC

Research and Development.....................................................................................................................R&D

Software-as-a-Service............................................................................................................................ SaaS

© 2018 Tractica LLC. All Rights Reserved.

15 
 
Considerations for Getting Started with AI 

SECTION 5 
TABLE OF CONTENTS 
SECTION 1 ...................................................................................................................................................... 2 
Introduction ................................................................................................................................................. 2 
1.1  What Is AI? .................................................................................................................................... 2 
1.2  Potential Benefits of Using Machine Learning and Deep Learning .............................................. 4 
SECTION 2 ...................................................................................................................................................... 5 
Implementation Considerations ................................................................................................................ 5 
2.1  Understanding the Journey ........................................................................................................... 5 
2.2  Key Business Drivers to Justify AI Investment.............................................................................. 5 
2.3  Competitive Differentiation ............................................................................................................ 5 
2.4  Project Metrics and Guidelines ..................................................................................................... 6 
2.5  Ownership of AI ............................................................................................................................. 7 
2.6  Level of Investment ....................................................................................................................... 7 
2.7  Infrastructure Considerations ........................................................................................................ 7 
SECTION 3 ...................................................................................................................................................... 9 
Case Study Examples ................................................................................................................................. 9 
3.1.1  Oil & Gas: Locating New Reserves ......................................................................................... 9 
3.1.2  Automotive: Object Detection and Recognition....................................................................... 9 
3.1.3  Life Sciences: Drug Compound Discovery............................................................................ 10 
3.1.4  Financial Services: Insurance ............................................................................................... 11 
3.1.5  Consumer: Customer Recommendations ............................................................................. 13 
3.2  Conclusions and Recommendations .......................................................................................... 14 
SECTION 4 .................................................................................................................................................... 15 
Acronym and Abbreviation List ............................................................................................................... 15 
SECTION 5 .................................................................................................................................................... 16 
Table of Contents ...................................................................................................................................... 16 
SECTION 6 .................................................................................................................................................... 17 
Table of Charts and Figures..................................................................................................................... 17 
SECTION 7 .................................................................................................................................................... 18 
Scope of Study .......................................................................................................................................... 18 
Sources and Methodology ....................................................................................................................... 18 
Notes .......................................................................................................................................................... 19 

© 2018 Tractica LLC. All Rights Reserved.

16 
 
Considerations for Getting Started with AI 

SECTION 6 
TABLE OF CHARTS AND FIGURES 
Chart 7.1  Tractica Research Methodology............................................................................................ 19 

Figure 1.1  Artificial Intelligence Techniques ............................................................................................. 2 


Figure 1.2  The Artificial Intelligence Continuum ....................................................................................... 4 
Figure 2.1  Proof-of-Concept Cycle ........................................................................................................... 6 

© 2018 Tractica LLC. All Rights Reserved.

17 
 
Considerations for Getting Started with AI 

SECTION 7 
SCOPE OF STUDY 
This white paper examines the hardware, software, and services used to deploy AI technology within
commercial enterprises, government entities, and other large organizations. The technologies covered
include ML, DL, NLP, CV, MR, and CV. The paper uses insights from relevant Tractica reports, including
Artificial Intelligence for Enterprise Applications and Artificial Intelligence Services, while the case examples
discuss real-world implementations of AI technology.

SOURCES AND METHODOLOGY 
Tractica is an independent market research firm that provides industry participants and stakeholders with
an objective, unbiased view of market dynamics and business opportunities within its coverage areas. The
firm’s industry analysts are dedicated to presenting clear and actionable analysis to support business
planning initiatives and go-to-market strategies, utilizing rigorous market research methodologies and
without regard for technology hype or special interests including Tractica’s own client relationships. Within
its market analysis, Tractica strives to offer conclusions and recommendations that reflect the most likely
path of industry development, even when those views may be contrarian.

The basis of Tractica’s analysis is primary research collected from a variety of sources including industry
interviews, vendor briefings, product demonstrations, and quantitative and qualitative market research
focused on consumer and business end-users. Industry analysts conduct interviews with representative
groups of executives, technology practitioners, sales and marketing professionals, industry association
personnel, government representatives, investors, consultants, and other industry stakeholders. Analysts
are diligent in pursuing interviews with representatives from every part of the value chain in an effort to gain
a comprehensive view of current market activity and future plans. Within the firm’s surveys and focus
groups, respondent samples are carefully selected to ensure that they provide the most accurate possible
view of demand dynamics within consumer and business markets, utilizing balanced and representative
samples where appropriate and careful screening and qualification criteria in cases where the research
topic requires a more targeted group of respondents.

Tractica’s primary research is supplemented by the review and analysis of all secondary information
available on the topic being studied, including company news and financial information, technology
specifications, product attributes, government and economic data, industry reports and databases from
third-party sources, case studies, and reference customers. As applicable, all secondary research sources
are appropriately cited within the firm’s publications.

All of Tractica’s research reports and other publications are carefully reviewed and scrutinized by the firm’s
senior management team in an effort to ensure that research methodology is sound, all information provided
is accurate, analyst assumptions are carefully documented, and conclusions are well-supported by facts.
Tractica is highly responsive to feedback from industry participants and, in the event errors in the firm’s
research are identified and verified, such errors are corrected promptly.

© 2018 Tractica LLC. All Rights Reserved.

18 
 
Considerations for Getting Started with AI 

Chart 7.1 Tractica Research Methodology

MARKET RESEARCH

SUPPLY SIDE DEMAND SIDE

PRIMARY Industry  Vendor  Product  End‐User  End‐User 


RESEARCH Interviews Briefings Evaluations Surveys Focus Groups

SECONDARY Company News  Technology &  Government &  Case  Reference 


RESEARCH & Financials Product Specs Economic Data Studies Customers

MARKET ANALYSIS

QUALITATIVE Company  Business  Competitive  Technology  Applications 


ANALYSIS Analysis Models Landscape Assessment & Use Cases

QUANTITATIVE Market Market  Market  Market Share  Scenario 


ANALYSIS Sizing Segmentation Forecasts Analysis Analysis

(Source: Tractica)

NOTES 
CAGR refers to compound average annual growth rate, using the formula:

CAGR = (End Year Value ÷ Start Year Value)(1/steps) – 1.

CAGRs presented in the tables are for the entire timeframe in the title. Where data for fewer years are
given, the CAGR is for the range presented. Where relevant, CAGRs for shorter timeframes may be given
as well.

Figures are based on the best estimates available at the time of calculation. Annual revenues, shipments,
and sales are based on end-of-year figures unless otherwise noted. All values are expressed in year 2018
U.S. dollars unless otherwise noted. Percentages may not add up to 100 due to rounding.

© 2018 Tractica LLC. All Rights Reserved.

19 
 
Considerations for Getting Started with AI 

Published 2Q 2018 

© 2018 Tractica LLC


1650 38th Street, Suite 101E 
Boulder, CO 80301 
Tel: +1.303.248.3000 
Email: info@tractica.com 
www.tractica.com 

This publication is provided by Tractica LLC (“Tractica”). This publication may be used only as expressly
permitted by license from Tractica and may not otherwise be reproduced, recorded, photocopied,
distributed, displayed, modified, extracted, accessed or used without the express written permission of
Tractica. Notwithstanding the foregoing, Tractica makes no claim to any Government data and other data
obtained from public sources found in this publication (whether or not the owners of such data are noted in
this publication). If you do not have a license from Tractica covering this publication, please refrain from
accessing or using this publication. Please contact Tractica to obtain a license to this publication.

© 2018 Tractica LLC. All Rights Reserved.

20 

Вам также может понравиться