Вы находитесь на странице: 1из 18

Big Data and Analytics Applied to Oil and Gas

How Agile Companies Thrive in an Internet-Connected World

Shawn Cutter
Quorum Business Solutions
Table of Contents
Executive Summary ............................................................................................................... 3
Introduction .......................................................................................................................... 3
Technology Components ....................................................................................................... 4
Cloud and IoT.................................................................................................................................4
Integration and Storage .................................................................................................................5
Database .......................................................................................................................................5
Data Models ..................................................................................................................................6
Machine Learning ..........................................................................................................................7
The Connected Energy Value Chain ........................................................................................ 7
Land Acquisition and Management ................................................................................................7
Drilling and Completions ................................................................................................................9
Production Operations ................................................................................................................. 11
Back Office .................................................................................................................................. 14
Midstream ................................................................................................................................... 16
Conclusion........................................................................................................................... 17
References .......................................................................................................................... 17
About Quorum .................................................................................................................... 17

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Executive Summary
Informational value is the most important commodity throughout the oil and gas industry, from
upstream operators, unconventional, conventional, and EOR, to midstream gas and liquids
processors, to pipelines that deliver products to world markets. New embedded devices and
sensors make it easier to collect data at an ever-increasing rate, while cloud storage
technologies make it simpler to store the data; however, real value is created from data when
informational context and associations are made inside and outside of an organization.

Current market volatility provides proof of the need for optimization across the oil and gas
value chain. Companies that use data to optimize business processes will not only survive, but
thrive during market downturns and dominate the industry on the next upswing.

Introduction
Energy producers live and die by the decisions they make; the concept is well known. E&Ps take
incredible risks to find, drill, complete, and produce oil and gas. Midstream companies invest
billions to gather, process, and send that energy to market.

Specialized applications are available to solve specific and complex problems in the energy
industry. However, good decision making in select areas of the business is insufficient to thrive
in todays energy market. Companies must execute in all areas of the business, on top-
performing projects, with impeccable timing, and ready to shift resources as market conditions
change all while continuing to operate their existing assets.

In the current market environment, companies are cutting budgets for capital projects. This
could have a material impact on future production and accumulating reserves to enable
accelerated growth when prices recover. Agile companies that are making these decisions are
not developing a single model; they are generating numerous models that need to factor in
many complex datasets, which are often interdependent.

Additionally, models are not created and then filed away; they are continually tested and
retested to validate business decisions. Predictive analytics allows companies to minimize the
impact of bad decisions through early detection and ultimately decrease these occurrences
altogether.

In the real world, an E&P must consider many factors that are all dependent on moving targets,
especially given the volatility of the markets today. First, companies must focus on what they
have control over, and then move beyond to areas where the only guarantee is change.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Examples of real-world variables include:

Land: drilling obligations and lease expirations


Capital Outlay: drilling and completion cycle times
Production and Expenses: pipeline capacity gas, field optimization, and complete water
management
Risk Management: commodity pricing, net revenue, existing and future hedges
Analysis: completion design, well spacing, initial production rates, and production
decline curves, equipment failures
Financial: cash flow, reserves, credit terms

Technology Components
Quorum refers to the ability for oil and gas companies to identify relationships, understand
context, and analyze data across the energy value chain as connected energy intelligence.
Leveraging the data from all areas of the business with the Internet of Things (IoT) and the
cloud and bringing them together in new ways with analytics, big data, and machine learning
are the basic components for implementing connected energy intelligence.

Any big data system delivers one or more of the five Vs iii:

Volume Velocity Variety Veracity Value


Amount of data Rate of data Different types Quality of data Analytics on the
received of data data

Cloud and IoT


The cloud and IoT go hand in hand since the former enables the latter. The cloud delivers
scalability on demand, without the fixed overhead of computing, storage, and communication
capacity. The result is ubiquitous computing, a world where computing cannot be distinguished
between one thing and the next, one decision to the next. The cloud is a necessary ingredient
of connected energy intelligence because of one thing: scale.

In 2015, nearly half of the drilling rigs in North America were idled in less than 12 months.
Decisions were made during this time that shifted hundreds of billions of dollars as companies
went from drill mode to survival. When the market shifts upwards, how quickly will the industry
be able to pivot? Cloud computing and Internet-connected devices deliver the business agility
and insight required by the oil and gas industry to quickly adapt to changing conditions.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
The landscape for cloud and big data storage technology continues to change rapidly with new
or improved technologies that leverage related software, many of which are open source. Open
source vendors deliver value-added software and services as commercial add-ons to open
source software. This makes the adoption of new big data technology economically feasible for
industries like oil and gas that historically are slow to innovate in areas that are not directly tied
to commercial operations. Moreover, open source solutions are typically available on a
subscription basis, which allows for predictable total cost of ownership (TCO) and makes it
possible to transition IT from a capital expense to an operating expense. However, IoT involves
more than just deploying data collectors and collecting information in the cloud; therefore,
companies must also consider how the all data streams will be ingested and stored for later
consumption.

Integration and Storage


Considerations for data ingestion and integration are akin to the midstream sector of oil
and gas. Large, volatile flows of hydrocarbons must be conditioned and balanced into
steady flows product to downstream markets. The technology strategy must involve
communication protocols, telecommunication limitations, stream analytics, and system
integration.

To satisfy these requirements, cloud platform offerings provide a toolbox of


components that, when assembled together, offer a robust set of integration, protocol,
and storage options. Example offerings include:

IBM Bluemix: (www.ibm.com) cloud platform as a service (PaaS) that supports


several programming languages and services
Amazon Web Services: (aws.amazon.com) collection of cloud computing
services that make up the on-demand computing platform
Microsoft Azure: (azure.microsoft.com) cloud platform and infrastructure for
building, deploying, and managing applications and services
Google Cloud Platform: (cloud.google.com) set of modular cloud-based services
with a host of development tools
Dell Boomi: (www.boomi.com) integration platform for connecting cloud and
on-premises applications and data
MuleSoft: (www.mulesoft.com) integration platform as a service (iPaaS) for
connecting applications, data sources and APIs

Database
Integration platforms give companies the flexibility to change the format and
destination of any piece of data at any time. Many companies struggle initially with
uncertainty about where all the data will be stored, and this hesitation results in the
failure to collect enough data to make analysis meaningful. Analytics platforms address
this problem by enabling connectivity between disparate data sources, making it

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
unnecessary to choose one and only one database technology like building a single
enterprise data warehouse; this is simply not required anymore.

Additionally, non-relational databases (NoSQL) provide a mechanism for storage and


retrieval of data at scale. Non-relational databases are required for big data initiatives
because relational database management systems (RDBMS) cannot scale and are by
definition much more rigid. It is important to note that there is no single database
system, unstructured or structured, that fits every scenario.

A few of the many options available include:

MongoDB: (www.mongodb.org) a NoSQL, a cross-platform document-oriented


database
Apache Cassandra: (cassandra.apache.org) an open source distributed database
management system designed to handle large amounts of data across many
commodity servers, providing high availability with no single point of failure
Apache HBase: (hbase.apache.org) an open source, non-relational, distributed
database modeled after Google's BigTable
Redis: (redis.io) a data structure server that is open source, networked, in-
memory, and stores keys with optional durability
Basho Riak: (www.basho.com) a distributed NoSQL key-value data store that
offers high availability, fault tolerance, operational simplicity, and scalability
Apache CouchDB: (couchdb.apache.org) an open source NoSQL a document-
oriented database implemented in the concurrency-oriented language Erlang
Amazon DynamoDB: (aws.amazon.com) a fully managed proprietary NoSQL
database service
Azure DocumentDB: (azure.microsoft.com) a fully managed, multi-tenant
distributed database service for managing JSON documents at Internet scale

Data Models
Data acquisition, storage, and integration are links along the energy information value
chain. Data models define the relationships between many structured and/or
unstructured data elements. It is important to understand that data models can and
should change as new discoveries are made. But, they do have to start with a project
and a reason to justify their creation. Adequate documentation around the model and
the metadata around each data element should be at the core of any new project.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Models should be built around each area of the business through a process of:
1. Defining the project
2. Collecting the data
3. Analyzing the data
4. Building a model
5. Deployment and evaluation

Machine Learning

Today, most companies would gain tremendous benefits from connecting data silos together,
providing visibility into the basic relationships that are already known to exist, such as vendor
and cost. It is not surprising that technology leaders over the last decade dominate in machine
learning and artificial intelligence platforms. IoT and the potential value created by the
petabytes of data streaming from billions of these devices, is why these leaders are advancing
the platforms and tools to eliminate the barriers to entry into this cutting-edge field. A few of
the platforms and toolkits to consider for use:

IBM Watson: (www.ibm.com/Outthink) a technology platform that uses natural


language processing and machine learning to reveal insights from large amounts of
unstructured data
Azure Machine Learning: (azure.microsoft.com) cloud platform and toolset that
includes complex algorithms and technology to apply against custom data models
Amazon Machine Learning: (aws.amazon.com) a cloud platform based on the same
proven, highly scalable, ML technology used by Amazons internal data scientist
community
TensorFlow: (www.tensorflow.org) an open source software library for numerical
computation using data flow graphs developed by Google
DMTK: (www.dmtk.io) a framework for training models on multiple servers, a topic
modeling algorithm, and a word-embedding algorithm for natural language processing
Torch: (torch.ch) a scientific computing framework opened sourced by Facebook with
wide support for machine learning algorithms that puts GPUs first

The Connected Energy Value Chain

Land Acquisition and Management

An E&Ps reserves and the land it sits under or a transporters capacity and right-of-ways are
the largest assets held by energy companies. From the beginning of the land acquisition phase
in a new basin, companies must efficiently document, catalog, prioritize, and value their
leaseholds and mineral rights. As market conditions change, exploratory wells are tested and
field development begins, creating data silos that prove difficult to overcome in many
corporate software ecosystems.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
With disparate and disconnected systems, market volatility masks the problem when prices
increase and leaves executives scrambling for answers when prices rapidly decline. There are a
number of analytics and data services available today that utilize public and internal sources of
data that provide visibility to land availability and drilling activity.

Once the assets have been acquired and field development begins, a real-time field
development engine can continually evaluate and refine reserves, re-value assets, and even
alter development plans as conditions are met and milestones achieved. In the world of IoT,
data streams are already available that provide each department with the real-time data they
need but rarely provide value to any other organization unit.

During the land acquisition phase in a newly discovered basin, landmen descend and acquire as
much land as possible for the lowest cost. Each mineral owner and each lease can be different.
In the fog, there is an emotional component to land deals, and time is typically the only means
of placing true value on each deal.

By using artificial intelligence technologies such as cognitive and natural language


interpretation, all emotion can be removed from each deal by applying an objective score. This
can be done by gathering inputs, scanning the lease with a mobile device, and processing the
content of each page to gather. Some of the inputs from the list below will continue to change
over time as the field is developed and reserves are proven.

Lease terms including signing bonus and royalties


Location of each lease
Lease type and total included acreage
Location to nearest producing well
Expected production rating
Relation to other leases and competitive risk
JV opportunities
Expected development cost
Production output capacity
Mineral owner demographics
Service provider performance
Resource costs

When commodity prices drop rapidly, as they have done twice in the last decade, companies
that make better and faster decisions especially when capital budgets get cut will be better
positioned to extend the gap between them and the rest of the market. With the system
described above already in place, connecting the dots to identify the assets with the lowest
objective score with any defined vicinity should be the first to send to the data room.
Objectivity also benefits executives who now face tough decisions of selling assets that were
expensive to acquire.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Connected energy intelligence answers the questions below in real time, all the time:

What assets should be divested?


What assets have the highest value?
What assets have the lowest risk?
What divestiture will have the least impact on future shareholder value?

Most of these questions could be answered by one division, but they become more difficult to
answer when considering:

Internal factors
o Field development backlog
o Expected decline curves versus actual
o Delayed rentals and lease expirations
o Seismic data and existing analysis
Downstream activities
o Processing facility construction
o Gathering systems
o Pipeline capacity
External factors
o Service provider performance and cost
o Resource and equipment costs
o Hedging and risk

Companies with solid balance sheets look for acquisition targets that can add long-lasting
shareholder value from distressed companies. Similar to a land acquisition where each lease is
objectively scored, the entire data set for a potential deal must be processed, analyzed, and
scored. Data produces information that can be used to provide the acquisition and
development roadmap given current and expected market conditions.

Drilling and Completions

Finding, drilling, and completing oil and gas wells is extremely capital intensive. Therefore, as
companies strive to improve production output and extend the life of their assets, it is not
surprising that big data and analytics are already leveraged for these activities. Unfortunately,
when oil is $100 per barrel everyone is an expert, and bad decisions tend to result from gut
feelings rather than data. Improvements are made, some through trial and error and others via
brute-force analysis. Industry-leading producers have been employing analytics for years.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Data models use hundreds, sometimes thousands, of controllable inputs: drill location, depth,
frac stages, lateral length, proppant type and amount, pump pressures and rates, etc. Outside
economic factors can also play a significant role from one basis to another and even from one
pad to another. The frac water source and location of nearby flow back locations for blending
reused water at these prices will drive whether or not a well gets drilled at all.

The drilling process no longer consists of a tool-pusher capturing drilling depth on a paper
sheet. Data streaming is the new normal, where fiber optics are used to sample hundreds of
sensors every few seconds and gather rates, pressures, weight, torque, molecular of gas
captured, etc. It is now common for a drilling engineer in Houston to monitor drilling activity in
Pennsylvania remotely. However, these systems are still very much point to point and lack
enterprise integration.

An advanced form of artificial intelligence being used by leading unconventional E&Ps is called
Prescriptive Analytics, a term coined by Atanu Basu, the founder and CEO of big data analytics
software company Ayata (www.ayata.com).i According to Basu, It [Predictive Analytics] uses
any kind of data to predict, prescribe, and automatically adapt. The more it sees, the smarter it
gets.

Prescriptive Analytics compresses learning curves to help operators arrive at better answers,
faster, with far less risk and financial exposure compared to current practices. The figure below
is Predictive Analytics in its simplest form.

Source: Ayata (www.ayata.com)

Companies that continue to push for improvements are building their field development plans
and designing their drill sites and completions long in advance. If the wrong decision is made
while the bit is in the ground, additional costs and delayed revenue are the most that
companies can hope for out of a well. The worst case is complete loss. And sadly, companies
typically have more than enough data to drive better decisions. Companies with a culture of
continuous improvement through analytics find that integrating 3D seismic data with their
completion design improves the long term economics of every well drilled by increasing
production and reducing the risk associated with well design.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
The figure below illustrates how Ayata Prescriptive Analytics software enables completion
designers to get the recipe for an optimized well design. As improvements are made, more data
is collected, and the system gets smarter, yielding additional improvements.

Source: Ayata (www.ayata.com)

Production Operations

For any company that expects to make it through this or any future downturn, the times when
data is not used on a day-to-day, hour-by-hour basis for oil and gas production are long gone.
Technology and information have been lockstep with capital expenditures for decades, and
production operations and the corresponding operating expenses were largely ignored.
However, companies seek efficiency when commodity prices decline, and efficiency is not a
luxury, it is mandated for survival.

The ability for companies to collect, process, and analyze production and operational data has
never been easier. Oil and gas producers are able to make incremental improvements that
compound over the entire productive life of a well. The omission of real continuous
improvement strategies during production is why most leading experts and technology

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
innovators are particularly focused on production optimization that uses process control
automation and preventative equipment maintenance.

Production optimization is not solely dependent on producing more oil and gas all the time; to
truly optimize production, companies must produce smarter with less overhead. Efficient E&Ps
have already introduced mobility into every aspect of operations. Accuracy is a side effect of
manual data collected at the source using smart devices such as tablets. However, capturing a
set of important data points, one time per day, by sending a field technician is incredibly
inefficient.

Utilizing devices and sensors to capture pressures, rates, and equipment control data on high
frequencies yields far more accurate data sets, improving an engineers ability to make
production decisions. The scarcity of engineering knowledge increases the time between
receiving event information and a decisions positive outcome or negative impact. Intelligent
systems must also deliver information back to the field technician, providing actionable
information rather than just a better means of collecting data.

In a study conducted by McKinsey & Companyii, automation and optimization will yield the
most substantial results for any upstream company.

Source: McKinsey & Company (www.mckinsey.com)

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Process control automation of oil and gas assets has been a part of localized production
operations for decades from the perspective of a well, facility, and platform driven by safety
and maximizing output. With the limiting factors of computing and storage infrastructure
removed, the industry moves beyond observational analysis to continuous improvement via
data models with many input variables associated to data points sourced outside of the
production system. Warm bodies in front of a monitor and keyboard do move the needle far
enough to enable companies to thrive in extremely low commodity markets.

Artificial lift was an early form of automation for production operations and, in nearly all cases,
is limited by control automation at a very narrowly focused level. A connected analytical system
that leverages sensor, business, and third-party data, artificial lift should be controlled by
systems that produce the most for the company, not in terms of volumes, but revenue.

Consider an unconventional eight natural gas well pad and the effects of equipment failures on
one or two wells. In this situation, a control system can manage total productive output by
enabling other wells to continue to produce longer or with more frequent cycles when other
wells are not available. It can even allow for the wells coming back into production to have
more pipeline and production availability to make up for lost production.

Companies wielding big data across the enterprise, combining back-office ERP data alongside
sensor data collected throughout the entire natural gas field, can use analytics to optimize
artificial lift of wells across multiple well pads. Connected intelligent platforms go beyond the
control panel to optimize artificial lift across the entire field, making changes to output across
multiple wells in the following areas:
Downstream pipeline capacity
Expected rate of volume gain
Contractual obligations to trading partners and royalty owners
Impact to future production declines
Wear on production equipment

The biggest source of lost revenue throughout oil and gas is unplanned downtime, most of
which is preventable through proper maintenance programs. An intelligent optimization
platform goes beyond the current expectation of using human capital to schedule and deploy
human resources. The manufacturers of large and expensive production equipment are leading
the charge in this space and rightly so. It is no longer good enough to sell a good piece of
equipment that works day one. The expectation is added value through instruction, training,
and outright packaged solutions that leverage the sensor and control data from each
component to determine the likelihood of failure by more than just time.

Preventative maintenance is an immediate driver for the Industrial Internet of Things (IIoT),
largely because manufacturers of large capital equipment understand the value of data and
how much control they have over their products after they go into production. A natural gas
compressor can have thousands of moving or controlled parts, each with its own useful life.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Data models have to aid in determining and extending the life of each part. Some of these data
models analyze the actual life based on real recorded conditions in order to arrive at more
accurate representations of useful life and what maintenance programs extend it. This also
reduces downtime.

The gains realized by keeping large, critical equipment in service longer without unexpected
failures is obvious. When the sensor and control data is pushed from individual control panels
into data models that leverage data from the rest of the enterprise, the result is operational
excellence. Excellence is never achieved; it is the target. There is always some incremental
improvement to make, another competitive advantage to gain.

Improvement can come in the form of intelligently distributing the scarce set of knowledge field
operators. Some would describe this as route optimization. However, route optimization is
often used to describe reactionary mechanisms to address unplanned downtime and lost
production.

Route optimization goes beyond downtime and includes:


Geology and geoscience characteristics
Financial models including contractual obligations and weighted impact for each
decision on a companys bottom line.
Location, location, location
o Of the oil and gas well
o Of the personnel in the field
o Of the personnel in the field in relation to other potential issues and impactful
changes
Availability of resources coupled with the success rate and average execution time for
task completion
Preventative maintenance data models and prescribed actions for improvements to
direct
Conditional scheduling to maximize every mile of ground covered by remote personnel
Well sites that have not been visited in N days or devices that have failed
communication in X hours.

Connected energy intelligence allows for all of the above to be leveraged in order to build and
distribute optimized routes throughout the day. Improving logistics for proper resource
allocation in a 24-7, commodity-based industry will lead to future compounded benefits.

Back Office

Back-office cost centers of oil and gas companies will become the controllers of innovation in
the future. Why? Because they sit in the middle of big data for the oil and gas industry and they
put a dollar value on all of it.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
It is often surprising how much effort goes into keeping the data moving between data silos.
Data models exist today in other industries such as retail and manufacturing that are very
similar to what energy companies require. Energy producers need to manage inputs through
vendor performance and optimize output through hedging and risk management, the same way
an automobile manufacturer has to manage its suppliers and the demands that are influenced
by commodity markets.

Hundreds of vendors are used to locate, drill, complete, and produce oil and gas, which is very
similar to a manufacturer. The energy industry is built on relationships and an assumption of
value. Commodity downturns test the business relationships between E&Ps and service
providers as every invoice is scrutinized at a granular level. Decisions made on relationships
alone are why E&Ps were able to achieve immediate efficiency gains through service provider
cost reductions. Similar to manufacturing, a big data solution can identify performance and
quality trends of every vendor, allowing decisions on the merits of quality and reliability rather
than cost or location.

All commodity-based companies should inherently see themselves as big data companies that
must effectively navigate risks. It is no longer sufficient to look out over months at macro-level
trends alone. The shift that is underway, allowing companies to thrive in the most volatile of
markets, has moved beyond transactional systems that can take weeks or months to identify
macro trends occurring on the market. A predictive model that is connected to a companys
ERP system and real-time production data, capital projects, and downstream partners provide
increased visibility to opportunities that can be exploited. The data model and analytical results
should be open and available within an organization, possibly even shared externally with
trading partners. The need for coordination between a commodity trading platform and
production operations is easy to see in the following example.

Production operations exceeds expectations with efficiency improvements and a reduction in


reservoir decline rates resulting in more product to deliver to the market. However, limited
pipeline capacity in a particular area of the field results in penalties and associated fees. Using
models focused solely on reservoir optimization in terms of net revenue to the company would
miss marginal availability on a particular gathering system. However, predictive models
incorporating pipeline capacity and costs associated to delivery would indicate a greater need
for operations to focus on legacy areas of a developed field with plenty spare capacity. Even
with a stellar hedging program, the spot market provides companies plenty of opportunity to
increase revenue by tapping into secondary macroeconomic forces such as social media and
weather. In conclusion, the ability to make up pennies in tight markets is the difference
between profitability and viability.

Macro market trends: pricings, demand


Secondary macro driving forces: weather and social media
Production operations and real time rates
Trading partner and downstream availability and pricing
Regulatory compliance visibility

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Midstream

As hydrocarbons flow from upstream production sites to their downstream destinations,


midstream processes and conditions the products for market. Oil and gas processors and
transporters suffer from many of the same operational issues, such as downtime caused by
equipment failures and field logistic inadequacies. Unconventional production growth has
added to the volatility to one of the more stable segments of the energy industry as producers
expand in new basins not equipped for massive increases of in natural gas and liquid
production.

Consider a natural gas gathering and processing company with operations in a rapidly
expanding basis. New plants are being constructed, compressor stations are brought online,
sales meters are installed at each well pad, and everything is connected through SCADA. The
device and sensor data is also streamed to a large big data analytics platform that is also
connected to the back-office ERP, measurement, contract, financial, and commodity prices fed
in from the NYSE. The company has also worked out terms with their trading partners on data
enrichment agreements that allow for information to flow between organizations.

Since companies in this sector make the most profits through maintaining a consistent flow of
product at a target capacity, big data solutions have application value. First, this is an example
where it is simply not possible for a single human being to process, analyze, interpret, and
interpolate where and how to make adjustments to flow rates and send those notifications and
control alerts to their upstream customers every hour of the day. For example, by tapping into
real-time data streams from remote devices and sensors, and applying machine learning,
opportunities can be identified that improve the flow of gas through the system, resulting in
more favorable terms for customers. Connecting this sort of intelligence once identified may
simply be a matter of distributing actionable notifications. In order to increase the likelihood of
acceptance, real-time analytics can apply financial terms and not only provide upstream
customers with the desired rate, but also the anticipated revenue value of the change.

The interesting aspect of this scenario is that it is continuous and repeatedly testable by
incentivizing with market forces. Offering customers a five percent reduction in fees to
maintain a consistent 15 percent increase in total volume speaks for itself.

The same data streams can feed other neural networks, enhancing other systems in order to:
Identify imbalances in the system and feed the opportunity costs and offers for
adjustments.
Push gas conditional control parameters back to process control systems to so that all
processing equipment is performing.
Estimate line pack and expected.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
Conclusion
As connected systems for oil and gas expand through the cloud, with data flowing seamlessly
from drill bit to burner tip, upstream production activity can account for changes in market
conditions and energy demand without any human intervention. Consider for a moment, the
global stock markets, powered by sophisticated computing systems. Stock market computing
services make thousands of transactions per second, and the gain of fractions of a second is a
competitive advantage over other trading systems. Why should the energy industry neglect to
capitalize on similar efficiencies, even the most nominal?

Additional opportunities for improvement are available through the application of cloud
computing, machine learning, and predictive analytics against continual ingestions of data that
affect the results of each model. Cloud computing encourages the efficient use of capital while
expanding computing and storage capacity. And with todays capabilities in machine learning,
companies can expand beyond one scenario that engineers and management conceptualize,
and instead run through many potential scenarios to produce the best path forward.

Companies will continue to have better access to data from their trading partners and services
providers: devices in the field that stream real-time production data, real-time market and
commodity pricing, and risks and the predictive models. Results from each of these business
areas all play an important role in the recommendations that executives will use to direct their
organizations through the ups and downs.

References
i
Ayata. (2016, March). Atanu Basu and Daniel Mohan. http://ayata.com/prescriptive-analytics/

ii
McKinsey & Company. (2014, August). Digitizing Oil and Gas Production. Retrieved March 15, 2016, from
McKinsey & Company website: http://www.mckinsey.com/industries/oil-and-gas/our-insights/digitizing-oil-and-
gas-production

iii
Bernard Marr (2015) Big Data: Using SMART Big Data, Analytics and Metrics to Make Better Decisions and
Improve Performance

About Quorum
Quorum makes innovative software for hydrocarbon and energy business management. Our
platform of integrated solutions is designed with deep industry expertise using next-generation
technology. It delivers advanced functionality, improved efficiency, and enhanced regulatory
compliance. And it is proven to maximize profit throughout the energy value chain to drive
customer success.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.
For additional information about Quorum or to request a demo, please contact your Quorum
Sales consultant or visit qbsol.com.

qbsol.com | 2016 Quorum Business Solutions, Inc. All Rights Reserved. All other trademarks and copyrights are
properties of their respective owners.

Вам также может понравиться