Вы находитесь на странице: 1из 58

F O C U S

Henri-Georges Doll: 1902-1991

Henri-Georges Doll, for decades Schlumberger’s technical A page from Henri


guiding light, died after a long illness near Paris, France Doll’s notebooks, fea-
turing the develop-
on July 25, 1991 shortly before his 90th birthday. He is
ment of the laterolog
buried, as he wished, next to his mentor Conrad Schlum- technique. This per-
berger in a small cemetery at the Schlumberger family mitted logging forma-
estate in Normandy, France. tion resistivity with
less borehole effect
In a career spanning more than 40 years, he invented,
and much improved
designed, built and field tested geophysical instruments, vertical resolution.
developed interpretation methods, created research and
engineering centers, and guided young scientists and
engineers who could be trusted to carry on. He planted
fertile seeds outside geophysics as well. During World
War II he formed a company for developing a detector for
metallic land mines. This venture later produced automat-
ic guidance and telemetry systems, industrial instrumen-
tation, photomultiplier tubes and sealed-tube neutron
generators. Even after retirement from Schlumberger, his
restless technical spirit couldn’t desist. For more than a
decade, Doll worked at his own expense developing med-
ical instruments for the in-situ measurement of blood-flow
rate. In all these efforts he was guided by faith that practi-
cal technical success rests on scientific research and
engineering of the highest quality.
Henri-Georges Doll possessed a gentlemanly bearing
that came as from another age:
“The first thing that impresses one about Henri-Georges
Doll is his natural elegance... Surprise him in the field
dressed in knickers, or at his work table in a tweed jacket,
and what will strike you first is the ease with which he fits
into his surroundings. One cannot imagine that a poten-

4 Oilfield Review
January 1992 5
tiometer would fail him, or that the lead would dare to after graduation, he participated in electrical surface sur-
break at the tip of his pencil....He seems to be one of veys of the Alsace plain. By 1927 he was to design and
those men who could walk across the Gobi Desert without test equipment for Conrad’s newly conceived carrotage
getting dust on his shoes; the passage of years would only électrique, or “electric coring.” Using this equipment, he
accentuate his neatness and complicity with matter.” and two assistants a short time later ran the now legendary,
So wrote his former wife, Annette Gruner Schlumberger, hand-plotted first electric log in Péchelbronn, France.
in her 1977 book The Schlumberger Adventure. In Doll, A creative flowering filled the following decade. He
this complicity with nature joined forces with a roving started serious mathematical analysis of electrical logging
mind, long-range vision, tenacity and mathematical ability and conceived a differential magnetometer. He was first to
to produce an engineer and inventor par excellence. recognize the origins of small voltages called spontaneous
His work, chronicled by authorship of more than 70 potentials that appeared on the measurement electrodes
patents and over 30 publications, drew the highest esteem of electrical logging tools even when no survey current
from industry and government. During his lifetime, Doll was emitted—and to note that these differentiated shales
received the Lucas Gold Medal of the American Institute of from permeable conglomerates. He brought to fruition
Mining, Metallurgical and Petroleum Engineers, the first borehole measurements of temperature, dip and inclina-
Gold Medal Award of the Society of Professional Well Log tion, the last of these employing the first sonde containing
Analysts, a Certificate of Appreciation from the US govern- complex downhole instruments—not merely wires and
ment and elevation by the French government to “Officier electrodes. He even developed a safe and simple method
de la Légion d’Honneur.” for such a mundane but ubiquitous operating problem as
Henri-Georges Doll was born in Paris, France, August locating cable leaks.
11, 1902, to a Swiss father and French mother. During Shortly after the outbreak of World War II in Europe, Doll
World War I, while his father was serving as an officer in was dispatched to Houston, Texas, USA to establish
the French Army, young Henri attended a lycée in Lyons, Schlumberger’s first research and development activity
where he and his three brothers lived with their mother, outside France. When the United States entered the war,
who was working in a hospital. A brilliant student, he was he offered his services to the US Army through a small
a natural candidate for the Ecole Polytechnique in Paris, nonprofit company that he had founded, Electro-Mechani-
from which he later graduated. He then went on to higher cal Research. Here Doll led the development of the jeep-
studies at the Ecole des Mines, also in Paris. borne mine detector for which he later received a Certifi-
While still a student there, the 22-year-old engineer cate of Appreciation from the US government.
married Annette Schlumberger, Conrad’s daughter, and in After the war, Doll spearheaded the separation of
late 1925 joined Conrad and Marcel Schlumberger’s small research and engineering in Schlumberger by founding in
electrical prospecting group part-time. The following year, 1948 what was then the Schlumberger Well Surveying Cor-
poration’s research laboratory in Ridgefield, Connecticut,
USA. Upon his retirement in 1967, the board of directors
named it the Schlumberger-Doll Research Center in his
honor. No matter what his other responsibilities, the urge
for technical creation came to the fore.
At Ridgefield, Doll not only led electrical tool develop-
ment, but expanded the staff to include nuclear, sonic,
mathematics/computer and log interpretation groups,
among others. Although he was never as technically com-
fortable with nuclear and sonic logging as with electrical,
his support and encouragement sustained these groups
through their early, lean years. This foresight informed all
his thinking. Nearly 40 years ago, Doll predicted the even-

6 Oilfield Review
tual demise of the log analyst’s slide rule and colored pen- less affected by mudcake. He had already in 1946 successful-
cil, and their replacement by truck-borne computers doing ly field tested a resistivity dipmeter sonde. This essentially
on-line interpretation. employed three rudimentary laterologs on pads spaced
Roughly coincident with Doll’s tenure as Schlumberger’s around the borehole, with a common survey current electrode
manager of technique in the decade following World War several feet below.
II, a second creative flowering took place. Two papers There is little doubt, however, that Doll’s crowning techni-
“The SP Log: Theoretical Analysis and Principles of Inter- cal achievement was inventing the induction log. The impetus
pretation” and “The SP Log in Shaly Sands,” now clas- for this tool was the felicitous joining of a problem, the increas-
sics, described the current and voltage distributions of the ing use of nonconductive oil-base mud, and a technique,
spontaneous potential in and around the borehole. These electromagnetic induction, which Doll had already used suc-
clarified formerly puzzling aspects of the subject. Perhaps cessfully in mine detection. Schlumberger colleagues strong-
the most widely referenced of all papers on induction log- ly opposed his proposal, citing problems posed by very small
ging is his 1949 “Introduction to Induction Logging and signal strength, huge direct mutual coupling interference and
Application to Logging of Wells Drilled with Oil-Base lack of adequate supporting technology. Nevertheless, Doll
Mud.” These were followed by important publications on persisted in this complicated and difficult task, leading a small
the microlog, the laterolog and the microlaterolog. team that often developed order-of-magnitude improvements
Louis Allaud and Maurice Martin, in their book Schlum- in technology as needed. History validated his vision, perse-
berger, The History of a Technique, give Doll full credit for verance and faith in the eventual success of well grounded
the basic theory of resistivity determination in a perme- research. Induction logging became one of the most widely
able formation, which took into account all the parame- used logging methods in the world. It not only solved the oil-
ters—borehole, depth of invasion and shoulders. The thin- base mud problem, but also surmounted the obstacle present-
bed problem challenged him also. From the beginning of ed to electrode methods by high-resistivity invaded zones.
logging, bed resolution had been a preoccupation of the Throughout his professional life, Doll was self-confident but
Schlumberger brothers and Doll. As early as 1927, Conrad reserved in manner, soft-spoken but precise and authorita-
proposed a sonde employing long guard electrodes to force tive. Above all, he was a kind and considerate man. His lega-
the survey current into a beam that would penetrate later- cy to geophysics and hydrocarbon exploration is a thriving
ally into the formation, even in the presence of salty mud. technology and a clear image of the fruit that can grow from
Many years later he conceived a “point electrode” system seeds of vision, knowledge, perseverance and integrity.
aimed at the same goal. But it wasn’t until Doll actually Doll is survived by his first wife, Annette Gruner Schlum-
designed a workable system using the then barely ade- berger, his daughters, Mrs. Frank Davidson, Mrs. Jean Lebel,
quate technology that the first laterolog was run in 1949. and Mrs. Arnaud de Vitry, 10 grandchildren and 23 great-
Taking a cue from the success of the microlog, one of grandchildren.—JT
his earlier inventions that located permeable zones, he
extended the laterolog concept to a pad tool. The micro-
laterolog was born, giving for the first time a reasonable
estimate of invaded-zone resistivity (Rxo) and residual oil
saturation. Then in 1958, Doll came up with the proximity
log, which gave an even better determination of Rxo, one

Acknowledgements and Further Reading

Schlumberger AG: The Schlumberger Adventure. New


York, USA: ARCO Publishing, Inc. 1982.
Allaud L and Martin MH: Schlumberger: The History of a
Technique. New York, USA: John Wiley and Sons, Inc.,
1977.

January 1992 7
Trends in Reservoir Management

As oil becomes harder to find and known reserves must be more

phase than the industry has ever experienced.

Twenty-five years ago, oil companies could enhance reservoir description.2 Still lacking drill these wells and is mastering how to
hope for continuing discoveries of giant the resolution to systematically pick out fine complete and stimulate them. Less clear is
fields and highly profitable exploitation sedimentological structure, it has neverthe- how horizontal wells can best benefit flood-
once oil was found. Today, the tables have less revolutionized the mapping of faults ing strategies and enhanced oil recovery
turned. More than three-quarters of current and illuminated well-to-well correlation in (EOR) schemes.4
additions to the world’s oil reserves comes complex environments where logs and Given the increasing complexity of devel-
from better management of existing reser- cores offer poor or negligible correlation. oping many of today’s discoveries, particu-
voirs. Less than one quarter comes from dis- Three-dimensional seismic techniques have larly those in deep formations and offshore,
covery of new oil. Profitability in today’s also aided fluid monitoring in certain these innovations are giving reservoir man-
harsher economic climate depends on enhanced oil recovery (EOR) schemes, but agement the edge it needs to make exploita-
increasing recovery from producing fields. If have yet to be proved capable of monitoring tion worthwhile. The basic principles of
recovery is being optimized better now than conventional water- and gasfloods.3 reservoir management, though, remain
in years past, at least four factors could The third factor is increasing computer unchanged. Every reservoir progresses
claim credit: power, which has made possible the disci- through the same phases in its producing
First is the oil companies’ clear vision that plines of geostatistics and reservoir simula- life and similar decisions must be made at
a cross-disciplinary approach must prevail. If tion. We may tire of the computer industry’s any given phase (next page ).
producing oil requires the expertise of geol- dramatic predictions of undreamed-of com- The most critical occur during the earliest
ogists, geophysicists, petrophysicists, reser- puter power, but must remember that so far phases—discovery and appraisal—when the
voir engineers and numerous other special- the predictions have come true. Computer least data are available. With scanty geo-
ists, it is best that the experts work together power has allowed geologists and reservoir logic information, with the exploration seis-
rather than separately. It sounds simple, but engineers to create a range of probabilistic mic survey, and with cores, logs and tests
entrenched management structures can models that fill space between wells where from a handful of wells, oil companies have
make this difficult to put into practice.1 measured data are lacking. The models are to rapidly assess reservoir size and pro-
Second is dramatic innovation in interwell used as input to fluid-flow simulations, ducibility, decide well locations and drive
measurements, notably three-dimensional another activity that benefits directly from mechanism, and then commission surface
(3D) seismics, and perhaps, in the future, advances in computer technology. facilities that may cost billions.
interwell seismic tomography. Developed The fourth factor is horizontal well tech- As the field is drilled—the development
initially to aid exploration and improve nology, which changes all the rules about phase—an abundance of data becomes
understanding of complex structure, 3D producing from both simple and complex available, allowing more detailed under-
seismics is being used increasingly to structures. The industry has learned how to standing of production. Simulation that was

Prepared with assistance from:

Peter Briggs, Manager of Reservoir Technology Michael Fetkovich, Sr. Principal Reservoir Engineer Tien-when Lo, Project Scientist
BP Exploration Phillips Petroleum Company Exploration and Production Technology Department
Uxbridge, England Bartlesville Oklahoma, USA Texaco Inc.
Houston, Texas, USA
Tony Corrigan, Partner
Michel Gouilloud, Executive Vice-President
Corrigan Associates
Schlumberger Ltd Björn Paulsson
Ditchling, England
Paris, France Senior Research Geophysicist
Chevron Oil Field Research Company
La Habra, California, USA

8 Oilfield Review
carefully exploited, reservoir management is entering a tougher, more challenging

At the appraisal phase, the concerns are


twofold: determining reservoir boundaries,
Production

including those of the driving mechanism,


and determining large-scale heterogeneities
that complicate the reservoir’s internal struc-
ture. This means identifying the main depo-
sitional units and the boundaries between
nPhases in a them, and identifying faults and fractures.
field’s exploitation.
Discovery Time The extremities of the reservoir and its
Appraisal faults are exclusively provided by seismic
Development data, although gross reservoir size and dis-
Plateau tances to certain boundaries can be esti-
Decline
mated from well tests.5 A system of sealing or
partially sealing faults can completely change
the way a field should be produced. Know-
previously based on a simple reservoir More or less paralleling this saga is a
In this article, Array-Sonic and Formation MicroScan-
model because of limited data can now gradual narrowing of focus for the data ner are marks of Schlumberger.
grow more sophisticated. As the field enters gatherers—a continuing quest for finer 1. Thakur GC: “Reservoir Management: A Synergistic
the so-called plateau phase, when produc- detail. At every scale, their primary quest is Approach,“ paper SPE 20138, presented at the 1990
tion reaches planned levels, there is also an to better understand reservoir heterogeneity. Permian Basin Oil and Gas Recovery Conference,
Midland, Texas, USA, March 8-9, 1990.
accumulation of production data that simu- Heterogeneity governs not only connectiv- 2. Robertson JD: “Reservoir Management Using 3D Seis-
lation can history match. The net result is ity—the degree to which the permeable mic Data,“ Journal of Petroleum Technology 41 (July
more reliable prediction of future produc- zones are interconnected and connected to 1989): 663-667.
tion. This helps prepare the way for the wells—but also horizontal and vertical 3. Greaves RJ and Fulp TJ: “Three-Dimensional Seismic
Monitoring of an Enhanced Oil Recovery Process,”
decline phase, when the oil company must sweep efficiency and residual oil saturation Geophysics 52 (September 1987): 1175-1187.
decide how to extract the most from its in swept zones. All are critical factors that 4. Horizontal well technology is described in several
dying asset. Issues range from whether to determine recovery. The ongoing challenge articles in the July 1990 issue of Oilfield Review.
infill drill—and if so, where—to modifying facing operators is learning enough about 5. Hansen T, Kingston J, Kjellesvik S, Lane G, l’Anson K,
Naylor R and Walker C: “3D Seismic Surveys,” Oil-
surface facilities, and finally perhaps to the heterogeneity that may influence the field Review 1, no. 3 (October 1989): 54-61.
planning EOR (see “A Niche for Enhanced next phase of the reservoir’s life (next page).
Oil Recovery in the 1990s,” page 55).

Nansen Saleri, Manager of Reservoir Engineering Koenraad Weber, Professor of Production Geology
Chevron Exploration and Production Services, Co. Delft Technical University, The Netherlands
Houston, Texas, USA and Consultant in Reservoir Geology
Shell Internationale Petroleum Maatschappij BV,
John Warrender, Staff Geologist
The Hague, The Netherlands
Reservoir Development
Conoco (U.K.) Limited, Aberdeen, Scotland

January 1992 9
Reservoir heterogeneity Recovery Data
Sweep Reservoir Production Well logging Rock
efficiency pressure data / tests samples
Scale

Sidewall core
swept zones

Tracer tests
Production
distribution
3D seismic

distribution

Pulse tests

Production

Production
interaction
Horizontal

Rock/fluid

Horizontal
continuity
Reservoir

Outcrop/
Standard

reservoir
cuttings
Special
ROS in
Vertical

Vertical

history

Cores
tests

ROS
logs
Sealing fault

Semi-sealing fault
(>1000 ft)

Nonsealing fault
Giga

Fracturing
-tight

-open

Boundaries
genetic units
(100-1000 ft)
Mega

Permeability
zonation within
genetic units

Baffles within
genetic units
Macro
(in.-ft)

Lamination
cross-bedding

Microscopic
heterogeneity
(microns)
Micro

Textural types

Mineralogy

Strong effect Moderate effect

nReservoir heterogeneities versus scale and the data required to characterize them. At the beginning of a field’s life, attention
focuses on large-scale structure. As a field matures, attention shifts to finer detail. (From Weber and van Geuns, reference 22.)

10 Oilfield Review
ing the fault system, then determining if efficient drainage resulting in a continual faulted nature of their asset. By 1981, when
faults are sealing may be the most important upgrading of reserves. In tectonically active Block IV in the north of the field was drilled,
information an operator should acquire as it areas, reserves estimates tend to decline. reservoir managers had the benefit of a reli-
contemplates development. This is illus- Operators found these fields to be more able 3D survey (next page, top ). This reaf-
trated by yearly estimates of recoverable faulted than expected, and in some cases firmed the complex faulting and enabled
reserves required of operators by the UK saddled with high permeability contrasts. reservoir managers to ensure that injector-
government. These estimates show trends The industry’s solution is 3D seismics, lots producer well pairs were at least in the
that correlate with the field’s tectonic history of it. Take Shell Expro’s Cormorant field that same unit.7
and its faulting system in particular.6 lies in the highly faulted Brent province in Another 3D seismic survey was commis-
In tectonically quiescent areas with little the northern North Sea in the UK sector. sioned in 1984 to provide yet more detail,
faulting, operators tend to increase reserves The southern part of the field was devel- the previous survey having suggested that
estimates (below). Many of these North Sea oped in the 1970s before 3D seismics the average distance between faults was
fields were deposited as submarine fans, became commercial, and production started about the same as or even less than the sur-
and the sands have a more or less uniform in 1980. Reserves estimates dropped steadily vey line spacing. The more closely spaced
morphology. They have limited faulting and during the buildup to production as reser- survey showed the same big picture, but
produce with waterdrive that leads to more voir engineers gradually perceived the enough detail was different to merit fine-

350 nNorth Sea opera- 6. Corrigan T: “Factors Controlling Successful Reserve


tors’ changing esti- Prediction: A Cautionary Tale from the UK North
300 mates of their Sea,“ presented at the 2nd Conference on Reservoir
250
fields’ recoverable Management in Field Development and Production,
Forties reserves—the spots Norwegian Petroleum Society, Stavanger, Norway,
200 indicate beginning November 14-15, 1988.
of production. In 7. Gaarenstroom L: “The Value of 3D Seismic in Field
tectonically quiet, Development,” paper SPE 13049, presented at the
relatively non- 59th SPE Annual Technical Conference and Exhibi-
faulted fields (red), tion, Houston, Texas, USA, September 16-19, 1984.
140
estimates tend to Ruijtenberg PA, Buchanan R and Marke P: “Three-
go up because Dimensional Data Improve Reservoir Mapping,”
Piper Journal of Petroleum Technology 42 (January 1990):
original forecasts
120 22-61.
were conservative.
Grant I, Marshall JD, Dietvorst P and Hordijk:
Estimated reserves, million tonnes

In highly faulted
“Improved Reservoir Management by Integrated
fields (greens), esti- Study: Cormorant Field, Block 1,” paper SPE 20891,
100
mates go down as presented at Europec 90, The Hague, The Nether-
operators come to lands, October 22-24, 1990.
Magnus
grips with reservoir
complexity. These
80 trends show how
important it is for
Dunlin N W Hutton operators to under-
stand complex
60
structure as early
Claymore Field as possible in the
life of a field. For
Thistle typical North Sea
40
Hutton crudes, 1 tonne is
equivalent to 7.53
Maureen barrels. (From Corri-
20 gan, reference 6.)

Tartan
0
75

77

79

81

83

85

87

89
19

19

19

19

19

19

19

19

January 1992 11
N

Platform

Oil 0 2 km
1974-1975 1981 1983-1984 1989 Water
2D Seismic 3D Seismic 3D Seismic Void block
Fault

N.Cormorant
nAn increasingly tuning the locations of wells to be drilled.
Norway platform complex view of Shell Expro’s estimates of the Cormorant
Shetland Cormorant the fault system in
Field Block IV of Shell field recoverables have rebounded and now
Isles
Expro’s North Sea remain relatively constant (below ).
Cormorant field. The newcomer in detecting large-scale
East Shetland Two 3D seismic structure is well-to-well tomography, a tech-
Basin surveys, in 1981 nique that measures the acoustic signal
and 1984 respec-
Orkney tively, progres- transmitted from a source located in one
Isles sively clarified the well to a receiver located in a neighboring
picture. By 1989, well.8 Measurements are made for multiple
data from newly combinations of receiver and source depths,
North Sea drilled wells added
UK Block IV more detail. The creating a large data set from which com-
UMC puter processing recreates an acoustic
0 100 miles most recent simu-
Aberdeen lation (far right) velocity map of the interwell terrain. The
used a curvilinear
grid to follow the
faulting. (From 30
Estimated reserves, million tonnes

Cormorant Ruijtenberg et al, ref-

Southern part
Alpha platform erence 7.)

Block IV
0 2 km
20

8. “CAT-Scanning the Subsurface,” Oilfield Review 2, 10. Lasseter T, Karakas M and Schweitzer J: “Interpreting
no. 2 (April 1990): 4-6. an RFT*-Measured Pulse Test with a Three-Dimen-
10
9. Paulsson BNP, Fairborn JW, Cogley AL, Howlett DL, sional Simulator,” SPE Formation Evaluation 3
Melton DR and Livingston N: “McKittrick Cross-Well (March 1988): 139-146.
Seismology Project: Part I. Data Acquisition and 11. Slentz LW: “Geochemistry of Reservoir Fluids as a
Tomographic Imaging,” Expanded Abstracts, 60th Unique Approach to Optimum Reservoir Manage-
Annual International Meeting and Exposition, Society ment,” paper SPE 9582, presented at the SPE Middle
of Exploration Geophysicists, San Francisco, Califor- East Oil Technical Conference, Manama, Bahrain, 0
nia, USA, September 23-27, 1990: 26-29. March 9-12, 1981. 75 77 79 81 83 85 87
Lo T-w, Inderwiesen PL, Howlett DL, Melton DR, Liv-
ingston DN, Paulsson BNP and Fairborn JW: “McKit-
Gibbons K: “Use of Variations in Strontium Isotope
Ratios for Mapping Barriers: an Example from the nEstimates of recoverable reserves in the
trick Cross-Well Seismology Project: Part II. Tomo- Troll Field, Norwegian Continental Shelf,” presented Cormorant field. Initially, only the south-
graphic Processing and Interpretation,” Expanded at the 6th European Symposium on Improved Oil ern part of the field was developed. Its
Abstracts, 60th Annual International Meeting and Recovery, Stavanger, Norway, May 21-23, 1991. complex structure caused estimates of
Exposition, Society of Exploration Geophysicists, San 12. Lachance DP and Rezk AS: “Resolution of Fault reserves to drop. Cormorant Block IV was
Francisco, California, USA, September 23-27, 1990: Block Communication Leading to an Optimal Plan developed three years later and benefited
30-33. of Depletion, Abu Gharadig Gas Field, Egypt,” paper from 3D seismic surveys mapping its com-
SPE 17993, presented at the SPE Middle East Oil plex structure. Reserves increased after
Technical Conference and Exhibition, Manama,
the addition of the new field and
Bahrain, March 11-14, 1989.
remained steady with time because the
13. Weber KJ, Mandl G, Pilaar WF, Lehner F and Pre-
operator better understood its structure.
cious RG: “The Role of Faults in Hydrocarbon
Migration and Trapping in Nigerian Growth Fault
Structures,” paper OTC 3356, presented at the 10th
Annual Offshore Technology Conference, Houston,
Texas, USA, May 8-11, 1978.
12 Oilfield Review
technique was conceived in the 1960s, but
implementation is still hampered by techni-
cal difficulties, notably the engineering of a
sufficiently powerful downhole source. In nTwo abutting
tomograms
current experiments, acoustic frequency lies revealing possi-
in the hundreds of hertz (Hz), between the ble fault structure
low frequencies of surface and borehole between wells in
seismics (10 to 100 Hz) and the high fre- the shallow
quencies of acoustic logging (20,000 Hz). McKittrick oil
field in California,
This is designed to provide both formation USA. Tomograms
penetration and spatial resolution. Most are obtained by
experiments so far have been in shallow for- transmitting
mations. Texaco Inc. and Chevron Oil Field acoustic signals
between the wells
Research recently performed a tomographic (see inset below)—
survey to clarify reservoir structure in the the technique is
shallow McKittrick oil field in California, experimental.
USA ( right ). 9 The image created by two The data were
abutting tomograms maintains a resolution obtained jointly
by Texaco and
of 40 ft [12 m], reveals details of overthrust Chevron; the
faulting and provides a substantially tomogram was
improved picture of the field’s heterogene- Well 1 Well 2 processed and
ity. However, much work remains to ensure interpreted by
Texaco Inc.
the reliability of tomographic processing. (From Lo et al, refer-
Locating faults is just half the story, S R ence 9, courtesy of
though. Also crucial is determining whether Texaco Inc. and
there is fluid communication across a fault Chevron Oil Field
R Research.)
when sand abuts sand. Several methods S
help clarify the sealing question. In interwell
testing, a flow disturbance is created in a
R
well on one side of the fault and monitored S
using pressure gauges in a well on the other
side, either at surface or downhole. Lack of
reaction to the disturbance may indicate
sealing. 10 In tracer tests, radioactive or
chemical tracers injected in one well are
monitored in neighboring producers. No
nFault sealing
through clay
tracer appearing in the producer may indi- Shale smearing, a phe-
cate sealing. Chemical analyses of forma- nomenon in which
tion water and hydrocarbons on both sides clay smears plasti-
of a fault may also give insight into connec- cally in a growing
fault creating a
tivity across it, the analyses matching if the barrier to fluid
fault is nonsealing.11 Material balance cal- flow. The likeli-
culations that indicate the volume of the hood of a sealing
connected reservoir offer another diagnos- fault due to clay
smearing can be
tic—the reservoir will appear larger if a fault estimated from the
is nonsealing.12 In a reservoir that has pro- thickness of
duced for some time, a common method is nearby clay zones
to make pressure measurements using an and the throw of
RFT (Repeat Formation Tester) tool in wells Sand the fault. (From
Weber et al, refer-
either side of the fault. Differing pressure ence 13.)
declines across the fault indicate sealing.
A nondirect method is based on the
hypothesis that faults seal because clay beds
Clay
cut by fault displacement smear plastically
into the fault, filling it and preventing com-
munication (right ). Clay smearing has been
reproduced in laboratory experiments and
also observed in situ down mines.13 Clay
smearing is more likely to seal a fault if
there is plenty of clay available—this can be
determined from logs. But it is less likely the
more the fault has been displaced—dis-

January 1992 13
placement can be estimated from seismic Preferential Detecting a natural fracture system and
data and in some cases from logs in neigh- flooding estimating its directionality is therefore
boring wells. Both factors can be worked direction mandatory, and needed as early as possible
N
into a quantitative prediction of fault com- in the development of a field. The question
munication and integrated into reservoir is how? Apart from regional stress studies,
simulation, a feat recently performed for the little can be done before wells are drilled
Cormorant Block IV field.14 and logs and cores become available. Con-
A yet more difficult task awaits the opera- W E ventional 3D seismics fails to see natural
tor when evaluating a reservoir’s natural fractures. There is hope, however, that shear
fracture system. That fractures influence General seismic surveys, both from the surface and
gross field behavior is demonstrated in a channels the borehole, may help. Shear waves vibrate
recent report surveying 80 fields from pro- trends transversely to the wave direction and can
ducing areas worldwide and covering every be split into two parts by fractures. Shear
S
production method from waterdrive to gas- Range of major surface seismic experiments made by
drive to surfactant flooding.15 The survey horizontal stress Amoco Production Co. have demonstrated
shows that preferential flooding direction this splitting as have shear borehole seismic
closely parallels the direction of maximum surveys in the Paris basin, France and Silo
horizontal stress, the same direction a natu-
nCorrelation between preferential flood- field, Wyoming, USA.16 Shear seismics is in
ing orientation and direction of maximum
ral fracture system takes (right ). The survey horizontal stress determined from well- its infancy, though, and one stumbling block
rules out the possibility that induced frac- bore breakouts, for 80 fields worldwide may prevent the technique ever maturing: it
tures may be causing flooding directional- with a variety of flooding mechanisms. is practically impossible to generate shear
ity—the correlation improves with greater This result reinforces the importance of wave energy in marine sediments because
understanding a reservoir’s natural frac-
spacing between wells, unlikely unless the ture system and its role in determining the ocean separating source from formation
fractures are natural. waterflood efficiency. (From Heffer and Lean, does not support shear propagation. All
reference 15.) experiments so far have been on land.

nConfirmation of natural fractures from cores and logs, in a Texas carbonate. Fractures are visible on the borehole televiewer and
Formation MicroScanner logs (black streaks) at 2234 and 2238 ft. They also appear as two peaks on the Stoneley fracture width log
which is derived from the Array-Sonic log (far right). (From Adams et al, reference 18.)

14 Oilfield Review
Concrete proof of natural fractures finally
emerges when wells are drilled and frac-
tures are recognized in cores and on logs
(previous page, bottom). Cores and logs also
herald the beginning of serious depositional
study. With the help of a standard petro-
physical interpretation of conventional logs,
use of dipmeter logs17 and more recently
Formation MicroScanner logs18 can reveal
important clues about the depositional pro-
cess (right ). Skillfully interpreted, all these
logs together with core data allow a picture
of the reservoir’s internal structure to gradu-
ally emerge. Particularly crucial are perme-
ability barriers between units and high-per-
meability streaks that allow production to
bypass large regions of the reservoir. Barriers
may be suggested by depositional analysis,
but must be corroborated with pressure
measurements and well testing.
Of increasing use in clarifying deposi-
tional environment is, once again, the 3D
seismic survey. Sophisticated processing on
the amplitude data that reveals dip and
azimuth, shaliness, and even net pay is

14. Bentley MR and Barry JJ: “Representation of Fault


Sealing in a Reservoir Simulation: Cormorant Block
IV, UK North Sea,” paper SPE 22667, presented at
the 66th SPE Annual Technical Conference and
Exhibition, Dallas, Texas, USA, October 6-9, 1991.
15. Heffer KJ and Lean JC: “Earth Stress Orientation—a
Control on, and Guide to, Flooding Directionality in
a Majority of Reservoirs,” presented at the 3rd Inter-
national Reservoir Characterization Technical Con-
ference, Tulsa, Oklahoma, USA, November 3-5,
1991.
16. For a review of shear seismics and its application to
identifying fractures:
“Formation Anisotropy: Reckoning with its Effects,”
Oilfield Review 2, no. 1 (January 1990): 16-23.
For surface shear seismics:
Lynn HB and Thomsen LA: “Reflection Shear-Wave
Data Along the Principal Axes of Azimuthal
Anisotropy,” Expanded Abstracts, 56th Annual Inter-
national Meeting and Exhibition, Society of Explo-
ration Geophysicists, Houston, Texas, USA (1986):
473-476.
For borehole shear seismics:
Crampin S, Lynn HB and Booth DC: “Shear-Wave
VSP’s: A Powerful New Tool for Fracture and Reser-
voir Description,” Journal of Petroleum Technology
41 (March 1989): 282-288. nDepositional interpretation from well data facilitated by the integration of petrophysi-
17. Gilreath JA: “Strategies for Dipmeter Interpretation cal logs, Formation MicroScanner logs and computer-aided interpretations. In con-
(Part 1),” The Technical Review 35, no. 3 (July structing this composite display, a geologist has reviewed the Formation MicroScanner
1987): 28-41. log—left track in “Sedimentology/Structure”—on an interactive workstation and sym-
Adams J, Bourke L and Frisinger R: “Strategies for bolically coded an interpretation that appears to the right. The “Petrophysics” and
Dipmeter Interpretation (Part 2),” The Technical “Rock Classification” tracks are computer-interpreted from conventional logs. (From Dar-
Review 35, no. 4 (October 1987): 20-31.
ling et al, reference 18.)
18. Adams J, Bourke L and Buck S: “Integrating Forma-
tion MicroScanner Images and Cores,” Oilfield
Review 2, no. 1 (January 1990): 52-65.
Darling H, Patten D, Young RA and Schwarze L:
“Single-Well Data Integration,” Oilfield Review 3,
no. 3 (July 1991): 29-35.

January 1992 15
proving that minute changes in the seismic may correlate from one well to the next, and and occurrence of depositional units and
signal may reflect real geologic events and the reservoir structure emerges quite readily. their heterogeneity, a labor often subcon-
have an important story to tell.19 A time But in complex areas, correlation may be tracted by operators to university depart-
slice from a survey in the Matagorda area of sparse or nonexistent. The only recourse is ments. The second takes place in front of a
the Gulf of Mexico reveals a meandering for the geologist to dig into a repertoire of terminal connected to a powerful computer.
stream channel (next page ).20 Amplitude typical sizes, shapes and juxtapositions for Here, a geostatistician uses the outcrop
maps from the top and base of another depositional units and fill in the blank space statistics to help create probabilistic models
channel sand from one of Shell’s offshore with as plausible a picture as possible. of the reservoir (for detail on outcrop studies
fields in Sarawak, Malaysia, obtained by For clastic reservoirs, this art has been for- and reservoir model building, see “Reservoir
automatic tracking on the 3D seismic ampli- malized by Weber and van Geuns of Shell Characterization Using Expert Knowledge,
tude data of the relevant horizons, display a (below ).22 Reservoir architecture is divided Data and Statistics,” page 25 ).23
distributary channel and crevasse splay.21 into three types—layer cakes, jigsaw puzzles A probabilistic reservoir model first hon-
Both examples are from less than 1000 m and labyrinths. Layer cakes describe reser- ors known, or deterministic, data from wells
[3200 ft], shallow enough to preserve the voirs deposited by a single depositional and then fills the empty space with sand
high seismic frequencies required to see this mechanism that show excellent correlation bodies both shaped and placed in space
kind of detail. Obtaining the same resolu- between wells. In jigsaw puzzle reservoirs, randomly. The result is called a realization,
tion at greater depths that currently attenu- different sand bodies fit together without a term coined by statisticians to describe
ate high frequencies will depend on techni- major gaps but with occasional intervening one outcome of a random process. Many
cal improvements to the acquisition low-permeability zones. Labyrinths represent realizations are required to judge variations
process. more or less random arrangements of sands, in architecture caused by the random ele-
The general problem facing the entire usually discontinuous. ment of the model building. In simple layer
reservoir management team, as eyes focus The first step is to recognize which type of cake architectures, most of the model may
on smaller-scale heterogeneity, is the blank architecture is present. The next step— be deterministic with little or no probabilis-
area between wells. 3D seismic data may which becomes harder with more complex tic content. Labyrinth architectures, on the
reliably outline the overall reservoir archi- architecture—is to draw on knowledge of other hand, have little deterministic content
tecture and the fault system with an occa- how particular sand bodies are shaped and and are mostly probabilistic.
sional clue to deposition, and well data may distributed and then create plausible scenar- Probabilistic model building comes in a
reveal the detailed depositional environ- ios for the reservoir geometry. This has variety of guises, and models can be con-
ment close to wells, but the huge space spawned two disciplines. One involves structed at all scales. Some of the finest scale
between wells is mostly unknown. True, in patient hours in the field studying outcrops clastic models are being built by Shell using
depositionally simple areas, features on logs and compiling statistics on the geometry their proprietary software package,

600 m
Layer cake

Deterministic

Jigsaw puzzle

Labyrinth
Probabilistic

50 m

nClassification of large-scale sand geometries into layer cakes, jigsaw puzzles and labyrinths. Layer
cakes are deposited in one environment, have excellent well-to-well correlations and can be modeled
deterministically. Jigsaw puzzles represent sand bodies that fit together without major gaps, but with
occasional low permeability barriers. Correlation may be difficult and modeling must have a probabilis-
tic content. Labyrinths represent discontinuous sands with poor well-to-well correlation. Modeling
requires a strong probabilistic content. (From Weber and van Geuns, reference 22.)

16 Oilfield Review
Base sand
nTwo examples of
channel sands
identified with 3D
seismic data:
Left: amplitude
maps of two hori-
zons picked from a
3D seismic data set
obtained in one of
Shell’s offshore
fields in Sarawak,
Malaysia. The hori-
zons show the top
and bottom of a
distributary chan-
nel sand. The top
horizon clearly
Top sand shows a crevasse
splay, later con-
firmed by logs.
(From Rijks and Jauf-
fred, reference 21,
courtesy of Shell Inter-
nationale Petroleum Fault
Maatschappij B.V.
and Geophysics: The
Leading Edge of
Exploration.)
Right: a time slice
from a survey con-
ducted by ARCO in Channel
the Matagorda
area of the US Gulf
Coast reveals a
meandering
stream channel,
probably gas filled.
The meander of the
stream suggests a
channel width of
Dis 15 ft [4.5 m], while
trib seismic resolution
uta
ry was estimated at
Ch only 40 ft [12 m].
an
ne The better-than-
l
expected accuracy
may be caused by
Crevasse interference or by
Splay the presence of
gas. (From Riese and
Winkleman, reference
20, courtesy of ARCO
Oil and Gas Company
0 1 2km and the American
Association of
Petroleum Geologists.)

19. Sonnelund L, Barkved O and Hagenes O: “Reservoir 22. Weber KJ: “How Heterogeneity Affects Oil Recov-
Characterization by Seismic Classification Maps,” ery,” in Lake LW and Carroll HB Jr. (eds): Reservoir
paper SPE 20544, presented at the 65th SPE Annual Characterization. Orlando, Florida, USA: Academic
Technical Conference and Exhibition, New Orleans, Press Inc. (1986): 487-544..
Louisiana, USA, September 23-26, 1990. Weber KJ and van Geuns LC: “Framework for Con-
20. Riese WC and Winkleman BE: “Shallow Overlooked structing Clastic Reservoir Simulation Models,” Jour-
Channels, Offshore Gulf of Mexico: Application of nal of Petroleum Technology 42 (October 1990):
3D Seismic Analysis to Stratigraphic Interpretation,” 1248-1253, 1296-1297.
in Bally AW (ed): Atlas of Seismic Stratigraphy, vol. 23. Haldorsen HH and Damsleth E: “Stochastic Model-
3. Tulsa, Oklahoma, USA: American Association of ing,” Journal of Petroleum Technology 42 (April
Petroleum Geologists (1989): 59-65. 1990): 404-412.
21. Rijks EJH and Jauffred JCEM: “Attribute Extraction:
an Important Application in any Detailed 3D Inter-
pretation Study,” Geophysics: The Leading Edge of
Exploration (September 1991): 11-19.

January 1992 17
Early in life of reservoir Ten years later

Water Sand, all types

Channel-fill sand

Oil Sheet sand

Coarsening-upward
sand
Gas Sand, all types

Coal

Foreground scale 10 ft
100 m

MONARCH, which divides space into vox- Programmed to track different fluids in the nTracking fluid fronts using MONARCH,
els, the three-dimensional version of a pixel, sand bodies, MONARCH-type models can Shell’s probabilistic reservoir modeling
package. MONARCH divides space into
each measuring about 50 m × 50 m × 0.6 m also integrate production logs and pulsed millions of voxels, small volume elements
thick [164 ft × 164 ft × 2 ft]—the exact neutron logs that measure saturation behind measuring typically 50 m x 50 m x 0.6 m
dimensions can vary.24 The model honors casing. This is a major advance. Currently, thick. The models honor data measured
well data from logs, down to the 0.6-m verti- the commonest data used to monitor reser- at wells (grey vertical columns), but fill
the rest of space probabilistically.
cal resolution, uses structural information voir performance are well pressures and MONARCH models permit analysis of for-
provided by seismic and dipmeter data. It production, usually from individual wells mation connectivity and also of fluid dis-
ensures a given sand-shale ratio and distin- but sometimes from groups of wells. These tributions throughout the reservoir.
guishes between at least three different sand data track gross well performance at best, This example shows part of a North Sea
types—for example, in a fluvial environment masking the relative contributions of different reservoir early in the reservoir’s life and
after ten years of production. The models
between channels, mouth bars, and crevasse producing layers. MONARCH models can are viewed from the south and do not show
splays. Finally, it obeys the statistics on track flood advances in detail and identify an overall 8° westerly dip. Given the dif-
depositional unit width, thickness and length unswept parts of the reservoir (above ). fering vertical and horizontal scales, a true
as derived from outcrop or other studies. Tracking fluids through voxels at one par- visualization is obtained by tilting up the
right-hand edge of the figure by about 25°.
The result is a three-dimensional matrix of ticular moment in time, though, is a far cry There are three types of sand, identifi-
millions of voxels. Although this must be from reservoir simulation which builds a able by color if oil saturated. All three
reduced in size—scaled up, in simulation flow model of the reservoir from past pro- types are uniformly colored blue if water
parlance—by almost two orders of magni- duction data—the history matching saturated and green if gas saturated.
tude for simulating fluid flow, the detailed phase—and then predicts future Shales are transparent. The rightward
progress of the waterfront is clearly evi-
model provides insight into gross reservoir production.25 In flow simulation, the reser- dent. Ten years on, several pockets of oil-
connectivity by tracking flow paths through voir is divided into a large number of inter- saturated sand remain unswept. (From
the voxels. Applied to two reservoir units in connecting homogeneous tanks called grid Budding et al, reference 24, courtesy of Koninkli-
the Cano Limon field in Colombia, one unit blocks, and then multiphase fluid flow jke/Shell en Produktie Laboratorium.)
producing and the other overlying unit not through them is solved as a function of time.
yet completed, a MONARCH model showed The computation requires the largest com-
that 24 existing wells tapping the producing puters, and maximum model size is cur-
unit would also tap 60% of the overlying rently about 50,000 grid blocks, each block
unit if it was completed. This is invaluable being more than 200 times the volume of a
information for reservoir managers needing voxel. Most simulations use fewer and even
to decide if extra wells are needed to tap the larger blocks, particularly during the
overlying unit. appraisal phase when data are scarce.

18 Oilfield Review
Opinions on the value of simulation number of model parameters that can be ular, much simpler simulations can yield
vary—from being an unproductive chore, adjusted to history match past production valuable insight into production (below ).
required for unitization or by government data greatly exceeds the quantity of produc- Single-well simulators, using a radial grid
regulation, to being the key tool in reservoir tion data being matched. This permits many system centered on the well, are invaluable
management. Nevertheless, most fields possible matches for the same data. To for studying coning, the economics of a par-
undergo some form of simulation as soon as make the history match, reservoir engineers tial completion, or for interpreting a well
the discovery well hits pay. The pessimist’s therefore adjust certain parameters and test. Sectional simulators that model flow in
view stems in part from unreasonable ignore the rest. Traditional favorites for a vertical plane are often used to study dis-
expectations about what simulation can adjustment are grid-block properties such as placement behavior between an injector
bring, which in turn depends on when sim- permeability and relative permeability. and a producer. Areal simulators that model
ulation is used in the life of the reservoir.26 Much less popular, because they are far flow in a horizontal plane may be an opera-
During discovery and appraisal phases, more trouble to adjust but maybe just as sig- tor’s first choice for analyzing production in
simple simulations with few grid blocks rep- nificant, are geologically dependent param- a single-layered homogeneous reservoir.
resenting little more than an extension of eters such as the size and placement of the The results and analyses of any of these sim-
classical reservoir engineering techniques grid blocks themselves. ulations can always serve as input to a more
can assist major decisions on production Simulations do not necessarily have to sophisticated simulation.
facilities and how the reservoir should be reproduce a reservoir in its full three-dimen- Areal simulators were the first choice for
produced. With little data to go on, such a sionality. Early in the life of a field, in partic- Saudi Aramco when planning water injec-
simulation cannot be expected to accurately
predict production. Rather, its use is to test Single Well Areal
production sensitivity to different drive
mechanisms and to variations in large-scale
structure, such as external boundaries, strati-
fication and whether faults are sealing, par-
tially sealing or open.
As the field matures and more data
become available, the simulation grows
more complex with an increasing number
of grid blocks. Prediction becomes more
reliable and helps guide all manner of reser-
voir management decisions—about infill
drilling, perhaps the drilling of horizontal
wells, evolving surface facilities to cope Full Field
Sectional
with changing production, modifying the
driving mechanism, or justifying workover Producer
(see “Drilling for Maximum Recovery,” page Injector
21). Yet, simulation remains essentially a
qualitative tool. Like weather forecasting, it
strives to predict a potentially chaotic situa-
tion from a huge number of initial condi-
tions—pressure everywhere in space for
weather forecasting and pressure and satu-
ration in every grid block for simulation.27
Both systems are what mathematicians nVarieties of reservoir simulations. Reservoir management may begin with a simple
call overdetermined. In simulation, the simulation model and then evolve to more complex representations as more data
become available.

24. Budding MC, Flint SS, Paardekam AHM and 26. Thomas GW, “The Role of Reservoir Simulation in 27. Saleri NG and Toronyi RM: “Engineering Control in
Dubrule ORF: “Three-Dimensional Reservoir Geo- Optimal Reservoir Management,” paper SPE 14129, Reservoir Simulation: Part I,” paper SPE 18305, pre-
logical Modeling of the Cano Limon Field, Colom- presented at the 1986 SPE International Meeting on sented at the 63rd SPE Annual Technical Conference
bia,” AAPG Bulletin 74, no. 5 (May 1990): 621. Petroleum Engineering, Beijing, China, March 17- and Exhibition, Houston, Texas, USA, October 2-5,
Abstract only. Full text to appear in the AAPG Bulletin. 20, 1986. 1988.
Budding MC, Paardekam AHM and van Rossem SJ: Wood ARO and Young MS: “The Role of Reservoir Saleri NG, Toronyi RM and Snyder DE: “Data and
“3-D Connectivity and Architecture in Sandstone Simulation in the Development of Some Major Data Hierarchy,” paper SPE 21369, presented at the
Reservoirs,” paper SPE 22342, to be presented at the North Sea Fields,“ paper SPE 17613, presented at SPE Middle East Oil Show, Manama, Bahrain,
SPE International Meeting on Petroleum Engineer- the 1988 SPE International Meeting on Petroleum November 16-19, 1991.
ing, Beijing, China, March 23-28, 1992. Engineering held in Tianjin, China, November 1-4,
25. For brief reviews of reservoir simulation: 1988.
Mattax CC and Dalton RL: “Reservoir Simulation,“ Wittmann M, Al-Rabah AK, Bansal PP, Breitenbach
Journal of Petroleum Technology 42 (June 1990): EA, Hallenbeck LD, Meehan DN and Saleri NG:
692-695. “Exploring the Role of Reservoir Simulation,“ Oil-
field Review 2, no. 2 (April 1990): 18-30.
Breitenbach EA: “Reservoir Simulation: State of the
Art,” Journal of Petroleum Technology 43 (Septem-
ber 1991): 1033-1036.

January 1992 19
Reservoir I
1824 grid blocks 32,760 grid blocks

57

32 2520 grid blocks

13
60

60

42 42

Reservoirs I and II

1 I
Dense limestone
60
4 II
13
42 I

Dense limestone
60
4 II

42
Aquifer and Reservoir I

42,000 grid blocks

I
19
8

I
60

42

nEvolving simulation complexity for one of Saudi Aramco’s largest carbonate reservoirs. The development began with
areal simulations of Reservoir I, then a full-field simulation, next an investigation of the underlying Reservoir II, and finally
of the surrounding aquifer. The number of grid blocks starts at 1824 and finishes at 42,000.

20 Oilfield Review
tion in their largest, carbonate field (previ- layers, which were placed below a single Drilling for Maximum Recov-
ous page ). 28 This first simulation was areal grid simulating Reservoir I. Later, the ery
designed in 1970, 20 years after parts of the four layers were combined with the full 13-
field started producing and used a 57 × 32 layer model previously established for
Whatever drive mechanism is chosen, ultimate
grid, giving 1824 grid blocks. It was Reservoir I, giving a total of 37,800 grid
enlarged in 1978 to 60 × 42, giving 2520 blocks. This is currently being used to recovery depends on ensuring maximum connec-
grid blocks. These simulations were used to assess how to produce Reservoir II. tivity between wells and the producing formation.
plan individual well productions to sustain Yet more simulation has been required to This requires understanding the nature of the for-
an overall production rate, to forecast the study the extensive aquifer that drives Reser- mation heterogeneity and then guiding the drill
length of the production plateau and to esti- voir I’s production and also that of other bit to reach as much connected oil as possible.
mate final recovery. nearby fields. The aquifer was modeled with
In reservoirs where faults subdivide the reser-
By 1980, water encroachment had an areal grid in the middle of which Reser-
become serious enough to consider giving voir I occupies 19 × 8 grid blocks, much voir into isolated pockets, 3D seismics with its
the areal simulation a third, vertical dimen- coarser than the reservoir’s original 60 × 42 ability to map large-scale structure offers the
sion—two horizontal dimensions rule out areal simulation. During history matching, best chance to ensure that each pocket gets
the modeling of coning. Geologists identi- the aquifer and Reservoir I simulators were tapped by a well. An example is Chevron’s giant
fied 13 layers within three major producing run iteratively, with results from the finer
Bay Marchand field, offshore Gulf of Mexico.1
zones and superimposed them on the previ- gridded reservoir simulation being scaled up
ous areal grid, giving a simulation compris- to the coarser grid blocks of the aquifer sim- This field, discovered in 1949, is located around
ing 60 × 42 × 13 = 32,760 grid blocks. Con- ulation. A final step in simulating this field a huge salt dome and produces from highly
currently, reservoir engineers embarked on has been the use of locally refined gridding faulted sands. Oil production peaked at 75,000
a major program of reservoir-monitor log- that allows the integration of the fine 3D barrels of oil per day (BOPD) in the late 1960s
ging—flowmeters and pulsed neutron mea- reservoir model directly into the coarse, and declined to just 18,000 BOPD by the mid
surements mainly—to provide sufficient areal aquifer model, obviating the iterative
1980s. It looked like the end, until Chevron com-
information to history match the field’s pro- process. This raises the number of grid
duction. By 1985, the simulator was history blocks to 42,000. missioned a 3D seismic survey of the entire field.
matched and available for prediction stud- Saudi Aramco’s field was developed long With over 60 platforms and 50 multiwell sur-
ies. These have included determination of before the age of computers, so simulation face structures scattered over the offshore site,
the original oil/water contact, the problem could be applied only years after production the logistics were daunting. Once completed,
of varying depletion from different zones, commenced. In more recent fields, particu-
however, the survey clarified the salt-sediment
the possibility of also producing the reser- larly those in geologically complex and eco-
voir’s gas cap and evaluation of gas lift. In nomically harsh areas, simulation is often interface, gave improved definition to the faulting
short, the 3D simulator has proved a key used from the moment of discovery. The and provided a better understanding of the depo-
tool for most reservoir management deci- Brent field in the North Sea, jointly owned sitional environment. The result has been several
sions in the field. by Shell and Esso, provides an example. new producers in both mature and undeveloped
About 500 ft [150 m] below this huge The first simulation took place during the parts of the field, a realization that certain wells
reservoir lies a smaller reservoir of inferior appraisal phase using just 800 grid blocks.29
were not worth maintaining, and improved water-
quality with about one-tenth the individual It successfully guided basic decisions on
well productivity. By the time the first well field development and facilities planning. flood schemes. Field production now stands at
in the smaller reservoir (Reservoir II) was The next simulation built during the devel- 32,500 BOPD.
put on production in 1954—eight years opment phase used 11 layers and an areal A different strategy is required in highly strati-
after the huge reservoir (Reservoir I) had grid of 19 × 37, giving a total of 7733 grid fied reservoirs where poor connectivity is caused
started producing—Reservoir II’s pressure blocks.30 This proved adequate to confirm
by discontinuous stringers, such as in Exxon’s
had dropped by about 900 psi. This indi- the water injection program, investigate fur-
cated communication between the two ther facilities requirements and estimate the Fullerton field in Andrews County, Texas, USA.
reservoirs, which geologists attributed to a length of the plateau. As the field entered Discovered in 1942, this field was rapidly
fracture system permeating the intervening decline phase, a third simulation was con- exploited with over 550 wells. By the 1960s, the
500 feet of dense limestone. The urgent ducted with an even finer grid to take field was in decline and substantial water injec-
question was how to contain the communi- advantage of the abundant acquired data tion was required to maintain oil production at
cation, since it could cause both oil and and better track the water front. This time,
around 10,000 BOPD. Although the producing
water to migrate downward from Reservoir 21 layers and an areal grid 30 × 54 pro-
I to Reservoir II as Reservoir II’s pressure duced 34,020 grid blocks. Reservoir engi- 1. Abriel WL, Neale PS, Tissue JS and Wright RM: “Modern
declined. Juggling the options required sim- neers used the simulation to pick new well Technology in an Old Area: Bay Marchand Revisited,”
ulating both reservoirs simultaneously. The locations and optimize a gas injection pro- Geophysics: The Leading Edge of Exploration (June
first attempt divided Reservoir II into four gram.31 A comparison of production predic- 1991): 21-35.

28. Al-Dawas SM and Krishnamoorthy AM: “Evolution 29. Johnson HA: “North Sea Reservoir Simulation: Prac-
of Reservoir Simulation for a Large Carbonate Reser- tical Considerations,” paper SPE 19039, presented at
voir,” paper SPE 17938, presented at the SPE Middle the SPE Middle East Technical Conference and Exhi-
East Technical Conference and Exhibition, Manama, bition, Manama, Bahrain, March 11-14, 1989.
Bahrain, March 11-14, 1989. 30. Tollas JM and Sayers JR: “Brent Field 3-D Reservoir
Simulation,“ paper SPE 12160, presented at the 58th
SPE Annual Technical Conference and Exhibition,
San Francisco, California, USA, October 5-8, 1983.

January 1992 21
40-acre spacing Phase 1

Phase 2

- Producer - Injector

100 nTwo-phase sequence of infill drilling in Exxon’s Fullerton, Texas reservoir. In phase 1,
From 40-acre spacing new producers were drilled in the center line of the previous three-line drive, and other
producers were converted to injection. In phase 2, another producer was drilled in each
Continuity, %

From 20-acre spacing 80-acre rectangle. The infill program tapped reserves unconnected to previous wells
and prolonged the field’s life by at least ten years. (From Barber et al, reference 3.)

From 10-acre spacing 20 nIncrease in pro-


duction from infill
drilling in Exxon’s
Fullerton, Texas
Production, BOPD x1000

0 reservoir. The inter-


0 1000 2000 3000 Other Infills ference production
Distance between wells, ft is that lost from the
original wells to
nPercentage of reservoir formation 10
Phase 1 the new wells.
hydraulically connecting two wells versus (From Barber et al, ref-
Phase 2
distance between them, for Exxon’s Fuller- erence 3.)
ton, Texas field. The data were obtained
from wells drilled with 40-, 20-, and 10-
acre spacing. As distance between wells
increases, continuity worsens. And as
more data become available, previous
40-acre wells
Interference
estimates on continuity appear overly
optimistic. (From Barber et al, reference 3.) 0
1965 1975 1985 1995

2. Stiles LH: “Optimizing Waterflood Recovery in a Mature 4. McCoy TF, Fetkovich MJ, Needham RB and Reese DE: zone is several hundred feet thick, the pay is in
Waterflood, the Fullerton Clearfork Unit,” paper SPE “Analysis of Kansas Hugoton Infill Drilling: Part I—Total carbonate stringers that correlate very poorly
6198, presented at the 51st SPE Annual Technical Confer- Field Results,” paper SPE 20756, presented at the 65th
ence and Exhibition, New Orleans, Louisiana, USA, Octo- SPE Annual Technical Conference and Exhibition, New
between wells.2 Exxon found during infill drilling
ber 3-6, 1976. Orleans, Louisiana, USA, September 23-26, 1990. that the correlation seemed to get worse the
3. Barber AH Jr, George CJ, Stiles LH and Thompson BB: Fetkovich MJ, Needham RB and McCoy TF, : “Analysis of closer you looked. A continuity measure was
“Infill Drilling to Increase Reserves—Actual Experience in Kansas Hugoton Infill Drilling: Part II—12-Year Perfor-
Nine Fields in Texas, Oklahoma, and Illinois,” Journal of mance History of Five Replacement Wells,” paper SPE defined as the percentage of producing carbonate
Petroleum Technology 35 (August 1983): 1530-1538. 20779, presented at the 65th SPE Annual Technical Con- that hydraulically connects between two given
ference and Exhibition, New Orleans, Louisiana, USA,
September 23-26, 1990.
wells—that is, the percentage that would theoret-
Fetkovich MJ, Ebbs DJ Jr and Voelker JJ: “Development ically get produced if one well was an injector
of a Multiwell, Multilayer Model to Evaluate Infill Drilling and the other a producer. Exxon then investigated
Potential in the Oklahoma Hugoton Field,” paper SPE
20778, presented at the 65th SPE Annual Technical Con- how this continuity depended on well spacing
ference and Exhibition, New Orleans, Louisiana, USA, (above, far left). Initially, the information came
September 23-26, 1990.
from the original wells, which were on a 40-acre
5. Freyss HP and Burgess K: “Overcoming Lateral Reservoir
Heterogeneities Via Horizontal Wells, ” presented at the spacing.
6th European Improved Oil Recovery Symposium, Sta- It came as no surprise that continuity worsened
vanger, Norway, May 21-23, 1991.
as well spacing increased. But as the field was

22 Oilfield Review
drilled with denser spacings, data showed that 500
2nd 3rd
earlier estimates of continuity were grossly opti-
mistic.3 The only means of increasing recovery 400

Production, BOPD x 1000


was to drill progressively denser well spacings
(previous page, top). This was carried out in two
300
phases. The first took well spacing to 20 acres,
the second to 10. When the second phase was
completed, the new wells accounted for 71% of 200
the field’s production, and each well added an
estimated 97,000 barrels to recoverable reserves
100
(previous page, middle).
However tempting it may look, though, infill
drilling is not a panacea. This is the view of 0
1977 1979 1981 1983 1985 1987 1989 1991 1993 1995
Fetkovich and colleagues at Phillips Petroleum
Co. with respect to the Hugoton gas field, the nProduction forecasts from second and third simulations for Shell and Esso’s North Sea
Brent field. The second simulation, made in 1982 during field development and
largest in the lower 48 US states, spanning parts intended to investigate plateau-phase production, gave a reasonable forecast for five
of Kansas, Oklahoma and Texas. The Hugoton years. The third simulation, made in 1988 during the decline phase, gives a more
accurate picture of the field’s later life and allows planning of remedial programs. (The
comprises interspersed carbonate and layered first simulation, not shown, was made at very coarse scale during the appraisal phase
shaly units—continuity is excellent, the log sig- of the field.) (From Johnson, reference 29)
nature being consistent from one well to the next.
When the state authorities authorized infill tions from the second and third simulations these sands for a 3- by 2.5-kilometer [1.9-
shows how the finer simulation provides a by 1.6-mile] sector in the southern part of
drilling in the late 1980s, the Phillips team
more optimistic, and presumed truer, sce- the reservoir (next page, top). The sector
proved through studying production histories and nario for the field’s decline phase (above ). was chosen because it had abundant well
conducting simulations that the reservoir was What is the trend in simulation? Advances data—six oil producers, three water injec-
essentially layered with no vertical crossflow. in computer power will undoubtedly spur tors and one gas injector. The probabilistic
They concluded that while infill drilling might simulations with more grid blocks, repre- model was then sandwiched between
increase productivity, the new wells would add
senting finer reservoir detail. It is conceiv- deterministic models of the upper and
able, for example, that simulating grid lower parts of the Statfjord reservoir, and
nothing to the field’s reserves.4 Thanks to the for-
blocks the size of the MONARCH system’s finally the whole thing was run through a
mation continuity, all the gas was already con- voxels may eventually be possible. If this simulator. The results were compared with
nected to the existing wells. happens, reservoir management will a simulation of the Statfjord reservoir con-
Production from natural fractures, such as in become reliant on probabilistic modeling, structed entirely deterministically.
the Austin chalk formations in south Texas, poses since no field technique, even the most Using the MONARCH package, three real-
experimental, appears capable of measuring izations of the channel sands were con-
yet another type of heterogeneity that until
to that degree of detail between wells—all structed using 172,000 voxels, each measur-
recently sorely tested operators’ attempts to the simulations described in this article and ing 75 m × 25 m × 1.2 m thick [250 ft × 80 ft
secure economic recovery. The Pearsall field almost all described in the literature are × 4 ft]. Each realization had to be reduced to
exploited by Oryx is typical. The oil is contained deterministic. The future may lie in con- about 10,000 grid blocks before sandwich-
in successive clusters of roughly parallel subver- structing a geologically probabilistic model ing and simulation. The comparisons
tical fractures. Each cluster acts as a separate oil
from the beginning and then firming it up showed that simulations using the proba-
deterministically as data become available. bilistic realizations predicted water break-
accumulation and lacks communication with its
The latest simulation work in the Brent field through far more accurately than an entirely
neighbors. Produced through vertical wells, these represents one of the first applications of deterministic simulation. Credited with the
formations provide unsubstantial economic probabilistic modeling.32 improvement was MONARCH’s ability to
return. But connect several together with a hori- The Brent field actually comprises two create a more realistic picture of shale barri-
zontal well, and the economics improve immedi- major reservoirs, the middle Jurassic Brent ers within the channel sands. These appear
reservoir and the lower Jurassic Statfjord to influence not only water breakthrough in
ately.5 Overcoming reservoir heterogeneity was a
reservoir, that are separated by 250 m [820 the channels, but also breakthrough in the
key impetus behind the horizontal well revolution. ft] of shale. Some of the most complex overlying and underlying units that were
reservoir sediments and therefore most modeled deterministically.
troublesome during history matching are Simulation requires intense collaboration
channel sands in the middle part of the between geologists and reservoir engineers,
Statfjord unit. It was therefore decided to a cross-disciplinary challenge that operators
construct a detailed probabilistic model of face in virtually every area of reservoir man-

31. Tollas JM and McKinney A: “Brent Field 3-D Reser- 32. Keijzer JH and Kortekaas TFM: “Comparison of
voir Simulation,” paper SPE 18306, presented at the Deterministic and Probabilistic Simulation Models
63rd SPE Annual Technical Conference and Exhibi- of Channel Sands in the Statfjord Reservoir, Brent
tion, Houston, Texas, USA, October 2-5, 1988. Field,” paper SPE 20947, presented at Europec 90,
The Hague, The Netherlands, October 22-24,1990.

January 1992 23
Reservoir Geological 2nd 3rd Probabilistic agement. Other options and decisions in
Unit Simulation Simulation simulation reservoir development involve expertise
1 1 from a plethora of disciplines—drilling engi-
I 2 neering, environmental engineering, facili-
2 3 ties engineering to name a few. Integrating
4 them can be hard work.
3 5 Conoco (U.K.) Limited has identified sev-
II 6
4 7 eral factors that help.33 On a team of differ-
8 ent experts, there may have to be a boss,
Brent 9
but no single expertise should be allowed to
III 5 10 dominate. As far as management is willing
11 to devolve responsibility, the team should
12 set its own goals and audit its achievements.
IV 6 13 The team should be as independent as pos-
14 sible from the parent structure, and thus
have the freedom to react and make deci-
sions rapidly—in other words, act like
Shale
entrepreneurs (below, left ). Last, there must
be persistent cross-disciplinary communica-
tion and education, not only between
1 7
15 15 experts, but also a reaching out to financial
and business colleagues in the parent group.
Conoco put this theory into practice while
16
2 8 planning a highly deviated well in their geo-
17
Probabilistic logically complex North Sea Murchison
Statfjord
field. The well was intended to tap multiple
18 targets in the main field and south flank.
3.1 9
19 19 Geoscientists made a detailed interpretation
of the south flank so the planned well hit
10 20 20 the targets; drilling engineers decided if the
3.2
more than 80-degree well was drillable and
11 21 21
designed a casing program that could han-
dle the varying depletion pressures in the
nIncreasing complexity in modeling the producing layers in the Brent and Statfjord different units; and reservoir engineers
reservoirs of Shell and Esso’s North Sea Brent field. In the most recent simulation, sev- investigated whether the casing program
eral layers in the Statfjord reservoir were modeled probabilistically and sandwiched
between deterministic layers. (The first simulation, not shown, was made at very coarse could be simplified by shutting in existing
scale during the appraisal phase of the field.) producers or injectors to adjust these pres-
sure differentials. The design process was
Before nOrganizational iterative and had to involve partners and
changes that outside specialists. Yet, it was done speedily
Aberdeen London Conoco (U.K.) Ltd because all experts worked together and
implemented to
ensure a cross-dis- were in constant communication.
Exploration
ciplinary approach In tomorrow’s world, every aspect of
when planning a reservoir management will be developed
high-angle well in and managed through synergy—by integrat-
Engineering Drilling Development their North Sea
Murchison field. ing data, by coupling compatible method-
After restructuring, ologies and by building teams. Add better
Production Operations Reservoir Geology Geophysics the key disciplines measurements for seeing between wells and
(boxed)—drilling almost unimaginable increases in computer
engineers, geolo- power, and we may see hydrocarbon
After gists, reservoir
engineers and geo- exploitation becoming as much a science as
Aberdeen an art. —HE
physicists—all
worked in one team
in the same office. 33. King W and Warrender J: “The Murchison Field: A
(From King and War- Multidisciplinary Approach to Reservoir Manage-
Engineering Reservoir development
render, reference 33.) ment,” presented at Advances in Reservoir Technol-
ogy, Edinburgh, Scotland, February 21-22, 1991.

Drilling Reservoir Geology Geophysics

24 Oilfield Review
Reservoir Characterization
Using Expert Knowledge, Data and Statistics

Prepared with assistance from: Accurately simulating field performance requires knowing properties
Peter Corvi, Head of Effective Properties
like porosity and permeability throughout the reservoir. Yet wells provid-
Kes Heffer, Head of Reservoir Description
Peter King, Senior Reservoir Engineer, ing the majority of data may occupy only a billionth of the total reser-
Effective Properties
Stephen Tyson, Computing Consultant voir volume. Transforming this paucity of data into a geologic model for
Georges Verly, Senior Geostatistician, simulating fluid flow remains an industry priority.
Effective Properties
BP Research The simplest representation of a reservoir is Defining the Large-Scale Structure
Sunbury-on-Thames, England a homogeneous tank with uniform proper- The first stage aims to construct as detailed a
ties throughout. Characterizing such a sim- geologic description of the reservoir as mea-
Christine Ehlig-Economides, Reservoir
plified model requires merely enough data surements will allow. The available informa-
Dynamics Program Leader
to define its volume and basic petrophysical tion includes a structural interpretation from
Isabelle Le Nir, Geologist
properties. This information can then be seismic data, static data from cores and logs
Shuki Ronen, Geophysics Integration used to help forecast reservoir behavior and and dynamic production data from testing.
Program Leader
compare possible production scenarios. To these, the geologist adds expert knowl-
Phil Schultz, Department Head But efficient exploitation of reserves edge on geologic structures, obtained from
Etudes et Productions Schlumberger requires a more sophisticated approach. field experience and outcrop studies (see
Reservoir Characterization Department
Clamart, France Reservoirs have been created by complex “Gathering Outcrop Data,” page 28).
sedimentary and diagenetic processes, and The first step is to recognize depositional
Patrick Corbett, Research Associate modified by a history of tectonic change. units and then, if possible, correlate them
Jonathan Lewis, Conoco Lecturer in Geology Rather than resembling simple tanks, reser- between wells. It helps to know the general
Gillian Pickup, Research Associate voirs are heterogeneous structures at every shape and order of deposition of the units,
Phillip Ringrose, Senior Research Associate
scale. The need for accurate characteriza- and ideally the relationships between width,
Heriot-Watt University
tion is vital for oil company economic plan- thickness and length for each unit.1 Well-to-
Department of Petroleum Engineering ning (next page, top ). well correlation is made using traditional
Edinburgh, Scotland In the past, geologists have used expert hard-copy maps and logs or, today, interac-
knowledge to interpolate between wells and tive computer workstations. Workstations
Dominique Guérillot, Sr. Research Engineer, produce basic reservoir models. However, allowing simultaneous visualization of seis-
Reservoir Engineering Division during the last ten years, this effort has been mic, log and other data enable easier recog-
Lucien Montadert, Director of Exploration transformed by statistical modeling offering nition of subtle correlations.
and Production insight into the effects of heterogeneity. In most cases, this large-scale characteri-
Christian Ravenne, Principal Research Associate, This article details how reservoir descrip- zation results in a conventional layer-cake
Geology and Chemistry Division tions are developed and is divided into model—a stack of subhomogeneous layers.
Institut Français du Pétrole three stages (page 27 ): There is no doubt that many reservoirs
Rueil-Malmaison, France
•Defining the reservoir’s large-scale struc- should be characterized by more complex
Helge Haldorsen, Assistant Director, International
ture using deterministic data arrangements of depositional units—such as
Field Developments •Defining the small-scale structure using the “jigsaw” and “labyrinth” models (next
Norsk Hydro A.S.
statistical techniques—geostatistics page, bottom ). 2 The ability to use these
Oslo, Norway •Rescaling the detailed geologic model to more complex models routinely remains an
be suitable input for a fluid-flow simulator. industry goal (see “Trends in Reservoir Man-
Thomas Hewett, Professor of Petroleum agement,” page 8 ).
Engineering
Stanford University Cray is a mark of Cray Research Inc. MicroVAX is a mark 2. Weber KJ and van Geuns LC: “Framework For Con-
California, USA of Digital Equipment Corp. structing Clastic Reservoir Simulation Models,” Jour-
1. Weber KJ: “How Heterogeneity Affects Oil Recovery,” nal of Petroleum Technology 42 (October 1990):
in Lake LW and Carroll HB Jr (eds): Reservoir Charac- 1248-1253, 1296-1297.
terization. Orlando, Florida, USA: Academic Press
Inc. (1986): 487-544.

January 1992 25
Hypothetical well data Defining the Small-Scale Structure Using
Geostatistical Modeling
Once the shape of the reservoir and its
large-scale structure have been described,
the next stage focuses on defining hetero-
geneity within each depositional unit. This
requires more detailed interwell measure-
Heterogeneity case A ments than are currently available. Borehole
seismic data typically cannot span wells.
Although surface and cross-well seismic
data do reach everywhere, there is insuffi-
cient knowledge relating seismic data to
physical rock properties, and exploration
seismic data have insufficient resolution. In
addition, well testing data often lack direc-
Heterogeneity case B
tional information. Therefore nondetermin-
istic, or geostatistical, methods are required.3
Large-scale reservoir models typically
comprise a few thousand grid blocks, each
measuring about 100 m square and more
than 1-m thick [about 1000 ft2 by 3 ft] But
models that take into account small-scale
Lithology Predicted oil recovery
heterogeneity use smaller grid blocks—1
million or more are often needed. The sheer
1 0.5
quantity of input data required to fill so
PV recovered

2 0.4 Case B
Case A many blocks also favors geostatistics.
3 0.3 Geostatistical techniques designed to pro-
4 0.2 vide missing data can be classified in two
5 0.1 ways: by the allowable variation of the
Shale Barrier property at a given point and by the way
0
0 0.2 0.4 0.6 0.8 1.0 data are organized. There are two classes of
Pore volume (PV) injected property variation, continuous and discrete.
Continuous models are suited to properties
nHow heterogeneity affects productivity. In this example developed at BP Research, like permeability, porosity, residual satura-
Sunbury-on-Thames, England, two different models have been interpolated between a tion and seismic velocity that can take any
pair of hypothetical wells. The lithologies are identical: only their spatial correlation
has been altered. The graph shows how the pore volume recovered versus pore volume value. Discrete models are suited to geo-
injected varies for the two models. After one pore volume has been injected, the differ- logic properties, like lithology that can be
ence in recovery between the two models is about 0.1 of a pore volume. represented by one of a few possibilities.4
Data organization has developed along
two paths depending on how the models
are built up. One is grid based. All the
Barrier bar Tidal channel Crevasse splay properties are represented as numbers on a
Transgressive deposit Barrier bar
Barrier foot Distributary channel fill grid, which is then used for fluid-flow simu-
lation. The second is object based. Reser-
voir features such as shales or sands are
generated in space and a grid then superim-
posed on them.

Grid-Based Modeling
One of the key geostatistical tools used in
Layer cake Jigsaw Labyrinth
grid-based modeling is kriging—named after
a pioneer of rock-property estimation, D. G.
nDefining reservoir type. Current large-scale characterization tends to result in a sim- Krige, who made a series of empirical stud-
ple layer-cake model. However, efforts are being made to use more complex models ies in the South African goldfields.5 Kriging
like the jigsaw and labyrinth models shown. is designed to honor measured data and
produce spatially correlated heterogeneity.
The heart of the technique is a two-point
statistical function called a variogram that
describes the increasing difference (or
decreasing correlation) between sample val-
ues as separation between them increases.

26 Oilfield Review
Core plugs Whole core Well logs Well testing
Kriging minimizes the error between the
interpolated value and the actual (but
unknown) value of the property.6
For a given property, a variogram is con-
structed from pairs of data generally mea-
sured in wells. In the oilfield, criticism of
kriging centers on the low density of well
data when compared to mining. To over- Borehole geophysics Outcrop studies Surface seismics
come this problem, variograms can alterna-
tively be constructed from data measured
on outcrops or in mature fields where better
sampling is available.
Although variograms are commonly used
to infer the spatial continuity of a single
variable, the same approach can be used to
study the cross-continuity of several differ- Geologist’s
ent variables—for example, the porosity at expert knowledge
one location can be compared to seismic
transit time. Once constructed, this cross-
correlation can be used in a multivariate
regression known as cokriging. In this way,
a fieldwide map of porosity can be com-
puted using not only porosity data, but also
the more abundant seismic data.7
Kriging and cokriging deal with quantita-
tive, or hard, data. A third method called Stage 1: Defining large-scale structure
soft kriging combines expert information,
also called soft data, with the quantitative
data. The expert data are encoded in the
form of inequalities or probability distribu-
tions. For example, in mapping a gas/oil
contact (GOC), if the contact is not reached
in a certain well, soft kriging uses the
inequality: GOC is greater than well total Stage 2: Defining small-scale structure
depth. At any given point where there is no
well, an expert may define the probability of
finding the GOC within a certain depth
interval. Soft kriging will use this probability.8
(continued on page 30 )

3. Haldorsen HH and Damsleth E: “Stochastic model-


ing,” Journal of Petroleum Technology 42 (April
1990): 404 -412. Haldorsen HH and Damsleth E:
Stage 3: Scaling up
“Challenges in Reservoir Characterization Research,”
presented at Advances in Reservoir Technology, Char-
acterization, Modelling & Management, organized by
the Petroleum Science and Technology Institute, Edin-
burgh, Scotland, February 21-22, 1991.
4. Journel AG and Alabert GF: “New Method For Reser-
voir Mapping,” Journal of Petroleum Technology 42 30,000 1 million
(February 1990): 212-218. grid blocks grid blocks
5. Krige DG: “A Statistical Analysis of Some of the Bore-
hole Values of the Orange Free State Goldfield,” Jour-
nal of the Chemical, Metallurgical and Mining Society
of South Africa 53 (1952): 47-70. Fluid-flow simulation
6. Journel AG: “Geostatistics for Reservoir Characteriza-
outflow face
inflow face

tion,” paper SPE 20750, presented at the 65th SPE


Annual Technical Conference and Exhibition, New
Orleans, Louisiana, USA, September 23-26, 1990.
7. Doyen PM: “Porosity From Seismic Data: A Geostatis-
tical Approach,” Geophysics 53 (October 1988):
1263-1275.
8. Kostov C and Journel AG: “Coding and Extrapolating
Expert Information for Reservoir Description,” in Lake
LW and Carroll HB (eds): Reservoir Characterization. nBuilding a reservoir model in three stages with measured data, expert knowledge
Orlando, Florida, USA: Academic Press Inc. (1986): and statistics.
249-264.

January 1992 27
Gathering Outcrop Data

Every reservoir is unique. Yet recurring deposi-


tional conditions create families of reservoirs
that are broadly similar. If reservoir rock outcrops
at the surface, it presents a golden opportunity to
gather data which can be used to help character-
ize related subsurface formations.1
Outcrop studies allow sampling at scales that
match interwell spacing. Small-scale permeabil-
ity can also be measured and compared with
nPhotomosaic of the Tensleep Sandstone in the Big Horn basin, Wyoming, USA. This
Pennsylvanian-age sandstone is being studied because of its similarity to the Rotliegen-
depositional characteristics. Then the statistical des Sandstone, an important gas reservoir unit found in the Southern Basin of the North
Sea. Like most outcrops, it is irregularly shaped. Therefore, to obtain the mosaic with
and depositional information measured at the
constant scale and no distortion, overlapping photographs were taken from points in a
outcrop can be incorporated into the stochastic plane parallel to a major depositional surface on the outcrop. With a helicopter posi-
modeling of the subsurface analog.2 tioned in this plane, laser range finders were used to maintain a fixed distance from
the outcrop to ensure constant scale. This was double-checked by positioning scale
Generally, there are two kinds of outcrop bars at regular intervals on the outcrop. The mosaic was prepared during studies by
study.3 Bed or bed-set scale genetic models Jon Lewis and Kjell Rosvoll at Imperial College, University of London, England and
Heriot-Watt University, Edinburgh, Scotland with the support of Conoco (UK) Ltd.
(GEMs) concentrate on reservoir heterogeneity as
a function of the sedimentary process. To date, outcrop (above). The mosaic can be digitized to probe permeameters (next page, below).6 These
most outcrop studies have been of this type. The quantify lithofacies patterns. It can also be used evaluate permeability—from 0.5 millidarcies
second type, a field analog model (FAM), looks at to plan and execute small-scale petrophysical (md) to 15 darcies—by injecting nitrogen into a
heterogeneity caused not only by depositional measurements on the outcrop. prepared location on the outcrop. Thousands of
processes but also by diagenesis. FAMs present Gamma ray logging of outcrops is used to help nondestructive measurements can be made in
two difficulties. First, it is hard, given the paucity describe the composition of the lithofacies. A situ and the results compared to traditional labo-
of reservoir data, to decide which diagenetic pro- standard truck-mounted sonde can be lowered ratory tests for quality control. This permits
cesses are relevant. Second, finding outcrops down the face of an outcrop and gamma ray mea- examination of permeability changes on a very
that have undergone specific diagenetic pro- surements continuously recorded. Although small-scale—on the order of millimeters.
cesses can be difficult. washouts are known to affect borehole gamma Through such measurements, permeability has
Although the study of outcrops is a traditional ray logging, tests show that the sonde can be as been shown to vary dramatically within a deposi-
skill learned by most geologists, in the past they much as 0.6 m [2 ft] away from the rock before tional unit.7
have concentrated on such goals as describing the reading is compromised.5 A typical study, carried out by the University of
the sedimentological environment and the An alternative technique employs a lightweight, Texas, Bureau of Economic Geology, Austin,
geometries of the major depositional units. For portable gamma ray spectrometer to measure Texas, USA, centers on a Ferron Sandstone out-
reservoir characterization, a much more detailed either total radiation or individual radioelement crop in central Utah, USA. Ferron is a wave-mod-
and quantitative picture of the heterogeneity concentrations. Total radiation measurements ified, deltaic sandstone with permeability distri-
within depositional sequences is required. are typically made at 0.65-m [2.1-ft] intervals. bution strongly related to lithofacies type and
Therefore, in addition to the geologist’s tradi- Selected intervals may be logged in greater grain size. It is believed to be an analog for Gulf
tional mapping skills, new analytical techniques detail if required. Because the tool measures an Coast deltaic reservoirs, which account for 64%
have been devised. Analyzing photographs of the area of the rock face with a diameter of about 0.3 of Texas Gulf Coast gas production.8
outcrop helps clarify the lithofacies distribution.4 m [1 ft], readings taken for thin beds will be First, the depositional framework of the out-
Clearly, more than one photograph is needed, slightly influenced by adjacent strata. crop was established using color photomosaics to
and this technique can succeed only if all the One of the principal pitfalls in reservoir charac- map the distribution and interrelations of the
photographs are at the same scale and taken at terization stems from the use of unrepresentative sandstone components. This involves determin-
the same distance from the rock. A helicopter- permeability data measured from cores and core ing lithofacies architecture, delineating perme-
mounted camera with a laser range finder is one plugs taken from only a few wells. Densely sam- able zones and their continuity, identifying flow
way of achieving this. Successive photographs pled outcrop permeability data are of key impor- barriers and baffles and establishing permeabil-
are taken with a 30 to 40% overlap and then com- tance. These can be gathered by performing labo- ity trends.
bined to construct a photomosaic of the whole ratory-based flow tests on samples gathered in On the basis of this framework, more than 4000
the field. However, a comparatively recent devel- minipermeameter readings were made, all on
opment has seen the introduction of portable
electronic minipermeameters—also known as

28 Oilfield Review
vertical exposures of rock to minimize the effects Variogram analysis confirmed that the hetero-
of weathering. Two sampling strategies were geneity of the permeability was structured in a
employed. First, more than 100 vertical strips way that depended on the rock type sampled and
were mapped every 30 to 60 m [100 to 200 ft] on the sampling scale. For instance, at sampling
three channel complexes. Permeability measure- intervals of 0.6 m, the range over which perme-
ments were taken at 0.3-m spacing on each strip. ability shows correlation was 5.5 m [18 ft] for
Second, two sampling grids were constructed tough-cross-bedded sandstone and 1.2 m [4 ft]
to examine the small-scale spatial variability of for contorted beds. Planar cross-bedded strata
permeability within lithofacies. One was 12-m exhibited a range of 3.6 m [12 ft] for a 0.6 m
[40-ft] square with 0.6-m subdivisions. The sec- sampling interval and 0.6 m for the 0.1 m inter- nGoing underground—outcrops are not
always on the surface. A three-dimen-
ond, inside the first, measured 1.8-m [6-ft] val. According to the University of Texas sional (3D) study of permeability varia-
square with 0.1-m [0.3 ft] subdivisions. researchers, this suggests that the permeability tion was undertaken in a subsurface sil-
ica sand mine in Morvern, Scotland. Some
The study successfully identified patterns of structure may be fractal. 8000 permeability measurements have
heterogeneity. Delta-front sandstones were found This is just one example of work underway. been collected over a 1-km [0.6-mile]
square section of the mine. The work was
to be vertically heterogeneous but laterally con- Over the past five years, there has been an explo-
carried out by Jon Lewis and Ben Lowden
tinuous over a typical well spacing. However, the sion of outcrop studies—a hundred or more liter- at Imperial College, University of London,
sand belts of the distribution system—composed ature references (above, right). And many pro- England, and supported by Den Norske
Stats Oljeselskap A.S. (Statoil).
of amalgamated, lateral accretion point-bar jects are afoot to bring these studies together and
sandstones—were found to have both lateral and create outcrop data bases. Key information
vertical heterogeneity. needed to condition geostatistical modeling will
become readily available.

nMinipermeame- 1. Weber KJ and van Geuns LC: “Framework for Construct-


Gas supply R1 R2 ter in detail. The ing Clastic Reservoir Simulation Models,” Journal of
(nitrogen cylinder) probe is applied to Petroleum Technology (October 1990): 1248-1253, 1296-
the cleaned sur- 1297.
Four way valve face of the rock 2. North CP: “Inter-Well Geological Modelling of Continental
Four way valve with enough pres- Sediments: Lessons From the Outcrop,” presented at
sure to ensure a Advances in Reservoir Technology, Characterisation,
good seal. Nitro- Modelling & Management, organized by the Petroleum
gen is injected into Science and Technology Institute, Edinburgh, Scotland,
the rock and the February 21-22, 1991.
Flow units pressure and flow 3. Lewis JJM: “A Methodology for the Development of Spa-
rate measured. tial Reservoir Parameter Databases at Outcrop,” pre-
sented at Minipermeametry in Reservoir Studies, orga-
These values,
nized by the Petroleum Science and Technology Institute,
along with the Edinburgh, Scotland, June 27, 1991.
area of the probe,
4. A lithofacies is a mappable subdivision of a stratigraphic
are then used to unit distinguished by its lithology.
Four way valve calculate perme-
Injection pressure 5. Jordan DW, Slatt RM, D’Agostino A and Gillespie RH:
measurement system Gas injection probe ability. To measure “Outcrop Gamma Ray Logging: Truck-Mounted and
the range of per- Hand-Held Scintillometer Methods Are Useful for Explo-
meabilities found ration, Development, and Training Purposes,” paper SPE
Rock

at the outcrop, four 22747, presented at the 66th SPE Annual Technical Con-
different flow ele- ference and Exhibition, Dallas, Texas, USA, October 6-9,
ments are required. 1991.
Pressure
transducer (0-20psi) 6. Lewis JJM: “Outcrop-Derived Quantitative Models of Per-
meability Heterogeneity for Genetically Different Sand
Bodies,” paper SPE 18153, presented at the 63rd SPE
Annual Technical Conference and Exhibition, Houston,
Texas, USA, October 2-5, 1988.
Jensen JL and Corbett PWM: “A Stochastic Model for
Comparing Probe Permeameter and Core Plug Measure-
ments,” paper 3RC-24, presented at the 3rd International
Reservoir Characterization Technical Conference of the
National Institute for Petroleum and Energy Research and
the US Department of Energy, Tulsa, Oklahoma, USA,
November 3-5, 1991.
7. Kittridge MG, Lake LW, Lucia FJ and Fogg GE:
“Outcrop/Subsurface Comparisons of Heterogeneity in
the San Andres Formation,” SPE Formation Evaluation 5
(September 1990): 233-240.
8. Tyler N, Barton MD and Finley RJ: “Outcrop Characteriza-
tion of Flow Unit and Seal Properties and Geometries,
Ferron Sandstone, Utah,” paper SPE 22670, presented at
the 66th SPE Annual Technical Conference and Exhibition,
January 1992 Dallas, Texas, USA, October 6-9, 1991.
These kriging methods yield smooth inter- Measured data
polations, but do not describe small-scale
heterogeneity. In a process called stochastic Simulation 1
modeling that superimposes correlated

Value
Kriged
noise onto smooth interpolations, more real- interpolation
istic pictures emerge. A probability distribu- Actual data
tion determines how this noise is generated. Simulation 2
A number of pictures, or realizations, will
usually be created, each with different noise
sampled from the same distribution (top, Distance
right ). By analyzing many realizations, the nImitating reality’s variability. The kriging technique yields a
extent to which geologic uncertainty affects smooth interpolation, ignoring nature’s true small-scale het-
reservoir performance can be studied (mid- erogeneity. By adding noise to a kriged interpolation, stochas-
tic modeling produces more lifelike realizations.
dle, right ).
An example is sequential Gaussian simu-
Realization A
lation (SGS). The data are constructed grid 0 nTwo stochastic
block by grid block. At the first selected realizations of reser-
block, the smooth interpolated value is cal- 20 voir permeability
culated by kriging using the available mea- showing how differ-
40 ent noise added to
sured data. The interpolated value and its
the same kriged
variance, also calculated by kriging, define 60 interpolation gives
Depth, ft

a Gaussian distribution function from which quite different real-


80 izations. In both the
a noise value is randomly drawn and added Realization B
to the interpolated value. For the next point, 0
horizontal and ver-
tical directions,
selected at random, the process is repeated, realization A has
20
using as a base this newly derived data relatively high con-
point together with the measured data. As 40 tinuity whereas
the grid blocks are filled, all previously cal- realization B has
60 relatively low conti-
culated values contribute to computing the
nuity. Continuity is
next in the sequence. 80 gauged by measur-
The technique depends on the reservoir 0 200 400 500 600 800 1000 1200 1400 ing the size of groups
property being normally distributed, so a Distance, ft of contiguous blocks
property like permeability, which has a Above 100 md 1.0–0.1 md with permeability of
100 millidarcies
skewed distribution, requires transformation (md) or more.
to normality. The reverse process has to be 100–0 md Below 0.1 md
performed once all the grid blocks are filled.
10–1 md
An increasingly popular method for gen-
erating noise for stochastic modeling uses After Fogg GE, Lucia FJ and Senger RK: “Stochastic Simulation of Interwell-Scale Het-
the concept of fractals—a statistical tech- erogeneity for Improved Prediction of Sweep Efficiency in a Carbonate Reservoir,” in
Lake LW, Carroll HB Jr and Wesson TC (eds): Reservoir Characterization II. San Diego,
nique that produces remarkably realistic California, USA: Academic Press Inc. (1991): 355-381.
imitations of nature. Fractal objects exhibit
similar variations at all scales of observa-
tion. Every attempt to divide fractal objects
into smaller regions results in ever more
similarly-structured detail (right ).9 This sim-
plifies stochastic modeling. The variogram is nThe Sierpinski Gasket. This self-simi-
lar fractal structure was devised by the
defined from a single number—the fractal Polish mathematician W. Sierpinski
dimension, calculated from measured data about 90 years ago. It is formed by
in the reservoir or outcrops. And because dividing the largest triangle into
fractals are self-similar, the variance of the smaller triangles with sides half as
noise need be determined only at a single long as the original. In three of the
resulting four triangles the process is
scale. Fractal modeling has been used to repeated. The process is repeated at
predict the production performance of sev- progressively finer scales. At every
scale the same patterns can be found.
Random fractals generated by a simi-
lar process, but with added stochastic
variations, produce remarkably realis-
tic simulations of geologic variability.

30 Oilfield Review
eral large fields. Recent studies have also shales may be inferred from cores and logs,
9. Hewett TA: “Fractal Distributions of Reservoir Het-
applied fractal models to investigate the per- particularly gamma ray logs. But unless the erogeneity and Their Influence on Fluid Transport,”
formance of miscible gas floods.10 well spacing is extremely dense, nothing is paper SPE 15386, presented at the 61st SPE Annual
Technical Conference and Exhibition, New Orleans,
Discrete data require a special interpola- revealed about the shales’ lateral dimen- Louisiana, USA, October 5-8, 1986.
tion technique called indicator kriging. This sions. Some shales correlate from well to Hewett TA and Behrens RA: “Conditional Simulation
can be combined with stochastic modeling well, but most do not, and their lateral extent of Reservoir Heterogeneity With Fractals,” paper SPE
to form a process called sequential indicator must be generated statistically. 18326, presented at the 63rd SPE Annual Technical
Conference and Exhibition, Houston, Texas, USA,
simulation (SIS). Take a model where either These stochastic shales are drawn at ran- October 2-5, 1988.
sand or shale must be chosen—there is no dom from a size distribution and placed “Fractals and Rocks,” The Technical Review 36, no.
middle ground. At the same time, a random randomly until the precalculated shale den- 1 (January 1988): 32-36.
element must be introduced to obtain multi- sity has been achieved. Shale density is esti- Crane SD and Tubman KM: “Reservoir Variability
and Modeling With Fractals,” paper SPE 20606, pre-
ple realizations. Consider building a model mated from the cumulative feet of shale sented at the 65th SPE Annual Technical Conference
where sand is 1 and shale 0. Indicator krig- measured in wells compared with the gross and Exhibition, New Orleans, Louisiana, USA,
September 23-26, 1990.
ing will assign every grid block some value pay, and then assumed to represent the
10. Perez G and Chopra AK: “Evaluation of Fractal Mod-
between 0 and 1—giving an indication of reservoir volume under study. To ensure that els to Describe Reservoir Heterogeneity and Perfor-
the likely lithology. For example, a value of the realizations honor the deterministic mance,” paper SPE 22694, presented at the 66th SPE
0.7 indicates a 70% chance of sand. In the data, shales observed in wells are placed in Annual Technical Conference and Exhibition, Dal-
las, Texas, USA, October 6-9, 1991.
stochastic stage, this percentage is then used the model first, their lateral extent still deter- Payne DV, Edwards KA and Emanuel AS: “Examples
to weight the random choice of sand or mined randomly. Subsequent, randomly of Reservoir Simulation Studies Utilizing Geostatisti-
shale for the grid block.11 placed shales that intersect a well are cal Models of Reservoir Heterogeneity,” in Lake LW,
Carroll HB Jr and Wesson TC (eds): Reservoir Char-
rejected (below ).12 acterization II. San Diego, California, USA: Aca-
Object-Based Modeling To generate these models, the key statis- demic Press Inc. (1991): 497-523.
In object-based modeling, a picture of the tics specified by the geologist are shale 11. Journel AG and Alabert FG: “Focusing on Spatial
Connectivity of Extreme-Valued Attributes: Stochas-
reservoir model is built from generic length and width, together with some guid- tic Indicator Models of Reservoir Heterogeneities,”
units—for example, sand bodies or shale ance as to their interdependence. Tradition- paper SPE 18324, presented at the 63rd SPE Annual
barriers, each unit having uniform internal ally, the geologist describes shales qualita- Technical Conference and Exhibition, Houston,
Texas, USA, October 2-5, 1988.
properties. These models are built by start- tively as wide, extensive or lenticular—too
12. Haldorsen HH and Chang DM: “Notes on Stochastic
ing with a uniform background matrix and vague for building realistic stochastic mod- Shales; From Outcrop to Simulation Model,” in Lake
adding units with a contrasting property: els. Instead, ranges of length and width are LW and Carroll HB Jr (eds): Reservoir Characteriza-
tion. Orlando, Florida, USA: Academic Press Inc.
either a high permeability background to required, along with gross pay thickness and (1986): 445-486.
which shale barriers are added, or a low average shale density. The depositional 13. Geehan GW, Lawton TF, Sakurai S, Klob H, Clifton
permeability background to which sand environment has great bearing on these val- TR, Inman KF and Nitzberg KE: “Geologic Predic-
bodies are added. ues and the primary sources of this informa- tion of Shale Continuity, Prudhoe Bay Field,” in Lake
LW and Carroll HB Jr (eds): Reservoir Characteriza-
When starting with a high-permeability tion are outcrop studies.13 tion. Orlando, Florida, USA: Academic Press Inc.
background, the vertical distribution of (1986): 63-82.

nStochastic shales in object-based modeling. The location of


each stochastic shale is random and independent of other
shales. The units are drawn randomly from a size distribution
function and placed at random locations until the required shale
density has been achieved.

January 1992 31
In theory, a similar method could be
applied to generate sandstone bodies in a
low-permeability background. However, BP
Research, Sunbury-on-Thames, England, has
developed SIRCH—system for integrated
reservoir characterization—to generate real-
istic reservoir heterogeneities using a library
of depositional shapes. Each shape has a set
of attributes defined by sedimentology and
based on observations of depositional sys-
tems. These attributes determine the size
and position in the reservoir model of each
shape and its relations with other shapes.
For example, channel belts are con-
structed as continuous links of a specified
shape.14 As a belt is constructed, input val-
ues for the attributes are repeatedly sampled
from the library and the generated belts may
be conditioned to honor well data. The
model can also be faulted, structured to
account for regional dip and clipped back
using the mapped top and bottom of the
reservoir to give it a volume indicated by
structural maps (left ).
SIRCH is still stochastic, so several real-
izations are required to estimate reservoir
characteristics. The realizations can be used
to estimate the volume of sandstone within
a specified interval, reservoir connectivity
and hydrocarbons in place for a given
radius around a well. This process has
proved valuable in indicating sensitivity of
results to the input parameters. Influential
inputs can then be targeted for extra study.
Individually, all these techniques paint a
portion of the total picture. In most cases,
gaining a full view of the reservoir requires a
hybrid (see “The HERESIM Approach to
Reservoir Characterization,” next page ).
nBuilding an object-based model using SIRCH, a software pack- Object-based, discrete modeling may be
age developed at BP Exploration. In this model, two different used to describe the large-scale hetero-
types of fluvial channel belt have been generated in a three-
stage process. First, one type of channel belt is generated (yel- geneities in the reservoir—the sedimento-
low). Then a second set (purple) is added to the first. This second logical units—while different continuous
type has associated overbank deposits (red)—which are thinner, models may describe the spatial variations
poorer quality sands that nevertheless improve connectivity. of properties within each unit. For example,
Finally, the combined channel belts are faulted. for fluvial environments, BP Research is
Each channel belt is created using about 50 properties. Some
are constants, some are picked at random from input distribu- planning to use SIRCH to generate chan-
tions and some are calculated from other properties. Properties nels, SIS for the facies within them and SGS
that determine channel shape include width, thickness, reach for permeability within the facies.
length and angle, azimuth and depth. Flow properties, like per- (continued on page 36)
meability and porosity, are assigned either by lithology or by
using a random sample. All well data are honored. Net-to-gross
ratio is used to control the quantity of sand generated.
14. Channels are the depositing rivers; channel belts are
the resulting sandstone bodies.

32 Oilfield Review
The HERESIM Approach to Reservoir Characterization

An example of a software package that combines Geostatistical


traditional geologic analysis with geostatistical analysis Simulation of Simulation of Scaling up of
techniques is HERESIM, developed jointly by the • proportion lithofacies petrophysical petrophysical
curves data data
Institute Français du Pétrole, Rueil-Malmaison,
• variograms
France, and the Centre de Géostatistiques, Ecole
Nationale Supérieure des Mines de Paris, Foun-
tainebleau, France. This allows sedimentological Stratigraphic studies
and geostatistical structural analysis to be car- Fluid-flow modeling
Enhanced oil recovery
ried out on interactive workstations, followed by
studies
geostatistical conditional simulations using Reinterpretation of the results in
terms of sequential stratigraphy
small-scale grid blocks. Each simulation creates
an image of the reservoir geology, consistent
with well data. Petrophysical data, like perme- Connectivity studies
ability and porosity, are then mapped between Optimization of well spacing
wells using deterministic or stochastic Swept volume estimation
algorithms. Finally, the petrophysical information
nOrganization and use of HERESIM, a geostatistical reservoir
is scaled up for fluid-flow simulation (right).1 characterization package.
First, a sedimentological study is carried out to
subdivide the area under investigation, which can
▲▲▲▲
▲ ▲▲▲▲ ▲ ▲▲▲ ▲
▲▲▲ ▲ ▲Wells ▲ Variogram
include formations bordering the reservoir as 100%
well as the reservoir itself. The largest divisions
Sill
are depositional units that include sets of geneti-
Frequency

cally linked strata, bounded by major sedimen-


tary discontinuities or sudden variations in the

Value
sedimentological environment. Distinction of
units is usually achieved by correlating features
between wells. The next step is to identify litho-
facies within each depositional unit, collect sta- 0
-1000 0 1000 2000
tistical information about them at the wells and Relative distance
then generate stochastic realizations of lithofa- Lithofacies Range Distance
1 2 3 4
cies between wells.
Statistical information obtained from well data, nHorizontal proportion curve. Proportion nVariograms quantify the spatial continu-
curves provide information on the relative ity of each lithofacies in the reservoir. The
seismic interpretation and geologic knowledge are: frequency of each lithofacies in a deposi- sill equals the variance of data; while the
• Proportion curves—the percentage occurrence tional unit. Here, four lithofacies have been range indicates the separation beyond
proportioned using data from 20 wells. which two points are uncorrelated.
of each lithofacies in a depositional unit (right)
• Experimental variograms—to quantify the spa-
tial continuity of each lithofacies in the reser- numerous realizations of lithofacies distribution 1. Eschard R, Doligez B, Rahon D, Ravenne C and Leloch G:
“A New Approach for Reservoirs Description and Simula-
voir (far right).2 can be readily obtained for each unit. However, tion Using Geostatistical Methods,” presented at
With this information and a form of indicator krig- the algorithm works only in rectangular formats, Advances in Reservoir Technology, Characterization,
Modelling & Management, organized by the Petroleum
ing employing a Gaussian random function, so it is necessary to first geometrically transform Science and Technology Institute, Edinburgh, Scotland,
each unit into a rectangle. After stochastic real- February 21-22, 1991.
2. Matheron G, Beucher H, de Fouquet C, Galli A, Guérillot D
izations have been generated, the real shape of and Ravenne C: “Conditional Simulation of the Geometry
the unit is restored. HERESIM offers two types of of Fluvio-Deltaic Reservoirs,” paper SPE 16753, pre-
sented at the 62nd SPE Annual Technical Conference and
transform, depending on whether the prevailing Exhibition, Dallas, Texas, USA, September 27-30, 1987.
environment since deposition has been domi-
nated by erosion or differential subsidence—

January 1992 33
Erosion transformation Subsidence transformation

Fill in eroded strata to create rectangle, then perform stochastic


realization.
Reverse effect of subsidence to create rectangle,then perform
stochastic realization.

Erode stochastic realization to recreate structure.


Subside stochastic realization to recreate structure.

nA step in HERESIM processing: creating rectangles to allow generation of stochastic realizations. Before the stochastic realization
can be generated, the depositional units have to be transformed into rectangles. HERESIM has two ways of doing this. One assumes
that the unit has been eroded and that all the correlation lines within the unit are parallel. To create a rectangle, these correlations
are extrapolated across the eroded sections of the unit. Then the rest of the structure can be stochastically generated and the result-
ing realization re-eroded to recreate the original structure. The alternative technique assumes that differential subsidence has
occurred, causing the correlation lines to diverge. To create a rectangle, the effects of the subsidence are reversed. Then the stochas-
tic realization is created and resubsided.

34 Oilfield Review
nStochastic realization generated using HERESIM.
although some forms of fracturing can also be ties.3 First, average permeability is estimated
accommodated (previous page). from the root of the product of the arithmetic mean
Having selected the most likely realizations of and harmonic mean of the grid-block permeabili-
lithofacies, geologists and reservoir engineers ties. The quality of this value can be judged by
assign porosity and permeability values. computing the log of the difference between the
HERESIM assumes that petrophysical data are two means. The larger the difference, the less
strongly related to lithofacies. For each lithofa- satisfactory the simple average. Grid blocks that
cies, either constant data values can be assigned fail this test are subjected to a more accurate and
or a Monte Carlo-type distribution employed. time-consuming fluid-flow simulation.
The geologic model describes a reservoir at a
decametric scale (on average 50-m [165 ft] square 3. Guérillot D, Rudkiewicz JL, Ravenne C, Renard G and
Galli A: “An Integrated Model for Computer Aided Reser-
or smaller with a thickness of 0.5 m [1.5 ft]). As a voir Description: From Outcrop Study to Fluid Flow Simu-
result, simulated grids are huge (millions of lations,” presented at the 5th European Symposium on
Improved Oil Recovery, Budapest, Hungary, April 25-27,
cells), too big for fluid-flow simulation (above). 1989. Reprinted in Revue de l’Institut Français du Pétrole
45, no. 1 (January-February 1990): 71-77.
Scaling up in HERESIM is performed using a fast
algorithm that calculates absolute permeabili-

January 1992 35
Rescaling the Geologic Model For Fluid- greater the number of grid blocks, the more heterogeneity at all scales, a series of scale-
Flow Simulation expensive the fluid-flow simulation is in up operations can be performed, each deal-
Geologic modeling can describe a reservoir computation. So before the simulation is ing with heterogeneity larger than the previ-
with millions of grid blocks. For the com- run, the geologic model, with its high-reso- ous operation. At each stage in the process,
puter, this mainly presents problems of stor- lution petrophysical data must be enshrined the size of the scale up should be as large as
age. But the next major operation is fluid- in a smaller number of larger grid blocks—a possible without introducing the next level
flow simulation, which involves complex process called scaling up. Reorganizing so of heterogeneity. If a reservoir is divided into
numerical calculation—a monumentally much data may introduce error, masking depositional units, lithofacies, beds and
large task given a million grid blocks (see heterogeneities in the geologic model that small-scale heterogeneity within a bed, the
“Simulating Fluid Flow,” page 38 ). directly affect simulated production. scale-up process will have four stages. For
Today, fluid-flow simulators can cope The key to achieving scale up is deriving example, lithofacies become the basic build-
with up to about 50,000 grid blocks—the a single set of reservoir properties—called ing blocks at the depositional unit scale.17
pseudos—for the larger blocks so that the
fluid-flow simulation after scale up is as Horizontal Flood
close as possible to what it would have

Recovery of original oil-in-place


80%
Detailed geologic model
been with the small-scale data. Arithmetic average and rock
Over the years, a variety of methods has curves
been developed to do this. These include
calculations based on simple averaging
procedures like the oft-quoted Kyte and 40%
Berry method and more complex simula-
tions of parts of the reservoir using small-
scale grid blocks, measuring about 1 m
square and 0.5 m thick.15
0
The impact of small-scale effects on two- 0 50% 100%
phase flow performance is revealed in a Total pore volume injected
study of the North Sea Rannoch formation, Vertical Flood

Recovery of original oil-in-place


carried out at Heriot-Watt University, Edin- 80%
Detailed geologic model
burgh, Scotland. Rannoch subfacies are Harmonic average and
strongly laminated and rippled. The study rock curves
utilized newly acquired minipermeameter
data at a millimetric scale, much denser
40%
than that conventionally obtained from
core analysis.
These data were combined with rock cap-
illary pressure curves to define pseudos for
nModeling the North Sea Rannoch forma- each lamina subfacies. A geologic model for
tion. This reservoir sandstone is character- 0
ized by hummocky cross-stratified bed- hummocky cross-stratification combined the 0 50% 100%
forms. The Department of Petroleum subfacies at the bedform scale (left ).16 The Total pore volume injected
Engineering at Heriot-Watt University simulated flow performance of this model
determined the flow performance of this was then compared with simpler flow simu- nRannoch formation’s flooding character-
depositional structure using the two deter- istics. Using densely-sampled permeabil-
ministic simulations shown here. These lations calculated using arithmetic and har-
ity data, capillary pressures measured
two images represent a slice of rock mea- monic average permeabilities for horizontal from appropriate rock types and knowl-
suring 2.4 m [7.9 ft] and 14.4 m [47.2 ft]. and vertical directions, respectively. edge of the depositional structure to con-
In the first (top), saturation is at initial The flow behavior of the detailed model struct a detailed geologic model, the
conditions. Low permeability, rippled Department of Petroleum Engineering at
crests have the lowest oil saturations was anisotropic, whereas the simpler
Heriot-Watt University simulated both hor-
(green). The second (bottom) shows satu- approach yielded more isotropic behavior izontal and vertical flooding. The results
ration distribution after 0.2 pore volumes and significantly higher recoveries for verti- using the detailed geologic model are sig-
of water have been injected at a low cal displacement (right ). So, small-scale nificantly different from those derived
advance rate of 0.25 m/d. Saturation is heterogeneity affects reservoir producibility, using traditional techniques employing
reduced most rapidly in the low perme- average permeabilities and rock curves—
ability, rippled crests (dark blue). and in many cases, variations in capillary
particularly for the vertical flood.
pressure with rock type and small-scale
sedimentary structure cannot be ignored.
A predominant theme in scaling up is the
identification of a number of discrete scales
of heterogeneity. To preserve the effects of

36 Oilfield Review
Heterogeneity can be assessed by measur-
ing the average value of a reservoir property

Normalized average value


Microscopic Macroscopic Megascopic
within a growing volume. When the volume
is very small, the average value will fluctuate
because of microscopic heterogeneities. But
as the averaging volume increases, more of
these heterogeneities will be encompassed
and the average value will stabilize. Further
growth may result in capturing larger-scale
heterogeneity and the value will again fluc-
tuate (right ).18 Since this behavior varies Volume
among properties, the size of the smallest
grid block should be selected so that all key Fluid density ( ρ)
properties are stable. 19
No matter how many scales are used, nHow sample volume affects
existing methods of calculating pseudos property averages. At small sam-
ple volumes, property averages
have their drawbacks. The elementary fluctuate because of microscopic
methods, using averaged properties, are heterogeneity. As sample volume
quick but not accurate enough. The simula- 0 is increased, heterogeneities will
tion methods are computationally expen- average out and properties stabi-
sive. And if sensitivity studies are required the range of REVρ lize—this gives the representative
Porosity (φ) elementary volume (REV). This sta-
using a number of stochastic realizations, bility continues until larger-scale
the time and cost of scaling up can be pro- heterogeneity begins to take
hibitive. Recently, BP Research developed a effect. Different properties have
new algorithm—real-space renormaliza- different volumes where hetero-
geneity is stable. The ideal sample
tion—that promises speedy yet accurate cal- volume should fall within ranges
culation of pseudos. of stability for all key properties.
The method starts with a group of eight
0
grid blocks (2×2×2) with permeabilities dis-
tributed to represent the original data. A new the range of REVφ
relative permeability is calculated and Permeability (k )
adopted for a new single block the size of
the eight original blocks—it becomes the
starting point for the next renormalization
when the process is repeated with seven
other similarly generated blocks. The new
block’s effective relative permeability is cal-
0
culated using a semianalytical technique that
depends on an analogy between Darcy’s law the range of REVk
and Ohm’s law—block permeabilities are
modeled by an equivalent resistor network.
Through a series of transformations, the
the range of REV for all
resistor network is reduced to a single resis- properties

15. Kyte JR and Berry DW: “New Pseudo Functions to 16. Corbett PWM and Jensen JL: “An Application of 19. Haldorsen HH: “Simulator Parameter Assignment
Control Numerical Dispersion,” Society of Small Scale Permeability Measurements—Prediction and the Problem of Scale in Reservoir Engineering,”
Petroleum Engineers Journal 15 (August 1975): 269- of Flow Performance in a Rannoch Facies, Lower in Lake LW and Carroll HB Jr (eds): Reservoir Char-
276. Brent Group, North Sea,” presented at Miniperme- acterization. Orlando, Florida, USA: Academic Press
Tompang R and Kelkar BG: “Prediction of Water- ametry in Reservoir Studies, organized by the Inc. (1986): 293-340.
flood Performance in Stratified Reservoirs,” paper Petroleum Science and Technology Institute, Edin- Kossack CA, Aasen JO and Opdal ST: “Scaling Up
SPE 17289, presented at the SPE Permian Basin Oil burgh, Scotland, June 27, 1991. Heterogeneities With Pseudofunctions,” SPE Forma-
and Gas Recovery Conference, Midland, Texas, 17. Lasseter TJ, Waggoner JR and Lake LW: “Reservoir tion Evaluation 5 (September 1990): 226-232.
USA, March 10-11, 1988. Heterogeneities and Their Influence on Ultimate Norris RJ and Lewis JJM: “The Geological Modeling
Recovery,” in Lake LW and Carroll HB Jr (eds): of Effective Permeability in Complex Heterolithic
Reservoir Characterization. Orlando, Florida, USA: Facies,” paper SPE 22692, presented at the 66th SPE
Academic Press Inc. (1986): 545-559. Annual Technical Conference and Exhibition, Dal-
18. Any volume for which the property is stable is called las, Texas, USA, October 6-9, 1991.
a representative elementary volume (REV).

January 1992 37
Simulating Fluid Flow

Today, most simulators solve fluid-flow problems model. Adjustments should be consistent with initial production rates. To combat this, Shell has
by segmenting a portion of the reservoir into a measured data and improve upon postulated vari- devised a simulator that allows flow in all direc-
series of grid blocks—either two- or three-dimen- ations in reservoir properties that cannot be or tions between neighboring blocks.
sional. The fluid phases in each block are mod- have not been measured. The results of the simu- Fluid-flow simulation for horizontal wells is
eled with finite difference equations similar to lation can be extremely sensitive to variations in currently being approached in at least two ways.
the conventional volumetric material balance the reservoir description. Models for reservoir The productivity of a horizontal well can be
equations. Darcy’s law is then used to describe management are usually routinely updated as approximated, using numerical solutions of sim-
fluid flow between grid blocks. new production data or reservoir information are plified equations, and the approximation used in
Early efforts in reservoir simulation, in the late acquired. This allows for reexamination of cur- a conventional reservoir simulator. Alternatively,
1940s, began with simple material balance cal- rent and future production scenarios. But history for single wells in which short-term performance
culations. Throughout the 1960s, black oil simu- matching always results in nonunique combina- is being studied, fine-scale, black oil simulations
lations dominated the scene. These describe tions of variables. And it could be said that the only have been used. In this approach, pseudos can
nonvolatile oil as a single component containing fully simulated reservoir is a fully depleted one. be derived for a coarser, field-sized simulation.
gas in solution as a second component. There are Simulator technology is by no means mature.
three compressible, immiscible phases: gas, Some current developments center on improving
1. Mattax CC and Dalton RL: “Reservoir Simulation,” Jour-
water and oil. Recovery methods simulated the ability to cope with different production sce- nal of Petroleum Technology 42 (June 1990): 692-695.
included pressure depletion and some forms of narios, for example, naturally fractured forma- Breitenbach EA: “Reservoir Simulation: State of the Art,”
Journal of Petroleum Technology 43 (September 1991):
pressure maintenance.1 tions and horizontal wells. 1033-1036.
The advent of enhanced recovery techniques During production by gravity drainage in natu- 2. Cheshire IM and Pollard RK: “Advanced Numerical Tech-
niques for Reservoir Simulation and Their Use on Vector
such as chemical flooding, steamflooding and in- rally fractured reservoirs, oil drains from matrix and Parallel Processors,” in Edwards SF and King PR
situ combustion, ushered in more sophisticated blocks into the fracture system. From there it can (eds): Mathematics in Oil Production. Oxford Science
Publications, Oxford, England: Clarendon Press (1988):
simulators. Most describe the hydrocarbon in pass either into other blocks below or along frac- 253-268.
terms of a number of components and solve ther- tures. A fractured reservoir is modeled using two 3. Por GJ, Boerrigter P, Maas JG and de Vries A: “A Frac-
tured Reservoir Simulator Capable of Modeling Block-
modynamic equilibrium equations to determine interacting continua, one for the matrix and one Block Interaction,” paper SPE 19807, presented at the
64th SPE Annual Technical Conference and Exhibition,
the distribution of components between the liquid for the fractures—the so-called dual porosity San Antonio, Texas, USA, October 8-11, 1990.
and vapor phases in the reservoir. 2 concept. The matrix contains most of the oil and
A common strategy for fluid-flow simulation is the fractures most of the conductivity. So trans-
to use history matching to adjust the geologic missibility between matrix blocks is usually
ignored. But to correctly model ultimate recovery,
the capillary contacts between matrix blocks
should be considered.
Work at Koninklijke/Shell Exploratie en Produk-
tie Laboratorium, Rijswijk, The Netherlands, has
shown that a vertical stack of oil-saturated matrix
blocks surrounded by gas does not drain indepen-
dently.3 Oil from one block passing into the frac-
ture system will be absorbed by the underlying
block, slowing production. Therefore, a flow sim-
ulation that ignores this effect will overestimate

38 Oilfield Review
tance, equivalent to the new block’s effective
relative permeability.
The effect of each stage of renormaliza-
tion is to produce larger blocks whose per-
meability approaches that of the whole.
Results of renormalization have been com-
pared with direct numerical simulation and
the maximum error was found to be only
7%.20 The main difference was in computa-
tional effort. Renormalization speeds are up
to two orders of magnitude faster than those
of direct numerical simulation. The largest
problem yet tackled totalled 540 million
grid blocks.
Renormalization has drawbacks: it is diffi-
cult to represent contorted flow paths with
small cells and the estimation of effective
permeability suffers. This is seen in very
shaly reservoirs, where there are high con-
trasts between neighboring permeabilities.
This problem may be resolved by another
method that short-circuits the need for scale
up and requires a different kind of reservoir nCutting simulation time using streamtubes, a network of conduits conveying fluid
characterization. This is the streamtube from injection to production wells. This field-scale model incorporates five solvent injec-
approach, championed by Chevron Oilfield tion wells (pink) and seven producers (yellow). Using streamtubes, simulating the injec-
Research Co., La Habra, California, USA, tion of 2.5 pore volumes of solvent took just 30 seconds on a MicroVAX 3200 worksta-
tion. Using conventional fluid-flow simulation for a coarse grid and a Cray XMP-48, the
among others.21 The plan view of the reser- calculation required 630 seconds. The two calculations predicted similar solvent break-
voir is simulated as a network of tubes that through. (After Hewett and Behrens, reference 21.)
carry the flow between injection and pro-
duction wells (right ). The technique takes
20. King PR: “The Use of Renormalization for Calculat-
into account well placement, areal hetero- may always be able to generate more com- ing Effective Permeability,” Transport in Porous
geneity and the relative flow rates of wells. plex geologic models than can be simu- Media 4 (July 1989): 37-58.
The key to the technique’s success is a pre- lated. Scaling up will therefore remain a key King PR, Muggeridge AH and Price WG: “Renor-
liminary flow simulation on two-dimen- element in reservoir characterization. malization Calculations of Immiscible Flow,” Sub-
mitted to Transport in Porous Media, August 1991.
sional vertical slices between wells. The The fluid-flow simulation bottleneck will 21. Emanuel AS, Alameda GK, Behrens, RA and Hewett
observed scaling behavior is then mapped have to be addressed, either by developing TA: “Reservoir Performance Prediction Methods
onto individual streamtubes and their contri- new, more rapid processing hardware like Based on Fractal Geostatistics,” SPE Reservoir Engi-
neering 4 (August 1989): 311-318.
butions summed to get a three-dimensional parallel processing 22 or by smarter algo-
Hewett TA and Behrens RA: “Scaling Laws in Reser-
prediction of reservoir performance. rithms, or probably both. Then, the capabil- voir Simulation and Their Use in Hybrid Finite Dif-
Because of the relatively greater arith- ity for cost-effective and repeated stochastic ference/Streamtube Approach to Simulating the the
Effects of Permeability Heterogeneity,” in Lake LW,
metic complexity of fluid-flow simulation simulation will be realized. The impact of Carroll HB Jr and Wesson TC (eds): Reservoir Char-
compared with model building, computers this capability on reservoir management, acterization II. San Diego, California, USA: Aca-
and ultimately on improved recovery, demic Press Inc. (1991): 402-441.
remains to be felt. —CF 22. Mayer DF: “Application of Reservoir Simulation
Models to a New Parallel Computing System,” paper
SPE 19121, presented at the SPE Petroleum Com-
puter Conference, San Antonio, Texas, USA, June
26-28, 1990.

January 1992 39
$
RESERVOIR OPTIMIZATION

Taming the Geoscience Data Dragon

Cross-disciplinary uses of oilfield information are changing the face of petroleum data

management. Here is a look at exploration and production data management—where it

stands today and where it needs to go.

Steve Darden The center of attention for the exploration of paper and film have been loaded into the
John Gillespie and production geoscientist is the base computer. Porosity data are three keystrokes
FINDER Graphics Systems, Inc. map—a map of geologic or geophysical fea- away rather than buried in a pile ( next
Corte Madera, California, USA tures, such as structure, fluid type or bed- page ). Seismic and log data pop up in sepa-
ding tops. This map, built from cross sec- rate windows on the same screen. The
LaRay Geist tions based on log and seismic data, is used overnight batch job is replaced by an inter-
Geoffrey King to locate new prospects and plan develop- active map revision...let’s see what happens
BHP Petroleum (America) Inc. ment of existing ones. if we assume the Dunlin formation here is
Houston, Texas, USA To produce this map in early 1980s, the 200 feet thick instead of 170 feet...The geo-
E&P geoscientist spent 70 to 80% of the scientist no longer tracks data mentally; that
Scott Guthery time locating, sorting and reprocessing data is done by the workstation. He or she can
Ken Landgren and making names and scales of data con- devote more energy to data analysis and
Schlumberger Austin Systems Center sistent.1 At the end of the day, a map was more to data management. Now data are
Austin, Texas, USA produced by batch processing on a main- shared across disciplines, and interdisci-
frame computer. Hours later, or the next plinary integration is no longer a separate
John Pohlman morning, the geoscientist draped the map step. It is contained in the interpretation.
ExplorTech Computer Applications over a drafting table, beside previous maps. The workstation is the most visible prod-
Houston, Texas, USA To do the analysis, the scientist looked at uct of a data revolution that, in just ten
everything—compared maps, referred to years, has affected every level of the E&P
Samuel Pool seismic sections and logs that covered the business.2 In many oil companies, central-
ARCO office wall, and hunted through production ized data processing is giving way, in whole
Plano, Texas, USA and core reports. If the scientist could keep or part, to smaller units of “project data”
track of the data, he or she could assemble downloaded from mainframe computers to
Dave Simonson a clear mental picture of the reservoir and workstations. The user’s demand for easier
Consultant therefore generate the best possible map. communication between application pro-
Oakland, California, USA Ultimately, the geoscientist would meet grams is compelling software vendors and
with specialists from other oilfield disci- oil companies to set aside rivalries and join
Paul Tarantolo, Jr. plines and, through a series of meetings, forces to write industry standards for data
Exxon Production Research Company contribute to the integration of different map formats, data exchange and computer
Houston, Texas, USA versions. For example, the geologist’s map, graphics. Data organization systems are
built from well log correlation, would be springing up that allow previously isolated
Dan Turner integrated with the geophysicist’s map, built data to be accessed from one workstation.
Petrotechnical Open Software Corporation from seismic data. Ten years ago, all this was mostly infeasible.
Houston, Texas, USA To produce a map in the 1990s, the E&P
geoscientist sits before a workstation. Only
the remnants of clutter remain. The drafting
table is little used. Information from stacks

40 Oilfield Review
nThe old world of
geoscience inter-
pretation meets the
new. Remnants of
clutter remain
around a demon-
stration worksta-
tion at FINDER
Graphics in Corte
Madera, Califor-
nia. Nearly all the
data surrounding
Arlene Fox and
Eric Erickson have
been loaded into
the data base
accessed by the
workstation. Note
on the far wall the
“correlation” estab-
lished between
logs, using pushpins
and rubberbands.
The same sort of
correlation has been
migrated into the
computer, using
graphic devices.

Gary Wagner photo


As a result of increasing computer power Hardware
and versatility, the exploration and produc- Reservoir data today are stored and ana- Makers of mainframes note the petroleum
tion data dragon has been caged but not lyzed on four kinds of hardware systems: industry’s long-standing commitment to the
tamed (see “Evolution of Petroleum Data mainframe computers (including minicom- mainframe, citing its many advantages:
Management, next page). In 1992, explo- puters and servers), workstations, a combi- computing power, capacity for data security,
ration and production data management nation of mainframes and workstations, and ease and reliability of data access (usually a
stands at an uncertain and exciting thresh- on small computers, usually workstations or single interface for all data), and ease of
old, marked by rapid change in three areas: personal computers (PCs) dedicated to spe- tracking data history. Workstation vendors
computer hardware, computing standards cific tasks, such as seismic or log analysis. see their tool as the mouse—limited in
and database development. Many see this as an unstable transition power, but flexible and inexpensive—that
state, and predictions of the outcome vary. (continued on page 44)

For help in preparation of this article, thanks to Jeffrey A. Terry J. Sheehy, FINDER Graphics Systems, Inc., Hous- A list of acronyms used in this article appears on page 54.
Brown and J. William Bradford, GeoQuest Systems, Inc., ton, Texas, USA; Av Munger, Munger Oil Information 1. Citerne A and Yu K: “Knowledge-Based Well Data
Houston, Texas, USA; Ron Dietsch and Lisa Stennes, Services, Los Angeles, California, USA; Bill Quinlivan, Management and Graphical Interface,” paper SPE
Petroleum Information, Denver, Colorado, USA; Eric Schlumberger Austin Systems Center, Austin, Texas, USA; 17611, presented at the SPE International Meeting on
Erickson and Arlene Fox, FINDER Graphics Systems, Ron Samuels, Dwight’s Energydata, Inc., Dallas, Texas, Petroleum Engineering, Tianjin, China, November 1-
Inc., Corte Madera, California, USA; Jerry House, USA; Bill Schork and Lisa Stewart, Schlumberger-Doll 4, 1988.
Petroleum Information, Houston, Texas, USA; Mel Research, Ridgefield, Connecticut, USA; Ron Uchida,
2. About 50 companies offer E&P hardware or software
Huszti, Gulf Canada Resources Limited, Calgary, Alberta, FINDER Graphic Systems, Inc., Lakewood, Colorado,
systems for the exploration and production environ-
Canada; Stephen Kenny, Digitech Information Services USA; Laramie M. Winczewski, Simon-Horizon Inc.,
ment. See Oil & Gas Journal 88, no. 11 (March 12,
Ltd., Calgary, Alberta, Canada; Pam Koscinski, Dwight’s Houston, Texas, USA.
1990): Special Supplement, 32-3–32-17.
Energydata, Inc., Oklahoma City, Oklahoma, USA; Marc The following marks and trademarks appear in this arti-
Lador, Petroconsultants, Geneva, Switzerland; Gary cle: Cray (Cray Research Inc.), FINDER (Schlumberger),
Meyers, ARCO Oil and Gas Company, Plano, Texas, GeoShare (joint mark of Schlumberger and GeoQuest
USA; Phil Freeman, Eric J. Milton, Dwight V. Smith and Systems, Inc.), Intel (Intel Corp.), Macintosh (Apple Com-
puter, Inc.), MOTIF (Open Software Foundation, Inc.),
ORACLE (ORACLE Corporation), UNIX (AT&T Bell Labo-
ratories), VAXstation (Digital Equipment Corp.), VMS
(Digital Equipment Corp.) and X Window System (Mas-
sachusetts Institute of Technology).

January 1992 41
Evolution of Petroleum Data Management

1920 1930 1940 1950 1960

•1919
• •1924
• •
•1928

•1934 •1945
• •
•1951
• •1955
• • •1956
Munger Oil Everett Petroleum Production Electronic Ferranti Mark I, Production Petroconsul-
Information De Golyer Information reports1 offered Numerator, first commer- information tants S.A.
Service first to discovers first Corp. (PI) (Texas Panhan- Integrator, Ana- cially manufac- available in starts monthly
offer drilling & oil field using started in dle) by company lyzer and Com- tured computer, digital form scouting report
completion single-fold Denver, that became puter (ENIAC) is installed at (Lockwood). service in Cuba;
reports (only in seismic data, Colorado, USA Dwight’s Energy- invented at Manchester expanded to
Southern Cali- Nash salt dome, as weekly drilling data. University of University (UK). world outside
fornia, USA). Brazoria County, report for Rock- Pennsylvania, North America
Texas, USA. ies only. USA, first fully in 1962.
•1953
functional elec-
First commer-
tronic calculator.
cial digital
computer (IBM).

1. Production reports typically include data on monthly and cumulative production, lease data, total depth, completion dates and
number of wells on the lease.
2. Scout reports, or “tickets,” typically include all drilling-related data from the time the well is permitted through to completion.
TRANSISTORS

VACUUM TUBES I

Paper and film

Core memory
Solid state m

Drum me
Magnetic ta
Punch cards and paper tape
Hard disk sto

Computing for Oil—A Short History of Data in the Petroleum Industry


Over the past 40 years, the life of the geoscientist late 1950s, computers proved their mettle in series in 1964, and geoscientists began moving
has been changed largely by advances in infor- helping solve differential equations describing data from bookshelves and file cabinets into com-
mation science—how data about the earth are the flow of oil through the reservoir. An early puters. They built well files that had mainly raw
acquired, processed, stored, retrieved and ana- application for exploration was analysis of a log data, but also some interpreted data. To eval-
lyzed. Before computers, prospects were found small number of grids for 2D reservoir modeling. uate a prospect, they would take a seismic sec-
mainly by surface reconnaissance and crude Computational power jumped a level in the early tion and correlate it by hand to the sonic log.
seismic surveys. This required a tremendous 1960s, when discrete transistor technology arrived. There was no automated correlation, and the
number of people, to both acquire and sort For reservoir modeling, this allowed adding a computer well files mimicked the paper well files
through data. Armchair exploration using com- third dimension and increasing the number of from which they descended. Later, computer fil-
mercial data services—scout tickets and other grids, which improved resolution. But, more sig- ing grew more sophisticated and data bases
well data—became possible in the 1930s, but not nificantly, for the first time earth properties with evolved within each discipline: geology, geo-
practical until the 1950s, and was labor-intensive nonlinear characteristics, like relative permeabil- physics, drilling, petrophysics and reservoir
until the 1970s. ity and capillary pressure, could be modeled. engineering. By the 1970s, it was standard to
The first use of computers in the petroleum Computers moved from novelty to commercial manually tap information from these separate
industry was far removed from exploration. In the popularity with the introduction of the IBM 360 computer data bases. Now, exploration didn’t
42 rely mainly on expensive field reconnaissance. It
ution of Petroleum Data Management

1960 1970 1980 1990


•1962

•1971
• •
•1972

•1976

•1978 •1987 •1989
• •• •1990
First digital Intel invents Scout reports Delivery of first National Pro- RISC technol- Development Scout informa-
scout2 reports microprocessor. available Cray-1 super duction System ogy becomes of International tion on CD
(PI). nationwide computer. (NPS), a com- commercial. Relational ROM (Dwight’s).
[USA] (PI). mon format for Information
First production data System
•1973 commercial and histories, (IRIS 21)
International satellite established by relational data
well and con- telecom- 13 major oil base for E&P
cession data munications companies. data
supplied on of log data. First on-line (Petroconsul-
computer tape system for con- tants).
(Petroconsul- veying produc-
tants). tion data world- Raster images
wide via phone of entire logs on
lines (Dwight’s). diskette, optical
disk or magneto-
optical disk (PI).
Early 1960s: Emergence of 2D Early 1970s: Intergraph first company to adapt interactive
multifold seismic data. computer display technology for oil industry application.

Mid- to late-1960s: Wireline logs and Late 1970s: Gulf Oil’s ISIS (Interactive Late 1980s: Interactive color graphics
seismic data on paper computer tape or Seismic Interpretation System), first on- capability available for ~ $20,000, mak-
cards. First capability to store and print screen integration of two disciplines: ing it commercial for the first time.
multiple copies of logs. seismics and wireline logs.

MICROCIRCUITRY

INTEGRATED CIRCUITS

Video

Solid state memory


Mass storage unit (IBM, 1976)
Drum memory
Magnetic tape

Hard disk storage


Floppy disk storage
Optical storage

could also be performed in the computer room These data bases were part of the established initially slower than on hierarchical ones, the
with purchased data. isolation of disciplines. By and large, the produc- idea of searching by “relations” fostered a new
These first discipline-specific data bases were tion engineer never talked with the exploration way of thinking about database design—you don’t
hierarchical and so only relationships between engineer. They worked for the same company, but need to anticipate all the relations between data
data sets anticipated by the author of the hierar- belonged to separate organizations, with sepa- when designing the data base. Through the 1980s,
chy could be considered. The production data rate cost control points, separate budgeting this technology trickled into the oil industry, which
base, for example, might have been organized by issues and often different management points had become thoroughly computerized. Relational
lease block only, not by hydrocarbon type, pro- that converged only at the level of chairman. technology made possible data integration across
duction volume or completion date. If you wanted Today, this separation has largely disappeared, disciplines—core and log porosities, for instance,
to map productivity from the Wilcox formation, due in part to a force from outside the industry. were combined in the same data base. Slowly,
you would have to manually identify the Wilcox in In the 1970s in California’s silicon valley, the
1. The seminal paper that started relational technology was
each well in the petrophysics data base, then first papers were published on relational data by Edgar Codd of IBM’s research laboratory in San Jose,
California, USA: Codd EF: “A Relational Model of Data for
compare that manually with each well in the pro- base technology, and by the early 1980s it was Large Shared Data Banks,” Communications of the ACM
13, no. 6 (June 1970): 377-387. An update on this work
duction data base—and the two data bases might becoming the rage in the computer industry.1 is Codd EF: The Relational Model for Database Manage-
have been in different buildings, on separate Although searches on relational data bases were ment. Addison-Wesley Publishing Company: Reading,
Massachusetts, USA, 1990.
computers and using different software. 43
replaced the brontosaurus mainframe— Mainframe evolved only to the Piper Cub, IBM faces
powerful, but cumbersome and expensive. several new challenges. A technical chal-
Users are taking several roads. A handful lenge, given the size of the data base and
emphasize mainframes, some are going nature of its contents, will be a design that
completely to workstations, and the major- permits establishing meaningful relations
ity are trying to mix workstations and main- between data. How the mainframe will
frames. The consensus seems to be that communicate with workstations is also
mainframes will not disappear, but their role uncertain. Many oil companies and work-
may change from data analysis to data station vendors want IBM to design the sys-
archiving and control. The main exceptions tem to be compatible with their own sys-
are reservoir simulation and seismic data tems. This would simplify, for example,
processing, which benefit from being per- networking, checking of data in and out of
formed with conventional microcomputers the mainframe and data representation.
linked in parallel or with supercomputers. Log The shift to workstations also affects peo-
Mapping
Urgently needed is a reliable link between Analysis ple doing E&P work. As oil companies try to
mainframes and workstations, since most oil maximize their resources, managers wel-
companies have both technologies and come workstations for their economic as
need a way to get the most from each. well as technical benefits. Buying and main-
Many companies have assembled in-house taining workstations is significantly less
links that operate with varying degrees of Seismic expensive than mainframe computers. Addi-
Interpretation
efficiency and reliability (right ). Often, how- tionally, workstations let geoscientists take
ever, mainframes and workstations cannot greater responsibility for data organization
communicate easily, if at all. and quality, formerly the work of data man-
Much work remains before the two tech- Application Application/data agers or geotechnicians.
programs interface
nologies can be married in a standard, non-
proprietary way. Meanwhile, a leading effort Data files/data base The Battle for Computing Standards
for a proprietary linkage is a project orga- Linguists estimate that when Christopher
nized by International Business Machines Columbus landed in the New World 500
(IBM) and consisting of IBM and a consor-
nTransfer of data by links between pairs years ago, Native Americans spoke 1000
of computers. The advantages of this
tium of oil companies. The project is an approach are that it does not hinder cre- separate, mutually unintelligible languages.
attempt to develop a large IBM mainframe- ation of new application programs; it iso- Today’s E&P geoscientist, casting around for
based relational data management system lates the impact of change; and it allows a data system, may feel like Columbus (see
tight integration of programs within a dis-
with a mechanism for workstations to cipline. The disadvantages are that it “Misconceptions about Data Management,”
upload and download data (see “Database does not provide integrated access to next page and “Matching Needs and Capa-
Development,” page 49, for discussion on data, requires transfer software for each bilities,” page 47 ). Among the numerous
data bases). link pair, and makes tracking of data his- standards, only a few are recognized con-
The data base is unique in that it will be tory and consistency difficult. The names ventions—and sometimes within these are
of the programs are taken as examples;
several times larger than any existing oilfield they could be any type. (From Guthery et al, several incompatible versions.
data base, able to hold information from reference 3.) The current generation of exploration soft-
several fields. Because this system is the first ware consists mostly of workstation- and
Boeing 747 in an industry that has so far PC-based, stand-alone applications and

holes appeared in the walls separating types of Again, the roots of the solution came from out- the geoscientist’s fingertips, the more obvious
data and separating E&P organizations. side the industry. From the ashes of the price col- the need for a way to organize it and make it
This incremental change was given a boost, lapse rose a few companies offering relational accessible between different hardware and soft-
ending disciplinary isolation, by the oil price col- database technology for the petroleum industry, ware configurations.
lapse of 1986. Suddenly, fewer resources were among them, Ingres Corporation (purchased by This brings us to the present, and the efforts of
available to solve a growing number of E&P prob- Ask Computer Systems Inc.), Oracle Systems standards groups to define an open systems
lems. Management consequently had to revise Corp. and Software AG. These data bases architecture that is acceptable to both vendors
business strategies, and one practice that enabled geoscientists to relate data that previ- and clients (see pages 46-48 of main text). The
emerged was the interdisciplinary team. Today, ously had not been recognized as related, and it goal now is to make all forms of data available to
the idea of the geophysical team solving a prob- allowed users to define the relationship between any geoscientist working on any workstation,
lem on its own, independent of the geologist and the data. For example, it allowed log values to seamlessly and effortlessly.
reservoir engineer, is almost nonexistent. But constrain a seismic inversion.
before the interdisciplinary teams can work effi- Along with the introduction of this new capabil- For further reading:
Augarten S: Bit by Bit: An Illustrated History of Computers.
ciently, a final challenge remains—finding a way ity, in the late 1980s, was the declining cost of New York, New York, USA: Ticknor & Fields, 1984.
Computer Basics: Understanding Computers. Time-Life
to join data from different disciplines. storage media, which increased the amount of Books, Alexandria, Virginia, USA: 1989.
on-line information. And the more information at
their proprietary data bases.3 While these Misconceptions about Data Management
tools are powerful within their own realm,
they can neither talk to each other nor eas-
ily share data. Routine analysis usually • Buying a data base or data management system ples of errors found in vendor data include
requires several software applications, often is enough. Buying a system is the beginning, February having 30 days, an offshore platform
from different vendors, and the user must not the end, of problem solving. Once a system with the Kelly bushing at 2000 feet, and a 1000-
manually reformat the output from one
is acquired, there are other issues to confront: foot well with TD at 5000 feet. In the future, much
application before loading it into the next.
To make matters worse, applications often How are data converted and loaded? Who will of checking for plausibility will be automatic.
reside on different computer systems, so perform quality checks before data are loaded? • Just having a geographic information system
data must be moved from system to system, What will be required to ensure that the new (GIS) or data base will increase productivity.
and in many cases must be rekeyed when system interfaces with existing systems, or The learning curve for workstations can be on
data conversion is not performed automati- other systems that might be acquired in the near the order of months to years. One seismic work-
cally. Even if two systems can be linked,
future? What kind of training is available? What station vendor’s own survey found that less than
data sharing is often slow because the
underlying proprietary data base is struc- are the maintenance agreements? 1% of the users of its system ranked themselves
tured to run only its specific application as Loading a large data base is a major task. In as “experienced,” even after eight years. One
fast as possible. the early 1980s, Exxon Production Research reason is that understanding the nuances of
This lack of communication emerged built a seismic interpretation data base. Locat- workstation operation takes time. Another rea-
because software developers seek speed
ing, converting and loading data for a relatively son is that most users are occasional users,
through proprietary solutions. Furthermore,
until recently, both oil companies and ven- small project—100 seismic lines—took who tend to learn the machine well enough for
dors tended not to share solutions because months. During database construction, a geo- their job but not to the point of mastery. The
of the perceived risk of revealing proprietary scientist working by hand could finish an inter- learning curve is shrinking, as new machines
algorithms.4 The direct effect is a higher cost pretation before another geoscientist awaiting become easier to use, but they still require
of software engineering—each software completion of the data base. But after the data some computer background. An average user of
vendor must design user interfaces, graphics
base was complete, mapping took half as long. the FINDER system, for example, can get the
standards and database architecture. And
this is often at the expense of analytical • All types of information can be dealt with the basics in about one week—given a background
capability. The indirect effects for the user same way. Certain systems may improve pro- in UNIX and SQL and some experience with
are less efficient use of software programs, ductivity only for certain tasks. Most information ORACLE data bases.
higher training and support costs and time- systems are good at only one or two tasks. One A second part of this misconception is that the
consuming data management.
oil company bought a mapping system that was only advantage of a GIS system is the ability to
Geoscience computing entered the 1990s
with a lack of standards—such as for user designed for cultural data used by city planners work faster. True, the geoscientist can spend
interface, data exchange and graphics. For- and highway engineers. They found, however, less time looking for data and therefore produce
tunately, by the mid-1980s, the computer that well data were so dense that data “over- an interpretation faster. But an often overlooked
industry at large had recognized the futility posting” produced an uninterpretable image on advantage is that a more thorough interpretation
of a proliferation of standards, and that with- the screen. The vendor was not responsible for can be performed in the same amount of time as
out a cooperative effort, everyone would
the problem—its system was working as when done on a mainframe.
lose their competitive advantage. Standards
groups sprang up for operating systems5, intended. The oil company had to donate to the • Information management is somebody else’s
quarter-inch tape manufacturers and vendor millions of dollars in software engineer- job. Anyone using a computer that is not a
ing to solve the problem. stand-alone system is managing shared infor-
3. Guthery S, Landgren K and Waagbo K: “Integrated • If data are in the data base, they must be right. mation, and information management is part of
Data Access for Geoscience Interpretation Systems,” Many geoscientists can tolerate working with his or her job. This attitude helps prevent devel-
GEOBYTE 5, no. 5 (October/November 1990): 38-41.
4. Schwager RE: “Petroleum Computing in the 1990s: paper copies of data that contain errors. They opment of an adversarial relationship between
The Case for Industry Standards,” GEOBYTE 6, no. 1 perform quality assurance on the fly, or compen- people using an information system and people
(February/March 1991): 9-13.
5. An operating system is a software family that lets sate in their interpretation. But once data managing it. Every explorationist is responsible
information flow through the computer as fast as pos- become digitized from paper to an electronic for the quality of data that go into the interpreta-
sible but in an orderly fashion. It also serves as a trans-
lator or interpreter for the user. A user doesn’t talk to a file, the perception is that the data must be cor- tion. Data loading and quality assurance may be
computer, but to an operating system, which com-
mands the hardware to fulfill the user’s request. Tasks
rect—“otherwise, somebody wouldn’t have put subcontracted, but the explorationist is respon-
of an operating system include: it in the data base.” It is valuable to recognize sible for correctly placing the wells, scaling the
•Controlling movement of data into and out of com-
puter memory
that data are data, no matter the form. A valu- logs and picking the formation tops. It is in the
•Supervising operation of the central processing unit able component of the data base is a record of
•Sending data to peripheral devices, such as monitors who loaded and checked the data.
and printers
•Keeping data and programs uniquely identified. Checking of data requires not only attention to
In oilfield computers, common operating systems are detail but a wide-ranging imagination. Exam-
UNIX, Disk Operating System (DOS), and Virtual
Memory System (VMS).

45
best interest of the company if the explorationist’s
needs and understanding of the data are clearly
communicated to people managing the data base. 40 41
• A data base is an application. Workstation appli-
cation products, such as for log interpretation or
mapping, have an underlying internal data base
that is largely invisible to the user. Consequently,
Geopolitical
the concept of a data base is often not appreci- Data
ated. The data base itself is not an application
but a reservoir of information accessed through
Well Data Seismics
an application.
Logs Subsurface Surveys
• A central (or a local) data base is preferred. Cores Model Cross
The conflict in the choice of central vs. local is Cuttings sections
between the need to freely make changes in the
data and the need to prevent changes that
Production
destroy the accuracy of data. Central data bases Data
have higher security, but can inhibit interactive
interpretation. One solution to these conflicting 60

50

Barrels of Oil/Day
needs is the use of a database management 40

system that facilitates the automatic creation of 30

20

local data bases, devoted to the study area. This 10

0
allows the geoscientist to manipulate a sub- 82 83 84 85 86 87 88 89 90

Year
group of data, without altering the integrity of
the central data base.
nA central part of an E&P data model is the subsurface model, which is used to guide
• Graphics capabilities are of foremost impor- the search for, and development of, petroleum reserves. The subsurface model is devel-
tance. Good graphics capability in a workstation oped with information from the four surrounding disciplines. (Adapted from Berkhout et al,
reference 6.)
is only one factor in its overall performance.
Often overlooked in judging the day-to-day prac-
database programs. By the time the problem Flaws in data models often become
ticality of a workstation is ease of data load- of multiple standards surfaced in petroleum apparent when a type of data does not fit
ing—not something vendors rush to demon- computing, the broader computer industry’s the model or when relations between data
strate on the convention floor. Nevertheless, efforts could be used as a guide. cannot be accommodated by the structure
finding and loading data take a large chunk of As geoscientists mix interpretation from of the model—for example, horizontal wells
the four major disciplines—geology, geo- may require new ways of computing log
the geoscientist’s time, and the easier the load-
physics, petrophysics and reservoir engi- data, such as varying the measured value
ing, the sooner data analysis can begin. neering—a foremost need is for a standard with horizontal distance rather than vertical
• Speed is of the essence. Speed is important in exploration and production data model: a depth. A data model may have nothing to
an application, but less so for a data base. system for organizing and defining geo- do with computers. Computers provide only
Database software and hardware can be tuned science information and interrelationships a representation of that model, such as a
to a specific application, increasing database
between data to model the reservoir in a relational or hierarchical database structure.
way that fulfills a need (above).6 Because there are so many ways to define
speed 20 to 200 times over that of an off-the-
A typical question in assembling a data oilfield data and relations between them, oil-
shelf product. Also, workstation hardware is model is the definition of a so-called major field data models are complex, attempting to
likely to be upgraded before workstation soft- element. For example, is (A) a seismic line cover as many phenomena and data permu-
ware, so that in two years, there is a good part of a seismic survey or is (B) a seismic tations as possible. Through the 1980s, the
chance the user will be running the same soft- survey part of a seismic line? A computer move toward integrating data across disci-
system using a model that organizes data as plines made it clear that a cooperative effort
ware—or an updated version—on hardware that
(A) cannot easily exchange information with involving vendors and users was necessary
is twice as fast. a computer system using a model that orga- to unify the field, or at least reduce the num-
nizes data as (B). ber of geoscience data models.

46 Oilfield Review
The IBM effort at linking mainframes and Matching Needs and Capabilities
workstations depends ultimately on devel- Questions to Ask When Selecting an E&P Information System
oping a satisfactory data model. Two, more
sweeping efforts have been initiated by the
Petrotechnical Open Software Corporation A chief cause of dissatisfaction with information systems—from data bases to integrated interpretation
(POSC), based in Houston, Texas, USA, and packages—is that the buyer had neither the time nor expertise to fully evaluate the capabilities and limi-
the Public Petroleum Data Model (PPDM), tations of the system. Answering several questions while shopping can help the buyer clarify his needs
based in Calgary, Alberta, Canada. Both and find the product that best satisfies them.
groups are not-for-profit organizations.
• What is going to be the present and future use of • Will workstations be networked together? If so,
The mission of POSC is to “...define,
the system? Who are the users? is network software available and how well does
develop and deliver, through an open pro-
• What do I want to store and why do I want to it perform?
cess, an industry standard, open systems
software integration platform for petroleum store it that way? • What platform(s) will the vendor support, and
upstream technical computing applica- • Does the data base have facilities that permit when?
tions.” The PPDM seeks to “create an indus- easy transfer of data between different data • What data input formats does the vendor sup-
try standard means of assembling, remem- bases—between the corporate, basin and pro- port? Are they compatible with those that I have
bering and communicating data related to ject data bases? or am likely to get?
the petroleum industry.” • Can the data base exchange information with • How will I load the various types of data that I
POSC, founded in October 1990, is a cor- software applications we use? expect to put on the system? How does the sys-
poration funded by 35 companies from tem handle paper data?
Europe and North America representing the • Can the data base connect with any Structured
E&P industry, oilfield services, software and Query Language (SQL) data base or product, such • Do I need satellite or phone links?
computer manufacturers and government as spread sheets or geoscience applications? • Will it interface with existing internal systems
research establishments. POSC acts as a kind • What is the size of the business unit I will be (for lease administration, production account-
of United Nations for geodata standards, working within—all the wells in the Adriatic? ing, asset accounting, etc) or systems we plan
soliciting computer-based technology from The Niger delta? The Eastern Hemisphere? to acquire? On what standards is the system
across the industry to be adopted in whole • What is the form of data to be used—a monthly based?
or part in a POSC “software integration plat- report? A weekly report? • What is the upgrade path for this system? In two
form.” To date, 20 companies have submit- years will I feel stuck with this system, or will it
• How is information accessed? Are privileges
ted 34 models for consideration by POSC. be able to grow to meet future needs?
limited or unlimited? Do users need simultane-
The POSC platform constitutes an E&P
ous access to the same data? Will I need a sys- • Can the data base be easily customized to fit the
data model, a user interface with a “com-
tems manager? needs of different operating groups?
mon-look-and-feel,” software offerings and
a set of specifications and reference codes • Where will information be stored—centrally, • How thorough and helpful is the vendor’s user
that will allow vendor products to be com- locally or regionally? Will I need one big disk support?
pared with selected industry standards.7 Its drive?
first release—a model for data exchange
and user interface—is scheduled for the the oil company trend of buying and modi- 6. For an example of an E&P data model, see Berkhout
third quarter of 1992. The group’s $6.5 mil- fying off-the-shelf software, rather than AJ, Smeets G and Ritsema I: “A Strategy for Develop-
lion budget funds a full-time staff of ulti- developing it in-house, which has become ing a Standard E&P Data Model,” Geophysics: The
Leading Edge of Exploration (September 1991): 33-37.
mately 44, and is supported by member too expensive. It also prevents “box
7. Turner D: “Petrotechnical Open Software Corp.—A
companies expected to number 40. lock”—when the time comes to retire a Way Forward,” GEOBYTE 5, no. 5 (October/Novem-
Although the group currently operates solely computer, the replacement won’t necessar- ber 1990): 36-37.
from Houston, its board of directors has ily tie the user to a single hardware supplier. 8. Wilson DC and King JW: “An Integrated Geoscience
and Reservoir Engineering Software System,” Advances
authorized the opening of a second office in POSC expects that these advantages will in Reservoir Technology: Characterization, Modeling
the UK. outweigh one drawback that may be associ- & Management, Royal College of Physicians, Edin-
The success of POSC is linked to the ated with open-system architecture, com- burgh, UK, February 21-22, 1991.
emergence of the so-called open systems promised speed. Proponents of open sys-
environment: software and hardware design tems maintain that this problem will be
that permits any application to run on any minimized by hardware advances—an often
hardware.8 Although “open systems” is on quoted figure is doubling of computational
the lips of many software engineers, the oil speed every 12 to 18 months.
industry has reason for some resistance The PPDM group, founded in 1989, is a
against it—a large investment in proprietary volunteer organization currently of 20 pay-
software and hardware. POSC sees part of
its job as showing how these existing sys-
tems can be adapted to talk with each other
and with new systems.
POSC cites several advantages to the
open systems environment. It is in line with

January 1992 47
ing subscribers comprising oil companies •Data conversion. The first generation of make a connection through the GeoShare
and vendors of applications and hardware.9 workstations were cumbersome because standard, the application developer must
Membership is $900, which is used by the data often had to be converted to a format write a link between the application‘s pro-
group for printing and other operational acceptable to the program, which was prietary data base and the GeoShare stan-
expenses. There are no paid employees. often specific to the workstation. This dard. The standard is based on the Ameri-
The PPDM is a standardized, nonpropri- extra step is removed in the emerging can Petroleum Institute (API) standard for
etary database format that serves as a frame- generation of workstations, which are data exchange known as Recommended
work on which to build petroleum com- capable, for example, of reading standard Practice 66 (RP 66) or Digital Log Inter-
puter application programs. Like POSC’s log tape formats. change Standard (DLIS).13
model, the PPDM is released in stages. The A subset of the data conversion prob- The first release of the GeoShare standard,
first database format, called a schema, was lem is data compression, particularly of in spring of 1991, is a program and blueprint
released in April 1990. Subsequent versions seismic data. Some precision is lost in for a two-way bridge connecting applica-
are revised, based on user feedback and going from 32-bit to 8-bit seismic data. In tions with each other, with a data base or
released periodically. The model currently some workstations, compression changes both, regardless of platform (next page, top).
covers exploration and production data, seismic amplitudes, resulting in anomalies The specification is being written by Geo-
including production and core analysis, in amplitude mapping. Quest Systems, Inc. and Schlumberger.
openhole pressure and drillstem test data. •Data definition. Trouble is in the making The GeoShare system is analogous to the
Plans call for pipeline, reserve and eco- when a user tries to integrate data from clipboard in the Macintosh environment. To
nomic data, lease information (expiration two systems that do not use coincidental move text from one program to another, you
dates, price paid) and seismic and log data. definitions of fundamental concepts. For can move it temporarily into a clipboard.
Since its first release, the PPDM has example, is bottomhole temperature static The GeoShare standard largely defines the
become the de facto standard in Calgary. or circulating? Is porosity log- or core- format of data that can go in the clipboard.
Only a small percentage of purchasers of derived? An effort to address this long- But it is more complex. The Mac simply has
the model are from outside Canada. In standing problem is the Petroleum Indus- to define each letter in each word. In the
hopes of expanding the model’s appeal, try Data Dictionary (PIDD). The PIDD has GeoShare standard, however, the sender
parts of it have been submitted to POSC. been assembled over the past few years and receiver have to agree not only on the
The proliferation of workstations has stim- by the American Association of Petroleum definition of a letter or number, but on the
ulated demand for other kinds of computer Geologists (AAPG) working with repre- definition of a fault, horizon or a log. It
standards for petroleum applications. Here sentatives of 17 companies and 2 United therefore conducts not only data but a dic-
are highlights of a few.10 States government agencies. Although the tionary of words to define the data in a sin-
•User interface. This is perhaps most in effort is entirely US-based, the dictionary gular way. Later, the GeoShare standard is
need of a standard. The only standards is intended to standardize industry terms expected to include graphics interface stan-
that exist today govern the structure of for what things are called, how they are dards that will permit shipping images
what things look like on the screen. An named in the data base. The 2448-term between different workstations.
important standard for the oil industry in glossary is being serialized in the Several major oil companies believe they
this respect is OSF/MOTIF, which gives a petroleum computing magazine of the can utilize the GeoShare standard to
common look to the screen. Each window AAPG, GEOBYTE, and is available for achieve data integration approximating the
has a similar configuration and border; it $10 on diskette.12 The next step for the “tight” integration obtained with a common
is moved around the screen and closed in PIDD is creation of a standard number of data base (see “Some Data Management
the same way; it has buttons that the user bytes— physical computer characters— Strategies,” page 51). By configuring their
clicks on to operate certain functions. Pro- for data referenced by the glossary terms. systems as shown on the next page, they
grammers have called for an application •Data sharing. As oil companies search for can move study data from the common data
style guide, defining standards for screen the best workstation, they often end up base to the selected workstation as needed
layout, menu bars, dialogue boxes and with four, five or more mutually exclusive and therefore keep only one master copy
data input fields. Standards are also platforms. Proprietary programs are writ- with the most current interpretation. Upon
needed to allow the user to select a pro- ten to link these resources with each other completion of analysis on the workstation,
ject and automatically start required or with the mainframe. These links are the results are then moved from the work-
applications and access data bases.11 often highly specific and expensive to station into the common data base. The
develop and maintain (page 44). An alter- sooner interpreted data are moved back into
native approach to connecting applica- the common data base, the less chance for
tions has been developed, called the confusion over the location of the correct
GeoShare data exchange standard. It is version of the interpretation.
not a pair-wise link, but a bus to which
any number of applications can be con-
nected. In order for an application to

48 Oilfield Review
Data Integration nData integration
today, before indus-
try-wide adoption
of a common data
Geologic
Modeling Log Analysis model. Applications
can talk with each
other or a central,
shared data base
Exchan through the
e Data ge GeoShare data
Lease, Contract, ar St Seismic exchange standard.
h Interpretation

an
Administration

oS
This is a form of

da
Ge
loose integration.

rd
Note that applica-
Shared tions can talk with
Data Base each other without
having to pass
through the shared
data base, but at
Production, Seismic the end of an inter-
Accounting Modeling pretation cycle, the
interpretation can
Tight be downloaded to
Integration the shared data
with base. In this exam-
Data Base ple, mapping and
cross section appli-
Mapping Cross Sections cations bypass the
GeoShare link and
are tightly inte-
grated with the
Read and write application half-links data base.

Another approach to data sharing is the A Data System Comparison


OpenWorks program, being developed by Public Library Oilfield Data System
Landmark Graphics Corp. There are several
Data base Books, magazines, Seismic, log, production
differences between OpenWorks and the newspapers, microforms and geopolitical data, etc.
GeoShare standard. Whereas the GeoShare
standard can be viewed basically as a DB Manager Librarian DB Manager or committee
bridge, OpenWorks is a link plus an attempt Quality control Staff evaluates condition of Geotechnicians, users,
to establish a unified user environment, a holdings, arranging for repair of data base administrators,
damaged items; correctly or all three. Data are checked
database management structure and inter- reshelves holdings for plausibility, correct scaling
process communications. Although Open- of logs, placement of wells,
Works has been made available to third- formation top picks, etc.
party software developers, its architecture Database Dewey decimal system Relational tables
optimizes Landmark applications. management Card/computer catalog Structured Query Language
system Holdings checkout Central processing unit
Database Development Hard copy device Photocopying machine, Printer, plotter
A data base is a collection of information printer
and a database management system is an
organizing system for storing, accessing and The first computer data bases were simply ple and networked systems, supplied by a
revising information that reflects the needs electronic means of storing what was in a vendor, capable of understanding the orga-
of the user (see “A Data System Compari- filing cabinet. The main advantage was that nization of different data bases on different
son,” above, right ). The key idea is that the the computer could locate data faster. Like machines and keeping a master list of every-
database organization satisfies a basic need, the first generation of application software, thing stored.14 (See “Bringing Data manage-
otherwise it’s just an unsorted pile of infor- oil companies mainly generated them in- ment into the 90s: The BHP Case Study,”
mation. Structure of the data base must also house. Database technology has evolved to page 52.)
accommodate the data model. the point where the state of the art is multi-

9. Rhynes PJ: “PRISM: Petroleum and Resource Indus- 11. Schwager, reference 4. 13. Although the name DLIS suggests that the format is
try Information Storage and Management,” GEOBYTE 12. “Petroleum Industry Data Dictionary—Part 1,” good only for logs, it is in fact appropriate for
5, no. 5 (October/November 1990): 31-35. GEOBYTE 6, no. 3 (February/March 1991): 17-32. exchange of any kind of scientific data.
The PPDM was founded by Applied Terravision The rest of the dictionary will be published in three 14. Gillespie JG: “Database Strategy for an Exploration
Systems, Digitech Information Services Ltd., FINDER subsequent issues. For diskettes of the dictionary, Workstation,” presented at the ASEG/SEG Interna-
Graphics Systems, Inc. and Gulf Canada Resources contact Pam Koscinski, Dwight’s Energydata, Inc., tional Geophysical Conference and Exhibition, Ade-
Limited. P.O. Box 270295, Oklahoma City, OK 73137. laide, Australia, February 17, 1988.
10. Nation L: “Data Standards Work Progresses,” AAPG Phone: (1) 405-948-7008; fax: (1) 405-948-6053.
Explorer (August 1991): 9.

January 1992 49
There are two main types of data bases, nComparison of hierarchical and relational data base structures for the same informa-
tion used in a drilling operation. To find wells intersecting a gas zone completed by
classified by the structure they use to orga- XYZ company in November, 1990, the programming interfaces would be as follows.
nize information: hierarchical and relational (Adapted from Date CJ: An Introduction to Database Systems. Reading, Massachusetts, USA: Addi-
(right ). A rapidly evolving variation on these son-Wesley Publishing Co., 1977.)
two themes, called object-oriented method-
ology, is just becoming commercial. 15 Hierarchical Data Base
Today, it operates mainly as an overlay on
top of a relational foundation, but increas-
ingly it may function as a separate approach Field FIELD_NAME FIELD_TYPE REGION well_ptr nxt_field
to which relational, hierarchical or flat-file
structures are subsets.
A hierarchical data base organizes data
Well WELL_ID OPERATOR COMPL_DATE zone_ptr nxt_well
like the branches of a tree. To reach a cer-
tain leaf, one must always ascend the trunk
and follow the same set of branches. Or in
computer vernacular, a hierarchical system Porous_zone ZONE_NAME TOP BOT POROSITY FLUID nxt_zone
is a list from which an item is selected. This
item refers to another list, and another list
and so on until the end point is reached. For a hierarchical data base:
This system is well suited to data that are next_field:
always used in a fixed structure. Information get first/next FIELD
contained in well log headers, which is if not found then goto exit
always presented in the same order and for- next_well:
mat, is often represented hierarchically. The get first/next WELL where WELL.OPERATOR =“XYZ” and
advantage of this structure is its speed for a WELL.COMPL_DATE = “NOV 1990”
particular item. A disadvantage is absence if not found then goto next_field
of a shortcut to the end point. In a strict get first/next POROUS_ZONE where
hierarchical system, one must always POROUS_ZONE.FLUID = “GAS”
ascend and descend the same path. if found then print WELL.WELL_ID
A relational data base is more like a series goto next_WELL
of file drawers than a tree. Data items are exit:
loaded one at a time into lists and sublists,
in a way that allows each item to be located
independently. For example, a list of wells
might be made up of one actual list of Relational Data Base
records, but indexed in several ways: by
operator, by hydrocarbon type and by com-
pletion date. The relation between items is Field FIELD_NAME FIELD_TYPE REGION
built with the query, which tells the com-
puter how to order data coming out. Most
commercial data bases are relational, with Well WELL_ID OPERATOR COMPL_DATE FIELD_NAME
some sections arranged hierarchically and
some in other formats. Two advantages of
relational technology are the simplicity of its
internal structure, and the ease with which Porous_zone ZONE_NAME TOP BOT POROSITY FLUID WELL_ID
this structure can be modified, with little or
no effect on supporting software.
For a relational data base, a query interface might be: “Print WELL_IDs for wells inter-
Relational database technology became secting gas zones completed by XYZ Company in November 1990.”
practical in mid-1970s, when IBM intro-
duced Structured Query Language (SQL), select unique Well.well_ID from Well, Porous_Zone where
which has become the de facto industry
standard.16 This language enabled commu- Well.OPERATOR = “XYZ” and
nication between data bases or between Well.COMPL_DATE = “Nov 1990” and
application and data bases that adhere to Porous_Zone.FLUID = “GAS” and
the SQL standard. Well.WELL_ID = Porous_Zone.WELL_ID.

50 Oilfield Review
Relational database technology still leads The new generation GISs can model the user must ask separately for a log curve and
the way into the 1990s, but another subsurface in 3D and have a complete com- header. The FINDER system, because it
approach, called object-oriented technol- prehension of E&P entities. They understand stores some bulk data as if they were single
ogy, is drawing attention. This is considered the data model used to store data in the data data items, allows the user to access both
by some to be a new way to access data, base, and can translate this into lines, points types of data with one command. Storing
but by others to be a layer that fits on top of and polygons that allow a map to be built to some data in bulk, such as well depths, per-
a relational structure, allowing a new way represent the subsurface data. Old GISs can- mits faster access because the computer
to group data. not make this translation between the data searches one already ordered entity, rather
The definition of an object-oriented data model and map. Users of the new genera- than searching each depth and having to
base is still uncertain, but it is generally tion can create searches of the data base impose order on it.
agreed that the basic unit in an object-ori- from a map. The new GIS shows on the map
ented approach is an “object,” which is all wells that satisfy a query—such as gas Some Data Management Strategies
managed as a single entity. An example is a wells the XYZ company completed in April Where data management stands today
seismic section. In a conventional relational 1990. By clicking on the well, the user can might be described as “loose” integration
data base, to call up a seismic section on bring up text annotations, such as who between platforms, and where it wants to go
the screen, the user locates and requests the logged the well, what logs are available, as “tight” integration. While the distinction
various traces before the section can appear. total depth and bottomhole temperature. may be mostly blurred, there are some clear
In an object-oriented system, the user asks The latest advance in database technol- differences. The main difference is in the
for the section by name or location, and the ogy—although still in commercial infancy— number of occurrences of a given data item.
computer locates the data and applications is mixed-media capability, which has devel- In a tightly integrated system, there is one
needed to draw the section. Another exam- oped in response to the demand for different data store; in a loosely integrated system
ple of an object would be all the well logs kinds of data. Until recently, data bases there may be many. In general, today there
from a traverse across a reservoir, named, were limited mainly to digitized data, such is often tight integration of application pro-
for instance, A-A’. By simply marking A-A’ as seismic maps, log curves and tabular grams within an existing platform of a single
on the screen, the user can call up all the data. With mixed-media storage, any kind of vendor. The long-term desire is for tight inte-
logs without knowing anything about the information can be accessed—text, tables, gration between platforms while, in the
underpinnings of data in the data base. The raster images of downhole data, such as short term, there is a need for loose integra-
logs are treated as objects. Some object-ori- logs and core photos, as well as video and tion between platforms.
ented products are coming out now, but the audio information. Interpretations can be To illustrate the difference between tight
technology isn’t expected to become signifi- stored as well as raw data. and loose integration, consider two seismic
cant until the mid-1990s. With mixed media loaded into the data interpretation systems. One is used for seis-
Data bases have recently been challenged base, the geoscientist can draw a polygon mic structural interpretation, the other for
by growth in the amount and complexity of around a set of wells and type a query, seismic inversion. It may be possible to
data and demand for new connections “show core information.” A text track move a copy of the seismic section from the
between data types. Two notable develop- appears with the list of wells cored, or cored interpretation to the inversion program, per-
ments to meet these challenges are graphics wells appear in a new color or a new sym- form an inversion, then move it back to the
interaction and mixed-media storage. bol. Then the user can query again: “show interpretation program to interpret an inverted
In the last three years, advances in graph- Pierre formation,” and up comes a core (continued on page 54)
ics interactions have enabled computers to image, with thin section data and the core
emulate the way geoscientists think: in analyst’s interpretation and notes—paleon- 15. Brewer KE and Pritchard RJ: “Integrated Technical
maps. About 50 companies today are mak- tology, sedimentary structures, grain Computing of the 1990’s,” paper SPE 20359, pre-
sented at the 5th SPE Petroleum Computer Confer-
ing and selling mapping systems under the description, and so on. ence, Denver, Colorado, USA, June 25-28, 1990.
generic name geographic information sys- Advances have also taken place in rela- Rumbaugh JE, Blaha MR, Premerlani WJ, Eddy F and
tem (GIS). Most of these first-generation sys- tional database capabilities of workstations. Lorensen W: Object-Oriented Modeling and Design.
New York, New York, USA: Prentice-Hall, 1991.
tems are designed to handle 2D representa- Some newer data management systems,
Fong E, Kent W, Moore K and Thompson C: 1991,
tions of the earth’s surface for use, for such as used in the FINDER system, have X3/SPARC/DBSSG/OODBTG Final Report. National
example, in city planning and forestry pro- the capability to withdraw two kinds of data Institute of Standards and Technology Draft, Septem-
jects. These GISs are limited, however, in in one swipe: bulk data (sometimes called ber 17, 1991. (This is a review draft of a study by the
object-oriented database technology group. This
that they cannot gracefully satisfy the vector data) and parametric data. Bulk data landmark report states that object-oriented technol-
demands of the E&P industry, which needs are large chunks of data, composed of many ogy and methodology is ready to use now by gov-
ernment agencies. For copies, contact Elizabeth
to apply GIS principles to model the subsur- smaller data items that are usually retrieved Fong, NIST, Building 225, Room A226, Gaithers-
face in three dimensions. These systems also in a specific order—for example, a sequence berg, Maryland 20899, USA. Phone: (1) 301-975-
have no understanding of the E&P data of values such as a porosity curve or a seis- 3250.)
needs. Users have to write software to con- mic trace. Parametric data are single pieces 16. Although IBM invented the SQL principle, its first
commercial implementation was in 1979 by Oracle
nect data descriptions of leases, seismic sur- of data, such as those included in a log Corporation.
veys, well logs and so on to the GIS’s recog- header: mud density, bottomhole tempera-
nition of lines, points and polygons. ture, field name, and so on. In most data
bases, bulk and parametric data are stored
separately and retrieved separately—the

January 1992 51
Bringing Data Management into the 90s: The BHP Case Study

1980s
In 1987, the Australian energy company, Broken
Seismic Geological
Hill Proprietary (BHP), recognized the need to Interpretation Interpretation
replace its Mapping and Geophysical Interpreta-
tion Computer (MAGIC), an in-house system for
Digitizing
exploration and production file management. Horizon Picks Well Completion Log Well Data Base
Reports Data Base
MAGIC was a suite of applications programs and
a data store that was mainly a repository for seis-
mic data. Its main limitation was that it operated MAGIC Data Base Log Displays,
only in batch mode, not interactively. Cross Sections
To assess its needs, BHP surveyed users
throughout the company on requirements for soft-
Base Maps Drafting
ware applications, data processing and data
management. A significant finding was that users
wanted closer interaction with their data. They
complained of a lack of control of data interpreta-
Hand Computer
Contoured Contoured
tion and processing functions, and wanted data to Maps Maps
be more accessible and more easily modified. Manual tasks

At the same time, the company recognized the Computerized tasks

industry trend toward workstations and user Drafting


access from the desktop, which fit well with the
user request. BHP also had the desire to store,
1990s
catalog and make available E&P data of all types,
not just seismic data. Geological Interpretation
Seismic Interpretation
Once BHP identified the company-wide (Cogniseis Development DLPS
(Landmark Seisworks
requirements, it looked for systems to meet these Interpretation Package
Module)
and Landmark Stratworks Module)
needs. The field was quickly narrowed to the Vor-
text system from Aangstrom Precision Corp. in
Michigan, the Exploration and Production Office
FINDER Exploration Data Base
System/Geoscience Information System
(EPOS/GSIS) from the TNO Institute of Applied
Geoscience in The Netherlands, and the FINDER Computer Log Displays,
data management system from FINDER Graphics Contoured Maps Cross Sections

Drafting Mapping
(Minor Alterations) (Zycor)

nBHP’s exploration technical systems overview.

52 Oilfield Review
Systems Inc. in California. The FINDER system interact with data. For capabilities not part of the are valid. The explorationist usually wants the
was found to have the greatest potential to satisfy FINDER system, such as seismic interpretation or operator’s at the time of drilling, because well
BHP’s requirements, and initial acquisition was geologic modeling, BHP has purchased packages logs will use this name, whereas the production
made in 1989. Since that purchase, BHP has from other vendors, including Zycor Inc., Land- engineer usually wants to know the name of the
installed the system in offices in Melbourne, Aus- mark Graphics Corp. and Sierra Geophysics, Inc. operator of the producing well. Each well contains
tralia; Houston, Texas, USA; London, England BHP has also written routines to work with the dozens of data fields with similar possibilities for
and Calgary, Canada. FINDER system, such as a program that digitizes ambiguity. With about 47,000 wells in just the off-
Currently, BHP stores data in the FINDER sys- horizon picks marked on paper displays of seis- shore portion of BHP’s Gulf of Mexico data base,
tem as basin-wide data bases. In the Gulf of Mex- mic sections. the size of this problem can be daunting.
ico, for example, the data base has about 7800 Through the FINDER system, BHP intends to Even when each data field is well defined, data
seismic lines representing 141,700 line miles integrate different types of data and data from often have to be manually reorganized to fit the
[228,000 km]; 202,600 well locations with 91,300 many sources, and, most importantly, make format specifications of existing data loaders. If
scout tickets, and cultural, lease and other types those data available to the user via workstations this task is too large or cumbersome, custom
of data that allow easy generation of base maps. and X Window System terminals. It is hoped that data loaders must be written to convert the ven-
In the Melbourne office, the Timor Sea data set this flexibility will permit integration of geology dor data format into the database format. These
comprises 5400 seismic lines representing about and geophysics in ways either not possible in the loaders are written either by FINDER Graphics or
123,000 line kilometers [76,400 miles], 200 past, or too difficult and, hence, impractical. BHP. Throughout the process, BHP draws on the
wells, cultural, permit and interpretative data. There have been several keys to BHP’s success expertise of its data administrators, who have
The hardware environments at BHP’s two main with the FINDER system. Early in the process, BHP learned from experience which data from which
sites are based on two different distributed com- acquired the FINDER Software Development vendors or areas need special attention.
puting models. In Houston, the FINDER system Facility (source code for modifying the FINDER BHP’s long-term goal is to simplify the every-
runs on seven stand-alone VAXstations with system) and assigned dedicated in-house support day work flow, allowing professionals to be more
access via a network from 19 X Window System staff to the FINDER system. This has allowed BHP productive and more creative, by reducing the
terminals, which are for graphics display. In Mel- to work closely with the FINDER development time spent searching, accumulating, requesting
bourne, there is a central file server with seven staff and has given BHP the ability to add special- and moving data.
dataless VAXstations and X Window System ter- ized features and functionality to the system.
minals attached to the server. Because of the Each business unit at BHP Houston has a data
experience gained in these test configurations, administrator whose sole responsibilities are
future implementations will probably be based on loading data, performing quality assurance, and
a central server for each business unit with data- correcting and updating the FINDER data bases.
less workstations and X Window System terminals. The data administrators work with the explo-
The working environment is flexible. In some rationists on quality assurance, but the ultimate
cases, subsets of data are extracted as project responsibility for the integrity of data lies with
data bases within the FINDER system. At other the explorationists.
times, users who want regional context maps Data loading and verification is an essential
work directly with the entire basin’s data set. function that BHP finds to be the most problem-
Within this framework, the FINDER system pro- atic. Moving data from old data bases or mag-
vides much of the routine functionality that BHP netic tape into the FINDER system often requires
had with its MAGIC system, plus the ability to reformatting. Even new data can require consid-
erable manipulation because the organization
and definition of data may vary. For example,
commercial scout tickets from various vendors
often don’t agree on the name of the well opera-
tor. Vendors may furnish the name of the operator
at the time of drilling or during production. Both

January 1992 53
section. Since there are now two copies of data be assured? Who will determine what previous two checkers attached, the inter-
the section, the two systems are said to be goes back in the data base and how is this pretation reaches the computing services
loosely integrated. In a tightly integrated sys- determined? Today, just as no two oil com- group in Plano, which then performs quality
tem, processing and interpretation systems panies have the same computer systems, no checks as if it were vendor data, and installs
would obtain data from a common data two companies approach these challenges the interpretation in the corporate data base.
store (possibly distributed across multiple the same way. If the computing group questions the data, it
computing sites or machines), with a com- A leading debate in data management is negotiates a solution with the project man-
mon file format, and would not require refor- coping with simultaneous alteration of the ager, the operations group, or both.
matting of data. Data handling is faster and same data set. Imagine two geoscientists Managing simultaneous alteration is one
the border between computers and applica- working concurrently on different aspects challenge in this early stage of technology
tions becomes invisible to the user. of the same project, using some of the same transfer to workstations. The major chal-
There are good reasons why tight integra- well data. They have downloaded data lenge, in using workstations to reduce find-
tion will be slow in coming. Tight integration from the master data base, revised it, cor- ing and development costs, is development
of applications written for independent sys- rected it and made interpretations. When and acceptance of a standard E&P data
tems requires modification of existing appli- interpreted data are ready for storage on the model. Once this hurdle is cleared, only
cations to permit them to use the common master data base, how is it decided which minor obstacles in the path to seamless inte-
data store. It also requires agreement on a version goes in? gration of geoscience data remain. —JMK
universal model for data exchange—one of There are various solutions. In BP USA,
the long-term projects of POSC. automatic data revision is anathema. When
Loose integration between domains someone changes data in the master data
requires a mechanism for moving data from base, the computer produces a report, stat-
one to the other. The advantage of this ing there are two versions of the same thing,
Acronyms
approach is that it does not require rewriting and listing the changes. It asks if those ver-
AAPG: American Association of
programs. The GeoShare data exchange sions should be merged or supplanted. This Petroleum Geologists
standard, which uses RP66/DLIS as the determination is made by the database man- API: American Petroleum Institute
common data format, is one effort in this ager. The manager cannot be expert in CD ROM: Compact disk read-only memory
direction, requiring only the writing of a link petrophysics, sedimentology, geophysics CPU: Computer processing unit
to the GeoShare standard. Additionally, oil and petrology in every province, but he or DB: data base
companies are interested in prolonging the she is still responsible for the integrity of the E&P: Exploration and production
value of their investment in proprietary sys- master data base. The manager therefore sits GIS: Geographic information system
tems. Therefore, the immediate efforts, such down with the geoscientists and, donning OSF: Open Software Foundation
Schlumberger’s geoscience data bus imple- the hats of diplomat, defender of data and PC: Personal computer
mentation using the GeoShare standard, are geoscientist, negotiates which version of PIDD: Petroleum Industry Data Dictionary
for loose integration of independent sys- what to keep. BP finds that the successful POSC: Petrotechnical Open
tems, while a hopeful eye is kept on POSC’s data manager must be high enough in the Software Corporation
development of a universal system permit- organization to wield recognized authority, PPDM: Public Petroleum Data Model
ting tight integration. yet close enough to the field to know the RISC: Reduced Instruction Set Computer
The shift toward workstations also calls area in question. SQL: Structured Query Language
into question the means of data security and ARCO is installing a data management
control, which are often deeply rooted in system in its Plano, Texas, USA, facility and
organization culture. Where should data be expects to use a slightly different solution.
kept—centralized, decentralized or in The data management plan calls for down-
regional data centers? Who controls data loading project data from the mainframe to
quality? How can consistent quality of all the FINDER system. When an interpretation
is complete, it is screened at three levels
before being installed in the corporate data
base. First, the project manager approves all
formation top picks and performs a general
quality check. The appropriate operations
center then checks data pertaining to its
area, such as well location and geopolitical
data. Finally, with recommendations of the

54 Oilfield Review
A Niche for Enhanced Oil Recovery in the 1990s

Aging fields and dwindling prospects for finding new, large reserves are turning attention to improving

recovery from known oil fields. What is the status of enhanced oil recovery (EOR) and what role might it

potentially play in the next few years?

Larry W. Lake Traditional primary and secondary produc- America. Reserves in the aging oil fields of
University of Texas tion methods typically recover one third of the US and Canada are declining faster than
Austin, Texas, USA oil in place, leaving two thirds behind. The new oil is being added by discoveries. In
reasons for this are not difficult to under- the US, for example, 70% of the approxi-
Raymond L. Schmidt stand. During the life of a well, there is mately 500 billion barrels of oil discovered
Chevron Oil Field Research always a point at which the cost of produc- were found during the earliest 20% of
La Habra, California, USA ing an additional barrel of oil is higher than drilling. About 130 billion barrels have been
the price the market will pay for that barrel. produced to date and up to another 170 bil-
Paul B. Venuto Production then halts. Under normal cir- lion barrels are considered a long-term tar-
Mobil Research and Development Corp. cumstances, the well is abandoned, with get for advanced EOR technology. The situ-
Princeton, New Jersey, USA 70% of the oil left in the ground. ation is similar in Canada. Given the
Except for brief periods in which EOR was declining reserves and the low probability
economical, or perceived to be so, there of locating significant new fields, producers
were good economic reasons not to nurse sought additional oil in old reservoirs, mak-
every drop of oil from a well. Oil was easy ing North America a proving ground for
to find and another giant field was just EOR techniques. Today, it is estimated that
around the corner—the cost of a newly North America produces more than half the
found barrel of oil was far less than the cost world’s EOR production (below ).
of an EOR incremental barrel. This situation Research and development on many
has begun to change, especially in North fronts indicate that the risks of EOR are

Estimated Annual Wordwide EOR Produced Oil, B/D (x1000)


Country Thermal Miscible Chemical EOR Total %
USA 454 191 11.9 656.9 42
Canada 8 127 17.2 152.2 10
Europe 14 3 — 17.0 1
Venezuela 108 11 — 119.0 7
Other S. American 2 NA NA 17.0 1
USSR 20 90 50.0 160.0 10
Other (estimated ) 171* 280** 1.5 452.5 29
Total 777 702 80.6 1574.6 100

*Mainly Duri field (Indonesia) **Mainly Hassi-Messaoud (Algeria) and Intisar (Libya)

nEstimated annual worldwide production of oil by EOR.


(USSR data from Simandoux P and Valentin E: “Improved Recovery, Strategic Option or Not?” pre-
sented at the Offshore North Seas Conference, Stavanger, Norway, August 28-31, 1990; other data
from Oil & Gas Journal 88, no. 17 (April 23, 1990): 62-67.)

January 1992 55
24
reservoir pressure, or pumping, until deple- •Fluid-flow simulations, based on a geo-
22 tion. Until the early 1940s, economics dic- logic reservoir model, can start with
20
tated when a well was to be plugged and assessment of primary and secondary
abandoned, usually after recovery of 10 to recovery, matching the production history
Oil production, billions of barells/yr

18 Worldwide 25% of original oil in place (OOIP). to determine residual oil and waterflood
oil production
Secondary recovery methods are gener- recovery. Then EOR process-variable sen-
16
ally used to repressure the reservoir and sitivities can be calculated, followed by
14 drive out some of the remaining oil. predictions of EOR recovery, incremental
Because water is usually readily available production rate and payout economics.
12
and inexpensive, the oldest secondary Reservoir geologic models are always
10 recovery method is waterflooding, pumping constrained by sparse data, simplified
water through injection wells into the reser- concepts of reservoir structure and
8
voir. The water is forced from injection dynamics, inadequate data for history
6 wells through the rock pores, sweeping the matching and increasing computational
oil ahead of it toward production wells. This uncertainty as calculations are extrapo-
4 is practical for light to medium crudes. Over lated into the future. Consequently, pre-
Worldwide EOR
2 oil production time, the percentage of water in produced dictions that cover years of EOR perfor-
fluids—the water cut—steadily increases. mance may be seriously in error. In
0 Some wells remain economical with a addition, small-scale heterogeneities,
76 78 80 82 84 86 88 90
Year
water cut as high as 99%. But at some which are difficult to define, are critical to
point, the cost of removing and disposing of the success of EOR.
nA slow, but steady increase in percent water exceeds the income from oil produc- •Usually, a pilot test of the proposed EOR
of world oil produced by EOR. Percent of
oil produced by EOR has more than dou- tion, and secondary recovery is then halted. process is carried out to investigate a
bled since 1982, when EOR oil accounted Extensive waterflooding, which began in novel technique or to confirm expected
for 0.9% of worldwide production. In 1990, the 1940s, within a few decades became performance before an expensive, full-
EOR accounted for 450 million—about
2%—of the 22 billion barrels of oil pro- the established method for secondary oil scale implementation. Ideally, the pilot
duced. (From Oil & Gas Journal data.) recovery, usually recovering about another test is performed in an area that is geolog-
15% of OOIP. On average, about one-third ically similar to the field and large enough
of OOIP is recovered, leaving two-thirds, or to be statistically representative of overall
being reduced and the potential for EOR twice as much oil as is produced, in the heterogeneity. Monitoring and data acqui-
profitability increased (above ). Computer- ground after secondary recovery.1 sition throughout pilot testing provide
ized characterization of the reservoir, which Another recognized secondary recovery information needed to plan a full-scale
quantifies the physical characteristics and technique is injection of a hydrocarbon- commercial operation.
dynamic behavior of a field, is becoming based gas into an existing gas cap or •For commercial operations, important
one of the most important tools for directly into the oil itself. Gas may be considerations are secure sources of water
improved oil recovery (see “Reservoir Char- injected over a considerable period of and other injectants, storage and trans-
acterization Using Expert Knowledge, Data time—up to a year—while producing wells portation facilities (like pipelines), surface
and Statistics,” page 25 ). Success of oil are shut in, until reservoir pressure is processing, separation, recycling and
recovery depends on applying the energy of restored and production resumed. Another upgrading facilities, and environmental
injected fluids in the right place, in the right method is injection of gas to sustain pres- and safety requirements.3
amount and at the right time—a strategy sure during production. Gas injection The same principles of EOR engineering
that a well-constructed reservoir simulator requires a nearby source of inexpensive gas may not apply to offshore oil fields. Because
can help develop. in sufficient volume. offshore wells tend to be highly deviated or
EOR is an imprecise term that historically While waterflooding is effective in nearly extended reach, the distance between them
has been used to describe the third step (ter- all reservoirs, no single EOR technique is a is often greater than between onshore wells.
tiary recovery) in oil and gas production. cure-all. Most reservoirs are complex, as are This extends the time between EOR initia-
The term “improved oil recovery” (IOR) has most EOR processes. Efficient reservoir tion and meaningful results and flattens the
come into use to describe all recovery management treats EOR as a high-cost, recovery response. These effects complicate
methods other than natural (primary) pro- high-risk but critical component of a com- process control and limit the number of
duction, reserving the designation EOR for prehensive plan that spans primary recovery EOR techniques that may be applicable.
those processes beyond simple waterflood through abandonment.2 Greater spacing between wells also
and gasflood—basically, recovery by injec- Once preliminary reservoir information increases the likelihood of undetectable het-
tion of anything not originally in the reser- has been assembled and used to select EOR erogeneities between wells, impairing simu-
voir (next page, top ). The three major EOR options, engineering project design usually lations of well behavior. Because the num-
methods are thermal (application of heat), follows several steps. ber of wells that can be drilled from a
miscible (mixing of oil with a solvent) and •Laboratory studies test the proposed EOR platform is fixed, infill drilling, often an
chemical (flooding with chemicals). processes in corefloods with samples of important strategy for both secondary recov-
Primary recovery, in long accepted prac- reservoir rock and fluids. These small, ery and EOR, may not be possible. High
tice, is defined as production by natural one-dimensional flow tests in relatively costs and extended time before EOR pro-
homogeneous media do not always suc- duction begins mean that offshore EOR pro-
cessfully scale up to reservoir dimensions. jects must be planned and started early
But if the process fails in the laboratory, it
will more than likely fail in the field.
56 Oilfield Review
Oil Recovery Mechanisms enough so that production increases incre-
mentally before primary and secondary pro-
Conventional Primary duction begins to decline. Otherwise,
Oil Recovery marginal costs may be too high to sustain
profitability. However, this option must be
balanced against other risks: insufficient
Natural flow Artificial lift reservoir description at early stages of field
Secondary production and lack of time to acquire pilot
test results to evaluate the EOR process.4
Various mechanisms thwart recovery of
Waterflood Pressure much of OOIP after secondary recovery.5
Tertiary maintenance Reservoir geologic heterogeneities may
cause a large volume of mobile oil to be
bypassed and remain within a field. This is a
result of poor sweep efficiency when
EOR
Chemical Thermal Miscible Other: microbial, electrical, injected displacement water moves prefer-
chemical leaching, entially through higher permeability zones
mechanical (vibrating, toward the production well. Even in regions
horizontal drilling) that have been swept by large quantities of
water, residual, immobile oil is held in the
pore spaces by capillary forces.
Surfactant Polymer Caustic CO2 Miscible Inert gas Many techniques have been tried in the
solvent laboratory and field in hopes of recovering
this additional oil. All employ one or more
of three basic mechanisms for improving on
Steam stimulation Steam or In-situ waterdrive alone:
or cyclic steam hot water combustion •Increase the mobility of the displacement
injection
medium by increasing the viscosity of the
water, decreasing the viscosity of the oil,
or both.
Foam •Extract the oil with a solvent.
displacement •Reduce the interfacial tension between the
oil and water.
nOil recovery mechanisms. The three major EOR processes—thermal,
(Adapted from Venuto PB, reference 2 and Donaldson EC et al, reference 5.) miscible and chemical—are each subdi-
vided into several categories. Among the
60 nCost-perfor- three, thermal processes dominate, having
mance compari- the greatest certainty of success and poten-
son of major EOR tial application in about 70% of enhanced
50 methods. (Adapted
oil recovery worldwide. Thermal methods
Incremental oil cost, $/barrel

from Simandoux P,
Champlon D and also give the highest recoveries at the lowest
40 Velentin E: “Manag- costs (left ).
Surfactant
ing the Cost of The term miscible means the mixing of
Enhanced Oil Recov-
two fluids—for instance, oil and a solvent
30 ery,” Revue de L’Insti-
Thermal
tut Français du Pét- such as carbon dioxide [CO2]—into a sin-
role 45, no. 1 gle-phase fluid. It may also apply to a conti-
20
CO2 injection (January-February nuity between the oil and injected gas, due
1990): 131-139.) to a multiphase transition zone between the
Polymer
two. Use of miscible gasdrive has grown
10 rapidly in recent years, and today the
Waterflooding method accounts for about 18% of EOR
0
applications worldwide. It has been suc-
0 10 20 30 40 50 60 70 80 cessful at depths greater than 2000 ft [610
m] for CO2 and greater than 3000 ft [915 m]
Total recovery, (% OOIP)
for other gases.
1. Smith RV: “Enhanced Oil Recovery Update,” 4. Simandoux P and Valentin E: “Improved Recovery, a EOR chemical processes, such as surfac-
Petroleum Engineer International 60, 61, four-part Strategic Option or Not?” Presented at the Offshore tant (detergent) flooding, have tantalized
series (November 1988 to March 1989). North Seas Conference, Stavanger, Norway, August
28-31, 1990. the industry with promises of significantly
2. Venuto PB: “Tailoring EOR Processes to Geologic Envi-
ronments,” World Oil 209 (November 1989): 61-68. 5. Smith RV, reference 1. improved recovery. As yet, cost and techni-
3. Venuto PB, reference 2. Donaldson EC, Chilingarian GV and Yen TF (eds): cal problems have precluded them from
Schmidt RL: “Thermal Enhanced Oil Recovery—Cur- Enhanced Oil Recovery, II—Processes and mainstream application. Waiting in the
rent Status and Future Needs,” Chemical Engineering Operations, Developments in Petroleum Science.
Progress (January 1990): 47-59. Amsterdam, The Netherlands: Elsevier Science Pub-
lishers, 1989.
January 1992 Schmidt RL, reference 3. 57
Oil viscosity, centipoise, at reservoir conditions wings are processes like microbial EOR
EOR method
0.1 1.0 10 100 1000 10,000 100,000 1,000,000 (MEOR) and some novel and exotic propos-
als; these await confirmation by lab and
Hydrocarbon field experimentation and evaluation before
miscible
taking their place as accepted practice.
Nitrogen and Each EOR process is suited to a particular
flue gas
type of reservoir. Because unexpected or
CO2 miscible unknown reservoir characteristics cause
most EOR failures, EOR begins with thor-
Surfactant/ ough geologic study. Technical rule-of-
polymer
thumb screening criteria are available to aid
Polymer preliminary evaluation of a reservoir’s suit-
ability for EOR (left ). After these criteria are
Alkaline applied to a prospect, stringent economic
analysis follows, generally through repeated
Fireflood reservoir simulations6 (see “Trends in Reser-
voir Management,” page 8 ).
Steamdrive
Thermal
Thermal methods are the main means of
Permeability, md recovering heavy oils, those with gravity
EOR method
10 100 1000 10,000 less than 20° API, representing viscosities of
200 to 2000 centipoise (cp). Such heavy oils
Hydrocarbon Not critical if uniform
miscible generally don’t respond significantly to pri-
mary production or waterflooding so initial
Nitrogen and Not critical if uniform
flue gas oil saturation is typically high at the start of
a thermal recovery project. The principle of
CO2 miscible High enough for good injection rates thermal recovery is simple: increasing the
oil’s temperature dramatically reduces its
Surfactant/
polymer viscosity, improving the mobility ratio (next
page, left ). The two primary methods of
Polymer heating reservoir oil are injection of fluid
heated at the surface or production of heat
Alkaline directly within the reservoir by burning
some of the oil in place.
Fireflood Although the idea of heating reservoirs
dates back more than 100 years, large-scale
Steamdrive steamdrive projects began in heavy oil fields
in the US in the early 1950s and were fol-
lowed shortly by projects in The Nether-
Depth, ft lands and Venezuela. A relative of steam-
EOR method
2000 4000 6000 8000 10,000 drive is cyclic steam injection, also called
Hydrocarbon steam soak or “huff and puff.” It was
miscible Deep enough for required pressure discovered accidentally in 1960 during a
Nitrogen and Venezuelan recovery project. Cyclic steam
flue gas Deep enough for required pressure recovery uses a single well for both injec-
tion and production. Steam is injected into a
CO2 miscible Deep enough for optimum pressure well for several days or weeks, then the well
Surfactant/ is shut in for several days to a month or
polymer Limited by temperature more, the soak period. After this, the well is
produced for up to six months, after which
Polymer Limited by temperature the process is repeated. The steam heats the
rock and fluids surrounding the wellbore
Alkaline and also provides some drive pressure; by
the time production resumes, the steam has
Fireflood Deep enough for required pressure condensed and oil and water are produced.

Steamdrive Normal range 6. Taber JJ and Martin FD: “Technical Screening Guides
for the Enhanced Recovery of Oil,” paper SPE 12069,
presented at the 58th SPE Annual Technical Confer-
Crudes up to 20° API ence and Exhibition, San Francisco, California, USA,
October 5-8, 1983.
Lighter crudes, 30° API Venuto PB, reference 2.
7. Taber JJ and Martin FD, reference 6.
Good Possible Fair Difficult Not feasible Smith RV, reference 1.
Donaldson et al, reference 5.
nSelection of EOR techniques by oil viscosity, permeability and depth. (Adapted from Venuto PB, reference 2.
Taber JJ and Martin FD, reference 6.)
Schmidt RL, reference 3.
The advantage of “huff and puff” is the rela- be insulated, as well as downhole tubing attempts to recover oil by igniting a portion
tively short “huff” time so that the well is below 1500 ft [460 m]. Thermal expansion of the in-place crude by injecting air or oxy-
producing most of the time. The disadvan- may damage downhole equipment and gen or by chemical or electrical means
tage is that only the reservoir near the well- cause cement failure. Steam is very reactive, (below ). Fireflooding appears attractive: it is
bore is stimulated. This method therefore causing pipe corrosion and scaling, miner- thermally more efficient than steam, has no
often has to be followed by continuous alogical dissolution or reprecipitation, clay depth restriction and is well suited to rela-
steam injection to drive oil toward a sepa- swelling and changes in permeabilities. tively thin (less than 25 ft [8 m]) reservoir
rate production well. Reservoir pressure, Steam is more mobile than oil, overriding sands. But in practice it is not so simple. Cap-
dictated by depth, imposes a limit on steam- the oil and channeling through thief zones; ital costs are high, the process is extremely
flooding: higher pressures require more fuel no satisfactory methods have been found to complicated and difficult to predict or con-
to generate higher temperatures on the sur- improve sweep efficiency. Steam is usually trol, and operational problems from the high
face to produce saturated steam needed for generated by burning natural gas, but if temperatures include cement failures, sand-
efficient steamflood. The higher temperature lease crude is used—typically at about one ing/erosion, corrosion at both injection and
also leads to greater heat losses. barrel of crude for three or four barrels of oil production wells because of oxygen and
The recovery mechanism of steamflood- recovered—the air pollution byproduct may moisture, and high gas production rate.
ing is complex. Besides viscosity reduction, be costly to control. Despite some successes and more than 30
the second most important recovery mecha- The other significant thermal recovery years of laboratory and field trials, in-situ
nism is steam distillation of lighter compo- process, in-situ combustion or fireflooding, combustion applications have not increased.7

10,000

10°
AP
I
Viscosity, centipoise

100

20°
AP
I
10
30°
AP
I

2
100°F 200°F 300°F
38°C 93°C 149°C
Crude oil temperature

nThe empirical relationship of crude oil


viscosity and temperature depends on oil
composition. (Adapted from Lake LW, refer-
ence 10.)

nents of the oil which form a solvent bank


ahead of the steam. Other factors are ther-
mal expansion, solution gasdrive and misci-
ble and emulsion drive. Steamflooding has
been tested in light oil reservoirs where the
viscosity mechanism plays a minor role and
distillation to solvents predominates. Some nChanges in a 3D seismic section for “preburn,” “mid-
field tests of light oil steamfloods have burn” and “postburn.” The bright spot—an increase in
shown virtually 100% recovery of oil from envelope amplitude, marked by arrowheads—appeared
zones reached by the steam. at midburn time and expanded by postburn time. Below
the bright spot is a dim spot, caused by a decrease in
All is not roses with steamflooding, as envelope amplitude. The bright spot is caused by
numerous technical problems may com- increased gas saturation along the top of the reservoir
bine to make a recovery project uneco- boundary. The dim spot shows good correlation with the
nomical, inefficient or even dangerous. burn volume in distribution and direction, as deter-
Heat losses in surface equipment, in the mined by core analysis. From studies like these, seismic
data can be used to map an estimated burn thickness.
wellbore, in rocks around the reservoir, in (Courtesy of the Society of Exploration Geophysicists. From
connate water and in the gas cap may Greaves RJ and Fulp TJ: “Three-dimensional Seismic Monitoring
defeat the process. Surface equipment must of an Enhanced Oil Recovery Process,” Geophysics 52, no. 9
(September 1987): 1175-1187.)
59
Miscible
Injector Producer
The fastest growing EOR process, miscible nPhase behavior
flooding, uses a solvent that mixes fully with and flow dynamics

one
residual oil to overcome capillary forces and in miscible flooding

an k
le z
increase oil mobility. Displacement effi- CO2 Residual oil (a) for ideal perfor-

Oil b
cib
ciency nears 100% where the solvent con- mance, (b) during

Mis
the influence of
tacts the oil and miscibility occurs. The fluid density and
numerous successful solvents include liqui- a
(c) in the setting of
fied petroleum gas (LPG), nitrogen, CO2, flue viscosity contrasts,
gas (mainly nitrogen and CO2) and alcohol. which produce fin-
gering of the EOR
A sufficient supply of a particular solvent gas into oil.
and the value of injected hydrocarbons, CO2
Residual oil (Adapted from Venuto
like natural gas, have considerable impact PB, reference 2.)
on the choice and economics of miscible
b
flood projects. In Canada, for example, the
abundance of natural gas makes that the
choice, while the US has huge reserves of
CO2 in the western states that have been
CO2 Residual oil
tapped for EOR in the Permian Basin fields
of west Texas.
Miscible displacement EOR can be subdi- c
vided into three significant processes: misci-
ble slug, enriched gas and high-pressure
lean gas, including CO2. For each of these •Similarly, in vaporizing gasdrive, lean gas orders of magnitude. The result is an unsta-
processes, there is a range of pressures, or is injected at high pressure, 3000 to 6000 ble front between the gas and oil which
depths, temperatures and oil gravities neces- pounds per square inch gauge, and C2–C6 allows viscous fingers to form and propa-
sary to achieve and maintain miscibility. components from the oil are vaporized gate through the displaced fluid, leaving
Solvents have much lower viscosity than oil, and mutually exchanged between the oil much of the hydrocarbon uncontacted
so reservoir stratification—vertical and hori- and gas during multiple contacts, eventu- (above). Today, the primary means of attack-
zontal permeability contrast—strongly ally forming a miscible slug. ing this problem is the water-alternating-gas
affects sweep efficiency. Early breakthrough, Carbon dioxide is a special case of high- (WAG) technique. In this process, water-
bypassing substantial amounts of oil, and pressure miscible recovery. This gas is flood and gasflood are alternated, with the
viscous fingering problems have plagued highly soluble in crude oil, swelling the oil design parameters being timing and the
many field projects involving the four misci- and reducing its viscosity, while simultane- ratio of water to gas. WAG claims the
ble displacement processes: ously extracting lighter hydrocarbons by virtues of decreasing the mobility of the gas,
•In the miscible slug process, a slug of liq- vaporization. The displacing gas front, maintaining pressure and saving operating
uid hydrocarbons, about half the reservoir enriched by vaporized hydrocarbons costs by substituting inexpensive water for
pore volume (PV), is injected and mixes through multiple contacts, forms a miscible relatively expensive gas. Ideally, the gas pro-
with the oil on contact. This is followed slug as long as minimum miscibility pres- vides miscibility and the water improves
by water or chase gas to push the slug sure (MMP) is maintained. Since CO2 can sweep efficiency. However, one study of
through the reservoir. extract heavier components, it is miscible WAG in 15 CO 2 flood projects showed
•For the enriched, or condensing, gas pro- with crude oils having fewer C2–C6 compo- lower recoveries than for cases using a sin-
cess, 10 to 20% PV of natural gas, nents. Carbon dioxide has a lower MMP gle injection of CO2 followed by a water-
enriched with intermediate molecular than natural gas, nitrogen or flue gas, and flood. Gravity segregation between water
weight hydrocarbons, is injected, fol- therefore can be applied in shallower (lower and gas is thought to compromise the effec-
lowed by lean gas and, in some cases, pressure) wells. tiveness of the WAG process.8
water. At the proper reservoir pressure, A major problem with miscible gasflood Performance of many EOR techniques
the C2–C6 components transfer from the EOR is the adverse mobility ratio caused by suffers from differences in mobility between
enriched gas to the oil, forming a miscible the low viscosity of the typical injectant gas the EOR product and the oil it is supposed
solvent bank. compared to oil, perhaps by one or two to recover. One possible solution to this is
foam, which is dispersed gas bubbles in a
8. Taber JJ and Martin FD, reference 6. 11. Bondor PL: “Dilute Surfactant Flooding for North Sea liquid. Foam can reduce reservoir gas
Smith RV, reference 1. Applications—Technical and Economics Considera-
tions,” Proceedings of the 6th European Symposium
phase permeability to less than 1% of its
Donaldson et al, reference 5.
on Improved Oil Recovery, Stavanger, Norway, May original value. Typical surfactant-based
Stosur G, Singer M, Luhning R and Yurk W:
21-23, 1991, volume 2: 749-758. foams can last indefinitely. Such foams
“Enhanced Oil Recovery in North America: Status
and Prospects,” Energy Sources 12, no. 4 (1990): 12. Taber JJ and Martin FD, reference 6. have been used for mobility control misci-
429-437. Smith RV, reference 1.
ble gas injection and steamfloods with
9. Taber JJ and Martin FD, reference 6. Venuto PB, reference 2.
mixed results. Problems include abnormally
Smith RV, reference 1. Lake LW, reference 10.
10. Taber JJ and Martin FD, reference 6. 13. Donaldson et al, reference 5.
high injector-to-producer pressure differen-
Lake LW: Enhanced Oil Recovery. Englewood Cliffs, Simkin EM and Surguchev ML: “Advanced Vibro- tials required for propagation, rapid
New Jersey, USA: Prentice-Hall, Inc., 1989. seismic Technique for Water Flooded Reservoir changes in foam stability and quality as it
Stimulation, Mechanism and Field Tests Results,” migrates away from the injection well and
Proceedings of the 6th European Symposium on
Improved Oil Recovery, Stavanger, Norway, May foam breakdown in small pores.
60 21-23, 1991, volume 1: 233-241.
Chemical Injector Producer
Chemicals used in EOR include polymers,
surfactants and alkalis. All are mixed with
water and, occasionally, other chemicals
before injection. Broadly speaking, targets
for chemical recovery are crudes in the Surfactant
range between the heavy oils recovered by slug
thermal processes and light oils recovered Polymer-
by miscible gas injection. Flood- thickened Oil-water Residual oil and
water fresh bank resident brine
Polymer flooding is simply an accompani- water
ment to waterflooding. It is the most com-
monly used chemical enhancement process
since it is easy to apply and requires rela-
tively small investment. Although polymer
flooding increases recovery by a modest nSurfactant-polymer flooding, showing the surfactant slug on its way toward the pro-
amount, on the order of 5%, it can yield ducer, pushed by polymer-thickened fresh water and floodwater from the injector well.
(Adapted from Donaldson EC et al, reference 5.)
solid profits under the right circumstances.
Adding high molecular weight polymers tant performance is optimal over a narrow sification, which lowers viscosity. Alkalies
increases the viscosity of water and, with salinity range and is subject to adsorption such as sodium or potassium hydroxide are
some polymers, reduces the aqueous phase and retention through ionic exchange with used. Caustics can react strongly with min-
permeability without changing the relative reservoir rocks. To avoid these problems, erals in the connate water and with the
permeability to oil. This can greatly improve the first slug may be a preflush water solu- reservoir rocks to the detriment of the pro-
waterflood volumetric sweep efficiency. tion. However, it is often ineffective. The cess. This complex process is poorly under-
Polymer concentrations are 100 to 1000 second slug, perhaps 10 to 30% PV, con- stood; it has had some technical successes
parts per million (ppm) and treatment may tains the surface active agent (5 to 10% by but no great financial successes.12
require injection of 15 to 25% PV over sev- volume), hydrocarbons, electrolyte and Besides the mainstream techniques
eral years followed by a typical waterflood.9 cosolvent, usually alcohol. This is followed discussed, many interesting ideas have
Cross linking, or gelling, polymers in situ by a slug of polymer-thickened water for been proposed and tried. Today, most of
with metallic ions can augment perfor- mobility control and, finally, typical water- these are not economical or have some
mance in sweep profile control—helping to flood (above ). technical flaw:
plug high conductivity zones or minor frac- Koninklijke-Shell Exploratie en Produktie •Microbial EOR injects bacteria and nutri-
tures that degrade sweep efficiency. The Laboratorium in Rijswijk, The Netherlands, ents into the reservoir where the bacteria
main drawbacks to the use of polymers are developed a surfactant capable of being multiply and biochemically manufacture
their high cost and the low injection rate injected in low concentrations (less than 1% polymers and surfactants; this technique is
caused by high viscosity (which impacts by volume) in seawater in an offshore envi- still unproved, although some successes
economic rate of return), degradation at ronment during late secondary recovery have been reported.
higher temperatures, intolerance to high waterfloods.11 Shell’s analysis of the eco- •Thermal enhancements include downhole
salinity, polymer deterioration from shear nomics of implementing this technique in a steam generation, mixing gas and solvents
stress imparted by pumping, flow through moderately sized North Sea field—100 mil- with steam, downhole radio frequency
tubulars and perforations, and long-term lion stock-tank barrels of OOIP—is reveal- (induction) heating, nuclear steam genera-
instability in the reservoir environment. ing. Shell found that high front-end capital tion and jet leaching by high-pressure hot
Surfactant flooding, also known as deter- costs would include building a chemical water with additives.
gent, micellar-polymer or microemulsion plant to manufacture the product and addi- •Other ideas: injection of direct current for
flooding, uses low concentrations of surfac- tional dedicated floating facilities to handle electro-osmotic effect, chemical alteration
tants in water to reduce the interfacial ten- injection and production. The surfactant of micellar configuration of in-situ
sion between oil and water. Although the cost alone was $18 per barrel of incremen- petroleum, earthquake simulation using
idea has been around since the 1920s, seri- tal oil recovered, undiscounted at an esti- high-powered surface vibrators (successful
ous research and field trials did not start mated 9 kilograms (kg) [20 lb] of product in the USSR13).
until 20 years ago, and uniformly profitable injected for each barrel of oil. Projections EOR technology does not seem to be on
field performance is still elusive. Of all EOR showed that front-end investment of $750 the threshold of any dramatic technological
processes, surfactant floods may be riskiest, million would be incurred in front-end capi- breakthroughs. Instead progress will proba-
involving the most difficult design decisions, tal costs and chemicals injected prior to the bly come through gradual evolution, stimu-
requiring large capital investments, and production of significant amounts of incre- lated by growing motivation to recover
being strongly affected by reservoir hetero- mental oil. The analyzed technical cost more oil from known fields. The major con-
geneities. For these reasons, and as a result came to $90 per barrel assuming a 15% dis- tribution to EOR is likely to be from the con-
of marginal field performance, interest in count rate (cost of capital). stantly improving art of reservoir characteri-
surfactant flooding has declined. The caustic or alkali flooding process zation for predicting EOR response and
A surfactant flood must be designed for a relies on a chemical reaction between the from horizontal drilling. —SM
specific crude oil in a specific reservoir tak- caustic and organic acids in the crude oil to
ing into account such factors as salinity, produce in-situ surfactants that lower inter-
temperature, pressure and clay content.10 facial tension between water and oil. Other
Generally, multiple slugs are used. Surfac- mechanisms that may enhance recovery are
changing rock from oil-wet to water-wet,
January 1992 which lowers interfacial tension, and emul- 61

Вам также может понравиться