Академический Документы
Профессиональный Документы
Культура Документы
ABSTRACT
Numerous dipmeter and borehole image log data sets have been acquired over
the years and are being used to build subsurface models. Dealing with dipmeter and
image log data remains a niche skill within the petroleum industry, and because
these are not conventional log data sets, they tend to be neglected in the way data
are stored and quality controlled. A variety of wireline and logging-while-drilling
tools exist, and each logging run contains a variety of curves with tool-specific
mnemonics. For a particular data set, there may be several tens of curves from the
raw data set and hundreds from the processed and interpreted data sets. Data quality
control (QC) is an essential procedure that has to be conducted to assure dipmeter
and image log data integrity in the subsurface models. Data QC should be performed iteratively during data acquisition, data management, processing, and
interpretation. This chapter presents standard and globally applicable corporate
guidelines for data management and data QC of dipmeter and image log data sets.
INTRODUCTION
39
40
Garca-Carballido et al.
Dealing with dipmeter and image log data, however, remains a niche skill. Even major operating
companies might have very few if any borehole
image (BHI) experts. Once dipmeter and image
log data have been acquired, it is commonly the
project geoscientist and/or petrophysicist who
decides what level of interpretation might be required either immediately after data acquisition or
several years later, i.e., during a field re-evaluation.
A large percentage of dipmeter and image log processing and interpretation is conducted by specialized service companies instead of petroleum company specialists.
As dipmeter and image log data are not conventional log data sets, and commonly require specialist
software, they tend to be neglected with respect to data
management. This is because of lack of specialists,
sizable number of curves, and the variety of curve
mnemonics, both tool type dependent, that are included in a given tool run. In addition, when each tool
run is taken through data processing, which includes
multiple steps and data interpretation, a multitude of
new curves are generated. For all of these reasons,
dipmeter and image log data require a suitable database that can handle a variety of multisampled curves,
store data in a range of formats, e.g., Log Information
Standard, Digital Log Interchange Standard, and an
actual image, as well as having a structure capable
of organizing all the curve versions that correspond
to raw, quality-controlled, spliced, processed, and
interpreted curves. Furthermore, the database dictionary of the database should be updated regularly, as new tools and/or new curve mnemonics
are developed.
Data quality control (QC) is an essential procedure
that has to be conducted to assure dipmeter and image log data integrity in the subsurface models. Quality control should be performed at all stages, including data acquisition, data management, processing,
and interpretation.
It is in the interest of each organization storing
such data to have suitable data management and data
QC procedures to enable the prompt availability of
quality-controlled dipmeter and image log data sets
when these are required by the project geoscientist
or petrophysicist. A set of such data management
and data QC procedures (Garca-Carballido, 2002;
Poppelreiter et al., 2002; Poppelreiter and GarcaCarballido, 2003; Tso, 2004), which are implemented
across many regions of Shell, is discussed in detail
in this chapter.
Database Inventory
The first step toward a corporate image log database is to make an inventory of the different legacy
data sets. An example of this is given below (Figure 1).
This example shows a snapshot in a point in time of
the data set from a Shell operating unit, revealing that
more than 700 wells had some kind of BHI and/or
dipmeter log data. Less than half of these data sets are
digitally stored in the company database, whereas
others are available as hard copies (field prints) or in
the tape archive.
Data Management and Quality Control of Dipmeter and Borehole Image-Log Data
41
Figure 1. The borehole image (BHI) and dipmeter database snapshot from a Shell operating unit (data up to 2001).
AST = Acoustic Scanning Tool; CBILSM = Circumferential Borehole Imaging Log (Baker Hughes/Baker Atlas);
TM
TM
HDIPSM = Hexagonal Diplog (Baker Hughes/Baker Atlas); EMI = Electrical Micro Imaging (Halliburton); FMI =
Fullbore Formation MicroImager (Schlumberger); FMS = Formation MicroScanner (Schlumberger); HALS = HighResolution Azimuthal Laterolog Sonde (Schlumberger); HDT = High Resolution Dipmeter Tool (Schlumberger);
TM
MBD = Multibutton Dipmeter; OBDT = Oil-Base Dipmeter Tool (Schlumberger); PSD = Precision Strata Dipmeter;
TM
SHDT = Stratigraphic High Resolution Dipmeter Tool (Schlumberger); UBI = Ultrasonic Borehole Imager
(Schlumberger).
The data set inventory needs to include the following for each BHI and dipmeter data set:
well name
latitude and longitude
logged interval
logging run
tool setup
reference distances
inclinometry type
offsets
comments on logging run
repeat or main log
acquisition tape name
processing applied
list of curves and interval spacing
list of associated logs with curve names
the data sets have all the required curves; if this is not
the case, data should be sent for repair to a specialist
BHI contractor if required. To get an overview of the
status of the database, a subset of the digitally stored
data should be selected to perform a few QC checks.
This subset could be selected from areas and reservoirs where current subsurface studies are planned,
which require BHI, to maximize business impact.
Following the example shown in Figure 1, a subset
of 30 dipmeter and BHI logs from various vintages,
fields, and reservoirs was chosen. Out of the digitally
stored logs, 70% were of very good to medium quality (i.e., they met the quality requirements discussed
in this chapter); however, some data sets were incomplete or data were partially damaged. We found
that many data could easily be repaired and upgraded
in a cost-efficient manner using data from original
tapes, digitizing data from field prints, or splicing in
data from repeat sections. The remaining 30% of the
subset was found to be unusable, mainly because some
essential curves such as orientation curves were missing
from the database and from the tape, and it was impossible to retrieve them from another data source. Less
42
Garca-Carballido et al.
Quality Control
During the data management routine, QC is first
applied to all newly acquired data sets as soon as they
arrive from the logging contractor and to all legacy
data before they are used in subsurface studies. The
QC procedure includes checking the presence of all
required curves, which will have tool-specific mnemonics, as well as conducting a QC plot and creating
a QC report. The procedures are described in more
detail below.
This is probably the most crucial data management step where data managers and end users, i.e.,
geoscientists and petrophysicists, need to find a technical and viable solution to organize the database
structure in the context of data acquisition, data management, and QC and data accessibility.
A data workflow applicable to dipmeter and BHI
data is illustrated in Figure 2. This workflow encompasses all stages, starting from designing the logging
program followed by data acquisition, data management and data QC, database structure, and data
processing and interpretation until data are exported
to the relevant subsurface applications to build geological and geomechanical models. This particular
workflow contains some assumptions regarding
whether data are processed in-house or externally, a
practice that may vary in different companies.
Data Management and Quality Control of Dipmeter and Borehole Image-Log Data
43
Figure 2. Database workflow applicable to dipmeter and borehole image (BHI) data. QC = quality control; Hdr = header;
R/W = Read/Write.
7)
8)
9)
10)
Data processing.
Data interpretation.
Once the study work has been finalized, it is
important that the results are fed back to the
CDB, ensuring that a track record exists for each
curve showing all processing steps applied to
each of the curves.
Interpretation results are transferred to specific
subsurface applications to build geological and
geomechanical models.
44
Garca-Carballido et al.
Figure 3. Dipmeter
and BHI Web page
used to search and
visualize the data
sets. BHI = borehole
image; QC = quality
control; GEOCAP =
Geological Computing Applications
Portfolio (developed
by Shell).
TM
Data Management and Quality Control of Dipmeter and Borehole Image-Log Data
interpretation quality. A simple QC check methodology has been developed and includes the following
three steps:
45
verify that data are on-depth with the well reference gamma ray or resistivity master log
for on-pad devices, identify areas of tool sticking
(look at vertical accelerometer and tension curves)
and irregular pad readings, curve character, or
dead buttons
verify caliper reading inside casing
verify data versus expected lithologies
identify sections of excessive tool rotation (look
at relative bearing curve), i.e., excessive tool rotation occurs within a 30-ft (9-m) interval
verify data repeatability from different acquisition runs (main and repeat)
on-pad devices, assess whether pad pressure
was satisfactory (look at the pad pressure curve)
identify mud cake buildup and sections of poor
hole conditions and assess the effect on data quality; mud cake buildup in excess of 0.5 in. (1.2 cm) is
likely to affect the image and raw curve quality
ensure that magnetic and gravitational field magnitudes are reading correctly
verify that data are oriented to true north (if not
corrected during processing); data might be also
oriented to the high side or low side of the tool
frame
QC tool orientation using an independent well
deviation survey (look at deviation and hole azimuth curves)
46
Garca-Carballido et al.
Data Management and Quality Control of Dipmeter and Borehole Image-Log Data
47
Figure 5. The BHI and dipmeter data quality control (QC) applied to an Oil-Base MicroImager (OBMI, Schlumberger)
data set; note that some of the tool mnemonics shown in this diagram will vary depending on the kind of tool. ANOR =
acceleration computed norm; FNOR = magnetic field intensity computed norm; DEVI = deviation; HAZI = hole
Azimuth; HAZI_ORI = calculated orientation of the hole azimuth; GR = gamma ray; BHI = borehole image; OBDT =
Oil-Base Dipmeter Tool Schlumberger.
48
Garca-Carballido et al.
Geological formation-related artifacts: For example, halo effects around a highly conductive pyrite nodule or fracture aureoles, mottling caused
by the presence of gas, or even a strong image
character change if data were acquired across the
hydrocarbon water contact.
TM
Acquisition artifacts: These relate to drilling operations (e.g., stabilizer grooving or sidetrack window), whereas others relate to the logging operations themselves (e.g., mud smear, tool sticking,
or signal loss).
Borehole wall artifacts: These are very common
and result from physical irregularities in the borehole wall, such as rugosity, washouts, mud cake,
spiral hole, and even breakouts.
Processing artifacts: These are caused during processing, but unlike acquisition artifacts, which
permanently impact data sets, these can sometimes be corrected by a more detailed processing.
The causes of these artifacts include choosing the
wrong borehole diameter from which incorrect
dip values would have been calculated (unless
your software uses caliper measurements directly), incorrect speed correction, mismatch between pads and flaps, or inappropriate normalization windows.
2)
Data Management and Quality Control of Dipmeter and Borehole Image-Log Data
ACKNOWLEDGMENTS
The authors of this article are grateful for constructive
comments by Christine McKay (Maersk Oil North Sea UK
Ltd.), Stuart Buck (Task Geoscience), Heike Delius (Task
Geoscience), and Michael Poppelreiter (Shell).
REFERENCES CITED
Garca-Carballido, C., 2002, Borehole image and dipmeter
tools Procedures and guidelines: Shell Internal Library, Shell Internal Publication, Expro Report ER02005,
p. 1 40.
Lofts, J. C., and L. T. Bourke, 1999, The recognition of
artifacts from acoustic and resistivity borehole devices,
in M. A. Lovell, G. Williamson, and P. K. Harvey, eds.,
Borehole imaging: Applications and case histories: Geological Society Special Publication 159, p. 59 76.
49