Escolar Documentos
Profissional Documentos
Cultura Documentos
268–280; 12 FIGURES
Daniel E. Shier1
ABSTRACT
Curve normalization identifies and removes systematic ual case studies. This paper describes methods that can be
errors from well log data so that reliable results may be applied generally, and provides guidelines for their use in
obtained for reservoir evaluation, solving difficult corre- a variety of rock suites. Also discussed are the errors that
lation and seismic modeling problems. It is especially crit- are expected for the various curve types, and suggested
ical for any work involving batch-mode computer pro- methods for correcting them.
cessing. Factors to be considered in planning a normalization
The normalization equation is a function of four vari- project include the rock types and compaction patterns in
ables, two of which are defined for each well and two of the study area, hole rugosity, curve types, and the strati-
which are related to regional lithologic patterns. graphic level at which run changes take place. Guidelines
Well-to-well comparisons are made using histograms, are provided to avoid the introduction of additional inac-
crossplots, depth plots, and statistical measurements. curacies. Even with these caveats, an irreducible random
Prior publications on normalization have been individ- error will remain in the data.
Manuscript received by the Editor January 29, 1999, revised manuscript received January 18, 2004.
1
Consultant, Golden, CO, USA.
©2004 Society of Petrophysicists and Well Log Analysts. All rights reserved.
ited above an unconformity may be 10 API units higher Coalson (1979), Lang (1980), and Reimer (1985). These
than in those below. Precise integration of the data from a authors established histograms and crossplots as standard
3D seismic survey and from scattered wells requires log techniques for comparing curve responses between wells.
data that are free of the effects of differences in mud pro- Doveton and Bornemann (1981) introduced the use of trend
grams and other drilling contractor procedures. A poor tie surfaces for regional normalization.
will erode confidence in the prospect. The goal of this paper is to place well log normalization
Normalization may be performed whether or not the data in a broader framework. This is an attempt to codify knowl-
have been borehole corrected. When compensated porosity edge gained in interpreting approximately 200 different
curves are available, it is best to normalize the data without datasets from diverse rock suites from many different North
any additional borehole-correction procedures. This is American basins. It is not a case study, but rather a distilla-
especially true for neutron curves that usually require a tion of 24 years of experience with well log normalization.
major effort to back out the original corrections (if it can be Any well log curve may be considered as the sum of a
determined which ones had been made) and then apply the signal (actual rock properties), random noise (including
corrections again. For porosity curves in general, the random variations in the count rates for the nuclear tools),
improvement in results from additional borehole-correction and systematic errors. Well log normalization is the process
work is usually not worth the effort. In datasets with rugose of eliminating systematic errors from well log curves.
holes, borehole correction of the gamma ray curve prior to This paper outlines the general principles of well log
normalization can often be done with little effort by assum- normalization and defines the normalization equation. It
ing some average mud weight and batch-mode processing also includes suggested approaches for normalization when
the data. This procedure is recommended even if the mix of the rocks studied are inconsistent or otherwise difficult to
centered and eccentered tools cannot always be clarified. deal with. In addition, this work describes the types of cor-
Only a small number of papers have been published on rections that are generally required for the different curves
well log normalization. Previous workers have established and the percentage of curves of each type that are likely to
the basic tools of well log normalization and published indi- require correction in the normalization process. Finally,
vidual case studies. Neinast and Knox (1973) published the there are suggested new steps to add accuracy to porosity
first study on well log normalization. This was followed by interpretations from the Gamma Ray Neutron Tool and sim-
three other well-documented case studies: Patchett and ilar devices (GNT-type neutron curves). These curves are
all scaled in counts per second or equivalent units.
Each of the common log curves has unique characteris-
tics that affect how to best handle it in the normalization
process. A curve-by-curve discussion of these characteris-
tics and suggested normalization guidelines are provided
for selected, commonly available logs. Also included are a
few broad principles to avoid the erroneous alteration or
removal of anomalous, but real, variations in log curve
responses, reflective of real differences in rocks and pore
fluids.
NORMALIZATION EQUATION
Figure 1 illustrates the normalization problem. Suppose
that stratigraphic studies predict that both a clean limestone
reservoir and a regionally distributed marine shale vary lit-
FIG. 1 Gamma ray curve comparisons. The clean limestone
tle from well to well in a field study area. Suppose also that
and the marine shale are regional in extent and should not vary it has been determined that one of the wells has a “correct”
much in these three nearby wells. However, the three wells gamma ray response in both formations (for example, by
have significantly different gamma ray response. Well A can be using the techniques discussed in later sections of this
adjusted to the same pattern as the type well by shifting the paper). That well may be designated the type well. In Well
entire gamma ray curve. Well B has a value of 12 API units in the
A, the difference in the gamma ray response, measured in
regional clean limestone just like the type well, but it has a very
different value in the regional marine shale. To adjust Well B to API units, between the limestone and shale is the same as in
the pattern of the type well, the scaling factor for the curve must the type well. However, in Well A, all of the gamma ray val-
be changed. ues are too high by a constant amount. Well A can be nor-
malized by simply shifting the entire gamma ray curve by sured using the uncorrected data. Parameters Rmin and Rmax
some constant number of API units. Well B has the pre- are the regional best estimates of the correct value for the
dicted response in the limestone but has a shale response two lithologies at that location, whether they are constant
that is higher than that in the type well. The gamma ray values or are taken from trend surfaces (Doveton and
curve in Well B can be adjusted using a new scaling factor, Bornemann, 1981).
in addition to shifting, so that it will match the type well. These parameters are illustrated by the curves in Figure
The normalization equation makes linear adjustments to 2. Choice of the regional maximum and minimum parame-
log curves according to standard parameters picked in the ters is shown in the type well, while the picks for maximum
type well, shifting the curve across the scale and/or chang- and minimum values in Wells A and B are also shown. Fig-
ing the scaling factor in one operation. While it is theoreti- ure 3 shows the effects of the normalization as related to the
cally possible that nonlinear corrections are needed in some original scale.
cases, as a practical matter, such errors tend to be so small as More accurate normalization results are achieved if the
to be inseparable from random errors that cannot be rock types used for standardization are picked to maximize
addressed by the normalization process. the difference between Rmax and Rmin. This ensures that val-
In each well, the equation requires measurement of the ues are interpolated accurately over the entire range of val-
value of two specific lithologies using the uncorrected data. ues in the wells to be normalized. A choice of Rmax and Rmin,
These are usually close to the maximum and minimum val- such that their difference is small, for example three poros-
ues of the curve in the interval. For the gamma ray curve ity units, is unsuitable because small errors would be
shown in Figure 1, these are the clean limestone and the greatly exaggerated when extrapolated to higher and lower
marine shale. The regional value for that same lithology is log values outside the range of Rmax and Rmin. Unfortunately,
the best estimate of the correct value for that lithology at real rocks often prove uncooperative. Coal, for example,
that location. should be an excellent low-density normalization rock, but
For a log curve whose unnormalized values are desig- is highly vulnerable to hole enlargement and can also har-
nated Vlog, the normalized values of the curve Vnorm are bor significant impurities. Similarly, shale facies are prone
given by to hole enlargement, borehole breakout, and other forms of
damage, depending on the mud used.
Vnorm = R min + ( R max - R min )(Vlog - Wmin ) / (Wmax - Wmin ).
(1)
FIG. 2 Normalization parameters for the same curves shown in FIG. 3 Changes in the gamma ray scale after normalization for
Figure 1. Wells A and B in Figures 1 and 2.
DATA DISPLAY TOOLS FOR COMPARING need to be aware that incorrectly normalized data can be
LOG RESPONSE made to fit the same linear trend as the type well if offset-
Histograms and crossplots for a specified formation are ting errors are made on both of the crossplotted curves. If
useful because they allow data from different wells to be normalization is done on the basis of nonreservoir intervals,
compared efficiently using overlay methods. Depth plots as a final check it is often best to make normalized
need to be at a much more compressed scale than is avail- crossplots of the reservoir interval itself to make sure that
able on blueline copies. each well has reasonable porosities.
Frequency plots of the log values for an interval are the Depth plots at 200 ft per inch (2400:1) to 1000 ft per inch
easiest display to use. Although good bimodal histograms (12000:1) are useful in a number of situations. Consider, for
(Figure 4a) are occasionally encountered, typical histo- example, a stratigraphic sequence in which a few somewhat
grams vaguely resemble a statistical normal distribution inconsistent thin beds of tight limestone or other lithology
(Figure 4b). In this case, the low-side shoulder may be used provides the best maximum or minimum. Using a vertically
as the minimum and the high-side shoulder may be used as compressed plot allows the simultaneous assessment of
the maximum. In a polymodal case, several distinct modes whether a given bed has the desired “typical” development,
are present (Figure 4c). If the lithology representing one of is thick enough to provide an accurate value, and has no
several modes is poorly represented in some of the wells, it hole enlargement issues. Without changing the display,
may result in confusion and incorrect normalization when other maximums and minimums for other formations can be
other display types are not used to check the results. If his- recorded as well. The effects of data gaps that may make
tograms like those in Figure 4b or 4c are the only means histograms and crossplots highly misleading can also be
used to compare curves, good results require that the same ignored. Compressed vertical plots can help determine
lithologies be present in the same proportions. exactly where a run change takes place for a given curve
and record maximums and minimums for both runs. After
Crossplots completing normalization using histograms and crossplots,
A more or less linear trend on a crossplot typically repre- it is useful to quickly check the same data as vertically com-
sents various mixtures of two lithologic components. In a pressed plots on a video display to identify unrecognized
pure limestone, a linear trend on a neutron/density crossplot run changes or other problems that were not evident on the
represents various mixtures of calcite and fluid-filled pore histograms and crossplots.
space. In shales, a linear trend on the neutron/gamma ray
crossplot may represent a mixture of clay and silt. ESTABLISHING CORRECT CURVE VALUES
If the type well includes all proportions of the lithologic
components and a linear trend is established, then points Normalization requires that at any location, the “correct”
from a well with only a limited lithologic mixture will fall values of the maximum and minimum lithologic end mem-
on the trend when normalized. However, crossplot users bers be known. Defining these correct values is the most
critical part of any study—it may also be the most
time-consuming.
The type well for a particular curve is the standard to
which the other wells must be adjusted. Generally, compari-
sons are made in one or more specific formations. A type
well can have either local or regional application. Even
when lateral stratigraphic change is not observed across an
area, there will usually be some. Thus, it is good practice to
select a type well near the geographic center of the study
area.
Under certain conditions, stratigraphy may be only a
FIG. 4 Some typical gamma ray histogram patterns. Pattern a minor source of regional variation as compared with sedi-
is generated from beds of two distinct rock types. Min and Max ment compaction. The Lower Miocene of the Gulf Coast is
values are best picked on the peaks. Pattern b shows an over-
lapping mix of rock types. Min and Max values are best picked
a good example. Rocks presently buried at 6000 ft [1830 m]
on the shoulders. Pattern c, with a number of distinct peaks, is have the same porosity log properties for long distances,
seldom seen in actual stratigraphic sequences. while the very same lithologies at a burial depth of 10,000 ft
[3050 m] have very different properties. Instead of a type may be made to the dispersion of the data of individual
well, a type compaction curve is utilized for sediments with wells using calculated standard deviations.
normal pore pressures. Figure 5 illustrates this normaliza- In reservoir studies for unitization, an equity holder may
tion process. Damage to the shales during the drilling pro- be especially attached to certain incorrect porosity values
cess and varied regimes of abnormal formation pressure that provide a financial advantage. This method remains
add extra challenge to normalization in the Gulf Coast Ter- popular because it is the method least likely to change those
tiary sequences. values.
There are several flaws in this method. Nothing is done
to test the implicit assumption that all stratigraphic variabil-
METHODS ity is effectively random. If there is a genuine trend, it will
be removed in the normalization process. The acceptable
Statistical normalization deviation of the mean for any one well from the all-well
All references cited in this paper assume that there is a mean is selected arbitrarily.
“correct” value for each part of each curve and that the cor-
rect value is the same as the value from appropriate mea- Type-well method
surements from core samples or other data sources. This is A representative sample of the wells is examined and one
different from normalization as defined by statisticians. As of the wells is selected as the type well for that particular
practiced in the petroleum industry, “statistical normaliza- curve. As part of that process all other curves are examined.
tion” uses statistical measures of curve data in the normal- In general, a good type well has values that are the same as
ization of well log values. For example, a log curve (e.g., those in many other wells and has an in-gauge hole.
the gamma ray) may be adjusted so it has the same average The data for each of the individual wells are then com-
and standard deviation in a correlated interval (e.g., the Fort pared with the type well. If adequate software is available,
Union formation) in all wells. Another statistical method is the person making the comparison will be able to compare
to adjust the curve so that the 10th percentile and the 90th histogram patterns, crossplot patterns, and vertically com-
percentile are identical in each well in a correlated interval. pressed depth plots simultaneously. Variables for the basic
Statistical normalization is based on the assumption that normalization equation are adjusted until a good fit with the
the same rocks are present in the same percentages over the type well is obtained. Many wells may be simply checked
but not adjusted at all.
study area. Practical difficulties include real changes in
rock compaction, incomplete penetration in some wells,
and null intervals.
Tampering with the standard deviation of the data from a
well ignores the fact that in some wells a greater spread of
the data may be due simply to factors such as more noise,
severe hole enlargement effects in that well, and different
thin-bed response by different tools. Brute-force statistical
warping of the data may result in entirely incorrect curve
scaling.
This is an improvement over the big histogram method. learned about data variability or any directional properties
Some petrophysical judgement has been used to explain that they might have.
and correct anomalous wells. The type-well method works
well in settings where there is little distance or geological Neighbor comparisons
change between wells in the dataset. Some “noise” has been In this method, a geologist designates clusters of wells in
removed from the data but more remains than if the trial the area. A cluster of wells might penetrate the formation
normalization method (below) had been used. Nothing is much deeper and have different compaction than the other
clusters. Within a cluster, the formation might have a rather
different mix of rock types.
Normalization is accomplished by selecting a type well
for each cluster and applying the type-well method to each
cluster individually. Selection of a good type well for a
porosity curve is usually straightforward, as density, com-
pensated neutron, and sonic curves, as they appear on the
bluelines, are usually accurate in more than 75% of the wells.
In the tear fault regime of coastal California or Cook
Inlet, Alaska, each of the various fault blocks may have a
very different compactional history. In such a situation,
neighbor comparisons within each fault block are the most
accurate normalization method available.
FIG. 7 Bubble map of residuals from a trend surface. Residu- FIG. 8 Bubble map of residuals from a trend surface. Residu-
als appear to be randomly distributed, indicating that the trend als are clumped. Clumps may reflect different vintages of data,
surface is a valid indication of the regional pattern. structural patterns, or other factors.
this brief exercise is then used to confirm the preliminary volume by about 20%. These errors affect not only reserve
choice of the type well, or to select a different well from the estimates but also completion decisions.
set of 10 to 20 wells. The Wmax and Wmin picks with refer- When picking a type well for a porosity curve, an addi-
ence to the final type well are determined and mapped for tional constraint is required. The curves from the type well
all wells, either as contours or as a bubble map. Figure 6 must have the same lithology response in the reservoir rock
shows a nonrandom pattern. Doveton and Bornemann type as that built into the software routine that will be used
(1981) dealt with this sort of pattern by constructing a trend to calculate crossplot porosities. In normalizing a dataset,
surface that reflected regional geological variations and the analyst may use a different lithology response for differ-
normalizing the data with respect to that trend surface. Fig- ent vendors and tool models, or may simply select one of
ure 7 is a random pattern of a trend surface residual suggest- the responses for the final batch-mode work and “normalize
ing that the trend surface removes most of the systematic out” the vendor and tool differences. This latter procedure
variation in the data. Figure 8 is a clumped pattern that may requires significantly less effort and leaves far fewer
be the result of historical patterns of tool vintage or logging opportunities for errors in keeping track of details. However,
contractors. some instrument types have a dolomite porosity response
The various maps will generally delineate the issues to that is significantly nonlinear with respect to limestone
be faced in normalization work. Can clumped values be porosity; others do not. Therefore, in dolomitic sequences, it
explained by different tool vintages or logging contractors? may be unwise to combine certain types of neutron curves.
Can clumped values or regional trends be explained by
structure maps related to compaction? Are clumped values Sonic curves
indicating that it would be better to use the type-well The best reference lithology for sonic curves is a region-
method? If a trend surface is to be used, is it better to use a ally distributed rock that is resistant to hole damage or
first-order or a second-order trend? After these questions enlargement during drilling. In thick clastic sequences, this
are answered, the trial normalization parameters are dis- is usually a hard, silty shale. In carbonate sequences, it is a
carded and the final normalization parameters determined. tight limestone, tight dolomite, or an anhydrite.
Results of the trial normalization method are more reli- Apparently, even the earliest sonic tools had highly
able than other methods because the additional dimensions accurate clocks. Hence, well-to-well differences between
of stratigraphy and instrument type have been considered. Wmax and Wmin are rare. Such differences, when observed,
Because there are such large variations in gamma ray are usually due to incorrect scaling of logs in the field or in
response with regard to tool vintage and contractor, and the digitizing of blueline prints. As a result, normalization
because the gamma ray is fairly insensitive to rock compac- of sonic curves is a matter of shifting the curve without
tion, trial normalization is generally used for gamma ray changing the scaling factor. As many as 40% of early
normalization over large areas regardless of the methods uncompensated sonic curves require shifting by 2 to 5 ms/ft
used for the porosity curves. in the normalization process. Any errors in compensated
sonic curves are usually small when compared to shale
POROSITY CURVES
damage effects, compaction effects, and editing problems
such as cycle skips and noise spikes. Frequently these
Approximately 20% of the compensated neutron and effects masquerade as normalization problems. As a rule of
density curves from the 1970s have errors of two porosity thumb, if more than 5% of compensated sonic curves
units or more. Uncompensated sonic logs from the 1950s appear to need adjustment in the normalization process, the
have a similar error rate. Fewer than 5% of the compensated normalization is probably in error. These errors can be
sonic logs have noticeable tool-related errors. Wells logged accounted for and eliminated by considering other sources
in the 1990s have significantly fewer inaccuracies than such as compaction and shale damage.
those logged in the 1970s. In Gulf Coast Tertiary sequences, compensated sonic
In ideal reservoirs like those of the Gulf Coast Pleisto- curves usually require no changes in the normalization pro-
cene (32% porosity and low connate water resistivity), the cess to obtain valid values for formation evaluation in the
effect of a change of two porosity units may not signifi- sandstones. However, shales are of equal importance in the
cantly impact economic decisions. In a producing province creation of accurate synthetic seismic curves. In most wells,
with low porosities and high water resistivities, such as a the sonic curve shows significant formation damage (sonic
typical Rocky Mountain gas reservoir, a difference in two values that are too high) in the associated shales. The
porosity units in a shale-free sandstone may change the cal- amount of formation damage is directly related to the
culated water saturation by 10% and the hydrocarbon pore silt/clay ratio in a shale bed, with formation damage
increasing with clay content. Using the procedures that include shales, neutron/gamma ray and neutron/sonic
described here, it may be possible to define a Wmax and Wmin crossplots are especially useful.
value for each logging run in each well that will change the
scaling factor of the sonic curve and produce a fairly good Neutron curves scaled in porosity units
result for making synthetic seismic traces. The normalized Generally 15% to 25% of 1970s-vintage compensated
curve generated in this way should be designated by a spe- neutron curves require adjustment in the normalization pro-
cial mnemonic since it may not be entirely suitable for for- cess. Stated another way, 75% to 85% of compensated neu-
mation evaluation in the shaly sandstones. tron curves give reasonable values without any bore-
hole-correction procedures beyond those done before the
Neutron curves blueline log was created. If possible, sidewall neutron curves
In carbonate/shale sequences, the best minimum should be normalized separately from compensated neutron
lithologies are anhydrites and tight carbonates. Shales that curves. These two main types of neutron curves should be
are not particularly subject to hole enlargement are the best handled separately in any batch-mode calculations.
maximum lithologies.
In purely clastic sequences, there are no ideal neutron Neutron curves scaled in counts/second
maximum and minimum lithologies. Minimums are highly These GNT-type logs include standardized versions of
problematic. Only in exceptional circumstances are counts/second such as API units, Environmental Units,
water-wet sandstones consistent enough to use for the max- MicroRoentgens, and Radium microgram equivalents/ton.
imum. Maximum porosity shales usually have varying Schlumberger (1969) provided charts for GNT tools that
degrees of hole enlargement. Purely statistical methods or indicate that the logarithm of the counts/second is slightly
methods that use only histogram displays will almost invari- nonlinear with respect to porosity. The exact function by
ably produce poor results. Some methods for addressing which porosity departs from linearity depends on the spac-
these problems are given in the Guidelines section below. ing of the neutron device, hole size, temperature, water
The basic data for any porosity interpretation from a neu- salinity, and dolomite content. In the 5% to 20% porosity
tron log are counts/second. Porosity is more or less range, departure from linearity varies from 0.5 to 2.0 poros-
inversely proportional to the log of the counts/sec- ity units. As a source of “noise,” this is minor compared
ond—although in practice the logging contractor may use with other factors.
dual detectors and more complex mathematical variants of Typical sets of GNT-type neutron curves represent four
this relationship. Tool calibration relates a particular count to eight different logging companies, the majority of which
rate to a particular porosity. As a consequence of the loga- are small companies for which no charts are readily avail-
rithmic relationship, a small error in assigning the count able. Header and borehole-size information are frequently
rate for the high porosities will produce a large error. On the inadequate to use a chart, if one even exists. As a practical
other hand, a small or moderate error in assigning the count matter, the small departure from linearity of the logarithm
rate for the low porosities will result in only a small error in of the counts/second with respect to porosity can be
porosity. As a result, substantial inaccuracies in neutron ignored, simplifying the whole process of removing much
porosities are more often found in rocks with higher of the rest of the noise from the dataset. This is essentially
porosities than in those with lower porosities. the same procedure as using semilog paper recommended
In normalization work, there are two practical effects of by Hilchie (1979) and others.
these relationships. First, nearly all neutron normalization For datasets that are entirely digital, it is best to calculate
involves changing the scaling factor, since (Wmax – Wmin) a curve of the logarithm of the counts/second curve and use
varies a good deal in wells requiring normalization. Sec- it for all further normalization and porosity interpretation
ond, if there is an obvious problem requiring changes to the work. Inspection and comparison of the logarithm curve
high-end porosities but there is no good low-end porosity between wells easily reveal casing shoes in wells completed
rock available, reduction of the noise level is usually better open hole, changes of fluid in the hole, incorrect zero regis-
served by assuming that zero porosity is correct. This pro- try of the curve on the log copy, and other problems that
duces better results than shifting the whole curve so the have to be dealt with on a well-by-well basis. When the
high-porosity end fits. GNT-type tool passed from an environment that was poorly
Neutron curves often have the same apparent porosity in shielded from the borehole walls to a more shielded envi-
a number of entirely different rock types. For this reason, ronment (i.e., from air to mud, or from uncased hole to
using crossplots in the normalization process is more cased hole), it resulted in a change in scaling factor but no
important than for any of the other curves. In formations change in the zero point. As a result, these changes can be
dealt with by simply shifting the portion of the logarithmic ied cementation, the same procedures may be followed if a
curve below the discontinuity so that a key lithology (such reliable Wmin can be picked in each well. Often this is not the
as a tight limestone) lines up with the same lithology above case and accurate results cannot be expected.
the discontinuity. Shier and Sponable (1997) point out that In dolomitic sections, comparisons between GNT-type
baselining these curves may also be required to allow for neutron curves and modern neutron curves should use mod-
mudcake and other problems not seen in curves from more ern curves that are more or less linear with respect to poros-
modern dual-detector tools. ity. These are the more modern compensated neutron curves
The type-well method is used when shales are present. and the sidewall neutron curves. Compensated neutron
The gamma ray/neutron crossplot is the definitive data dis- curves from the 1970s and 1980s have a nonlinear response
play. First, histograms and/or compressed depth plots are to dolomite porosity changes and should be avoided.
used to normalize the gamma ray curve for all wells. Rmin
for the neutron curve is determined using modern logs Density curves
nearby. Anhydrites are generally assigned a porosity of Typically 15% to 20% of the compensated density
zero, and tight carbonates a porosity of 1% to 2%. The main curves require adjustment during the normalization pro-
task is to determine a neutron porosity for the Rmax that cor- cess. The best reference lithology for the density curve is a
responds to an index shale that has a regionally consistent single pervasive rock type that suffers minimal hole enlarge-
neutron response. Using modern compensated neutron ment or damage during drilling. In thick clastic sequences,
curves from the same area does not work because the more this is usually a hard, silty shale. In carbonate sequences, it is
modern neutron tools count only those gamma rays that a tight limestone, tight dolomite, or an anhydrite.
have energies characteristic of neutron events. The Like the neutron porosity, density logs are based on
GNT-type neutron tools count gamma rays of all energies. count-rate measurements and therefore are theoretically
The effect of the gamma rays from the shales or other radio- subject to the same scaling factor problems as neutron logs.
active rocks on the GNT-type tools may be quite large. For Defining Wmax (low porosity) values is simple if anhydrite
example, in a West Texas study, a Grayburg feldspathic or a tight carbonate is present. However, defining a suitable
siltstone used for the Rmax had a value of 20% porosity on a Wmin (high porosity) rock type is usually difficult. The high-
compensated neutron curve. This same siltstone would est apparent density porosities often represent enlarged
have an apparent neutron porosity of only 8% if the gamma hole, not some consistent lithology. As a practical matter, it
ray/neutron crossplot pattern of the GNT-type neutron is usually best to simply shift density curves, without
curve had been assumed identical to a nearby compensated
neutron curve.
If core porosities are available, they can be crossplotted
versus the logarithm of the counts/second for that well and a
linear trend line established as shown in Figure 9. The point
on the trend line that corresponds to the normalized gamma
ray for the index shale has a corresponding tentative value
Rmax on the porosity scale. If there is another log that gives
reliable porosities in the same well, it can be used instead of
the core data. Generally, a tentative Rmax is calculated for
most or all of the wells that have adequate supporting data,
and these are used to establish the final Rmax for all wells.
In the absence of any other source of porosity data in the
same wells, a trial Rmax is assigned, and 10 to 20 wells are
normalized as if the trial Rmax were correct. The porosity
histograms for these wells are compared with those of 10 or
more modern logs from the same area. Of course, there will
be a variety of histogram patterns, but the highest porosity
modern wells should be similar to the highest porosity older
neutron wells. If the patterns compare well, that trial Rmax is FIG. 9 Procedure for estimation of Rmax for GNT-type neutron
curves. Core data are used to establish the regression line for
accepted as the final Rmax determination. If not, the trial Rmax that well. The index shale has been selected for its regionally
is changed, and the results checked again until a good consistent neutron response. The log10 of the counts/second is
match is obtained. entered on the x-axis and projected to the regression line; that
In a section that consists of shale and sandstone with var- point is projected to the y-axis.
changing the scaling factor, making sure the prevailing rock fairly high. The recommendation is to avoid using ura-
types have the appropriate values in each well. If the scaling nium-enriched shales like the Woodford in the Permian and
factor is to be changed, it must be based on crossplot analy- Anadarko basins for two reasons: First, many of these
sis. Relying on histograms alone as illustrated in Figure 4b shales have more local variations than casual observation
is not recommended. would suggest. Second, tool response in these beds will
In Gulf Coast Tertiary formations, it is common to depend on which detector was used—the early low-effi-
encounter density curves that show the effects of shale dam- ciency Geiger-Mueller counter or the later high-efficiency
age, resulting in an apparent shale porosity that is too high scintillator. For more ideas on dealing with difficult clastic
as compared with a nearby type well with minimal shale sequences, see the Guidelines section below.
damage. Wells drilled with oil-base mud have the least Although there is an industry standard for the gamma ray
shale damage. The procedure shown in Figure 5 is recom- curve, as a practical matter, it is not possible to establish
mended. It is generally best to check the sandstones or hard exactly what this should be based on the usual data at hand.
siltstones against the standard compaction curve and ignore For the gamma ray curve, the “correct” response is one that
the most clay-rich shales entirely. is consistent with its neighbors and gives a reasonable vol-
ume of shale interpretation for the various types of rocks. In
Photoelectric effect (PEF) curves other words, the gamma ray scale is relative to the other
Generally about the same proportion (15% to 20%) of wells in the project, not absolute. Most gamma ray
PEF curves require adjustment as do density and neutron normalizations require only a scale change. Zero API units
curves. The PEF curve is normalized by shifting, without is frequently correct, whether or not values in the shale
any change of the scaling factor, for the same reasons as range require adjustment. The increasing inaccuracy of
density curve. gamma ray values with higher radioactivity is supported by
data from the Dakota formation in the southern Powder
R i v e r b a s i n o f Wyo mi n g . F i g u r e 1 0 sh o w s t h e
GAMMA RAY CURVES unnormalized gamma ray values for the Dakota channel
Ideal rock types for gamma ray curve normalization are facies in 638 wells. Figure 11 shows the unnormalized
similar to those for the neutron curve. Anhydrites and clean gamma ray values for the associated regional marine shales
carbonates provide the best Wmin values. In fluvial and delta for the same wells.
plain sequences, it is usually possible to define a channel Of all the instrument responses, the gamma ray curve is
facies that makes a satisfactory Wmin. Shales are usually the most likely to be normalized using a trend surface
used for Wmax. The ideal shale should be sufficiently com- because clay minerals and the clay/silt mix in shales vary
petent that hole enlargement is not a problem but should be according to their source rocks, distance from the sediment
rich enough in clay that ambient gamma ray values are
source, depositional environment, compaction, and other lies or to introduce artifacts. These pernicious tendencies
regional patterns. can be reduced if certain guidelines are followed.
Even if some adjustments are minor, it is usually easier to In many wells, different depth intervals were logged
normalize all gamma ray curves except those of the type independently at different times. The various logging runs
well(s), than it is to adjust only those wells that need a large frequently involved different instruments, with different
amount of adjustment. calibration histories, different logging crews, and different
service companies. This lack of standardization dictates that
Spontaneous potential (SP) curves each logging run should usually be normalized separately.
Shier (1997) showed that there is a tendency for the size
SP curves are normalized by conversion to a Volume of
of the normalization corrections to be smaller with modern
Shale curve. This involves editing “by hand” to remove
instruments. This is somewhat balanced by a marked
“mechanical shifts,” followed by definition of a shale base-
increase in the introduction of new tools with different
line (a maximum) and definition of a clean sand line (a min-
response characteristics. When first introduced, new tools
imum). Care should be taken to allow for hydrocarbon sup-
may experience initial reliability problems. Normalization
pression of the SP as much as possible, based on the gamma
is needed for both old and new datasets.
ray, resistivity, and other curves that reflect the volume of
Hole rugosity affects log response. Histograms,
shale in the reservoir.
crossplots, and other data displays that do not allow the user
The raw SP curve (Figure 12) is not helpful. It shows the
to simultaneously assess the associated caliper curve may
effects of both run changes and changes in formation water.
be misleading and lead to inaccurate normalizations. An
In Figure 12, a shale baseline was defined for the SP curve, alternative method is to define a bad-hole curve, which may
using the caliper and resistivity curves (not shown) as addi- be based on curves in addition to the caliper. The bad-hole
tional references. The straightened SP curve has the shale
baseline set to zero. A clean sand baseline was defined for
the straightened SP curve, again using the caliper and resis-
tivity curves for reference. This was then used to create the
Volume of Shale curve.
Cook Inlet of Alaska is a place where SP normalization is
vital; volcanic clasts frequently make the sandstones more
radioactive than shales, rendering the gamma ray curve
nearly useless for distinguishing sand from shale.
Resistivity curves
Resistivity curves are usually handled with standard
borehole-correction procedures. They are not normalized at
all unless there is some compelling reason to do so. If nor-
malization is necessary, the zero is generally accepted, and
only values of Wmax are needed. Unfortunately, a suffi-
ciently consistent Wmax is almost impossible to define in
most datasets.
Some of the early resistivity curves have negative values
because of incorrect registry of the galvanometer to the
film. The solution here is to shift the entire resistivity curve
for that run so that the minimum value is compatible with
nearby wells. However, this will produce its own inaccura-
cies if the galvanometer adjustment was not the same for the
entire log run but was arbitrarily adjusted at some unknown
depth.
GUIDELINES
The purpose of curve normalization is to remove system-
atic inaccuracies, not to remove genuine lithologic anoma- FIG. 12 Creation of a Volume of Shale curve from an SP curve.
curve can be used as a filter prior to selection of a type well “lesser of two evils” strategy. Consider the case where thin
or other normalization tasks. shaly, possibly arkosic sands are developed in a section that
is 90% shale and there are no rocks in the study wells that
Reducing the noise but not the signal have a gamma ray of less than 65 API units. There will be
Avoid very small adjustments. If it cannot be clearly no problem using a gamma ray histogram to establish the
demonstrated that an adjustment will improve the accuracy maximum gamma ray normalizer from the pervasive shale
of the data, it should not be made. Set a minimum level for facies. The problem lies in defining the minimum normal-
adjustments to be made for each of the curve types based on ization rock type. Three less-than-perfect approaches are
the consistency of the lithologies that are available. An suggested below.
exception is made for the gamma ray curve, the values for Wmin can be picked from the low-side shoulder of the his-
which are not related to an absolute standard but are simply togram as shown in Figure 4b. It can be argued that as long
relative to the other wells in the study. as the Wmax and Wmin values that are used form an envelope
Look for map patterns. As an initial test, map the adjust- around the data to be analyzed, there should be no concern
ments that appear to be needed. If the polarity and magni- about the effect of the normalization parameters on nonex-
tude of the adjustments are more or less random, the adjust- istent values less than 60 API units. This approach is based
ments are likely to be legitimate. If there are nonrandom on the assumptions that each gamma ray curve has the same
areas, check possible causes. Are they real geologic anoma- thin-bed resolution characteristics and that the lowest vol-
lies? Are they the result of the consistent use of a particular ume of shale in typical sandstones in each well is the same.
logging contractor in a small area? Are they clumps of older If sandstones are actually better developed in one part of the
vintage logs and younger vintage logs from successive area than in another, use of this procedure by itself will
exploitation episodes? erase the very geologic changes that are most important!
Check for lithologic trends. Detect regional gradients by This source of error will be reduced if the trial normaliza-
making trend surface maps of the unadjusted log data. tion method is used and regional trends are considered.
Consider other information before finalizing normaliza- Zero can be assumed correct. The analyst adopts zero as
tion work. Don’t pursue the normalization work in isolation Rmin and Wmin for each well. As indicated in Figures 10 and
from what is known about the geology and production tests. 11, the amount of noise in the uncorrected gamma ray val-
Methods and results need to be discussed with the project ues decreases markedly at lower gamma ray values.
team. Any study that covers a large area and does not A hybrid approach uses the zero assumption initially.
include the efforts of a geologist/stratigrapher usually con- Then histograms or crossplots are used to check the results
tains unacceptable artifacts. of the preliminary normalization for an anomalous low-side
Do not overwork the data. There is little to be gained by data pattern. For example, check the log headers for potas-
attempting to wring the last tiny bit of accuracy from the sium chloride muds, look at the local stratigraphy more
data by slightly revising the normalization parameters time carefully, and adjust Wmin in anomalous wells. Experience
after time. suggests that this hybrid approach is the most efficient way
Some curves cannot be repaired. Most curves can be to proceed.
converted to reliable data. However, if normalization can- Another way to handle nonideal lithologies is to start
not give the curve the correct response characteristics, the with the most inherently reliable curves and work toward
curve should be discarded. the least reliable. For example, in a thick clastic sequence
Improve the data if you can. If it is clear that a particular that is only partly consolidated, it may be possible to nor-
adjustment will reduce the noise level, make the adjustment malize the gamma ray quite accurately but the neutron
even if it is also clear that an undesirable amount of noise response varies greatly. The procedure in this case would be
will remain. Consider the case of a gamma ray curve to first normalize the gamma ray and then crossplot the nor-
penetrating mechanically competent 15 API unit limestones malized gamma ray versus the unnormalized neutron curve.
and caving 120 API unit shales. There is no caliper curve. Using trial and error, a Wmax is found that produces the
Normalizing the data as if they had been borehole corrected desired crossplot pattern for each well. Similarly, the com-
will tend to correct for the hole enlargement in the shales in pensated sonic curve (which rarely needs adjustment)
that well. might be crossplotted against the density curve in a carbon-
ate play.
Cases without optimum lithologies
Software considerations
Study areas with nonideal lithologies are frequently
encountered. In these cases, it may be necessary to pursue a Effective normalization software has different impera-