Вы находитесь на странице: 1из 14

Technologies for the Analysis of Gases and Other Substances Dissolved

in Electrical Insulating Fluids


John Hinshaw, Donal Skelly, Thomas Waters, David Bidwell - Serveron Corporation, Beaverton, OR, USA

Abstract
Analysis of dissolved gases (DGA), water, and simple alcohols in electrical insulating fluids allows
electrical equipment operators to monitor for the presence or absence of a variety of faults as well as to
observe normal or accelerated asset ageing. DGA may be accomplished by direct wet measurement or
by analysis of gases extracted from the insulating fluid. Direct measurement of some target compounds
such as hydrogen or water can be performed by various immersed solid-state sensors. Gas extraction in
combination with chromatographic, spectroscopic, and solid-state sensor measurements in the gas
phase is widely applied for more detailed analyses of multiple gases. Standards promulgated by ASTM,
IEEE, and IEC present various methods of extraction and interpretation of substances dissolved in
insulating oil. This paper reviews the background and present state of DGA methods and technologies.

Introduction
Electrical transformers are well known for their robust construction with relatively few assembly
components, most of which are highly reliable. However, replacement costs upon failure are very high,
and loss of revenue from ensuing failures is also large. The time required to replace these assets can be
lengthy, and the collateral damage can be huge. At a time when the industry is losing its deep subjectmatter expertise on transformers and related assets, automation of transformer condition assessment
has become an attractive option.
Even before the notion of condition assessment, it was well understood that transformers were key to
delivering reliable, safe and affordable electricity, and that proper monitoring and maintenance would
be required during their entire lifetime. Initially, the focus was more on operational issues as opposed to
understanding the condition of the asset, and most large transformers would carry a number of sensors
that measured fundamentals such as temperature, load, and voltage, as well as cooling status and so on.
One of the early tests carried out to ascertain transformer condition was dissolved gas analysis (DGA).
DGA was used initially in the late 1920s and was introduced into routine transformer assessment in the
late 1960s by pioneers such as Dr. James Morgan, who worked with Hydro Qubec researchers and with
laboratories such Doble Engineering. DGA has since become a defacto standard as it can provide a
wealth of information about the asset and its possibility of failure.
Today, analysis of insulating fluids is performed using a number of electrical, chemical and physical tests,
many of which are listed in Table 1

Table 1. Common tests conducted on oil samples. List of applicable standards is not comprehensive.

Most common and


routine tests

Electrical

Chemical

Test Type

Standard

Dielectric breakdown voltage

ASTM D811 or D1816, IEC 61156

Power factor

ASTM D924

Resistivity

ASTM D1179

Water content

ASTM D1533, IEC 61814

Oxidation inhibitor

ASTM D2668, D4788

Acid content

ASTM D974

Polychlorinated biphenyls (PCB)

ASTM D4059

DGA

ASTM D3612, IEC 60567

Specific gravity (Density)

ASTM D1298

Color

ASTM D2129

Viscosity

ASTM D445, D7042

interfacial tension

ASTM D971

Physical

Even in the United States it is difficult to pinpoint the exact number of large power transformers with
ratings of 115KV / 100MVA and higher, but estimates run at about 30,000 units today. Of these, 70% are
older than 30 years, and the average age of the largest assets approaches more than 40 years, hence a
very large number are at or near the end of their design life. Operators of these assets face substantial
replacement costs as these transformers age out. Optimally predicting end-of-life and managing which
will be replaced or refurbished is a complex ongoing task that can be made more effective by the
application of appropriate diagnostic tests, including DGA.
On large transformers, manual oil samples are generally taken every 1-3 years, and more often when
abnormalities are present in some cases as often as weekly. In addition to DGA, other non-chemical
tests such as frequency response analysis (FRA), winding resistance, and core ground testing, for
example, provide additional diagnostic information. When this information is considered as a body and
in light of published interpretive guidelines, higher-quality actionable predictions of transformer health
and likelihood of failure can be obtained. This type of condition-based monitoring (CBM) holds
tremendous potential for better management of the ageing electrical asset population. A recent Cigr
bulletin (1) presents guidelines and experience with CBM.

Table 2. Principal transformer fault gases measured by DGA

Gas

Source

Carbon Monoxide (CO)

Thermal breakdown of oil and/or insulation

Carbon Dioxide (CO2)

Thermal breakdown of oil and/or insulation

Hydrogen (H2)

Low-energy coronal discharge

Ethane (C2H6)

Low-temperature faults

Methane (CH4)

Low to intermediate temperature faults; low-energy


electrical discharges

Ethylene (C2H4)

Intermediate temperature faults; low to intermediate


energy electrical discharges

Acetylene (C2H2)

High temperature faults and high energy electrical


discharges / arcing

Oxygen (O2)

Indicates condition of seals. Decreasing levels may


indicate internal oxidative faults.

What is DGA?
Mineral oil is the most commonly used transformer dielectric fluid, although this is gradually changing.
Alternative fluids such as silicone and ester oils are increasingly deployed for smaller distribution
transformers as well as those with a higher potential for collateral damage upon failure. Gases are
formed in the oil and the structural members of the transformer by normal and abnormal thermal,
electrical, and chemical conditions. Analysis of the volume, type, proportions, and rates of gas
production yields a substantial amount of diagnostic information about the nature and future trends of
normal ageing and abnormal internal faults. Transformer insulation and oil will degrade faster under
incipient fault conditions, and will do so in ways that are characteristic of the type of fault or faults at
hand. For this reason it is important to test oil samples regularly to ensure that degradation has not
gone too far. The normal, non-faulting degradation rate depends on a number of factors such as
transformer age, water content, operating temperature, amounts and types of contaminants, and the
amount of oxygen in the oil. How regularly these tests should be conducted will depend on whether any
known fault condition exists in the asset, the critical nature and size of the transformer, and whether
any known problems exist with a particular family of transformer designs.
Error! Reference source not found. lists the principal DGA transformer fault gases. Other gases and
chemicals are formed as well. Propane and propylene are sometimes measured by DGA and provide an
additional level of certainty for some fault conditions. Furans, in particular 2-furaldehyde determined by
liquid chromatography (LC), indicate the condition of cellulosic insulation, as does the presence of
methanol and ethanol, for which a new ASTM GCHeadspaceMass-Spectrometric laboratory method is
3

under development. Nitrogen and oxygen are always present although related to the type of oil
conservator system: sealed, nitrogen-blanketed, or air-breathing. The amounts of nitrogen and oxygen
can indicate the quality of the transformer seals, as well as provide some information about internal
oxidation processes.
A thorough discussion of the relationships between fault gases and fault types is beyond the scope of
this paper and is in fact still developing through the work of many individuals and organizations.
Several diagnostic tools that utilize gassing levels, ratios, and trends have been in use for many years.
Notably, Michel Duval (2) recently proposed a pentagonal graphic that depicts the relationships
between fault gas formation and type of fault for five key fault gases, as shown in Figure 1.

Figure 1. Duval Pentagon #1. The plotted data points (red x) correspond to the centroid of the pentagon with apices at the
percent composition for each gas relative to the sum of the five gas concentrations. In this case the gas concentrations were:
Hydrogen, 4.9 ppm; Ethane, 66.0 ppm; Methane, 44.2 ppm; Ethylene, 109.6 ppm; Acetylene, 0.0 ppm. Small blue pentagon:
area of uncertainty for highly accurately known gas concentrations. Large orange pentagon: area of uncertainty for less
accurately known gas concentrations. Point (X) in red represents the most recent result; additional earlier points can be
discerned behind it.

The well-known original Duval Triangle utilizes ratios of three key gases methane, ethylene, and
acetylene and other related triangles include combinations with hydrogen and ethane. The pentagonal
version incorporates all five in a single view that also nicely depicts the progression from formation of
mostly hydrogen at low energies through ethane, methane, ethylene, and finally acetylene at the
4

highest energy levels. Consideration of the carbon-oxide (CO, CO2) levels along with these graphical
presentations can reveal much about the state of a transformer.
The uncertainty in the location of a data point in the triangular or pentagonal DGA plots is a function of
the accuracy and repeatability of each gas concentration. Generally, concentrations greater than five
times the minimum detectable amounts, and that are known to be accurate within 15% or better after
correction for known measurement bias, can provide meaningful interpretive results. Less precise or less
accurate values can result in ambiguities in the fault-type zone (T1, T2, T3 and so-on) that will make it
more difficult to positively determine appropriate action. Figure 1 shows the effects of accuracy and
precision with two pentagonal areas around the data points. The smaller blue area depicts a higher
degree of accuracy and precision while the larger orange area indicates the effects of comparatively less
accuracy and worse precision.
The desired maximum amount of uncertainty in the diagnostic result drives requirements for oil
sampling and analysis. The analytical results upon which meaningful interpretation and diagnostic
conclusions can be made therefore must be of sufficient quality to support any proposed actions on that
basis. With frequent acquisition of new test results comes the ability to quickly react to trends that
otherwise would not be apparent with fewer DGA analyses acquired at longer intervals.

DGA in the Laboratory


ASTM International standard D3612 and International Electrotechnical Commission (IEC) standard 60567
specify methods for laboratory dissolved gas analysis in transformer oils. These and related standards
specify ways in which oil is to be sampled from a transformer, transported to the laboratory, and stored
until testing can take place. The proper execution of the sanctioned procedures requires skill, training,
and experience both on the part of the engineer collecting the sample and the laboratory technician
processing it when it reaches the lab, as well as embedded into the laboratory equipment and sustained
by periodic maintenance.
Standard laboratory DGA technique involves extracting or stripping the gases from the oil followed by
injecting them into a gas chromatograph (GC), which separates the gases in time as the analysis
proceeds. Gas detection involves a flame ionization detector (FID) and a thermal conductivity detector
(TCD). Most systems also employ a catalytic methanizer, which converts the carbon monoxide and
carbon dioxide into methane for more sensitive detection on the FID. Laboratory DGA systems are
calibrated with gaseous standards as well as with oil standards that contain known amounts of the
dissolved gases. Individual dissolved gases generally are detectable at low ppm concentrations.
Figure 2 shows the main steps involved in sampling and analysis for laboratory DGA. An oil sample is
withdrawn from the transformer in accordance with a standard such as ASTM D923 or IEC 60567 and
transported to the laboratory for storage and eventual transfer into a gas-extraction device. After
extraction the gases are transferred to a gas chromatograph for separation and measurement.

Gas Chromatograph

Transformer

Carrier Gas
Supply
(8.2 psig)

Manual Injection

Vent

Oil Sampled into


Syringe or Bottle

1
0

9
1.0
mL

1
2

Transportation

4
6

Thermal
Conductivity
Detector

Channel B

Porapack Column
(A)

Channel A

Molecular Sieves
Column (B)

1.0
mL

Oil Sampled From Syringe or


Bottle
Extraction Device
Vacuum (A)
"Stripper Column (B)
Headspace (C)

Transformer Oil

Transformer Gas

Figure 2. Steps in laboratory DGA. Manual injection is for method A only; others generally connect directly to the GC inlet.
Flame ionization detector and methanizer not shown. (A), (B), and (C) refer to the ASTM D3612 method designations. The GC
system is an example configuration; many others have been used in service.

Extracting the gas from the oil is one of the more difficult and critical portions of the procedure. Careless
manual transfer of the oil sample can result in the loss of volatile gases such as hydrogen and carbon
monoxide along with suffering the incursion of atmospheric gases such as nitrogen, oxygen, and carbon
dioxide. The original mercury extraction method is still in use, but it has been deprecated by the
standards bodies due to the use of mercury and expensive glassware, let alone as it is a complicated
setup that requires expertise to obtain meaningful results. Instead, mercury-free vacuum extraction
methods and equipment are favored.
Another laboratory DGA standard method utilizes headspace sampling of the dissolved gases. This
technique involves sampling an exact volume of oil into a gas-purged and sealed headspace vial. The gas
in the oil migrates into the headspace above the oil until equilibrium is achieved and the gas
concentrations in the oil and headspace are no longer changing. After a predetermined sample
equilibration time, an autosampler removes a portion of the gas from the headspace vial and injects it

into the GC. The advantage of this method is that it is automated and reduces the risk of operator error
from excessive handling of the sample during preparation and injection.
In both extraction and headspace methods, the concentrations of the fault gases depend on the relative
gas and oil volumes in the vial as well as upon the temperature dependencies of the gas solubilities in
the oil. None of the extraction techniques completely removes all the gases from the oil due to the
differing solubilities of each gas in the oil, although multiple extraction steps can approach this state.
Careful calibration and measurement are required for the best results.
Laboratories must work with commercial suppliers to obtain gas and gas-in-oil standards or they must
prepare the standards themselves. Repeatability and accuracy are also of the utmost importance as
small changes, even several ppm in some cases, can mean the difference between diagnosing an active
incipient fault condition that requires immediate attention or one that is stable and requires no
attention.

On-line DGA Technologies


Online DGA sampling evolved out of the need to obtain samples on a more frequent basis. Analyzing oil
samples from large transformers on a 1-3 year basis may be statistically acceptable and relevant, but
failures can develop very quickly within a transformer, over a matter of months, weeks, or even days. A
transformer could often be saved from failure, catastrophic or otherwise, if DGA oil samples were tested
on a more frequent basis. Accordingly, the introduction of online hydrogen monitors in the late 1980s
and of multi-gas monitors in the early 2000s provided timely enhancements to manual DGA analysis.
Both types provided the ability to provide analysis results as frequently as hourly.
Single-gas monitoring
Online monitoring systems have evolved from single to multi-gas capabilities. Initially, hydrogen
measuring devices provided necessary alarms but only as related to measurement of hydrogen and
some of the other combustible gases. In one hydrogen monitor design, the gases migrate through a
permeable polymer membrane and into a fuel cell to create an electric current proportional to the gas
concentration in oil. Another variant uses a bundle of tubes to extract hydrogen from the oil and then
measures the hydrogen concentration with a thermal conductivity detector. These devices function like
a smoke alarm, which signals for an oil sample to be extracted and thoroughly tested in the lab. These
H2 sensors provide less than one hundred percent selectivity for hydrogen, and they respond to a lesser
degree to other combustible gases such as acetylene or carbon monoxide. Even so, they do provide a
valuable indication that there is some activity in a transformer that may need to be investigated.
More recently, in 2012, a new generation of hydrogen DGA devices emerged. These utilize an oilimmersed catalytic hydrogen sensor without membrane, fuel cell, or TCD, and they respond selectively
to hydrogen only. The relative simplicity and more direct measurement means they can deliver better
accuracy and reliability than first-generation hydrogen-monitoring devices.
Discrete sensors for carbon monoxide or dissolved water also are available, although usually they are
found in combination with another sensor, usually hydrogen. There are a number of dual-gas devices
that combine a pair of sensors to provide additional information. For example, a carbon monoxide
7

channel alongside a hydrogen channel can help discern whether a developing fault is related to the
transformer insulation or more to internal parts exposed to oil alone, while a hydrogen and water
sensing device relays the presence of excess water, an indicator of insulation ageing.
Multi-gas monitoring
From around the year 2000 to the present, multi-gas online DGA monitors have brought laboratory
analysis capabilities directly to generation and substation locations. Multi-gas DGA systems measure
some or all of the principal DGA fault gases by combining gas-extraction technologies connected directly
to the transformer oil tank with laboratory-quality analytical measurement engines. DGA diagnostic
software provides an indication of the nature of detected fault conditions as well as a relatively datadense view of gassing levels and trends over time. Error! Reference source not found. shows a typical
gas concentrationtime graphical presentation from an online DGA monitor in which the development
of a serious fault is observed over a very short period.

Figure 3. Monitoring from fault to shutdown of a 3-phase, 1100 MVA, 345 kV GSU transformer. Time from first detection of
faulting to decision to shut down is slightly more than 48 hours.

The capability to obtain this detailed DGA data comes with a secondary requirement: an operator must
observe the data more frequently in order to react in a timely manner. For this to occur there must be
available computing equipment and data channels to bring the data to the decision-maker. Online
monitors make their data available in a number of ways that range from simple status and alarm
condition lights on the monitor itself to implementing more sophisticated standards such as IEC 61850,
DNP3, and Modbus, thus integrating well with existing data channels in many locations.
8

Relying upon simple status alarms from a DGA monitor or other attached measuring equipment reduces
the burden of observation but also increases the chance that a non-alarming, developing fault condition
may be missed. Multi-disciplinary condition-based monitoring systems combine information from
multiple sensing elements in the transformer and other assets to present a more comprehensive tactical
view that will better assist with the decision-making process.
Gas Extraction
Obtaining DGA data is a two-step process: (1) gas extraction followed by (2) gas measurement. In an
online system these two are combined into a single unit, while in the lab they are most often separated
into discrete functional devices. Two types of gas extraction system are commonly employed for online
monitoring: a headspace type of gas extraction, or a membrane-based extractor.
A headspace design, not shown here, captures a fixed amount of oil and then sparges the oil with a
mixture of outside air and transformer gas in a dynamic headspace sampling arrangement, without a
membrane. This type of system must then remove any added air before recirculating the measured oil
back to the transformer. Care must be taken in the design to prevent the entrainment of oil aerosols
into the gas-side circulation paths that lead to the gas analyzer.
A gas-permeable membrane prevents liquid oil, foam, and aerosols from entering the gas space that is
connected to the gas analyzer while allowing the DGA gases to permeate through. The membrane does
not affect the equilibrium concentrations that the gases attain, but it does influence the rate at which
gases permeate between the oil and gas spaces. Similarly to the laboratory headspace sampling method,
gases partition through the membrane or a bundle of smaller membranes until equilibrium is reached.
A simplified diagram of a GC-based online DGA monitor with membrane extraction is shown in Figure 4.
The oil side of the extractor connects the system directly to the transformer tank in a sealed loop, which
eliminates the variability that can be associated with manual oil sampling. The oil circulation system is
sealed, no fault gases can exit the system, and atmospheric gases cannot enter. A small amount of the
dissolved gases passes from the continuously circulating oil through a gas-permeable membrane for
measurement. The volume of gas removed from the oil is on the order of milliliters, an insignificant
fraction of the total volume of dissolved gas held in the transformer tank; this has no effect on the total
dissolved gas levels.
In either headspace or membrane extraction, the fault gases presented to the analytical measurement
system by the extraction system truly represent the gases dissolved in the oil. However, as is the case
with laboratory analysis, gas levels at the analytical measurement system after extraction are not equal
to their levels in oil. Each gas has a different solubility in oil and in addition, temperature and pressure
differences along the sampling train from extractor to analytical engine must be accounted for with
calibration.

Oil To/From
Transformer

Carrier Supply
Extractor

Gas Chromatograph

Pump

Transformer Oil

Carrier Gas
Supply A

Transformer Gas

Carrier Gas
Supply B

Membrane
Pump

Sample
Loop

Detector
Channel A

Channel B

Porapack
Column (A)

Molecular Sieves
Column (B)

Figure 4. Online DGAGC system basic diagram. Dissolved gases pass through a gas-permeable membrane into a gas plenum
and the GC sample loop. When ready, the loop contents are transferred to the GC columns for separation and measurement.

According to a Cigr report (3) most online DGA monitors are calibrated in the factory under tightly
controlled conditions; some to gas-in-oil standards and others to carefully obtained laboratory results. In
the Cigr report, a tested population of online DGA monitors was compared to over 200 carefully
obtained laboratory DGA results following recommended best practices all of which fell within a
range of from about 10% to 20% between laboratory and on-line. These results are consistent with
the IEC recommendation (4) for accuracies of 15% as acceptable performance for making operational
decisions based on DGA data, at gassing levels of 5 times the limit of detection or greater.
Thus, the experience of a skilled group of operators under the best conditions has demonstrated that
online DGA monitors are capable of providing sufficiently accurate data for reliable decision making.
However, experiences across the larger population of globally installed monitors may not be congruent.
10

Some operators have observed larger discrepancies between online and laboratory DGA results and,
given the existing historical laboratory DGA records, tend to question the accuracy of the online
monitor. In light of the available well-controlled studies, when this situation arises it is appropriate to
review the entire DGA process for both lab and on-line analysis and attempt to identify specific root
causes for the data inconsistencies.

Figure 5. Chromatogram of DGA gases. This separation makes use of a porous polymer column to separate CO2 and the C2
hydrocarbons plus a molecular sieves column to separate H2, O2, N2, CO, and CH4. Helium carrier gas, TCD detection.

Gas Measurement
After gas extraction procures a gas sample for analysis, a gas measurement system determines the gasphase concentrations of all of the fault gases of interest, which are then converted via calibration to gasin-oil concentrations and reported for further diagnostic work-up. Three types of analytical
measurement systems have been deployed in online DGA monitors: gas chromatography, infrared
spectroscopy (IR), and solid-state sensors. From a historical point of view, GC is the DGA reference
technique stipulated in the ASTM and IEC laboratory standard procedures.
Gas chromatography
In a gas chromatography analysis, the compounds of interest the DGA fault gases are presented
simultaneously into the beginning of one or more separation columns. Carrier gas flow argon or
helium in most cases moves each substance through the column or columns at different rates, as a
11

consequence of selective interaction of the columns internal stationary phase with the gases. The
compounds emerge from the end of the column in sequence and, ideally, completely separated from
each other. They are transferred to one or more detectors that respond in proportion to their
concentrations: the detectors responses are measured and converted to ppm values. Thus in a welldesigned GC system, all of the DGA gases can be separated and measured independently of one
another. A typical GC separation of DGA gases is shown in Figure 5.
Gas chromatography systems, by virtue of their complete gas separation, can easily be calibrated and
verified with an in-field gas standard.
Infrared spectroscopy
Infrared (IR) spectroscopy is an alternative technique to GC for the measurement of DGA gas
concentrations. Carbon dioxide, carbon monoxide, and the hydrocarbons methane, ethane, ethylene,
and acetylene, as well as propane, propylene and longer chain length molecules, absorb infrared light in
various bands in the mid-IR (MIR) wavelength range of 3 12 m. Unlike GC, IR does not require a
carrier gas to achieve a separation. The gases are instead measured all together in a mixture, that is, all
of the extracted gases are present simultaneously in an IR detector. So, also unlike GC, where the gases
encounter a nonselective detector one at a time, in IR the detector is operated so that its selectivity can
be adjusted during the analysis, ideally to respond to each of the gases individually. This is accomplished
by observing a sequence of restricted wavelength ranges within which the various gases absorb varying
amounts of infrared light.
The light absorption characteristics the spectra of the DGA gases are shown in Figure 6. While
carbon dioxide, carbon monoxide, and acetylene have well-defined spectral regions with little
interference from other compounds likely to be present, methane and the other hydrocarbons present
overlapping spectra. This is in contrast to GC where all of the compounds are separated before arriving
at the detector. When these gases are present at the same time, any single optical band in their shared
absorption range contains varying responses from each of the gases. Calibration can establish the
responses at multiple wavelengths for each pure gas, but finding individual gas concentrations from the
responses to a mixture is more complex.
In many cases it is possible to pull the individual gas absorptions apart using high-resolution spectral
techniques such as Fourier-Transform Infrared (FTIR). However, deploying such a sophisticated
instrument at a transformer is neither practical nor cost-effective. Instead, IR DGA monitors rely on
nondispersive infrared (NDIR) measurements by placing a series of selective IR bandpass filters into the
light beam optical filters to restrict the observed wavelengths.
Multivariate deconvolution techniques are employed to compute individual gas concentrations from
multiple measurements of the overlapping hydrocarbon absorptions. Calibration of an IR system for this
type of analysis requires careful measurement of the spectral responses to each gas individually, and
thus it is not particularly practical to fully recalibrate in the field as this would require presenting the
system with each pure gas at multiple concentrations.
During gas concentration measurement, the absorption of infrared light by DGA gases can be measured
either by the attenuation of a light beam in the presence of the DGA gases compared to the intensity of
a 100-percent transmitted beam in their absence, or with a photoacoustic spectroscopic (PAS) detector.
12

Figure 6. Infrared spectra of DGA gases. Note overlapping spectra for the hydrocarbons including propylene, both in the 7 12
m range as well as at shorter wavelengths around 2 4 m.

In PAS, the absorption of light energy causes the DGA gas sample to heat up slightly. When the incoming
IR light is pulsed, the resulting pressure pulsations caused by gas sample heating and cooling are picked
up by ultra-sensitive microphones. The pulsed signal is demodulated by treating the pulse train as a
carrier wave, to yield a measure of the intensity of light absorption. PAS can be sensitive into the subppm range and is relatively stable. It is a direct absorption measurement method in which zero
absorption corresponds to zero signal.
Solid state sensors
First to be deployed for on-line DGA, solid-state sensors are now found wherever other analytical
measuring technologies lack a response to one or more of the gases of interest. In particular, since
symmetrical molecules such as H2 and O2 are not detected by PAS, corresponding solid-state sensors are
found in PAS-based online monitors. Such devices may be deployed directly into liquid oil, or as gasspace sensors. The choice really depends on how the rest of the system is designed. For example, if air
might be admitted into the gas-sensing area of a monitor, then an oxygen sensor should be placed in the
liquid oil and read out before any air is let in. Hydrogen sensors are available for in-oil or in-gas use as
well. Suitable compensation is again required for gas-in-oil solubility, pressure, and temperature. GCbased monitors can measure hydrogen and oxygen directly with a TCD, as part of the chromatogram,
and nitrogen is detected as well, and so solid-state sensors are not usually found in such systems.
Water, while usually not thought of as a gas, is another DGA component of interest. Measurement of
dissolved water in oil is carried out in the laboratory by automated Karl-Fischer titration. In on-line
13

systems, water is conveniently determined with a capacitive humidity sensor that directly measures the
relative saturation level of water (RH) in the oil. The RH value can be converted to concentration with an
equation as a function of the oil temperature.
Water does come across a membrane extractor, and there is nothing to stop it from being extracted in a
headspace system. In high concentrations, water can interfere with IR detection, as can be seen from
the broad water absorption spectrum in Figure 6. Likewise, in a GC system the presence of water must
be accounted for, and any water peak must not be allowed to be co-eluted into the detector along with
any DGA gases of interest.

Conclusion
The critical nature of transformers and the recognition that they need continuous maintenance and a
thorough understanding of multiple potential failure processes has raised the importance of dielectric
fluid analysis to the forefront. This is part has been driven by the need to obtain better and faster
analyses and a better methodology of defining the health of the asset. We have seen that the laboratory
provides an excellent picture of the transformer, but because samples and analysis cannot realistically
be taken more frequently, the emergence of online DGA is helping to obtain a better understanding of
developing incipient fault conditions as well as the ongoing health of the transformer.
Multiple technical approaches have been used to obtain online DGA data. Sufficiently sensitive,
accurate, and repeatable results can form a strong basis for interpretation and diagnostics. Current
efforts to develop multi-disciplinary condition-based monitoring systems will bring the science of
transformer diagnostics to new levels, accompanied by better management of these critical energy
system assets.

References

1. Cigr Working Group A2.44 Guide on Transformer Intelligent Condition Monitoring (TICM) Systems,
Technical Brochure 630 (Cigr, Paris, 2015).
2. M. Duval, The Duval Pentagon, A New Complementary Tool for the Interpretation of Dissolved Gas
Analysis in Transformers, IEEE Electrical Insulation Magazine, 30 (6), 912 (2014).
3. Cigr Working Group D1.01 (TF15), Report on Gas Monitors for Oil-Filled Electrical Equipment,
Technical Brochure 409 (Cigr, Paris, 2010).
4. IEC 60599 3.0 (2015).

14

Вам также может понравиться