Вы находитесь на странице: 1из 12

Proceedings of the 2012 9th International Pipeline Conference IPC2012 September 24-28, 2012, Calgary, Alberta, Canada

IPC2012-90372

CEPA DEVELOPMENT OF A METAL LOSS ILI ACCEPTANCE PROCEDURE


Guy Desjardins Desjardins Integrity Ltd. Calgary, Alberta, Canada Reena Sahney Jiva Group Inc. Calgary, Alberta, Canada ABSTRACT This paper presents guidelines to enable a pipeline operator to assess whether the results of a metal loss in-line inspection (ILI) should be deemed acceptable. Acceptance depends on passing two checks, Verification and Validation. Verification is the process which ensures that all planning, preparation, execution, and analysis of the inspection were conducted according to ILI vendor processes to meet specification and prescribed industry standards. It assumes that a successful run is a consequence of good planning, preparation, execution and analysis. Validation is the process by which the results of an ILI run are compared to independent measurements. The Validation process depends on the results of the ILI and available information. Passing both the Verification and Validation checks is required to accept of the ILI run. INTRODUCTION The Canadian Energy Pipeline Association (CEPA) represents Canadas transmission pipeline companies who operate more than 100,000 kilometres of pipeline in Canada and the United States. CEPA member companies move approximately 1.2 billion barrels of liquid petroleum products and 5.5 trillion cubic feet of natural gas each year. Its members transport 97 per cent of Canadas daily crude oil and natural gas from producing regions to markets throughout North America. Ryan Sporns Enbridge Pipelines Inc. Edmonton, Alberta, Canada Joe Yip Desjardins Integrity Ltd. Calgary, Alberta, Canada The CEPA Pipeline Integrity Working Group identified an opportunity to develop further guidance on the acceptance of metal loss ILI data. The purpose of an ILI run is to address one or more threats to the integrity of a specific pipeline. Acceptance of a run means that the ILI vendor and operator believe that the ILI run can be used to adequately assess the threat(s). If the inspection is rejected, then the threats to the pipeline (or portions of it) are not adequately addressed by the inspection. In some cases, rejection of an inspection may require rerunning the inspection, but in other cases, the threat can be addressed by other integrity management actions. In practice, inspection results usually enable the operator to assess the threat on most of a pipeline, but there are potentially localized areas where the data has been compromised in some way, and the uncertainty associated with the inspection data may require an alternative management of the threat. The current practice in the industry to accept an inspection is to conduct a validation procedure whereby the depths of metal-loss anomalies reported by the inspection are compared to non-destructive testing (NDT) assessments of metal-loss depth as measured in an excavation of the pipeline. However, in situations where no actionable anomalies are reported (i.e., no reported anomalies threaten the integrity of the pipeline), the value of excavating shallow anomalies to validate the ILI run is limited. At best, excavations provide a limited number of

Copyright 2012 by ASME

comparisons at a few isolated locations along the pipeline. The expectation that an ILI run can only be accepted through excavations of features does not properly recognize the effort involved in the design, construction, testing, validation and quality assurance and control processes completed by the ILI vendor before an ILI is allowed into production. In addition, numerous, pre-planning, post-analysis, data quality assurance and control checks are done to ensure that the quality of the data will meet the stated specification and match the quality of previous testing and inspections. These steps reduce the value of excavating a small sample of non-injurious features in order to accept an ILI and increase the importance of the key quality parameters measured throughout the inspection and analysis. This paper proposes guidelines to promote consistent acceptance of metal loss ILI results and demonstrates that shifting expectations away from a purely statistical analysis to a more mechanistic verification of the technology, planning, preparation, execution, and analysis is a more appropriate method towards acceptance. These guidelines are worded to allowed flexibility in how an owner and ILI vendor would apply them to metal loss ILI acceptance and are consistent with existing industry standards [1, 2]. NOMENCLATURE Actionable Anomaly: an unexamined deviation from sound pipe material or weld which poses a risk to the pipeline. Validation: the check that the data collected in the inspection is accurate by comparing the results to independent sources. (Adapted from ASME) Verification: the check to ensure that all procedures, as defined by industry standards and recommended practices, have been followed faithfully in the planning, execution, and analysis of the ILI data. (Adapted from ASME) OVERALL PROCESS The ostensible purpose of this project was to define guidelines for a more comprehensive acceptance of a metal loss ILI run, in particular, when there are no actionable anomalies reported by the inspection. The process requires the definition of a holistic and comprehensive approach following the delivery of the final report. The acceptance of an ILI run depends on passing two checks, Verification and Validation. Figure 1 shows the overall procedure. The Process Verification procedure consists of an elevenpoint checklist to ensure that run planning and

execution were conducted according to established standards and identify conditions which may affect the quality of the data. The Data Validation compares the inspection results to a reference dataset to check that the inspection results are consistent with the specified tool performance. The Validation process varies depending on the inspection results and available reference datasets.

Figure 1 Overall process for accepting a run includes Process Verification and Data Validation.

PROCESS VERIFICATION Introduction Process Verification is a systematic and consistent approach to ensure that all proper procedures were undertaken by the operator and ILI vendor prior to, during, and after the inspection. The fundamental premise of the methodology is that highquality ILI data is a consequence of technology, planning, and execution. Process Verification consists of an eleven-point checklist regarding various aspects of the tool plus a Cumulative Assessment. The checks are 1. Tool Selection, 2. Historical Performance of the Inspection System, 3. Planning, 4. Pre-run Function Check 5. Pre-run Mechanical Check 6. Procedure Execution (e.g., pigging procedure, tool speed, etc.,), 7. Post-run Mechanical Check 8. Post-run Function Check 9. Field Data Check, 10. Data Analysis Processes: Quality Check, and 11. Cumulative Assessment For each of the first 10 parameters, the operator would follow the flowchart shown in Figure 2.

Copyright 2012 by ASME

Guidance on each decision node is provided below. Each parameter is given a score of Pass, Conditional pass or Fail. Process Verification is a systematic and consistent approach to ensure that all proper procedures were undertaken by the operator and ILI vendor prior to, during, and after the inspection. All checks must have a Pass or Conditional pass to be accepted. Once these ten parameters have been assessed, a cumulative assessment of all parameters (and potential deficiencies) is then reviewed cohesively to ensure that results are still deemed to be tolerable. The Cumulative Assessment is the final score for the Process Verification phase. The possible scores are Pass, Fail and Ambiguous. If a score of Ambiguous is assigned, further analysis is required to manage the uncertainty with the ILI data. If the operator can be satisfied that all aspects of the inspection were conducted properly, then the operator can then be assured that the ILI run is highly likely to meet its performance specification. Figure 2 shows the flowchart used to score each of the first ten parameters of the checklist. Box A of the flowchart is simple to evaluate. If the inspection passes the guidelines, then the score for that parameter on the checklist is Passed. However, if there are any questions concerning the score, and it is not obvious that the inspection met the guidelines, then the operator proceeds to Box B. For Box B, if the operator determines that even though the inspection did not out-right pass the guidelines but the impact on the data is not significant, then the operator will assign the parameter a Conditional pass on the checklist. The operator must ensure that the uncertainty caused by the parameter is properly managed. If the effect or the data is significant, then the procedure goes to Box C. In Box C, the operator considers if the problem is localized. If the issue is wide spread then the operator proceeds to Box D. However, if the issue is localized to a limited number of short segments, then the procedure moves to Box E. If the problem is not localized, the operator will consider Box D. If a re-run can fix the problem then the operator will assign the parameter a Fail which leads to rejecting the inspection. However, if a re-run will not remedy the situation then the issue is not an ILI problem (Box F) and engineering judgment is required as to determine how to proceed. In Box E, alternative methods to address localized issues are considered. If the operator decides that the issue can be best addressed by other means then a Conditional Pass or Pass may be assigned.

Figure 2 Flowchart used to determine the score for the first 10 parameters from the Verification checklist.

Checklist Guidance The first ten parameters of the Process Verification checklist can be divided into the categories of planning and preparation, pre-run activities, execution, post-run activities, and analysis. The Planning and Preparation checks include the Tool Selection (1), Inspection System (2), and Prerun Planning (3) checks. The pre-run activity checks include the Pre-run function (4) and Pre-run mechanical (5) checks. These two pre-run checks ensure that the inspection tool is in working condition prior to the inspection. These items may seem to be moot if the inspection tool passes its post-run checks after the completion of the inspection; however, the documentation of these checks is required by API 1163 and indicative of the ILI vendors diligence in following established Standards and Recommended Practices. The Procedure Execution (6) checks verifies that the procedures during the running of the tool. The Post-run checks include the Post-run mechanical (7), Post-run function (8), and Post-run field data (9) checks. These three post-run parameters check the group of activities that an ILI vendor

Copyright 2012 by ASME

carries out to ensure the functional and mechanical integrity of the tool and the integrity of the data upon receipt at the end of the run. The Data analysis process check (10) verifies that the data was properly handled and analyzed by the vendor in the production of the final report. Tool Selection Check Given the wide array of tools currently available from a number of different ILI vendors, the Tool Selection check ensures that an appropriate tool has been selected in light of the expected defect type(s) on the pipeline. The purpose of this check is to ensure that the inspection tool is capable of assessing the specific threat on the pipeline and identify the strengths and limitations based on the vendor specification. Its inclusion ensures that an inspection conducted to address one threat is not also used to address threats to which the tool is not well suited. A person not previously involved in the running of the inspection should verify that the tool is adequate for the job and the threat management expectations are well understood between the owner and ILI vendor. A Fail score is given if the dimensions of a potentially injurious defect are beyond the performance specification of the selected tool. A Conditional pass is assigned if the tool is capable of detecting the anomaly type(s) but has limited sizing or detection abilities of expected anomaly type(s). A Pass score is assigned when the best available technology for detecting and sizing the expected anomaly type(s) identified is used. Inspection System The Inspection System check ensures that the inspection tool used in the inspection has a history of successful runs and that the inspection system is likely to perform successfully. Whereas the emphasis of the Tool Selection check is to ensure that the technology is capable of detecting and sizing the anomalies, the motivation of this check is to ensure that the inspection system is able to deliver quality data as demonstrated by its history of successful runs and may include a review of the original acceptance testing done by the ILI vendor. A Fail is assigned if the tool used is experimental and there is no established history or if the tool has been demonstrated to have deficiencies in addressing the threat. A Conditional pass is assigned if the operator has history of successful runs to assess the threat, with the same model of tool with minor differences. Alternatively, a Conditional pass is assigned if the specific model of tool has history of successful runs to assess the threat for other operators, but results of those runs are not available. A Pass is assigned if the operator has firsthand

knowledge of the performance capabilities of the tool and has several successful inspections using the tool. Pre-run Planning The Pre-run Planning check covers the group of activities prior to executing an in-line inspection. The operators are referred to NACE RP0102-2002 (Sections 4, 5 & 6) for details of the types of activities that are typically undertaken as part of prerun planning. As part of the planning procedure, the ILI vendor and operator should work together to ensure a successful run. Planning includes, but not be limited to completion, of a pre-run questionnaire, pipeline cleaning, pipeline geometry assessment, development of an inspection procedure, inspection scheduling, logistics as well as ensuring appropriate product type, flows and pressures. A Fail is given when key elements of the pipeline ILI compatibility assessment and inspection scheduling are not conducted. A Conditional pass is assigned if the majority of the elements of the pipeline ILI compatibility assessment and inspection scheduling were completed but undocumented. A Pass is given when all elements of the pipeline ILI compatibility assessment and inspection scheduling were completed and documented. Pre-run Function Check The Pre-run Function check verifies that the ILI vendor conducted the group of activities to ensure the functional integrity of the tool prior to loading the inspection tool into the launcher barrel. As such, function checks are expected to be specific to each vendor and technology used. The function checklist should be provided by the ILI vendor, and items should be standardized and identified in advance of the inspection. These include, but are not limited to appropriate initialization of all components, the adequacy and availability of the power supply, confirming sensors are operational, and confirming adequacy and availability of data storage. A Fail is given if significant pre-run check did not pass. A Conditional pass is given if significant pre-run checks passed but checks are undocumented. A Pass is given if all pre-run checks are passed and documented. Pre-run Mechanical Check The Pre-run Mechanical check verifies that the ILI vendor conducted the group of activities to ensure the mechanical integrity of the tool prior to loading the inspection tool into the launcher barrel. As such, pre-run mechanical checks are expected to be largely visual and specific to each vendor and technology used. The pre-run mechanical checklist should be provided by the ILI vendor, standardized, and identified in advance of the inspection. These include,

Copyright 2012 by ASME

but are not limited to general visual inspection, confirming good pressure seals around electronic components, ensuring adequate integrity of cups, barrel tests and ensuring all wheels are intact and moving appropriately. A Fail is given if significant pre-run check did not pass. A Conditional pass is given if significant pre-run checks passed but checks are undocumented. A Pass is given if all pre-run checks are passed and documented. Procedure Execution The Procedure Execution check verifies that the ILI vendor and operator conducted the group of activities that needed to be undertaken in executing a successful in-line inspection. These include, but are not limited to Check the tool run was executed as per the planned pigging procedure; Check that the line-condition parameters (fluid composition, flow rate, temperature, and pressure) were in accordance with the planned procedure; Check that the line conditions for tool launch were as expected and launch proceeded as planned; Check that the line conditions for tool receive were as expected and received proceeded as planned; Check that the tool speed was within the planned range for the length of the run; (If deviations did occur, were they planned or expected and assessed in advance?) and Check that the tracking of the tool was according to plan. This check is designed to ensure that the actual inspection was conducted in such a way as to ensure high-quality inspection data. The documentation of this check is indicative of the ILI vendors and operators diligence in following established Standards and Recommended Practices. A Fail is given if the inspection was not carried out as per the inspection procedure with the potential of material impact to data quality. A Conditional pass is given if the inspection was not carried out as per inspection procedure but deviations are not material to data quality. A Pass is given if the inspection was carried out as per inspection procedure. Post-run Mechanical Check The Post-run Mechanical check verifies that the ILI vendor conducted the group of activities to ensure the mechanical integrity of the tool upon receipt at the end of the run. As such, post-run mechanical checks are expected to be largely visual and specific to each vendor and technology used. The checklist

should be provided by the ILI vendor, standardized and identified in advance of the inspection. These include, but are not limited to assessing general state of the tool, pressure seals around electronic components, integrity of cups, tool cleanliness, tool wear as well as ensuring all parts are intact and moving appropriately. A Fail is given if significant tool wear, damage or debris occurred with material impact to data. A Conditional pass is given if tool wear, damage or debris is observed but with no material impact to data. Alternatively, a Conditional pass is given if the checks were not documented but that the mechanical integrity of the tool can be verified by other means. A Pass is given if all mechanical checks passed and were documented. Post-run Function Check The Post-run Function check verifies that the ILI vendor conducted the group of activities to ensure the functional integrity of the tool upon receipt at the end of the run. As such, post-run function checks are expected to be specific to each vendor and technology used. The checklist should be provided by the ILI vendor, standardized, and identified in advance of the inspection. These checks include, but are not limited to appropriate operation of all components, the adequacy and availability of the power supply, confirming sensors are operational, and confirming adequacy and availability of data storage. A Fail is given if significant post-run function check did not pass. A Conditional pass is given if the checks were not documented but that the proper functioning of the tool can be verified by other means through the length of the run. A Pass is given if all function checks passed and were documented. Post-run Field Data Check The Post-run Field Data check verifies that the ILI vendor checked the integrity of the data collected upon receipt at the end of the run. These checks are expected to be specific to each vendor and technology used. The checklist should be provided by the ILI vendor, standardized, and identified in advance of the inspection. These checks include, but are not limited to amount of data collected and circumferential and linear continuity of data. The purpose of this item is to ensure that the inspection tool collected data for the full length of the line. The documentation of this check is indicative of the ILI vendors diligence in following established Standards and Recommended Practices. A Fail is given if the tool was unable to meet stated specifications due to significant lack of data integrity. A Conditional pass is given if the tool was unable to meet stated specifications but the situation is manageable through

Copyright 2012 by ASME

further analysis. A Pass is given if the tool was able to meet stated specifications for the entire length of the run. Data Analysis Process Check The data analysis should be discussed and decided jointly by the operator and ILI vendor. The operator and ILI vendor should agree on items such as sizing algorithms to use, amount of manual intervention, filtering of reported anomalies, clustering rules, burst pressure procedure, etc. In addition, the operator should also discuss data analysts qualifications (Level 1, 2 or 3). The Data Analysis checklist should be based on the analysis procedure provided by the ILI vendor, standardized, and identified in advance of the analysis. These checks include, but are not limited to amount of data collected, continuity of data, appropriate sensor response(s), sizing algorithms, manual checking, clustering rules, burst pressure procedure, execution of data analysis procedures as well as use of appropriate input parameters (such as pipeline diameter, wall thickness, grade etc.,). This Data Analysis Process check is to ensure that the raw data from the inspection has been properly analyzed by the ILI vendor and that the final report will satisfy the requirements of the inspection. A Fail is given for any situation which cannot be corrected by reanalysis of the data. Such situations may indicate a problem in the selection, preparation, or running of the tool. A Conditional pass is given if the vendor has not supplied the completed checklist or if any deficiency is noted but corrected by a reanalysis of the data. If following the reanalysis of the data, the number, distribution, or severity of the anomalies is not consistent with expectations, then an independent review (audit) of the analysis procedures may be required. A Pass is given if the vendors checklist is completed as previously agreed and the number, distribution, and severity of anomalies are consistent with the expectations of the operator considering the age, coating, previous inspections, and history of the pipeline. Cumulative Assessment This final step of the Verification procedure provides a means of assessing all of the parameters taken as a whole to determine whether the tool performance was acceptable. That is, it is a mechanism to ensure that (potential) sub-optimal performance across all parameters does not result in an unacceptable run even if all parameters taken individually are deemed acceptable (i.e., Conditional passes). Relevant considerations include, but are not limited to, the following:

Can data gaps be mitigated effectively using alternate methods? Can any data gaps be addressed through rerunning the tool or are line conditions such that similar challenges will remain? Are any Conditional scores cumulative in nature? Do Conditional scores of different parameters affect the same locations? Scoring of this final parameter is in fact the scoring of the entire Process Verification. The final score may be either Fail, Pass, or Ambiguous, depending on whether the Conditional passes of the previous checks are cumulative or not. Degradation is cumulative if one issue magnifies any pre-existing data degradation (such as a tool over-speed in a location where the tool performance is already compromised by debris-related sensor lift off). Conversely, issues are not cumulative if their impact on data quality is largely independent (such as a run where the tool over-speeds at launch and experiences debris issues at receive). If two or more Conditional passes of the previous checks affect the same length of pipe, and the degradation are cumulative and materially impact the results of the ILI, a Fail would be assigned. If the operator can clearly confirm that the impact of any Conditional scores are not cumulative and are manageable, a Pass may be assigned. However, should the results be ambiguous, due to the specifics of the situation, an Ambiguous is assigned and further analysis is required to manage the threat on the pipeline. Process Verification Example To illustrate the Process Verification stage, this section presents an example with the following details: 100 km NPS 36 run, 10 metre over-speed at launch, Intermittent loss of one sensor (km 1.9 to km 2.0), No record of pre-run mechanical and function tests, No actionable anomalies; no previous in-line inspection. The completed scorecard is
Parameter 1 2 3 Tool Selection Inspection System Planning and Preparation Pre-run Function Score P P C - No record - Obtained verbal confirmation - Post-run function check passed - No record - Obtained verbal confirmation Comment

Copyright 2012 by ASME

Check 5 Pre-run Mechanical Check Procedure Execution Post-run Function Check Post-run Mechanical Check Post-run Field Data Check Data Analysis Process Check C

- Post-run mechanical check passed - Short over speed (due to line conditions) at launch deemed tolerable since pipe is in station yard and above ground

considered acceptable and none of the issues were cumulative, the final Process Verification score was Pass and the ILI run was accepted. DATA VALIDATION

6 7

C P

9 10

P C - 100 m of data missing (single sensor) - Deepest anomaly non-injurious based on revised tool performance specification - All Conditional scores mitigated - No material cumulative impacts identified

11

Cumulative Assessment

In this example, Conditional scores were given for the Pre-run Function check, Pre-run Mechanical check, Procedure Execution check and the Data Analysis Process check. Although the pre-run mechanical and functional checks were undocumented, the post-run checks were documented, and the tool passed both checks. Although, the pre-run checks should have been documented, the operator accepted that the condition of the tool as good prior to the run based on the condition of the tool after the run. The operator should issue a letter to the ILI vendor instructing it to supply documentation of the pre-run checks on all future runs. During the running of the tool, a short speed excursion occurred at the start of the run. The excursion was believed to affect sizing capabilities of the tool, but would have minimal effect on the detection capabilities of the tool. Since no metal-loss anomalies were detected in the area of the speed excursion, the effect on the data was deemed to be not significant. In the data analysis process, it was discovered that one sensor was lost for 100 metres. The loss of one sensor was believed to affect detection and sizing capabilities of the tool only for metal-loss anomalies along that sensors track. Only sparse shallow anomalies were in the vicinity of the lost sensor. The effect on the data was deemed to be not significant. The final Cumulative Assessment examined the Conditional scores. Since the area of speed excursion and the sensor loss were along different parts of the pipeline, all aspects of the run were

Figure 3 Flowchart for the Data Validation stage of accepting an ILI run.

Data Validation is the process that compares the data collected and reported by the ILI tool to some independent reference dataset to ensure the ILI tool meets its performance specification. Depending on the available data, and the results of the ILI inspection, the Validation procedure may consist of different comparisons. The Data Validation stage can include one, two or all three of the following components: 1. Comparison of ILI results to known pipe features,

Copyright 2012 by ASME

2. 3.

Comparison of ILI metal-loss anomalies to excavations results, and Comparison of ILI metal-loss anomalies to a previous inspection.

Figure 3 shows the Validation flowchart. Comparison of the ILI data to known non-injurious pipeline features (such welds, tees etc.,) is always required. If there are actionable anomalies, then they must be compared to the excavation results. Finally if there is a previous inspection of the pipeline, then the results of that inspection must be compared to the current inspection. Comparison to Known Pipeline Features The simplest Validation process is the comparison of non-injurious features on the pipeline to the ILI report. The features to be compared should include but are not limited to Girth welds, Wall thickness changes, Tees, Valves, Above ground markers, The Validation process using known pipeline features is listed below. The procedure is meant as a guideline rather than a rigorous set of instructions. Deviations from the procedure may be required in some circumstances. 1. Obtain the most recent and complete reference list of pipeline features: the listing may be the as-built, a metal-loss inspection report, a non-metal-loss inspection report, or another reliable source. 2. Match the girth weld locations from the reference list to the reported ILI girth welds. 3. Compile a list of the matched girth welds. 4. Compile a list of the girth welds reported in the reference list but not reported by the ILI. 5. Compile a list of the girth welds reported by the ILI but not included in the reference list. 6. Calculate the percentage of matched girth welds:
Number of matched girth welds 100 Total number of girth welds in reference listing

successfully identify and report all girth welds and all wall thickness changes, tees, valves, above ground markers. Any deviations need to be reconciled and documented. In addition all unmatched features must be reconciled. The reported location of all features must meet location-accuracy specifications to enable the excavation of any reported feature. If the ILI report initially fails to meet the above requirements, the operator must investigate the cause. To meet the acceptance criteria, the operator may choose to ignore the part of the inspection in the stations at launch and receive where there are many short pipe joints. Also, features on segments which have been replaced or rerouted can be ignored. Any unexplained features reported by the ILI should be investigated to determine their cause. Comparison to Previous Inspection The comparison with a previous ILI is likely the most comprehensive method for validating the results of an ILI inspection. Unlike pipeline excavations, a previous ILI enables the operator to systematically compare all anomalies of the current inspection to the previous reference inspection. The authors recognize that significant corrosion growth since the previous inspection may be an issue when attempting to validate the current inspection. However, the intention of this work is to allow the comparison with a previous ILI to enhance the Validation process. Any pipeline with sufficient corrosion growth to cause an issue with the validation and is likely to require extensive excavations to address integrity concerns. The Validation would then be conducted using the excavation results. Validation using a previous ILI consists of two parts: Detection and Accuracy. By using a previous inspection, the operator can confirm both the detection capabilities and the accuracy of the current inspection. The Validation procedure using previous ILI data is listed below. The procedure is meant as a guideline rather than a rigorous set of instructions. Deviations from the procedure may be required in some circumstances. Furthermore, a lengthy interval between ILI inspections or the use of very different technologies can make matching difficult if not impossible. If there is insufficient similarity between the inspections to make adequate matches, then the current inspection cannot be validated by the comparison. However, it would not necessarily lead to the rejection of the ILI run, since the cause of the discrepancy may be the previous ILI run. The ILI Validation criterion is based on the assumptions and calculations in the Appendix. The Validation parameters for a typical run are summarized in Table 1.

Identify all pipeline features from the reference list and match them to the corresponding ILI reported feature. 8. Identify all ILI features not listed in the reference list. 9. Thoroughly investigate the cause of all unmatched features. To satisfy this Validation procedure, the ILI report must successfully meet detection standards and location-accuracy specifications. The ILI must 7.

Copyright 2012 by ASME

Table 1 ILI Validation parameters for comparison with previous inspection


Tolerance to be Validated in the current ILI. Specified confidence level (or certainty) of the current ILI. Specified tolerance of the previous ILI inspection. Specified confidence level (or certainty) of the previous ILI. Upper bound of acceptance for the ILI tolerance. Confidence level for the tolerance Validation. Minimum matched sample size This value is 10% if the specified accuracy of the current ILI is 10% NWT, 80% of the time. This value is 80% if the specified accuracy of the previous ILI is 10% NWT, 80% of the time. This value is 10% if the accuracy of the previous ILI is 10% NWT, 80% of the time. This value is 80% if the accuracy of the previous ILI is 10% NWT, 80% of the time. Desjardins recommends that this value be 1.1 x T. Thus in most cases Tupper = 11%. 1 - is commonly set at 95%, which makes = 0.05. Based on the above parameters and the calculations in the Appendix, the minimum sample size is 513 matches.

anomalies are not significant to the performance of the tool. However, the operator should thoroughly investigate differences between the inspections for any anomaly with a depth greater than 40% NWT. Any failure of the current inspection to detect a bona fide metal-loss anomaly with a depth of 40% NWT or greater may invalidate the run. The remedy of such a situation depends on the cause of the missed anomaly. In many cases, the deficiency can be corrected by a reanalysis of the data. If the anomaly was not detected because of a deficiency in the raw data, then a rerun should be considered. Comparison to Excavation Results Comparison with excavation data has been the standard method for Validating the results of an inline inspection. Unlike the comparison with a previous ILI, excavation data compares only a limited number of ILI anomalies. However, in-the-ditch measurements can be much more accurate than a previous ILI. The ILI Validation procedure using excavation data is listed below. The procedure is meant as a guideline rather than a rigorous set of instructions. Deviations from the procedure may be required in some circumstances. The ILI Validation criterion is based on the assumptions and calculations in the Appendix. The Validation parameters for a typical run are summarized in Table 2.
Table 2 - ILI Validation parameters for comparison with excavation data
Tolerance, , to be Validated in the current ILI. Specified confidence level (or certainty) of the current ILI. Specified tolerance of the in-the-ditch measurements. Specified confidence level (or certainty) of in-theditch measurements. Upper bound of acceptance for the ILI tolerance. Confidence level for the tolerance Validation. Minimum comparison sample size This value is 10% if the specified accuracy of the current ILI is 10% NWT, 80% of the time. This value is 80% if the specified accuracy of the previous ILI is 10% NWT, 80% of the time. If the device used in the field is highly accurate, then the field measurements can be assumed to have no error. This value is assumed to be 95%. Desjardins recommends that this value be 1.1 x T. Thus in most cases Tupper = 11%. 1 - is commonly set at 95%, which makes = 0.05. Based on the above parameters and the calculations in the Appendix, the minimum sample size is 134 comparisons.

For each matched pair of anomalies, calculate the apparent difference in depth: = Where is the depth of the th anomaly in the current inspection; is the depth of the th anomaly in the previous reference inspection; and is the difference in depth of the th anomaly. Calculate the mean, , and standard deviation, , of the difference in depths: 1 = , And =
=1

1 ( )2 . 1
=1

The ILI Validation criterion is based on the assumptions and calculations in the Appendix. The ILI depth accuracy is Validated if 1. The number of available matches is 513 or more and 2. The calculated value, s, of the standard deviation of the difference in depth is less than 11.6%. Due to difference in resolution and reporting criteria of the two inspections, the number of reported anomalies can differ greatly. In most cases, the difference in reported anomalies between the two inspections is due to small shallow anomalies with depths less than 15% nominal wall thickness (NWT). Differences in the reporting of these small shallow

For each comparison, calculate the apparent difference in depth: =

Copyright 2012 by ASME

Where,

is the depth of the ith anomaly in the current inspection. is the corresponding in-the-ditch depth of the ith anomaly; and is the difference between the in-the-ditch measurement and ILI reported depth of the th anomaly. Calculate the mean, , and standard deviation, s, of the difference in depths: 1 = , And =
=1

diligence in following established Standards and Recommended Practices. With the further development of industry standards such as API 1163, consistent Verification checklists may be developed across the industry and used by all owners and vendors. SUMMARY A novel set of guidelines for the acceptance of in-line inspection data has been presented. This approach shifts the emphasis of ILI acceptance based solely on field validation to a two stage procedure consisting of Process Verification and Data Validation. Process Verification assumes that a successful in-line inspection is the product of good planning, preparation, execution and analysis performed in accordance with well-vetted industryrecognized procedures. Further refinement of the process presented herein can be accomplished by standardizing ILI reporting procedures and incorporating ILI vendor feedback. ACKNOWLEDGMENTS The authors would like to acknowledge the efforts and contributions of the larger CEPA working group associated with this project. REFERENCES 1. NACE RP0102-2002: Standard Recommended Practice In-Line Inspection of Pipelines. 2. NACE International Publication 35100: In-Line Nondestructive Inspection of Pipelines. 3. API 1163: In-Line Inspection Systems Qualification Standard, June 2004. 4. Hahn, Gerald; Meeker, William: Statistical Intervals: A Guide for Practitioners.

1 ( )2 . 1
=1

The ILI Validation criterion is based on the assumptions and calculations in the Appendix. The ILI depth accuracy is validated if 1. The number of available matches is 134 or more. 2. The calculated value, s, of the standard deviation of the difference in depth is less than 8.6%. FUTHER DEVELOPMENT These guidelines, primarily for the Process Verification stage were intended to be a starting point for owners to develop more comprehensive and consistent Verification protocols in working with the ILI vendors. Specifically, ILI reporting and documentation of all checks need to be standardized. Reporting standardization will result in greater consistency of the Verification Process. Proper documentation is indicative of the ILI vendors

10

Copyright 2012 by ASME

APPENDIX SAMPLE SIZE CALCULATION FOR VALIDATION Validation is the process where the results of an inspection are compared to a set of reference measurements to determine if the differences between the inspection results and the reference measurements are consistent with the specified performance accuracy of the ILI tool. In many cases the reference measurements are collected during an excavation of the pipeline using NDT devices. In other cases, a previous ILI run may provide as the reference measurements. The sample size is the number of comparisons made between the ILI and reference measurements. The minimum required sample size depends on two main factors: Accuracy of the reference measurements and size of the confidence interval for the estimation of ILI accuracy. For this project, the inspection is considered validated if the calculated tolerance of the accuracy of the tool is better than 11%NWT, 80% of the time, with a 95% confidence level. Case I, Validation using field measurements This section provides the calculations of the minimum sample size required to validate ILI using a NDT measurements collected on the pipeline during an excavation. The assumption is that the NDT measurements are highly accurate compared to the accuracy of the ILI. If the field measurements have significant error, then calculations similar to those in Case II of this appendix would be required to calculate the required sample size. In this comparison, the measurements made in the field are the reference measurements to which the inspection results are compared. The differences between the ILI-reported depths and the fieldreported depths are analyzed by statistical methods to assess the size of the errors, or accuracy of the ILI tool. Accuracy of ILI tools is often stated as a tolerance and certainty (or confidence level). For example, a typical accuracy of an ILI tool is 10%NWT, 80% of the time. The tolerance is 10%, and the certainty is 80% . The assessment of the accuracy of the ILI tool requires the estimation of the tolerance. From a limited sample size (number of comparisons), the tolerance can only be estimated to limited confidence interval. A large sample size yields a small confidence interval, and a small sample size yields a large confidence interval. Figure 4 shows the distribution of error for an idealized inspection tool. The distribution is normal with a mean, , of zero and a standard deviation, , of 7.8%NWT. The shape of the normal distribution is such that 80% of the area under the curve is between 1.28 and +1.28. For the distribution of errors in Figure 4, 80% of the errors are within 10% (1.28 7.8%NWT = 10%NWT) (red shaded area). So estimating the 80% error tolerance of the tool requires calculating the standard deviation and multiplying it by 1.28.

Probability density
-30%

-20%

-10%

0% ILI error

10%

20%

30%

Figure 4 Distribution of error for a typical ILI run.

The standard deviation of the ILI tool error can only be estimated from a sample of comparisons of the ILI-reported depths to the reference depths. The difference between an ILI reported depth and a reference measurement is = The average difference, , of comparisons between the ILI-reported depths and the reference depths is 1 = . The standard deviation, , of the difference in depth for the sample is 1 = ( )2 . 1
=1

Note that is an estimate of . The true value of the standard deviation, , is Or 1 1 2 < < 2 ,1 1,1 Where 1 1 < < 2 2 ,1 1,1 (A1)

11

Copyright 2012 by ASME

is the standard deviation of the ILI-tool errors, is confidence level, and 1 2 ,1 is chi squared value for the probability and 1 degrees of freedom. The question of how large of a sample size is required to validate an inline inspection depends on the precision to which the standard deviation, , needs to be estimated. That precision is largely a matter of judgment. For this project, we believe that for the purposes of validation, the 80% confidence level tolerance should be determined within 1%NWT. This criterion is satisfied if 0.9091 < < 1.1111 In other words if the assessed accuracy of the ILI is 11.0%NWT, 80% of the time, then we can state with 95% confidence that the true tolerance would be between 10% and 12%. Since the specified performance accuracy is within the confidence interval ( = 0.9091), the ILI would be validated by the comparisons. However any assessed tolerance greater than 11% would not validate the run. A value of
1
,

If both inspections have an accuracy of 10%NWT, 80% of the time, then the standard deviation of error for both is 7.8%NWT. The expected value of ,
2 = 2 + = (7.8)2 + (7.8)2 = 11.03

However to validate the run we would require the assessed accuracy to be within 1% NWT of 10%NWT. Thus (7.02)2 + (7.8)2 (8.58)2 + (7.8)2 11.03 11.03 0.9514 Thus, implies 513. In other words, to validate an ILI run using a previous inspection requires a sample size of 513 matches between the two inspections. 1.0513

1 .9514 2 ,1

0.91, with = 0.05, requires

that 134. Case II, Validation using a previous ILI This section provides the calculations of the minimum sample size required to validate ILI using a previous ILI. Whereas the calculation above assumes that the NDT measurements are highly accurate, the accuracy of a previous ILI would be comparable to the current ILI inspection. Accuracy of ILI tools are often stated as a tolerance and certainty (or confidence level). This section assumes that the accuracy of the previous inspection is 10 %NWT, 80% of the time. The difference causes a change to Equation A1: Or Where 1 1 < < 2 2 ,1 1,1 1 1 < < 2 2 ,1 1,1 (A2)

2 = 2 + is the standard deviation of the errors in the reference (previous ILI) measurements; and is the standard deviation of the sample.

12

Copyright 2012 by ASME