Академический Документы
Профессиональный Документы
Культура Документы
AL 04 dekalb AL0000509 Fort Payne Waterwor1<s Board 00001T 0509001 SW 18735 GINS Large Perchlorate uglL 2002_07Jul 4 <MRL o o
AL 04 dekalb AL0000509 Fort Payne Waterwor1<s Board 00001T 0509001 SW 18735 GINS Large Perchlorate uglL 2002_04Apr 4 <MRL o o
AL 04 dekalb AL0000509 Fort Payne Waterwor1<s Board 00001T 0509001 SW 18735 GINS Large Perchlorate uglL 2002_100ct 4 <MRL o
°
AL 04 dekalb ALOOOO509 Fort Payne Waterwor1<s Board oooon 0509001 SW 18735 GINS Large Perchlorate uglL 2002_01 Jan 4 <MRL o
AL 04 madison AL0000899 US Army Aviation & Missile Command 00001T 0899001 SW 21180 GINS Large Perchlorate uglL 2001_12Dec 4 <MRL o o
AL 04 medison ALOOOO899 US Army Aviation & Missile Command 00002T 0899002 SW 21180 GINS Large Perchlorate uglL 2001_120ec 4 <MRL
AL 04 madison AL0000899 US Army Aviation & Missile Command 00001T 0899001 SW 21180 GINS Large Perchlorate uglL
° °
2001_09Sep 4 <MRL
AL 04 dale ALOOO1489 Fort Rucker 00006T 1489005 GW 18000 GINS Large Perchlorate ug/L 2002_07Jul 4 <MRL o
. .
CA CA3410700 McClellan Air Force Base - Main Base GW 17600 NTNC large No Data
CA 09
barbara
santa
CA4210700 Vandenberg Air Force Base 00003
16G05
08NI34W
08NI34W
08NI34W
08N134W
CA 09
barbara
santa
CA421 0700 Vandenberg Air Force Base 00002
16C05
08NI34W
SWP 12000 CWS large Perchlorate ugll
° °
barbara 16J02
2003_05May 4 <MRl o o
santa 08NI34W
CA 09 CA421 0700 Vandenberg Air Force Base 00003 SWP 12000 CWS large Perchlorate ugll 2003_05May
barbara 16G04 4 <MRl o
4810015
CA 09 solano CA4810015 Travis Air Force Base· Vallejo 00003 SW 32000 CWS large Perchlorate ugll 2001_01Jan. 4 <MRl o
CA 09 solano CA4810015 Travis Air Force Base - Vallejo 00003
003
4810015
SW 32000 CWS large Perchlorate ugll 2001_07Jul 4 <MRl
°
CA 09 solano CA4810015 Travis Air Force Base· Vallejo 00003
003
4810015
SW 32000 CWS large Perchlorate ugll 2001_100ct
° °
003 4 <MRl o
CA CA4810701 Travis Air Force Base GW 13837 CWS large No Data
. ,
PWS PWSw/o
PWSName
count data count
"
FL FL1170814 Corry Field· Naval Air Station GW 17192 CWS Large No Data
FL 04 duval FL2160734 Mayport Naval Station (",ainside) 08001 32228 GW 14642 CWS Large Perchlorale uglL 2oo3_02Feb 4 <MRL
° °
FL 04 duval FL2160734 Mayport Naval Station (Malnslde) 08001 32228 GW 14642 CWS Large Perchlorate uglL 2oo3_07Jul 4 <MRL o
04 duval FL2161212 Jacksonville Naval Air Ssation 08001 32212 GW 20000 CWS Large Perchlorate uglL 2oo3_07Jul 4 <MRL
FL
° °
FL 04 duval FL2161212 Jacksonville Naval Air Ssatlon 08001 32212 GW 20000 CWS Large Perchlorate ug/L 2oo3_02Feb 4 <MRL
° °
FL 04 duval FL2161212 Jacksonville Naval Air Ssatlon 08002 83868 GW 20000 CWS Large Perchlorate uglL 2oo3_02Feb 4 <MRL
° °
FL 04 duval FL2161212 JackSonville Naval Air Ssalion 08002 83868 GW 20000 CWS Large Perchlorate uglL 2oo3_07Jul 4 <MRL
04 liberty GA1790024 Fort Stewart· Main 03468 303 GW 21000 CWS Large Perchlorate uglL 2002_01 Jan 4 <MRL o
°
GA
04 liberty GAl 790024 Fort Stewart· Main 03687 304 GW 21000 CWS Large Perchlorate uglL 2oo2_06Jun 4 <MRL
°
GA
GA 04 liberty GA1790024 Fort Stewart· Main 02965 302 GW 21000 CWS Large Perchlorate uglL 2oo2_06Jun 4 <MRL
° °o
GA GA2150002 Fort Benning SW 44000 CWS Large No Data
GA 04 richmond GA2450028 Fort Gordon 02054 301 SW 24000 CWS Large Perchlorate uglL 2oo3_07Jul 4 <MRL o o
GA 04 richmond GA2450028 Fort Gordon 02054 301 SW 24000 CWS Large Perchlorate uglL 2003_04Apr 4 <MRL
° °
GA 04 richmond GA2450028 Fort Gordon 02054 301 SW 24000 CWS Large Perchlorate uglL 2oo3_12Dec 4 <MRL o o
GA 04 richmond GA2450028 Fort Gordon 02054 301 SW 24000 CWS Large Perchlorate uglL 2003_09Sep 4 <MRl
IA 07 lee IA5625062 Fort Madison Municlpat Waterworks 04840 02 SW 11618 CWS Large Perchlorate uglL 2oo1_03Mar 4 <MRL o o
IA 07 lee IA5625062 Fort Madison Municipal Waterworks 04840 02 SW 11618 CWS Large Perchlorate uglL 2001_11Nov 4 <MRL
IA 07 lee IA5625062 Fort Madison Municipal Waterworks 04840 02 SW 11618 CWS Large Perchlorate uglL 2oo1_12Dec 4 <MRL
°o °
IA 07 lee IA5625062 Fort Madison Municipal Waterworks 04840 02 SW 11618 CWS Large Perchlorate uglL 2oo1_06Jun 4 <MRL
°o
fA 07 webster IA94330SO Fort Dodge Water Supply 06471 02 GW 25894 CWS Large Perchlorate ug/L 2oo2_02Feb 4 <MRL o o
IA 07 webster IA94330SO Fort Dodge Water Supply 06470 03 GW 25894 CWS Large Perchlorate ugIL 2oo2_02Feb 4 <MRL o o
fA 07 webster tA94330SO Fort Dodge Water Supply 06472 01 GW 25894 CWS Large Perchlorate uglL 2oo2_07Jul 4 <MRL o o
IA 07 webster IA94330SO Fort Dodge Water Supply 06472 01 GW 25894 CWS Large Perchlorate uglL 2oo2_02Feb 4 <MRL o o
fA 07 webster IA9433050 Fort Dodge Waler Supply 06470 03 GW 25894 CWS . Large Perchlorate uglL 2oo2_07Jul 4 <MRL o o
PWSName PWS PWSw/o
count data count
IA 07 webster IA9433050 Fort Dodge Water Supply 06471 02 GW 25894 CWS Large Perchlorate uglL 2002_07Jul 4 <MRL
.'
IL 05 lake IL0975227 GREAT LAKES NAVAL TRAINING STATION 16173 TAP_01 SW 31637 CWS Large Perchlorale uglL 2003_02Feb
°
4 <MRL
IL 05 lake lL0975227 GREAT LAKES NAVAL TRAINING STATION 16173 TAP_Ol SW 31637 CWS Large Perchlorate uglL
° °
2003_11Nov 4 <MRL
IL 05 lake IL0975227 GREAT LAKES NAVAL TRAINING STATION 16173 TAP_Ol SW 31637 CWS Large Perchlorate uglL
° °
2003_05May 4 <MRL
IL 05 lake IL0975227 GREAT LAKES NAVAL TRAINING STATION 16173 TAP_01 SW 31637 CWS Large Perchlorate uglL
° °
2003_0BAug 4 <MRL
IN 05 gibson IN5226001 Fort Branch Water Department oooon 901 GW 3591 CWS M Perchlorate uglL 2002_11Nov
°
4 <MRL
KY 'KY(}470990 FORT KNOX I ENGINEERING & HOUSING SW 42400 CWS Large No Data
MD 03
arundel
anne
MD0020012 FORT GEORGE G. MEADE 00001 0100000 SW 50001 CWS VL Perchlorale uglL 2002_100ct 4 <MRL
° °
MD 03
arundel
anne
MDOO20012 FORT GEORGE G. MEADE 00001 0100000 SW 50001 CWS VL Perchlorate uglL
° °
2003_04Apr 4 <MRL
MD 03
arundel
anne
MDOO20012 FORT GEORGE G. MEADE 00001 0100000 SW 50001 CWS VL
° °
Perchlorate uglL 2003_07Jul 4 <MRL
MD
arundel
MD012ooo2 Aberdeen Proving Ground· Chapet Hlfl SW 12002 CWS Large
°
No Data
MD MD01B0022 Patuxent Naval Air Station INAWCAD) GW 11722 CWS Large No Data
NC {}4 cumberland NC0326344 FORT BRAGG 00004 EPI SW 65000 CWS VL Perchlorate ugIL 2002_07Jul 4 <MRL
NC {}4 cumberland NCro26344 FORT BRAGG 00004 EPI SW 65000 CWS VL Perchlorate uglL 2003_01 Jan 4 <MRL
° °
NC 04 cumberland NC0326344 FORT BRAGG 00004 EPI SW 65000 CWS VL
° °
Perchlorate ugIL 2002_100d 4 <MRL o
NC NC(}467041 USMC LEJEUNE·HADNOl POINT GW 35000 CWS Large No Data
NC NC(}467042 USMC lejune • New River Air Station GW 11500 CWS Large No Data
OH 05 greene OH2903312 WRIGHT·PATTERSON AFB, 'B' 00013 EPI GW 12045 CWS Large Perchlorate uglL 2002_07Jul 4 <MRL
OH 05 greene OH2903312 WRIGHT·PATTERSON AFB, 'B' 00013 EPI GW 12045 CWS Large Perchlorate uglL
° °
2002_02Feb 4 <MRL
OH 05 greene OH2903312 WRIGHT·PATTERSON AFB, 'B' 00014 EP2 GW 12045 CWS Large Perchlorate uglL
° °
2002_02Feb 4 17.2
OH 05 greene
°
OH2903412 WRIGHT·PATTERSON AFB AREA AIC 00014 EP3 GW 15160 CWS Large Perchlorate uglL 2OO2_02Feb 4 <MRL o
OH 05 greene OH2903412 WRIGHT·PATTERSON AFB AREA AlC 00013 EPI GW 15160 CWS Large Perchlorate uglL 2002_07Jul
°
4 <MRL
OH 05 greene OH2903412 WRIGHT·PATTERSON AFB AREA AIC 00013 EPI GW 15160 CWS Large Perchlorale uglL 2002_02Feb 4 <MRL
° °
° °
PWSName PWS PWSw/o
count data count
,.
OH 05 greene OH2903412 WRIGHT·PATTERSON AFB AREA AIC 00016 EP2 GW 15160 ews Large Perchlorate ugll. 2oo2_02Feb 4 <MRl .0 o
OH 05 greene OH2903412 WRIGHT·PATTERSON AFB AREA AIC 00016 EP2 GW 15160 ews Large Perchlorate ugll. 2oo2_07Jul 4 <MRl o o
OH 05 greene OH2903412 WRIGHT·PATTERSON AFB AREA AIC 00014 EP3 GW 15160 ews Large Perchlorate ugll. 2oo2_07Jul 4 <MRl o o
OH 05 greene OH2903412 WRIGHT·PATTERSON AFB AREA AIC 00015 EP4 GW 15160 ews Large Perchlorate ugll. 2oo2_02Feb 4 <MRl o o
OH 05 greene OH2903412 WRtGHT·PATTERSON AFB AREA AIC 00015 EP4 GW 15160 ews Large Perchlorate ug/L 2oo2_07Jul 4 <MRl o
TN TN0000820 FORT CAMPBEll WATER SYSTEM SW 40000 ews Large No Data
TX TX071 0020 FORT BLISS MAIN BASE AREA GW 21800 ews Large No Data
TX TX243OOO7 SHEPPARD AIR FORCE BASE SWP 12810 ews Large No Data
UT 08 dallis UT4900513 Hill AIR FORCE BASE 00009T 0602409 SWP 22082 ews Large Perchlorale ugll. 2oo1_100ct 4 <MRl o o
UT 08 dallis UT~900513 HIll AIR FORCE BASE 00002T 0602402 SWP 22082 CWS Large Perchlorate ugll 2ool_100ct 4 <MRl o o
UT 08 dallis UT4900513 Hill AIR FORCE BASE 00008T 0602408 SWP 22082 CWS Large Perchlorate ugll 2oo1_100ct 4 <MRl o o
UT 08 dallis UT4900513 HIll AIR FORCE BASE 00002T 0602402 SWP 22082 CWS Large Perchlorate ugll 2ool_05May 4 <MRl o o
UT 08 dallis UT4900513 Hill AIR FORCE BASE 00008T 0602408 SWP 22082 CWS Large Perchlorate ugll 2ool_05May 4 <MRl o o
UT 08 dallis UT4900513 Hill AIR FORCE BASE 00009T 0602409 SWP 22082 ews Large Perchlorate ugll. 2oo1_05May 4 <MRl o
VA VA6153675 QUANTICO MARINE BASE· MAINSIDE SW 13012 ews Large No Data
WA 10 k~sap WA5302714 SUBASE BANGOR 70090 7oo9C GW 15843 ews Large Perchtorate ugll. 2oo3_05May 4 <MRl o o
WA 10 kitsap WA5302714 SUBASE BANGOR 70510 7051CL GW 15843 ews Large Perchlorate ugll. 2oo3_05May 4 <MRl o o
WA 10 kilsap WA5302714 SUBASE BANGOR 70510 7051CL GW 15843 ews large Perchlorale ugll 2oo3_12Dec 4 <MRl o o
WA 10 kitsap WA5302714 SUBASE BANGOR 70090 7009C GW 15843 ews Large Perchlorate ugll 2oo3_12Dec 4 <MRl o
WA WA53243SO FAIRCHilD AIR FORCE BASE GW 11227 CWS Large No Data
PWS PWSwfo
count data count
61 40
DoD UCMR DATA FROM DANIEL HAUTMAN, UCMR IMPLEMENTATION TEAM CO-LEADER, OFFICE OF GROUNDWATER AND DRINKING WATER, EPA
PWS
Sample point Source Sample collection MRL PWS wlo
SlmelD EPA_Region PCOunty PWSID PWSName Fecility 10
10 Waler
Population PWSType Size Contaminant Un~ meallUre Reaull Detect
year & month (ugIL) count data
count
Air Force UCMR Dala From EPA UCMR POC
PBqP 1 0f paqe 8
T
PWS
Sample point Source Sample COllection MRL PWS wlO
SlatelD EPA...Region PCounty PWSIO PWSName Facility 10 Population PWSType Size Conlamin8nl Unit measure Result ,Detect
ID Water ,ear & monlh (ugIL) count data
count
Travis Air Force Base·
CA 09 solano CA481oo15 00003 481OO15-{)03 SW 32000 CWS Large Perchlorate ug/L 2oo1_0tJan <MRL
Vallejo
° °
Travis Air Force Base
CA 09 solano CA481OO15 00003 4810015,003 SW 32000 CWS Large Perchlorate ug/L 2oo1_07Jul <MRL
Vallejo
° °
Travis Air Force Base,
CA 09 solano CA481OO15 00003 4810015-003 SW 32000 CWS Large Perchlorate ug/L 2oo1_100cl 4 <MRl
Vallejo
°
CA CA481 0701 Travis Air Force Base GW 13837 CWS Large No Data
BASE
BASE
UT
08 davis UT4900513
HILL AIR fORCE BASE 00009T 0602409 SWP 22D82CWS Large
Perchlorate ugiL 2001_100ct 4 <MRL 0
°
UT
08 davis UT49OO513
HILL AIR FORCE BASE 00002T 0602402 SWP 22082 CWS Large
Perchlorate ugIL 2001_100ct 4 <MRL 0
°
UT
08 davis UT49OO513
HILL AIR FORCE BASE 00008T 0602408 SWP 22082 CWS Large
Perchlorate ugiL 2001_100ct 4 <MRL 0
°
UT
08 davis UT49OO513
HILL AIR FORCE BASE 00002T 0602402 SWP 22082 CWS Large
Perchlorate uglL 2001_05May 4 <MRL 0
\
°
UT
08 davis UT49OO513
HILL AIR FORCE BASE 00008T 0602408 SWP 22082 CWS Large
Perchlorale ugIL 2001_05May 4 <MRL 0
°
UT
08 davis UT49OO513
HILL AIR FORCE BASE 00009T 0602409 SWP 22082 CWS Large
Perchlorate ugiL 2001_05May 4 <MRL IIREFI IIREFI
P,ge 3 olp,ge 8
StatelD EPA_Region PCounIy PWSID PWS~ame Facility 10
Sample point Source
10 Waler
Population PWSType Size ConbIminant '
.
Unit measure
Sample coIl9dion MRl
year & month (ugIL) Result Detect
PWS w/o
PWS
count data
count
Navy/Marine Corps UCMR Data From EPA UCMR P~
° °
FL 04 duval FL2160734 08001 32228 GW 14642CWS Large Perchlorate uglL 2oo3_07Jur 4 <MRL 1
(Malnalde)
Jacksonville Naval Air
°
FL 04 duval FL2161212 08001 32212 GW 20000CWS Large Perchlorate uglL 2oo3_07Jul 4 <MRL
Saatlon
Jacksonville Naval Air
° °
FL 04 duval F12161212 08001 32212 GW 20000CWS Large Perchlorale ug/L 2oo3_02Feb 4 <MRL
Saatlon
Jacksonville Naval Air
° °
FL 04 duYal F12161212 08002 83868 GW 20000CWS Large Perchlorate uglL 2oo3_02Feb 4 <MRL
Saatlon
Jacksonville Naval Air
° °
FL 04 duval FL2161212 08002 83868 GW 20000CWS Large Perchlorate uglL 2oo3_07Jul 4 <MRL 1
Ssation
°
GU GUOOOOO10 U.S. Navy Water System SW 14300CWS Large No Data
USMC LEJEUNE·
NC NC0467041 GW 35000CWS Large No Data
HADNOT POINT
USMC LEJEUNE
NC NC0467043 GW 17000 CWS Large No Data
HOLCOMB BLVD.
Page 4 of paqe II
I PWS
Sample point Source Sample c:oflec:tion MRl PWS wlo
SlatelD EPA_Region PCounty PWSIO PWSName FacilitylD Population PWSType Size Contaminant Unit measure Result Delect
10 Water year & month (ugIlJ count data
count
QUANTICO MARINE
VA VA6153675 SW 13012 CINS Large No Dala
BASE - MAINSIDE
WA 10 kllsap WA5302714 SUBASE BANGOR 70090 7009C GW 15843 CINS Large Perchlorate uglL 2003_05May 4 <MRL 0 0
WA 10 kitsap WA5302714 SUBASE BANGOR 70510 7051CL GW 15843 CWS Large Perchlorate ugll 2003_0SMay 4 <MRl 0 0
WA 10 k~sap WA5302714 SUBASE BANGOR 70510 7051CL GW 15843 CINS Large Perchlorate uglL 2003_12Dec 4 <MRl 0 0
WA 10 k~sap WA5302714 SUBASE BANGOR 70090 7009C GW 15843CWS large Perchlorate ugiL 2003_12Dec 4 <MRl 1 0
PAge 5 Qf fYage 8
f' PWS
Sample point Source Sample collection MRl PWS wlo
StalelD EPA_Region PCounty PWSIO PWSName Facility 10 Population PWSType Size Contaminant Unit measure ResUlt Detect
10 Water year & month (ugIL) count data
count
Army UCMR Data From EPA UCMR POC
Al 04 madison Al0000899
us Anny Aviation & 00002T 0899002 SW 21180CWS Large Perchlorate ugIL 2oo2_03Mar 4 <MRl o o
Missile Command I
AL 04 dale ALOOO1489 Fort Rucker 00005T 1489004 GW 18000CWS Large Perchlorate ugIL 2002_02Feb 4 <MRl o o
AL 04 dale ALOOO1489 Fort Rucker 00003T 1489002 GW 18000CWS Large Perchlorate ugfl 2oo2_02Feb 4 <MRL o o
Al 04 dale ALOOO1489 Fort Rucker 00004T 1489003 GW 18000CWS Large Perchlorate uglL 2oo2_02Feb 4 <MRL o o
Al 04 dale ALOOO1489 Fort Rucker OOOOlT 14890C6 GW 18000CWS Large Perchlorate ug1L 2oo2_02Feb 4 <MRl o o
AL 04 dale ALOOO1489 Fort Rucker 00008T 1489007 GW 18000CWS Large Perchlorate uglL 2oo2_02Feb 4 <MRL o o
AL 04 dale ALOOO1489 Fort Rucker 00006T 1489005 GW 18000CWS Large Perchlorate ugfL 2oo2_02Feb 4 <MRL o o
AL 04 dale ALOOO1489 Fort Rucker 00003T 1489002 GW 18000CWS Large Perchlorate ugfL 2oo2_07Jul 4 <MRL o o
Al 04 dale ALOOO1489 Fort Rucker 00004T 1489003 GW 18000CWS Large Perchlorate ugfl 2oo2_07Jul <MRl o o
At 04 dale ALOOO1489 Fort Rucker 00005T 1489()(J4 GW 18000CWS Large Perchlorate ugfL 2oo2_07Jul <MRL o o
Al 04 dale ALOOO1489 Fort Rucker OOOOlT 1489000 GW 18000CWS Large Perchlorale ugfL 2oo2_07Jul <MRL o o
Page f) of pagp 8
r PWS
Sample poInt Source Sample collection MRl PWS wlo
StalelO EPA..Region PCounly PWSID PWSNarne Facility 10 Population PWSType Size COntaminant Unit measure Result Detect
10 Waler year & month (ug/l) count data
count
AL 04 dale ALOoo1489 Fort Rucker 000081 1489007 GW 18000 CWS Large Perchlorate uglL 2oo2_07Jul 4 <MRL o
°
AL 04 dale AL0001489 Fort Rucker 000061 1489005 GW 18000 CWS Large Perchlorate ugiL 2oo2_07Jui 4 <MRl
°
AZ AZ0402078 Fort Huachuca GW 15603 CWS Large No Data
Camp Roberts •
CA CA2710705 Calilomla National GW 20370 ews Large No Data
Guard
GA 04 liberty GA1790024 Fort Slewllrt • Main 01472 301 GW 21oooe~ Large Perchlorate ugll 2oo2_06Jun <MRl o .0
GA 04 liberty GA1790024 Fort Stewart· Main ' 03468 303 GW 21000CWS Large Perchlorate ugIl 2oo2_06Jun <MRl o
°
GA 04 liberty GA1790024 Fort Stewart· Main 01472 301 GW 21000CWS large Perchlorate ugiL 2002_01 Jan <MRl
° °
GA 04 liberty GA1790024 Fort Stewart - Main 03687 304 GW 21000CWS Large Perchlorate ug/L 2002_01 Jan <MRl o
°
GA 04 liberty GA1790024 Fort Stewart· Main 03795 305 GW 21000ews Large Perchlorate ugIL 2002_01 Jan <MRl
° °
GA 04 liberty GA1790024 Fort Stewart· Main 02965 302 GW 21000ews Large Perchlorate ugll 2002_01 Jan <MRl
° °
GA 04 liberty GA1790024 Fort Stewart· Main 03468 303 GW 21000 ews Large Perchlorate ugiL 2002_01 Jan 4 <MRl
° °
GA 04 liberty GA1790024 Fort Stewart· Main 03687 304 GW 21000ews large Perchlorate uglL 2oo2_06Jun 4 <MRl o o
GA 04 liberty GA179OO24 Fort Stewart· Maln 02965 302 GW 21000ews Large Perchlorate uglL 2oo2_06Jun 4 <MRl
°
GA GA2150002 Fort Benning SW 44000ews large No Data
GA 04 richmond GA245oo28 Fort Gordon 02054 301 SW 24000 ews Large Perchlorate uglL 2oo3_07Jul 4 <MRl o
°
GA 04 richmond GA245OO28 Fort Gordon 02054 301 SW 24000 CWS Large Perchlorate ugiL 2003_04Apr 4 <MRl
°
GA 04 ridlmOnd GA2450028 Fort Gordon 02054 301 SW 24000 CWS Large Perchlorate uglL 2oo3_12Dec 4 <MRl o
°
GA 04 richmond GA2450028 Fort Gordon 02054 301 SW 24000ews Large Perchlorate ugiL 2oo3_09Sep 4 <MRl
°
KS KS2006114 FORT RilEY GW 18000ews Large No Data
PogP 7 01 page 8
,; PWS
Sample point Source Sample collection MRl PWS wi,!
StalelO EPA_Region PCouoty PWSIO PWS~ame Facility 10 Population PWSType Size Contaminant Unit measure Result Oeleet
10 Water year & month (og/l) count data
count
FORT KNOX I
KY KY0470990 ENGINEERING & SW 42400 ews Large No Data
HOUSING
FORT GEORGE G.
MD 03 anne arundel MD0020012 00001 0100000 SW 50001 ews VL Perchlorate uglL 2003_01 Jan 4 <MRL 0
MEADE 0
FORT GEORGE G.
MD 03 anne arundel M00020012 00001 0100000 SW 50001 ews VL Perchlorate uglL ' 2002_100ct 4 <MRL 0
MEADE 0
FORT GEORGE G.
MD 03 ame arundel MD0020012 00001 0100000 SW 50001 ews VL Perchlorate ugll 2003_04Apr 4 <MRL 0 ·0
MEAOE
FORT GEORGE G.
MD 03 anne arundel MD0020012 00001 0100000 SW 50001 ews VL Perchlorate uglL 2003_07JuI 4 <MRL 1
MEAOE 0
Aberdeen Proving
MD MD012ooo2 SW 12002 ews Large No Data
Ground - Chapel Hill
Ne 04 cumberland Ne0326344 FORT BRAGG 00004 EP1 SW 65000eWS VL Perchlorate uglL 2002_07Jul 4 <MRL 0 0
Ne 04 cumberland Ne0326344 FORT BRAGG ()()()()4 EPI SW 65000eWS VL Perchlorate ugiL 2003_01 Jan 4 <MRL 1 0
FORT CAMPBELL
TN TNoooo820 SW 40000 ews Large No Dala
WATER SYSTEM
r,ge 8 of rOQe 8
SITES THAT APPEAR IN EPA's UCMR DATABASE
The following sites appear in EPA's UCMR database, but are NOT included in DoD's
consolidated UCMR spreadsheet. Most of these sites may not need to be included in
DoD's current UCMR data since they report "No Data" in EPA's database under
"Result." However, there are four sites (shown below) that report I/<MRL" in EPA's
database under "Result," but do not appear in DoD's consolidated UCMR spreadsheet.
Result = <MRL
• Great Lakes Naval Training Station
• U.s. Army Aviation & Missile Command
• Fort Rucker
• Fort Bragg
Result = No Data
• Elemendorf AFB
• McClellan AFB - Main Base
• Kirtland AFB
• Kelly AFB
• Randolph AFB
• Sheppard AFB
• Corry Fi.~ld - Naval Air Station
• u.s. Navy Water System (Guam)
• Fort Richardson
• Fort Wainwright
• Camp Roberts - California National Guard
• Fort Irwin
• Fort Knox
• South Fork Polk
• Fort Campbell Water System
• South Ford Hood
• Fort Sam Houston
• Fort Bliss Main Base Area
SITES THAT APPEAR IN DoD's UCMR DATA
The following sites are included in DoD's consolidated UCMR spreadsheet, but NOT
included in EPA's UCMR database.
• Fort Monroe
• Fort Drum
• Commander U.s. Naval Forces, Marianas
• Coronado Navbase
• El Centro NAF
• Marine Corps Logistics Base Barstow
• NAVAIRWPNSTA CHINA LAKE CA
• Navy Public Works Center (IL)
• Navy Public Works Center, Pearl Harbor
• Pensacola NAS
• Barksdale AFB
• Beale AFB
• CannonAFB
• Davis-Monthan AFB
• Shaw AFB
• HanscomAFB
Guidelines for Ensuring and
Maximizing the Quality, Objectivity,
Utility, and Integrity of Information
Disseminated by the Environmental
Protection Agency
EPA/260R-02-008
October 2002
Prepared by:
Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of
Information Disseminated by the Environmental Protection Agency
Table of Contents
Introduction J
2 EPA Mission and Commitment to Quality ~
2.1 EPA's Mission and Commitment to Public Access ~
2.2 Information Management in EPA 5.
2.3 EPA's Relationship with State, Tribal, and Local Governments ~
3 OMB Guidelines 2
4 Existing Policies and Procedures that Ensure and Maximize Information Quality .ill
4.1 Quality System lQ
4.2 Peer Review Policy 11
4.3 Action Development Process 12
4.4 Integrated Error Correction Process 12
4.5 Information Resources Management Manual n
4.6 Risk Characterization Policy and Handbook il
4.7 Program-Specific Policies 13
4.8 EPA CQmmitment to Continuous Improvement .H
4.9 Summary of New Activities and Initiatives 14
Table of Contents
GUidelines for Ensurin9 and Maximizmg the Quality. Objectivity. Utility. and Integrity of Information DIsseminate'; D\' EP/:.
8.5 How Does EPA Expect to Process Requests for Correction of Information on
Which EPA has Sought Public Comment? . 32 0 • 0 •••••• 0 • 0 ••• 0 0 0 •••••• 0 •••
8.6 What Should be Included in a Request Asking EPA to Reconsider its Decision on
a Request forthe Correction ofInformation? . 34 0 • 0 • 0 0 • 0 •• 0 ••••••• 00' ••• 0 0
8.7 How Does EPA Intend to Process Requests for Reconsideration of EPA
Decisions? . 0 0 0 •• 0 o..
••••••• 0 34
•••• 0 ••••• 0 ••• 0 •••• 0 ••••••••••• 0 •• 0 •
Table of Contents 2
c.UldCllOe~ 10, EnsunnQ and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Dlssemmated bv :oPt,
1 Introduction
Developed in response to guidelines issued by the Office of Management and Budget (OMB)J
under Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal
Year 2001 (Public Law 106-554; H.R. 5658), the Guidelinesfor Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity of Information Disseminated by the Environmental
Protection Agency (the Guidelines) contain EPA's policy and procedural guidance for ensuring
and maximizing the quality of information we disseminate. The Guidelines also outline
administrative mechanisms for EPA pre-dissemination review of information products and
describe some new mechanisms to enable affected persons to seek and obtain corrections from
EPA regarding disseminated information that they believe does not comply with EPA or OMB
guidelines. Beyond policies and procedures these Guidelines also incorporate the following
performance goals:
• The principles of information quality should be integrated into each step of EPA's
development of information, including creation, collection, maintenance, and
dissemination.
OMB encourages agencies to incorporate standards and procedures into existing information
resources management practices rather than create new, potentially duplicative processes. EPA
has taken this advice and relies on numerous existing quality-related policies in these Guidelines.
EPA will work to ensure seamless implementation into existing practices. It is expected that
EPA managers and staff will familiarize themselves with these Guidelines, and will carefully
review existing program policies and procedures in order to accommodate the principles outlined
in this document.
IOuidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Infonnation
Disseminated by Federal Agencies, OMB, 2002. (67 FR 8452) Herein after "OMB guidelines".
http://www. whitehouse.gov/omb/fedreglreproducible2 .pdf
Introduction 3
GUidelines tor Ensuring and Maximizing the Quality. ObJectivity. Utility. and Integrity' of Information Dissemlnmcc' [l\' EP';'
EPA's Guidelines are intended to carry out OMB's government-wide policy regarding
information we disseminate to the public. Our Guidelines reflect EPA's best effort to present our
goals and commitments for ensuring and maximizing the quality of information we disseminate.
As such, they are not a regulation and do not change or substitute for any legal requirements.
They provide non-binding policy and procedural guidance, and are therefore not intended to
create legal rights, impose legally binding requirements or obligations on EPA or the public
when applied in particular situations, or change or impact the status of information we
disseminate, nor to contravene any other legal requirements that may apply to particular agency
determinations or other actions. EPA's intention is to fully implement these Guidelines in order
to achieve the purposes of Section 515.
These Guidelines are the product of an open, collaborative process between EPA and numerous
EPA stakeholders. The Guidelines development process is described in the Appendix to this
document. EPA received many public comments and has addressed most comments in these
Guidelines. A discussion of public comments is also provided in the Appendix and is grouped by
overarching themes and comments by Guidelines topic areas. EPA views these Guidelines as a
living document, and anticipates their revision as we work to further ensure and maximize
information quality.
Introduction 4
Guidelmes for Ensuring and Maxllnizmg the Quality. Objectivity. Utility. and Integrity of Inlormatlon DIsseminated D~ EP':'
The mission of the EPA is to protect human health and safeguard the natural environment upon
which life depends. EPA is committed to making America's air cleaner, water purer. and land
better protected and to work closely with its Federal, State, Tribal, and local government
partners; with citizens; and with the regulated community to accomplish its mission. In addition,
the United States plays a leadership role in working with other nations to protect the global
environment.
EPA statutory responsibilities to protect human health and safeguard the natural environment are
described in the statutes that mandate and govern our programs. EPA manages those programs in
concert with numerous other government and private sector partners. As Congress intended, each
statute provides regulatory expectations including information quality considerations and
principles. Some statu~es are more specific than others, but overall, each directs EPA and other
agencies in how we regUlate to protect human health and the environment. For example, the Safe
Drinking Water Act (SDWA) Amendments of 1996 set forth certain quality principles for how
EPA should conduct human health risk assessments and characterize the potential risks to
humans from drinking water contaminants. Information quality is a key component of every
statute that governs our mission.
The collection, use, and dissemination of information of known and appropriate quality are
integral to ensuring that EPA achieves its mission. Information about human health and the
environment -- environmental characteristics; physical, chemical, and biological processes; and
chemical and other pollutants -- underlies all environmental management and health protection
decisions. The availability of, and access to, information and the analytical tools to understand it
are essential for assessing environmental and human health risks, designing appropriate and
cost-effective policies and response strategies, and measuring environmental improvements.
EPA works every day to ensure information quality, but we do not wait until the point of
dissemination to consider important quality principles. While the final review of a document
before it is published is very important to ensuring a product of high quality, We know that in
order to maximize quality, we must start much earlier. When you read an EPA report at your
local library or view EPA information on our web site, that infonnation is the result of processes
undertaken by EPA and our partners that assured quality along each step of the way. To better
describe this interrelated information quality process, the following presents some of the major
roles that EPA plays in its effort to ensure and maximize the quality of the information:
• The final category of information that is not included in any of the above
three categories includes information that is either voluntarily
submitted to EPA in hopes of influencing a decision or that EPA
obtains for use in developing a policy, regulatory, or other decision.
Examples of this information include scientific studies published in
journal articles and test data obtained from other Federal agencies,
industry, and others. EPA may not have any financial ties or regulatory
requirements to control the quality of this type of information.
• EPA is a conduit for information: Another major role that EPA plays in the
management of information is as a provider of public access. S~ch access enables
public involvement in how EPA achieves it mission. We provide access to a
variety of information holdings. Some information distributed by EPA includes
information collected through contracts; information collected through grants and
As mentioned in the previous section, EPA works with a variety of partners to achieve its
mission. Our key government partners not only provide information, they also work with EPA to
manage and implement programs and communicate with the public about issues of concern. In
addition to implementing national programs through EPA Headquarters Program Offices, a vast
network of EPA Regions and other Federal, State, Tribal and local governments implement both
mandated and voluntary programs. This same network collects, uses, and distributes a wide
range of information. EPA plans to coordinate with these partners to ensure the Guidelines are
appropriate and effective.
One major mechanism to ensure and maximize information integrity is the National
Environmental Information Exchange Network (NEIEN, or Network). The result of an important
partnership between EPA, States and Tribal governments, the Network seeks to enhance the
Agency's information ~chitecture to ensure timely and one-stop reporting from many of EPA's
information partners. Key components include the establishment of the Central Data Exchange
(CDX) portal and a System of Access for internal and external users. When fully implemented,
the Network and its many components will enhance EPA and the public's ability to access, use,
and integrate information and the ability of external providers to report to EPA.
3 OMB Guidelines
In Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year
2001 (Public Law 106-554; H.R. 5658), Congress directed OMB to issue government-wide
guidelines that "provide policy and procedural guidance to Federal agencies for ensuring and
maximizing the quality, objectivity, utility, and integrity of information (including statistical
information) disseminated by Federal agencies...." The OMB guidelines direct agencies subject
to the Paperwork Reduction Act (44 V.S.c. 3502(1» to: .
• Issue their own information quality guidelines to ensure and maximize the
quality, objectivity, utility, and integrity of information, including statistical
information, by no later than one year after the date of issuance of the OMB
guidelines;
• Report to the Director of OMB the number and nature of complaints received by
the agency regarding agency compliance with OMB guidelines concerning the
quality, objectivity, utility, and integrity of information and how such complaints
were resolved.
~
The OMB guidelines provide some basic principles for agencies to consider when developing
their own guidelines including:
• Some agency information may need to meet higher or more specific expectations
for objectivity, utility, and integrity. Information of greater importance should be
held to a higher quality standard.
OMS Guidelines 9
GUidelines tor E:nsurll1g and Maximizing the Quality. Objectivity. Utility. and Integrity of InformatIon DlsSemlnalec c'· :::t:"
4 Existing Policies and Procedures that Ensure and Maximize Information Quality
EPA is dedicated to the collection, generation, and dissemination of high quality infonnation.
We disseminate a wide variety of infonnation products, ranging from comprehensive scientific
assessments of potential health risks, 2 to web-based applications that provide compliance
infonnation and map the location of regulated entities,3 to simple fact sheets for school children. 4
As a result of this diversity of information-related products and practices, different EPA
programs have evolved specialized approaches to information quality assurance. The OMB
guidelines encourage agencies to avoid the creation of "new and potentially duplicative or
contradictory processes." Further, OMB stresses that its guidelines are not intended to "impose
unnecessary administrative burdens that would inhibit agencies from continuing to take
advantage of the Internet and other technologies to disseminate information that can be of great
benefit and value to the public." In this spirit, EPA seeks to foster the continuous improvement
of existing information quality activities and programs. In implementing these guidelines, we
note that ensuring the qUality of infonnation is a key objective alongside other EPA objectives,
such as ensuring the success of Agency missions, observing budget and resource priorities and
restraints. and providing useful infonnation to the public. EPA intends to implement these
Guidelines in a way that will achieve all these objectives in a hannonious way in conjunction
with our existing guidelines and policies, some of which are outlined below. These examples
illustrate some of the numerous systems and practices in place that address the quality,
objectivity, utility, and integrity of infonnation.
The EPA Agency-wide Quality System helps ensure that EPA organizations maximize the
quality of environmental information, including infonnation disseminated by the Agency. A
graded approach is used to establish quality criteria that are appropriate for the intended use of
the information and the resources available. The Quality System is documented in EPA Order
5360.1 A2, "Policy and Program Requirements for the Mandatory Agency-wide Quality
System" and the "EPA Quality Manual."s To implement the Quality System, EPA organizations
(l) assign a quality assurance manager, or person assigned to an equivalent position, who has
sufficient technical and management expertise and authority to conduct independent oversight of
the implementation of the organization's quality system; (2) develop a Quality Management
Plan, which documents the organization's quality system; (3) conduct an annual assessment of
the organization's quality system; (4) use a systematic planning process to develop acceptance or
performance criteria prior to.the initiation of all projects that involve environmental information
2 http;/Icfpub.epa.gov/ncealcfmlpartmatt.cfm
3 http://www.epa.gov/enviro/wme/
4 http://www.epa.gov/kids
S EPA Quality Manual for Environmental Programs 5360 AI. May 2000.
http://www.epa.gov/quality/qs-docs/5360.pdf
Existing Policies and Procedures that Ensure and Maximize Information Quality 10
Guidelines for Ensunn9 and M;mmizing the Quality. Objectivity. Utility. and Integrity of Informatton Dlssemmatee 0,.' ;:0;0.
collection and/or hse; (5) develop Quality Assurance Project Plan(s), or equivalent document(s)
for all applicable projects ,and tasks involving environmental data; (6) conduct an assessment of
existing data, when used to support Agency decisions or other secondary purposes, to verify that
they are of sufficient quantity and adequate quality for their intended use; (7) implement all
Agency-wide Quality System components in all applicable EPA-funded extramural agre.ements;
and (8) provide appropriate training, for all levels of management and staff.
The EPA Quality System may also apply to non-EPA organizations, with key principles
incorporated in the applicable regulations governing contracts, grants, and cooperative
agreements. EPA Quality System provisions may also be invoked as part of negotiated
agreements such as memoranda of understanding. Non-EPA organizations that may be subject to
EPA Quality System requirements include (a) any organization or individual under direct
contract to EPA to furnish services or items or perform work (i.e., a contractor) under the
authority of 48 CFR part 46, (including applicable work assignments, delivery orders, and task
orders); and (b) other government agencies receiving assistance from EPA through interagency
agreements. Separate quality assurance requirements for assistance recipients are set forth in 40
CFR part 30 (governing assistance agreements with institutions of higher education, hospitals,
and other non-profit recipients of financial assistance) and 40 CFR parts 31 and 35 (government
assistance agreements with State, Tribal, and local governments).
In addition to the Qualjty System, EPA's Peer Review Policy provides that major scientifically
and technically based work products (including scientific, engineering, economic, or statistical
documents) related to Agency decisions should be peer-reviewed. Agency managers within
Headquarters, Regions, laboratories, and field offices determine and are accountable for the
decision whether to employ peer review in particular instances and, if so, its character, scope,
and timing. These decisions are made consistent with program goals and priorities, resource
constraints, and statutory or court-ordered deadlines. For those work products that are intended
to support the most important decisions or that have special importance in their own right,
external peer review is the procedure of choice. For other work products, internal peer review is
an acceptable alternative to external peer review. Peer review is not restricted to the penultimate
version of work products; in fact, peer review at the planning stage can often be extremely
beneficial. The basis for EPA peer review policy is articulated in Peer Review and Peer
involvement at the U.S. Environmental Protection Agency.6 The Peer Review Policy was first
issued in January, 1993, andwas updated in June, 1994. In addition to the policy, EPA has
published a Peer Review Handbook,? which provides detailed guidance for implementing the
policy. The handbook was last revised December, 2000.
6Peer Review and Peer Involvement at the U.S. EPA. June 7, 1994.
hnp://www.epa.gov/osp/spc/oerevmem.htm
7 Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
1OO-B-OO-OO 1. http://www.epa.gov/osp/spc/prhandbk.pdf
Existing Policies and Procedures that Ensure and Maximize Information Quality 11
Guidelines fat Ensuring and MaKimizing the Quality. Objectivity. Utility. and Integnty 01 Information Dlssemmated L'. EO.
The Agency's Action Development Process also serves to ensure and maximize the quality of
EPA disseminated information. Top Agency actions and Economically Significant actions as
designated under Executive Order 12866 are developed as part of the Agency's Action
Development Process. The Action Development Process ensures the early and timely
involvement of senior management at key decision milestones to facilitate the consideration of a
broad range of regulatory and non-regulatory options and analytic approaches. Of particular
importance to the Action Development Process is ensuring that our scientists, economists, and
others with technical expertise are appropriately involved in determining needed analyses and
research, identifying alternatives, and selecting options. Program Offices and Regio.nal Offices
are invited to participate to provide their unique perspectives and expertise. Effective
consultation with policy advisors (e,g., Senior Policy Council, Science Policy Council), co
regulators (e.g., States, Tribes, and local governments), and stakeholders is also part of the
process. Final Agency Review (FAR) generally takes place before the release of substantive
information associated with these actions. The FAR process ensures the consistency of any
policy determinations, as well as the quality of the information underlying each policy
determination and its presentation.
The Agency's Integrated Error Correction Process8 (IECP) is a process by which members of the
public can notify EPA.of a potential data error in information EPA distributes or disseminates.
This process builds on existing data processes through which discrete, numerical errors in our
data systems are reported to EPA. The IECP has made these tools more prominent and easier to
use. Individuals who identify potential data errors on the EPA web site can contact us through
the IECP by using the "Report Error" button or error correction hypertext found on major data
bases throughout EPA's web site. EPA reviews the error notification and assists in bringing the
notification to resolution with those who are responsible for the data within or outside the
Agency, as appropriate. The IECP tracks this entire process from notification through final
resolution.
Existing Policies and Procedures that Ensure and Maximize Information Quality 12
Guidehn",~ for EnsUring and MaXllntzing the Quality. Ob,ectlvity. Utility. and Integrity of Information Disseminated bl' EP~
The EPA Infonnation Resources Management (lRM) Manual 9 articulates and describes many of
our information development and management procedures and policies, including infonnation
security, data standards, records management, infonnation collection, and library services.
Especially important in the context of the Guidelines provided in this document, the IRM
Manual describes how we maintain and ensure infonnation integrity. We believe that
maintaining infonnation integrity refers to keeping infonnation "unaltered, i.e., free from II
Beyond addressing integrity concerns, the IRM Manual also includes Agency policy on public
access and records management. These are key chapters that enable EPA to ensure transparency
and the reproducibility of infonnation.
The EPA Risk Characterization Policy and Handbook 10 provide guidance for risk
characterization that is designed to ensure that critical infonnation from each stage of a risk
assessment is used in fprming conclusions about risk. The Policy calls for a transparent process
and products that are clear, consistent and reasonable. The Handbook is designed to provide risk
assessors, risk managers, and other decision-makers an understanding of the goals and principles
of risk characterization.
We mentioned just a few of the Agency's major policies that ensure and maximize the quality of
information we dissemina.te. In addi.tion .to these Agency-wide systems and procedures, Program
Offices and Regions implement many Office-level and program-specific procedures to ensure
and maximize information quality. The purpose of these Guidelines is to serve as a common
thread that ties all these policies together under the topics provided by OMB: objectivity,
integrity and utility. EPA's approach to ensuring and maximizing quality is necessarily
distributed across all levels of EPA's organizational hierarchy, including Offices, Regions,
divisions, projects, and even products. Oftentimes, there are different quality considerations for
different types of products. For example, the quality principles associated with a risk assessment
IO Risk Characterization Handbook, U.S. EPA, Science Policy Council, December 2000.
http://www.epa.gov/osp/socl2riskchr.htm
Existing Policies and Procedures that Ensure and Maximize Information Quality 13
Guiot>ltnes for Ensufln~ and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Disseminated b\' !:P,:
differ from those associated with developing a new model. The Agency currently has a
comprehensive but distributed system of policies to address such unique quality considerations.
These Guidelines provide us with a mechanism to help coordinate and synthesize our quality
policies and procedures.
As suggested above, we will continue to work to ensure that our many policies and procedures
are appropriately implemented, synthesized, and revised as needed. One way to build on
achievements and learn from mistakes is to document lessons learned about specific activities or
products. For example, the documents that present guidance and tools for implementing the
Quality System are routinely subjected to external peer review during their development;
comments from the reviewers are addressed and responses reviewed by management before the
document is issued. Each document is formally reviewed every five years and is either reissued,
revised as needed, or rescinded. If important new information or approaches evol ve between
reviews, the document may be reviewed and revised more frequently.
In response to OMB's guidelines, EPA recognizes that it will be incorporating new policies and
administrative mechanisms. As we reaffirm our commitment to our existing policies and
procedures that ensure and maximize quality, we also plan to address the following new areas of
focus and commitment:
• Working with the public to develop assessment factors that we will use to assess
the quality of information developed by external parties, prior to EPA's use of
that information.
Existing Policies and Procedures that Ensure and Maximize Information Quality 14
GU;QOim8S lOr E:nsuTlng and MaxIl11lzing the Quality. Objectivity. Utility. and Integrity ot Information Disseminated 0\ U':'
Consistent with the OMB guidelines, EPA is issuing these Guidelines to ensure and maximize
the quality, including objectivity, utility and integrity, of disseminated infonnation. Objectivity,
integrity, and utility are defined here, consistent with the OMB guidelines. "Objectivity" focuses
on whether the disseminated infonnation is being presented in an accurate, dear, complete. and
unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. "Integrity"
refers to security, such as the protection of infonnation from unauthorized access or revision, to
ensure that the infonnation is not compromised through corruption or falsification. "Utility"
refers to the usefulness of the infonnation to the intended users.
The collection, use, and dissemination of information of known and appropriate quality is
integral to ensuring that EPA achieves its mission. Infonnation about the environment and
human health underlies all environmental management decisions. Information and the analytical
tools to understand it are essential for assessing environmental and human health risks, designing
appropriate and cost-effective policies and response strategies, and measuring environmental
improvements.
These Guidelines describe EPA's policy and procedures for reviewing and substantiating the
quality of information before EPA disseminates it. They describe our administrative mechanisms
for enabling affected persons to seek and obtain, where appropriate, correction of information
disseminated by EPA that they believe does not comply with EPA or OMB guidelines.
These Guidelines apply to "infonnation" EPA disseminates to the public. "Infonnation," for
purposes ofthese Guidelines, generally includes any communication or representation of
knowledge such as facts or data, in any medium or fonn. Preliminary information EPA
disseminates to the public is also considered "information" for the purposes of the Guidelines.
Information generally includes material that EPA disseminates from a web page. However not
all web content is considered "information" under these Guidelines (e.g., certain infonnation
from outside sources that is not adopted, endorsed, or used by EPA to support an Agency
decision or position).
For purposes of these Guidelines, EPA disseminates information to the public when EPA
initiates or sponsors the distribution of information to the public.
EPA intends to use notices to explain the status of information, so that users will be aware of
whether the information is being distributed to support or represent EPA's viewpoint.
If an item is not considered "information," these Guidelines do not apply. Examples of items that
are not considered information include Internet hyperlinks and other references to information
distributed by others, and opinions, where EPA's presentation makes it clear that what is being
offered is someone's opinion rather than fact or EPA's views.
"Dissemination" for the purposes of these Guidelines does not include distributions of
information that EPA does not initiate or sponsor. Below is a sample of various types of
information that would not generally be considered disseminated by EPA to the public:
• EPA's response to requests for agency records under the Freedom of Information
Act (FOIA). the Privacy Act, the Federal Advisory Committee Act (FACA), or
other similar laws.
elsewhere; interviews, speeches, and similar communications that EPA does not
disseminate to the public beyond their original context, such as by placing them
on the Internet. If a speech, press release, or other "ephemeral" communication is
about an information product disseminated elsewhere by EPA, the product itself
will be covered by these Guidelines.
5.5 What Happens if Information is Initially Not Covered by these Guidelines. but EPA
Subsequently Disseminates it to the Public?
If a particular distribution of information is not covered by these Guidelines, the Guidelines may
still apply to a subsequent dissemination of the infonnation in which EPA adopts, endorses, or
uses the information to fonnulate or support a regulation, guidance, or other Agency decision or
position. For example, if EPA simply makes a public filing (such as facility data req.uired by
regulation) available to the public, these Guidelines would not apply to that distribution of
information. However, if EPA later includes the information in a background document in
support of a rulemaking, these Guidelines would apply to that later dissemination of the
information in that document.
5.6 How does EPA Ensure the Objectivity, Utility. and Integrity ofinforrnation that is
not covered by these Guidelines?
These Guidelines apply only to information EPA disseminates to the public, outlined in section
5.3, above. Other information distributed by EPA that is not covered by these Guidelines is still
subject to all applicable EPA policies, quality review processes, and correction procedures.
These include quality 1Jlanagement plans for programs that collect, manage, and use
environmental infonnation, peer review, and other procedures that are specific to individual .
programs and, therefore, not described in these Guidelines. It is EPA's policy that all of the
information it distributes meets a basic standard of information quality, and that its utility,
objectivity, and integrity be scaled and appropriate to the nature and timeliness of the planned
and anticipated uses. Ensuring the quality of EPA information is not necessarily dependent on
any plans to disseminate the information. EPA continues to produce, collect, and use information
that is of the appropriate quality, irrespective of these Guidelines or the prospects for
dissemination of the information.
6.1 How does EPA Ensure and Maximize the Quality of Disseminated Information?
EPA ensures and maximizes the quality of the infonnation we disseminate by implementing well
established policies and procedures within the Agency as appropriate to the infonnation product.
There are many tools that the Agency uses such as the Quality System, Il review by senior
management, peer review process,12 communications product review process,13 the web guide, 14
and the error correction process. 15 Beyond our internal quality management system, EPA also
ensures the quality of infonnation we disseminate by seeking input from experts and the general
public. EPA consults with groups such as the Science Advisory Board and the Science Advisory
Panel, in addition to seeking public input through public comment periods and by hosting public
meetings.
For the purposes of the Guidelines, EPA recognizes that if data and analytic results are subjected
to fonnal, independent, external peer review, the infonnation may generally be presumed to be
of acceptable objectivity. However, this presumption of objectivity is rebuttable. The Agency
uses a graded approach and uses these tools to establish the appropriate quality, objectivity,
utility, and integrity of information products based on the intended use of the infonnation and
the resources available. As part of this graded approach, EPA recognizes that some of the
infonnation it disseminates includes influential scientific, financial, or statistical infonnation,
and that this category should meet a higher standard of quality.
6.2 How Does EPA Define Influential Information for these Guidelines?
llEPA Quality Manual for Environmental Programs 5360 AI. May 2000.
hup://www.epa-.gov/gualjty/gs-docs/5360.pdf
12Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
Ioo-B-OO-OOI. hup://www.epa.gov/osp/spc/prhandbk.pdf
13 EPA 's Print and Web Communications Product Review Guide. hup://www.epa.gov/dced/pdf/review.pdf
16The term "clear and substantial impact" is used as part of a definition to distinguish different categories of
information for purposes of these Guidelines. EPA does not intend the classification of information under this
definition to change or impact the status of the information in any other setting, such as for purposes of determining
whether the dissemination of the information is a final Agency action.
Information Quality Guidelines, EPA will generally consider the following classes of
information to be influential, and, to the extent that they contain scientific, financial, or statistical
information, that information should adhere to a rigorous standard of quality:
• Major \york products undergoing peer review as called for under the Agency's
Peer Review Policy. Described in the Science Policy Council Peer Review
Handbook, the EPA Peer Review Policy regards major scientific and technical
work products as those that have a major impact, involve precedential, novel,
andlor controversial issues, or the Agency has a legal.andlor statutory obligation
to conduct a peer review. These Major work products are typically subjected to
external peer review. Some products that may not be considered "major" under
the EPA Peer Review Policy may be subjected to external peer review but EPA
does not consider such products influential for purposes of these Guidelines.
6.3 How Does EPA Ensure and Maximize the Quality of "Influential" Information?
EPA recognizes that influential scientific, financial, or statistical infonnation ~hould be subject
to a higher degree of quality (for example, transparency about data and methods) than
information that may not have a clear and substantial impact @n important public policies or
private sector decisions. A higher degree of transparency about data and methods will facilitate
Several Agency-wide and Program- and Region-specific policies and processes that EPA uses to
ensure and maximize the quality of environmental data, including disseminated information
products, would also apply to. information considered "influential" under these Guidelines.
Agency-wide processes of particular importance to ensure the quality, objectivity, and
transparency of "influential" information include the Agency's Quality System, Action
Development Process, Peer Review Policy, and related procedures. Many "influential"
information products Il)ay be subject to more than one of these processes.
6.4 How Does EPA Ensure and Maximize the Quality of "Influential" Scientific Risk
Assessment Information?
EPA conducts and disseminates a variety of risk assessments. When evaluating environmental
problems or establishing standards, EPA must comply with statutory requirements and mandates
set by Congress based on media (air, water, solid, and hazardous waste) or other environmental
interests (pesticides and chemicals). Consistent with EPA's current practices, application of these
principles involves a "weight-of-evidence" approach that considers all relevant infonnation and
its quality, consistent with the level of effort and complexity of detail appropriate to a particular
risk assessment. In our dissemination of influential scientific information regarding human
health, safety '7 or environmental 18 risk assessments, EPA will ensure, to the extent practicable
I7"Safety risk assessment" describes a variety of analyses, investigations, or case studies conducted by EPA
to respond to environmental emergencies. For example, we work to ensure that the chemical industry and state and
local entities take action to prevent, plan and prepare for, and respond to chemical emergencies through the
development and sharing of information. tools, and guidance for hazards analyses and risk assessment.
JRBecause the assessment of "environmental risk" is being distinguished from "human health risk," the term
"environmental risk" as used in these Guidelines does not directly involve human health concerns. In other words, an
"environmental risk assessment" is in this case the equivalent to what EPA commonly calIs an "ecological risk
and consistent with Agency statutes and existing legislative regulations, the objectivityl9 of such
information disseminated by the Agency by applying the following adaptation of the quality
21
principles found in the Safe Drinking Water Aceo (SOWA) Amendments of 1996 :
(A) The substance of the information is accurate, reliable and unbiased. This involves the use
of:
(i) the best available science and supporting studies conducted in accordance with
sound and objective scientific practices, including, when available, peer reviewed
science and supporting studies; and
(ii) data collected by accepted methods or best available methods (if the reliability of
the method and the nature of the decision justifies the use of the data).
(i) each population addressed by any estimate of applicable human health risk or
each risk assessment endpoint, including populations if applicable, addressed by
any estimate of applicable ecological risk 22 ;
(ii) the expected risk or central estimate of human health risk for the specific
assessment".
190MB stated in its guidelines that in disseminating infonnation agencies shall develop a process for
revieWing the quality of the infonnation. "Quality" includes objectivity, utility, and integrity. "Objectivity" involves
two distinct elements, presentation and substance. Guidelines for Ensuring and Maximizing the Quality, Objectivity,
Utility, and Integrity of Infonnation Disseminated by Federal Agencies, OMB, 2002. (67 FR 8452)
http://www.whitehouse.gov/omb/fedreglreproducible2.pdf
20Safe Drinking Water Act Amendments of 1996, 42 U.S.c. 300g-1(b)(3)(A) & (B)
21 The exception is risk assessments conducted under SDWA which will adhere to the SDWA principles as
amended in 1996.
22Agency assessments of human health risks necessarily focus on populations. Agency assessments of
ecological risks address a variety of entities, some of which can be described as populations and others (such as
ecosystems) which cannot. The pl}rase "assessment endpoint" is intended to reJlect the broader range of interests
inherent in ecological risk assessments. As discussed in the EPA Guidelines for Ecological Risk Assessment (found
at http://cfpub.epa.gov/ncealcfmlrecordisplay.cfm?deid-12460), assessment endpoints are explicit expressions of the
actual environmental value that is to be protected, operationally defined by an ecological entity and its attributes.
Furthennore, those Guidelines explain that an ecological entity can be a species (e.g., eelgrass, piping plover), a
community (e.g., benthic invertebrates), an ecosystem (e.g., wetland), or other entity of concern. An attribute of an
assessment endpoint is the charactenstic about the entity of concern that is important to protect and potentially at
risk. Examples of attributes include abundance (of a popUlation), species richness (of a comniunity), or function (of
an ecosystem). Assessment endpoints and ecological risk assessments are discussed more fully in those Guidelines
as well as other EPA sources such as Ecological Risk Assessment Guidance for Superfund: Process for Designing
and Conducting Ecological Risk Assessments· Interim Final found at
http://www.epa.gov/oerrpage/superfundlprograms/risklecorisklecorisk.htm
In applying these principles, "best available" usually refers to the availability at the time an
assessment is made. However, EPA also recognizes that scientific knowledge about risk is
rapidly changing and that risk information may need to be updated over time. When deciding
which influential risk assessment should be updated and when to update it, the Agency will take
into account its statutes and the extent to which the updated risk assessment will have a clear and
substantial impact on important public policies or private sector decisions. In some situations,
the Agency may need to weigh the resources needed and the potential delay associated with
incorporating additional information in comparison to the value of the new information in terms
of its potential to improve the substance and presentation of the assessment.
Adaptation clarifications
In order to provide more clarity on how EPA adapted the SDWA principles in this guidance in
light of our numerous ~tatutes, regulations, guidance and policies that address how to conduct a
risk assessment and characterize risk we discuss four adaptations EPA has made to the SDWA
quality principles language.
EPA adapted the SDWA principles by adding the phrase "consistent with Agency statutes and
existing legislative regulations, the objectivity of such information disseminated by the Agency"
in the introductory paragraph, therefore applying to both paragraphs (A) and (B). This was done
to explain EPA's intent regarding these quality principles and their implementation consistent
with our statutes and existing legislative regulations. Also, as noted earlier, EPA intends to
implement these quality principles in conjunction with our guidelines and policies. The
procedures set forth in other EPA guidelines set out in more detail EPA's policies for conducting
risk assessments, including Agency-wide guidance on various types of risk assessments and
program-specific guidance. EPA recognizes that the wide array of programs within EPA have
resulted not only in Agency-wide guidance, but in specific protocols that reflect the
requirements, including limitations, that are mandated by the various statutes administered by
the Agency. For example, the Agency developed several pesticide science policy papers that
explained to the public in detail how EPA would implement specific statutory requirements in
the Food Quality Protection Act (FQPA) that addressed how we perform risk assessments. We
also recognize that emerging issues such endocrine disruption, bioengineered organisms, and
genomics may involve some modifications to the existing paradigm for assessIng human health
and ecological risks. This does not mean a radical departure from existing guidance or the
SDWA principles, but rather indicates that flexibility may be warranted as new information and
approaches develop. .
EPA introduced the following two adaptations in order to accommodate the range of real-world
situations that we confront in the implementation of our diverse programs. EPA adapted the
SDWA quality principles by moving the phrase "to the extent practicable" from paragraph (B) to
the introductory paragraph in this Guidelines section to cover both parts (A) and (B) of the
SDWA adaptation. 24 The phrase refers to situations under (A) where EPA may be called upon to
conduct "influential" scientific risk assessments based on limited information or in novel
situations, and under (B) in recognition that all such "presentation" information may not be
available in every instance. The level of effort and complexity of a risk assessment should also
balance the information needs for decision making with the effort needed to develop such
information. For example, under the Federal Insecticide, Fungicide and Rodenticide Act25
(FIFRA) and the Toxic Substances and Control Act 26 (TSCA), regulated entities are obligated to
provide information to EPA concerning incidents/test data that may reveal a problem with a
pesticide or chemical. We also receive such information voluntarily from other sources. EPA
carefully reviews incident reports and factors them as appropriate into risk assessments and
decision-making, even though these may not be considered information collected by acceptable
methods or best available method as stated in A(ii). Incident information played an important
role in the Agency's conclusion that use of chlordanelheptachlor termiticides could result in
exposures to persons living in treated homes, and that the registrations needed to be modified
accordingly. Similarly~ incident reports concerning birdkills and fishkills were important
components of the risk assessments for the reregistration of the pesticides phorate and terbufos,
respectively. In addition, this adaptation recognizes that while many of the studies incorporated
into risk assessments have been peer reviewed, data from other sources may not be peer
reviewed. EPA takes many actions based on studies and supporting data provided by outside
sources, including confidential or proprietary information that has not been peer reviewed. For
example, industry can be required by regulation to submit data for pesticides under FIFRA or for
chemicals under TSCA. The data are developed using test guidelines and Good Laboratory
Practices (GLPs) in accordance with EPA regulations. While there is not a requirement to have
studies peer reviewed, such studies are reviewed by Agency scientists to ensure that they were
conducted according to the appropriate test guidelines and GLPs and that the data are valid.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
in many circumstances to conserve Agency resources and those of the regulated community who
otherwise might have to generate significant additional data. This flexibility is already provided
24 The discussion in this and following paragraphs gives some examples of the types of assessments that
may under some circumstances be considered influential. These examples are representative of assessments
performed under other EPA programs, such as CERCLA
25
7 U.S.C. 136 et seq.
26
15 U.S.c. 2601 et seq.
for paragraph (B) in the SDWA quality principles. Pesticide and chemical risk assessments are
frequently performed iteratively, with the first iteration employing protective (conservative)
assumptions to identify possible risks. Only if potential risks are identified in a screening level
assessment, is it necessary to pursue a more refined, data-intensive risk assessment. This is
exhibited, for example, in guidance developed for use in CERCLA and RCRA on tiered
approaches. In other cases, reliance on "structure activity relationship" or "bridging data" allows
the Agency to rely on data from similar chemicals rather than require the generation of new,
chemical-specific data. While such assessments mayor may not be considered influential under
the Guidelines, this adaptation of the SDWA principles reflects EPA's reliance on less-refined
risk assessments where further refinement could significantly increase the cost of the risk
assessment without significantly enhancing the assessment or changing the regulatory outcome.
In emergency and other time critical circumstances, risk assessments may have to rely on
information at hand or that can be made readily available rather than data such as described in
(A). One such scenario is risk assessments addressing Emergency Exemption requests submitted
under Section 18 of FIFRA27 which, because of the emergency nature of the request, must be
completed within a short time frame. As an example. EPA granted an emergency exemption
under Section 18 to allow use of an unregistered pesticide to decontaminate anthrax in a Senate
office building. The scientific review and risk assessment to support this action were necessarily
constrained by the urgency of the action. Other time-sensitive actions include the reviews of new
chemicals under TSCA. Under Section 5 of TSCA 28 , EPA must review a large number of
pre-manufacture notifications (more than 1,000) every year, not all of which necessarily include
"influential" risk asses.sments, and each review must be completed within a short time frame
(generally 90 days). The nature of the reviews and risk assessment associated with these
pre-manufacture notifications are affected by the limited time available and the large volume of
notifications submitted.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
to account for safety risk assessment practices. This flexibility is already provided for paragraph
(B) in the SDWA quality principles. We applied the same SDWA adaptation for use with human
health risk assessments to safety risk assessments with the needed flexibility to apply the
principles to the extent practicable. "Safety risk assessments" include a variety of analyses,
investigations, or case studies conducted by EPA concerning safety issues. EPA works to ensure
that the chemical industry and state and local entities take action to prevent, plan and prepare for,
and respond to environmental emergencies and site specific response actions through the
development and sharing of information, tools and guidance for hazard analyses and risk
assessment. For example, although the chemical industry shoulders most of the responsibility for
safety risk assessment and management, EPA may also conduct chemical hazard analyses,
investigate the root causes and mechanisms associated with accidental chemical releases, and
assess the probability and consequences of accidental releases in support of agency risk
assessments. Although safety risk assessments can be different from traditional human health
risk assessments because they may combine a variety of available information and may use
expert judgement based on that information, these assessments provide useful information that is
sufficient for the intended purpose.
Next, EPA adapted the SDWA quality principles by adding the clause "including, when
available, peer reviewed science and supporting studies" to paragraph (A)(i). It now reads: "the
best available science and supporting studies conducted in accordance with sound and objective
scientific practices, including, when available, peer reviewed science and supporting studies." In
the Agency's development of "influential" scientific risk assessments, we intend to use all
relevant information, including peer reviewed studies, studies that have not been peer reviewed,
and incident information; evaluate that information based on sound scientific practices as
described in our risk assessment guidelines and policies; and reach a position based on careful
consideration of all such information (i.e., a process typically referred to as the "weight-of
evidence" approach 29 ). In this approach, a well-developed, peer-reviewed study would generally
be accorded greater weight than information from a less well-developed study that had not been
peer-reviewed, but both studies would be considered. Thus the Agency uses a "weight-of
evidence" process when evaluating peer-reviewed studies along with all other information.
Oftentimes under various EPA-managed programs, EPA receives information that has not been
peer-reviewed and we have to make decisions based on the information available. While many
of the studies incorporated in risk assessments have been peer reviewed, data from other sources,
such as studies submit~ed to the Agency for pesticides under FIFRA 30 and for chemicals under
TSCA, may not always be peer reviewed. Rather, such data, developed under approved
guidelines and the application of Good Laboratory Practices (GLPs), are routinely used in the
development of risk assessments. Risk assessments may also include more limited data sets such
as monitoring data used to support the exposure element of a risk as.sessment. In cases where
these data may not themselves have been peer reviewed their quality and appropriate use would
be addressed as part of the peer review of the overall risk assessment as called for under the
Agency's peer review guidelines.
Lastly, EPA adapted the SDWA principles for influential environmental ("ecological") risk
assessments that are disseminated in order to use terms that are most suited for such risk
assessments. Specifically, EPA assessments of ecological risks address a variety of entities,
some of which can be described as populations and others (such as ecosystems) which cannot.
Therefore, a specific modification was made to include "assessment endpoints. including
populations if applicable" in place of the term "population" for ecological risk assessments and
EPA added a footnote directing the reader to various EPA risk policies for further discussion of
these concepts in greater detail.
6.5 Does EPA Ensure and Maximize the Quality of Information from External Sources?
Ensuring and maximizing the quality of information from States, other governments, and third
parties is a complex undertaking, involving thoughtful collaboration with States, Tribes, the
scientific and technical community, and other external information providers. EPA will continue
to take steps to ensure that the quality and transparency of information provided by external
sources are sufficient for the intended use. For instance, since 1998, the use of environmental
data collected by others or for other purposes, including literature, industry surveys,
compilations from computerized data bases and information systems, and results from
computerized or mathematical models of environmental processes and conditions has been
within the scope of the Agency's Quality System3 ),
For information that is either voluntarily submitted to EPA in hopes of influencing a decision or
that EPA obtains for use in developing a policy, regulatory, or other decision, EPA will continue
to work with States and other governments, the scientific and technical community, and other
interested information providers to develop and publish factors that EPA would use to assess the
quality of this type of information.
For all proposed collections of information that will be disseminated to the public, EPA intends
to demonstrate in our Paperwork Reduction AcrJ 2 clearance submissions that the proposed
collection of information will result in information that will be collected, maintained and used in
ways consistent with the OMB guidelines and these EPA Guidelines. These Guidelines apply to
all information EPA disseminates to the public; accordingly, if EPA later identifies a new use for
the information that was collected, such use would not be precluded and the Guidelines would
apply to the dissemination of the information to the public.
EPA Quality Manual for Environmental Programs 5360 AI. May 2000, Section 1.3.1.
31
http://www.epa.gov/guality/gs-docsl5360.pdf
32
44 U.S.C. 3501 et seq.
Each EPA Program Office and Region will incorporate the information quality principles
outlined in section 6 of these Guidelines into their existing pre-dissemination review procedures
as appropriate. Offices and Regions may develop unique and new procedures, as needed, to
provide additional assurance that the information disseminated by or on behalf of their
organizations is consistent with these Guidelines. EPA intends to facilitate implementation of
consistent cross-Agency pre-dissemination reviews by establishing a model of minimum review
standards based on existing policies. Such a model for pre-dissemination review would still
provide that responsibility for the reviews remains in the appropriate EPA Office or Region.
For the purposes of the Guidelines, EPA recognizes that pre-dissemination review procedures
may include peer reviews and quality reviews that may occur at many steps in development of
information, not only at the point immediately prior to the dissemination of the information.
8.1 What are EPA's Administrative Mechanisms for Affected Persons to Seek and
Obtain Correction of Information?
EPA's Office of Environmental Information (OEI) manages the administrative mechanisms that
enable affected persons to seek and obtain, where appropriate, correction of information
disseminated by the Agency that does not comply with EPA or OMB Information Quality
Guidelines. Working with the Program Offices, Regions, laboratories, and field offices. OEI will
receive complaints (or copies) and distribute them to the appropriate EPA information owners.
"Information owners" are the responsible persons designated by management in the applicable
EPA Program Office, or those who have responsibility for the quality, objectivity, utility, and
integrity of the information product or data disseminated by EPA. If a person believes that
information disseminated by EPA may not comply with the Guidelines, we encourage the person
to consult informally with the contact person listed in the information product before submitting
a request for correction of information. An informal contact can result in a quick and efficient
resolution of questions about information quality.
Persons requesting a correction of information should include the following information in their
Request for Correction (RFC):
,
• Name and contact information for the individual or organization submitting a
complaint; identification of an individual to serve as a contact.
• A description of the information the person believes does not comply with EPA
or OMB guidelines, including specific citations to the information and to the EPA
or OMB guidelines, if applicable.
• An explanation of how the information does not comply with EPA or OMB
guidelines and a recommendation of corrective action. EPA considers that the
complainant has the burden of demonstrating that the information does not
comply with EPA or OMB guidelines and that a particular corrective action
would be appropriate.
• An explanation of how the alleged error affects or how a correction would benefit
the requestor.
• An affected person may submit an RFC via anyone of methods listed here:
• Internet at http://www.epa.gov/oeilgualityguidelines
• E-mail atguality.guidelines@epa.gov
• Fax at (202) 566-0255
8.3 When Does EPA Intend to Consider a Request for Correction of Information?
EPA seeks public and stakeholder input on a wide variety of issues, including the identification
and resolution of discrepancies in EPA data and information. EPA may decline to review an
RFC under these Guidelines and consider it for correction if:
• The request does not address information disseminated to the public covered by
these Guidelines (see section 5.3 or OMB's guidelines). In many cases, EPA
provides other correction processes for information not covered by these
Guidelines.
• The request omits one or more of the elements recommended in section 8.2 and
there is insufficient information for EPA to provide a satisfactory response.
• The request itself is "frivolous," including those made in bad faith, made without
justificl!tion or trivial, and for which a response would be duplicative. More
information on this subject may be found in the OMB guidelines.
8.4 How Does EPA Intend to Respond to a Request for Correction of Information?
the error. For requests involving infonnation from outside sources. considerations
may include coordinating with the source and other practical limitations on EPA' s
ability to take corrective action.
• For approved requests, EPA assigns a steward for the correction who marks the
information as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.
OEI will provide reports on behalf of EPA to OMB on an annual basis beginning January 1.
2004 regarding the number, nature, and resolution of complaints received by EPA.
8.5 How Does EPA Expect to Process Requests for Correction of Information on Which
EPA has Sought P1Jblic Comment?
When EPA provides opportunities for public participation by seeking comments on infonnation,
the public comment process should address concerns about EPA's infonnation. For example,
when EPA issues a notice of proposed rulemaking supported by studies and other information
described in the proposal or included in the rulemaking docket, it disseminates this infonnation
within the meaning of,the GUidelines. The public may then raise issues in comments regarding
the infonnation. If a group or an individual raises a question regarding infonnation supporting a
proposed rule, EPA generally expects to treat it procedurally like a comment to the rulemaking,
addressing it in the response to comments rather than through a separate response mechanism.
This approach would also generally apply to other processes involving a structured opportunity
for public comment on a draft or proposed document before a final document is issued, such as a
draft report, risk assessment, or guidance document. EPA believes that the thorough
consideration provided by the public comment process serves the purposes of the Guidelines,
provides an opportunity for correction of any infonnation that does not comply with the
Guidelines, and does not duplicate or interfere with the orderly conduct of the action. In cases
where the Agency disseminates a study, analysis, or other infonnation prior to the final Agency
action or infonnation product, it is EPA policy to consider requests for correction prior to the
final Agency action or infonnation product in those cases where the Agency has detennined that
an earlier response would not unduly delay issuance of the Agency action or infonnation product
and the complainant has shown a reasonable likelihood of suffering actual hann from the
Agency's dissemination if the Agency does not resolve the complaint prior to the final Agency
action or infonnation product. EPA does not expect this to be the nonn in rulemakings that it
conducts, and thus will usually address infonnation quality issues in connection with the final
Agency action or infonnation product.
EPA generally would not consider a complaint that could have been submitted as a timely
comment in the rulemaking or other action but was submitted after the comment period. If EPA
cannot respond to a complaint in the response to comments for the action (for example, because
the complaint is submitted too late to be considered and could not have been timely submitted, or
because the complaint is not gennane to the action), EPA will consider whether a separate
response to the complaint is appropriate.
8.6 What Should be Included in a Request Asking EPA to Reconsider its Decision on a
Request for the Correction of Information?
If requesters are dissatisfied with an EPA decision, they may file a Request for Reconsideration
(RFR). The RFR should contain the following infonnation:
• An explanation of why the person disagrees with the EPA decision and a specific
recommendation for corrective action.
• An affected person may submit a Request for Reconsideration (RFR) via anyone
of the methods listed here:
• Internet at http://www.epa.gov/oei/gualityguidelines
• ~.mail at quality.guidelines@epa.gov
• Fax at (202) 566-0255
• Mail to Information Quality Guidelines Staff, Mail Code 2822IT, U.S.
EPA, 1200 Pennsylvania Ave., N.W., Washington, DC, 20460
• By courier or in person to Information Quality Guidelines Staff, OEI
Docket Center, Room B128, EPA West Building, 1301 Constitution
Ave., N.W., Washington, DC
EPA recommends that requesters submit their RFR within 90 days of the EPA decision. If the
RFR is sent after that time, EPA recommends that the requester include an explanation of why
the request should be considered at this time.
8.7 How Does EPA Intend to Process Requests for Reconsideration of EPA Decisions?
• OEI sends the RFR to the appropriate EPA Program Office or Region that has
responsibility for the infonnation in question.
• For approved requests, EPA assigns a steward for the correction who marks the
information as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.
Appendix A
A.I Introduction
EPA's Guidelines are a living document and may be revised as we learn more about how best to
address, ensure, and maximize information quality. In the process of developing these
Guidelines, we actively solicited public input at many stages. While the public was free to
comment on any aspect of the Guidelines, EPA explicitly requested input on key topics such as
influential information, reproducibility, influential risk assessment, infonnation sources, and
error correction.
• An online Public Comment Session was held March 19-22,2002, as the first draft of the
Guidelines was being developed. EPA received approximately 100 comments.
• A Public Meeting was held on May 15, 2002, after the draft Guidelines were issued.
There were 99 participants, 13 of whom made presentations or commented on one or
more issues.
A 52 day Public Comment period lasted from May 1 to June 21, 2002, where comments
could be maileg, faxed, or e-mailed to EPA. EPA received 55 comments during this
period.
• A conference call between EPA and Tribal representatives was held on June 27, 2002.
More detailed information on the public comments is available through an OEI web site, serving
as the home page for the EPA Information Quality Guidelines through the development and
implementation process. Please visit this site at http://www.epa.gov/oei/gualityguidelines.
We have established a public docket for the EPA Information Quality Guidelines under Docket
ill No. OEI-10014. The docket is the collection of materials available for public viewing
Information Quality Guidelines Staff, OEl Docket Center, Room B128, EPA West Building,
1301 Constitution Ave., N.W., Washington, DC, phone number 202-566-0284. This docket
consists of a copy of the Guidelines, public comments received, and other infonnation related to
the Guidelines. The docket is open from 12:00 PM to 4:00 PM, Monday through Friday,
excluding legal holidays. An index of docket contents will be available at
http://www.epa.gov/oeilgualityguidelines.
Appendix 36
Gu,dellnes for Ensurin", and Mo~imlzin9 the Quality. Objectivity. Utility. and Integrity of Information Disseminated by ED;.
During the various public comment opportunities, EPA received input from a diverse set of
organizations and private citizens. Comments came from many of EPA's stakeholders - the
regulated community and many interest groups who we hear from frequently during the
management of EPA's Programs to protect the nation's land, air, water, and public health.
Government agencies at the Federal, State, Tribal, and local level also commented on the
Guidelines. OMB sent comments to every Federal agency and EPA received comments from two
members of Congress. Beyond our government colleagues, the private sector voiced many
concerns and helpful recommendations for these Guidelines. We would like to take this
opportunity to thank all commenters for providing their input on these Guidelines. Due to the
tight time frame for this project, this discussion of public comments generally describes the
major categories of comments and highlights some significant comments, but does not contain
an individual response to each public comment.
Comments received by EPA during the publi~ comment period reflect a diversity of views
regarding EPA'~ approach to developing draft Guidelines as well as the general concept of
information quality. Some commenters included detailed review of all Guidelines sections, while
others chose to address only specific topics. In some cases, commenters provided examples to
demonstrate how current EPA procedures may not ensure adequate information quality for a
specific application. Commenters provided general observations such as stating that these
Guidelines did not sufficiently address EPA's information quality problems. Some commenters
offered that the Guideljnes relied too much on existing policies. Interpretations of the intent of
the Data Quality Act were offered by some commenters. One comment noted that improvement
of data quality is not necessarily an end in and of itself. Another comment was that the goal of
Guidelines should be more to improve quality, not end uncertainty. Public interest and
environmental groups voiced concern over what they believed was an attempt by various groups
to undennine EPA's ability to act in a timely fashion to protect the environment and public
health. Some commenters stated that the directives of the Data Quality Act and OMB cannot
override EPA's mission to protect human health and the environment per the statutory mandates
under which it operates.
EPA was congratulated for the effort and, in some cases, encouraged to go even further in
addressing information quality. Some commenters encouraged EPA to provide additional
process details, provide more detailed definitions, augment existing policies that promote
transparency, and share more infonnation about the limitations of EPA disseminated
information. In one case, EPA was encouraged to develop a rating scheme for its disseminated
infonnation.
This section discusses public comments and our responses to many of the important questions
and issues raised in the comments. First, we provide responses to some overarching comments
we received from many commenters, then we provide a discussion of public comments that were
received on specific topics addressed in the draft Guidelines.
Appendix 37
GlIIocllfle!O- tor Ensurlnil and Maximizil]g the Quality. Objectivity. Utility. and Integrity of Information Disseminalec D'." E":
• Tone: Commenters criticized the "defensive tone", "legalistic tone", and the lack
of detail afforded in the Guidelines. Some commenters said that it was not clear
what the Guidelines were explaining, or how they might apply to various types of
information. We understand and agree with many of these criticisms and have
made attempts to better communicate the purpose, applicability, and content of
these Guidelines.
Many commenters told us that we rely excessively on existing EPA information quality policies.
Commenters provided specific examples of areas they believed were demonstrative of our lack
of commitment to or uneveILimplementation of our existing policies. Some commenters also
pointed out that there are key areas in which we lack policies to address quality and, as a result,
the Guidelines should address such issues in more detail. Some commenters also noted that EPA
itself has highlighted lessons learned with existing approaches to information product
development.
Appendix 38
GlJloellnes for Ensuring and Maximizing the Quality. Objectivity. Utilify. and Integrity of Information Dlssem,"3Ie~ t .. ;: D ~
The concept of peer review is considered in three Guidelines sections. (1) Application of the
Agency's Peer Review Policy language for "major scientific and technical work products and
economic analysis used in decision making" as a class of information that can be considered
"influential" for purposes of the Guidelines; ( 2) Use of "peer-reviewed science" as a component
of some risk assessments; and (3) Use of the Agency's Peer Review Policy as one of the
Agency-wide processes to ensure the quality, objectivity, and transparency of "influential"
scientific, financial, and statistical information under the Guidelines.
We received a number of comments on section 1.1 (What is the Purpose of these Guidelines?) of
the draft Guidelines. Some commenters argued that the Guidelines should be binding on EPA,
that they are legislative rules rather than guidance, or that the Guidelines must be followed
unless we make a specific determination to the contrary. Others argued that the Guidelines
should not be binding or that we should include an explicit statement that the Guidelines do not
alter substantive agency mandates. Some suggested that our statements retaining discretion to
differ from the Guidelines sent a signal that EPA was not serious about information quality.
With respect to the nature of these Guidelines, Section 515 specifies that agencies are to issue
"guidelines." As directed by OMB's guidelines, we have issued our own guidelines containing
nonbinding policy and procedural guidance. We see no indication in either the language or
general structure of Section 515 that Congress intended EPA's guidelines to be binding rules.
We revised this section (now section 1 in this revised draft) by adding a fuller explanation of
how EPA intends to ensure the quality of information it disseminates. This section includes
language explaining the nature of our Guidelines as policy and procedural guidance. This
language is intended to give clear notice of the nonbinding legal effect of the Guidelines. It
33http://epa.gov/osp/spc/perevmem.htm
Appendix 39
GUidelines for Ensurmg and Maximizing the Quality. Objectivity. Utility. and Integrity of Information DIsseminated 0\ Eel
notifies EPA staff and the public that the document is guidance rather than a substantive rule and
explains how such guidance should be implemented. Although we believe these Guidelines
would not be judicially reviewable, we agree that a statement to this effect is unnecessary and
have deleted it. In response to comments that EPA clarify that the Guidelines do not alter
existing legal requirements, we have made that change. In light of that change, we think it is
clear that decisions in particular cases will be made based on applicable statutes, regulations, and
requirements, and have deleted other text in the paragraph that essentially repeated that point.
Elsewhere in the document, EPA has made revisions to be consistent with its status as guidance.
Some commenters argued that all EPA disseminated information should be covered by the
Guidelines and that we lack authority to "exempt" information from the Guidelines. Others
thought that the coverage in EPA's draft was appropriate. EPA does not view its Guidelines as
establishing a fixed definition and then providing "exemptions." Rather, our Guidelines explain
when a distribution of information generally would or would not be considered disseminated to
the public for purposes of the Guidelines. As we respond to complaints and gain experience in
implementing these Guidelines, we may identify other instances where information is or is not
considered disseminated for the purposes of the Guidelines.
Some commenters cited the Paperwork Reduction Act (PRA), 44 U.S.C. 3501 et seq., to support
their argument that the Guidelines should cover all information EPA makes public. EPA's
Guidelines are issued under Section 515 of the Treasury and General Government
Appropriations Act for Fiscal Year 2001, which directs OMB to issue govemment-wide
guidelines providing P9licy and procedural guidance to Federal agencies. In tum, the OMB
guidelines provide direction and guidance to Federal agencies in issuing their own guidelines.
EPA's Guidelines are intended to carry out OMB's policy on information quality. One
commenter cited in particular the term "public information" used in the PRA as evidence of
Congress's intent under Section 515. In EPA's view, this does not show that Congress intended a
specific definition for the key terms, "information" and "disseminated," used in Section 515. In
the absence of evidence of Congressional intent regarding the meaning of the terms used in
Section 515, EPA does not believe the PRA requires a change in EPA's Guidelines.
We agree with commenters who noted that even if a particular distribution of information is not
covered by the Guidelines, the Guidelines would still apply to information disseminated in other
ways. As stated in section 1.4, if information is not initially covered by the Guidelines, a
subsequent distribution of that information will be subject to the Guidelines if EPA adopts,
endorses, or uses it.
Some commenters made specific recommendations about what should and should not be covered
by the Guidelines. In addition to the specific recommendations, some suggested that the "scope
and applicability" section was too long, while others thought it had an appropriate level of detail.
Based on other agencies' guidelines and public comments, EPA has removed much of the detail
from the discussion of Guidelines coverage. These revisions were intended to shorten and
simplify the discussion without changing the general scope of the Guidelines.
Appendix 40
Guidelines for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Dlssemmatcc b\ "P;
Some commenters thought all information from outside sources should be covered by the
Guidelines, even if EPA does not use, rely on, or endorse it. Others wished to clarify the point at
which the Guidelines cover information from outside sources. As noted above, section 1.4 of the
Guidelines explains how subsequent distributions of information in public filings may become
subject to the Guidelines. We continue to think that EPA's own public filings before other
agencies should not geJlerally be covered by the Guidelines as long as EPA does not
simultaneously disseminate them to the public, since use of this information would be subject to
the requirements and policies of the agency to which the information is submitted.
We received a number of comments, including from OMB, arguing that the provision regarding
information related to adjudicative processes was too broad, and that the Guidelines should
cover some or all information related to adjudicative processes, particularly administrative
adjudications. In addition to shortening this section, we have limited this provision to
information in documents prepared specifically for an administrative adjudication. This would
include decisions, orders, findings, and other documents prepared specifically for the
adjudication. As indicated in the Draft Guidelines, our view is that existing standards and
protections in administrative adjudications would generally be adequate to assure the quality of
information in administrative adjudications and to provide an adequate opportunity to contest
decisions on the quality of information. For example, in permitting proceedings, parties may
submit comments on the quality of information EPA prepares for the permit proceeding, and
judicial review is available based on existing statutes and regulations. Narrowing the provision
to information prepared specifically for the adjudication should make clear that the Guidelines
would not generally provide parties with additional avenues of challenge or appeal during
adjudications, but would still apply to a separate distribution of information where EPA adopts,
endorses, or uses the information, such as when EPA disseminates it, on the Iriternet, or in a
rulemaking, or guidance document. When we intend to adopt information such as models or risk
assessments for use in a class of cases or determinations (e.g., for use in all determinations under
AppendiX 41
Guidelines for Ensuring and Maximizing the Quality. Objectivity. Utility, and Integrity at Informallan Dlssemmated lh ::" ~
a particular regulatory provision), EPA often disseminates this information separately and in
many instances requests public comment on it. Accordingly, it is not clear that there would be
many instances where persons who are concerned about information prepared specifically for an
adjudication would not have an opportunity to contest the quality of information.
We received many comments on how the Guidelines apply to external parties, the shared quality
responsibilities between EPA and external parties, and specific EPA responsibilities when using
or relying on information collected or compiled by external parties.
EPA roles: Some commenters emphasized that ensuring quality of information at the point of
dissemination is no substitute for vigorous efforts by EPA to receive quality information in the
first place and therefore for information providers to produce quality information. One
commenter stated that ftPA cannot be responsible for all aspects of the quality of the information
we disseminate. In response to this and other comments, we have provided additional language
in these Guidelines on the various roles that EPA assumes in either ensuring the quality of the
information we disseminate or ensuring the integrity of information EPA distributes. One
comment suggested that we mention the role of the National Environmental Information
Exchange Network in ensuring information integrity, which we have done in section 2.4 of the
Guidelines.
Assessment factors: Overall, public input was positive and welcoming of our proposal to
develop assessment factors to evaluate the quality of information generated by third parties. A
few commenters offered their involvement in the development of these factors, their advice on
how to develop such factors, and some examples of what assessment factors we should consider.
EPA staff have provided such comments to the EPA Science POlicy Council workgroup that was
charged with developing the.assessment factors. EPA welcomes stakeholder input in the
development of these factors and published draft assessment factors for public comment in
September 2002.
Coverage of State Information: Some commenters suggested that our Guidelines must apply to
all information disseminated by EPA, including information submitted to us by States. Whereas
some commenters stressed that the quality of information recei ved by EPA is the responsibility
of the providers, others expressed concern about the potential impact that EPA's Guidelines
could have on States. We believe it is important to differentiate between information that we
AppendiX 42
Guidelines tor Ensuring and Maximizing the Quality. Objectivity. Utility, and Integrity of Information Disseminated bv po,~
generate and data or information generated by external parties, including States. State
information, when submitted to EPA, may not be covered by these Guidelines, but our
subsequent use ofthe information may in fact be covered. We note, however, that there may be
practical limitations on the type of corrective action that may be taken, since EPA does not
intend to alter information submitted by States. However, EPA does intend to work closely with
our State counterparts to ensure and maximize the quality of information that EPA disseminates.
Furthermore, one commenter stated that if regulatory information is submitted to an authorized
or delegated State' program, then the State is the primary custodian of the information and the
Guidelines would not cover that information. We agree with that statement.
We also received comments regarding the use of labels, or disclaimers, to notify the public
whether information is generated by EPA or an external party. We agree that disclaimers and
other notifications should be used to explain the status of information wherever possible, and we
are developing appropriate language and format.
A statement regarding Paperwork Reduction Act clearance submissions has been added in
response to comment by OMB.
Several commenters generally assert that the definition is too narrow. Other commenters
indicated that under EPA's draft definition, only Economically Significant actions, as defined in
Executive Order 12866, or only Economically Significant actions and information disseminated
in support oftop Agency actions, are considered "influentiaL" We disagree. To demonstrate the
broad range of activities covered by our adoption of OMB's definition, we reiterate the
definition below and include an example of each type of action, to illustrate the breadth of our
definition. "Influential," when used in the phrase '~influential scientific, financial, or statistical
information," means that the Agency can reasonably determine that dissemination of the
information will have or does have a clear and substantial impact on important public policies or
important private sector decisions. We will generally consider the following classes of
information to be influential: information disseminated in support of top Agency actions;
information disseminated in support of "economically significant" actions; major work products
undergoing peer review; and other disseminated information that.will have or does have a clear
and substantial impact (i.e., potential change or impact) on important public policies or
important private sector decisions as determined by EPA on a case-by-case basis. In general,
influential information would be the scientific, financial or statistical information that provides a
substantial basis for EPA's position on key issues in top Agency actions and Economically
Significant actions. If the information provides a substantial basis for EPA's position, EPA
believes it would generally have a clear and substantial impact. .
Appendix 43
Gliideline~ tor Ensurin\l and MaxlmizlnrJ the Quality. Objectivity. Utility. and Integrity ot Informatlol', Disseminated b\ E:':;
Top Agency actions: An example of a top Agency action is the review of the National
Ambient Air Quality Standards (NAAQS) for Particulate Matter. Under the Clean Air
Act, EPA is to periodically review (1) the latest scientific knowledge about the effects on
public health and public welfare (e.g., the environment) associated with the presence of
such pollutants in the ambient air and (2) the standards, which are based on this science.
The Act further directs that the Administrator shall make any revisions to the standards
as may be appropriate, based on the latest science, that in her judgment are requisite to
protect the public health with an adequate margin of safety and to protect the public
welfare from any known or anticipated adverse effects. The standards establish allowable
levels of the pollutant in the ambient air across the United States, and States must
development implementation plans to attain the standards. The PM NAAQS were last
revised in 1997, and the next periodic review is now being conducted.
Peer reviewed work products: An example of a major work product undergoing peer
review is the IRIS Documentation: Reference Dose for Methylmercury. Methylmercury
contamination is the basis for fish advisories. It is necessary to determine an intake to
humans that is without appreciable risk in order to devise strategies for decreasing
mercury emissions into the environment. After EPA derived a reference dose (RID) of
0.0001 mg/kg-day in 1995, industry argued that it was not based on sound science.
Congress ordered EPA to fund an National Research CouncillNational Academy of the
Sciences panel to determine whether our RID was scientifically justifiable. The panel
concluded that the 0.0001 mg/kg-day was an appropriate RfD, based on newer studies
than the 1995 RID. The information in this document was.evaluated, incorporated, and
subjected to comment by the Office of Water, where it contributed in large part to
Chapter 4 of Drinking Water Criteria for the Protection of Human Health:
Methylmercury (EPA/823/R-01/00I) January 2001. The peer review mechanism was an
external peer review workshop and public comment session held on November 15, 2000,
accompanied by a public comment period from October 30 to Novem~er 29, 2000.
Appendix 44
GWdeimes for Ensurlf19 and Maximizing the Quality. ObJectIvity. Utility. and Integrity of Information Dissemlnat£,c b'. ED,.
A number of commenters said that EPA created a limited definition of what types of infonnation
are to be considered "influential," and that we have no rational basis to do so. A number of
commenters also stated that :'all Agency infonnation should be considered influential"; that "all
data relied upon by the Agency should meet a high standard of quality regardless of the type"; or
that "'influential' infonnation includes infonnation used to support any EPA action, not just
'top' Agency actions." EPA followed OMB's guidelines in establishing a definition for
"influential" infonnation that was not all-encompassing. OMB stated "the more important the
information, the higher the quality standards to which it should be held, for example. in those
situations involving "influential scientific, financial or statistical infonnation.. ~". OMB narrowed
the definition of "influential" in their final guidance as follows:
Appendix 45
Guidelines tor Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated D,' EPt.
OMB also amended their definition to say that "each agency is authorized to define "influential"
in ways appropriate for it given the nature and multiplicity of issues for which the agency is
responsible" (67 FR 8455). We adopted OMB's "influential" definition. Once the Agency
reviewed the wide range of information disseminated to the public, such as major rulemakings,
risk assessments, rule related guidance, health advisories, annual reports, fact sheets, and
coloring books, it became apparent that there were reasons to distinguish between "influential"
information and other information. EPA adopted OMB' s definition for "influential" and used
types of information the Agency disseminates to further explain what information is included.
Another commenter suggested that EPA should not indicate whether disseminated information is
"influential" when it is first disseminated but should wait to designate information as
"influential" until either an information correction request is made or a final agency action is
taken. We intend to consider this point, as well as other comments made about when
disseminated information becomes influential, as the Agency implements the Guidelines.
One commenter suggests that the definition of the term "influential" should be more narrow.
Specifically, the c0rIlIl!enter states the following:
EPA agrees with the commenter that there are significant costs associated with ensuring that
information disseminated by.the Agency is of high quality. Consequently, EPA chose a
definition of the term "influential" to cover information that, when disseminated, will result in a
clear and substantial impact on important public policies and private sector decisions. We
believe that this definition balances the costs associated with implementing the Guidelines, the
need to ensure high quality information, and the Agency's mission to protect human health and
safeguard the natural environment. '
Several commenters indicated that it is inappropriate for EPA to base its definition of
"influential" on categories of actions. They suggest that the definition be based instead on the
AppendiX 46
GUidelmes for Ensurm9 and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Disseminated b\ EP':'
content of the infonnation. We consider our definition to be based on infonnation content. given
that those categories of disseminated infonnation we defined as influential are those that EPA
can reasonably determine will or do have a clear and substantial impact on important public
policies or private sector decisions. We note here that, in addition to the specific classes of
disseminated infonnation we have defined as "influential," EPA has reiterated the "case-by
case" portion of the OMB "influential" definition. This general provision is intended to capture
disseminated infonnation, based on its content, that would not otherwise rise to the level of
"influential" under the other parts of our definition (i.e., top Agency actions, Economically
Significant actions, major peer reviewed products).
Several commenters assert that EPA should categorically state that certain specific types of
disseminated infonnation products are influential, and that we should categorically state that
certain specific types of disseminated infonnation products are not influential. Given the vast
array of infonnation disseminated by the Agency, and given the fact that certain infonnation
may have a clear and substantial impact on important public policies or private sector decisions
at one time, but not have such an impact later on (and vice versa), classifying types of
information as "influential" or otherwise upfront is difficult and could be misleading. We intend
to rely on our definition in determining whether speCific types of disseminated infonnation
products are to be considered "influential" for purposes of the Guidelines.
A.3.5 Reproducibility
Some commenters stat~d that there needs to be more clarity in the definition of "reproducibility"
and related concepts. We have tried to provide definitions that are consistent with OMB
guidelines. Also, our Guidelines now include that EPA intends to ensure reproducibility for
disseminated original and supporting data according to commonly accepted scientific, financial,
or statistical standards. Many commenters thought there should be some kind of method to
consider reproducibility when proprietary models, methods, designs, and data are used in a
dissemination. Some commenters discourage all use of proprietary models; others suggest
proprietary model use be minimized with application limited to situations in which it is
absolutely necessary. We understand this concern, but note that there are other factors that are
appropriately considered when deciding whether to use proprietary models, including feasibility
and cost considerations (e.g., it may be more cost-effective for the Agency to use a proprietary
model in some situations than to develop its own model). In cases where the Agency relies on
proprietary models, these model applications are still subject to our Peer Review Policy. Further,
as recently directed by the Administrator, the Agency's Council on Regulatory Environmental
Modeling is now revitalizing its development of principles for evaluating the use of
environmental models with regard to model validation and certification issues, building on
current good modeling practices. In addition. these Guidelines provide for the use of especially
rigorous "robustness checks" and documentation of what checks were undertaken. These steps,
along with transparency about the sources of data used, various assumptions employed, analytic
methods applied, and statistical procedures employed should assure that analytic results are
"capable of being substantially reproduced."
Appendix 47
Guidelines tor Ensuring and Maximizing the Quality. Objectivity. Utility, and Integrity of Information DlssemlOalec 0. EoPt
Regarding robustness checks, commenters were concerned that the EPA did not use the term
"especially rigorous robustness checks." We have modified our Guidelines to include this term.
Some commenters speculated on the ability of the Agency's Peer Review program to meet the
intent of the Guidelines and were concerned about the process to rebut a peer review used to
support the objectivity demonstration for disseminated information. Our Peer Review program
has been subject to external review and we routinely verify implementation of the program.
Affected persons wishing to rebut a formal peer review may do so using the complaint resolution
process in these Guidelines, provided that the information being questioned is considered to be
"disseminated" according to the Guidelines.
Regarding analytic results, some commenters indicated that the transparency factors identified
by EPA (section 6.3 of the Guidelines) are not a complete list of the items that would be needed
to demonstrate a higher degree of quality for influential information. EPA agreed with the list of
four items that was initially provided by the OMB and recognizes that, in some cases, additional
information regarding disseminated information would facilitate increased quality. However,
given the variety of information disseminated by the Agency, we cannot reasonably provide
additional details for such a demonstration at this time. Also, in regards to laboratory results,
which were mentioned by several commenters, these Guidelines are not the appropriate place to
set out for the science community EPA's view of what constitutes adequate demonstration of test
method validation or minimum quality assurance and quality control. Those technical
considerations should be addressed in the appropriate quality planning documentation or in
regulatory requirements.
-
EPA has developed general language addressing the concept of reproducibility and may proVide
more detail after appropriate consultation with scientific and technical communities, as called for
by OMB in its guidelines. We have already begun to consult relevant scientific and technical
experts within the Agency, and also have planned an expedited consultation with EPA's Science
Advisory Board (SAB) on October I, 2002. Based on these initial consultations, EPA may seek
additional input from the SAB in 2003. These consultations will allow EPA to constructively and
appropriately refine the application of existing policies and procedures, to further improve
reproducibility. In the interim, EPA intends to base the reproducibility of disseminated original
and supporting data on commonly accepted scientific, financial, or statistical standards.
For a chemical or other stressor to be "risky," it must have both an inherent adverse effect on an
Appendix 48
GUideltnes for Ensuring anti Maximizing the Quality. Objectivity. Utility, and Integrity of Information Disseminatet1 b\ E".:.
Risk assessments may be. perfonned iteratively, with the first iteration employing protective
(conservative) assumptions to identify possible risks. Only if potential risks are identified in a
screening level assessment is it necessary to pursue a more refined, data-intensive risk
assessment. The screening level assessments may not result in "central estimates" of risk or
upper and lower-bounds of risks. Nevertheless, such assessments may be useful in making
regulatory decisions, as when the absence of concern from a screening level assessment is used
(along with other infonnation) to approve the new use ofa pesticide or chemical or to decide
whether to remediate very low levels of waste contamination.
Appendix 49
GUlOCiines lor !:nsuring and Maximizing the Quality. Objectivity. Utility. and Integrity' of Inlormatlon Olssemmalcc :l\ E"!
OMB Guidelines
In its guidelines OMB stated that, with respect to influential information regarding health. safety
or environmental risk assessments, agencies should either adopt or adapt the quality principles in
the Safe Drinking Water Act (SDWA) Amendments of 1996. 34.35 In the background section of
the OMB guidelines, OMB explains that "the word 'adapt' is intended to provide agencies
flexibility in applying these principles to various types of risk assessment."
EPA carefully and practically developed the adaptation of the SDWA quality principles using
our considerable experience conducting human health and ecological 36 risk assessments as well
as using our existing policies and guidance.
EPA conducts many risk assessments every year. Some of these are screening level assessments
based on scientific experts' judgments using conservative assumptions and available data and can
involve human health, safety, or environmental risk assessments. Such screening assessments
provide useful information that are sufficient for regulatory purposes in instances where more
elaborate, quantitative assessments are unnecessary. For example. such assessments could
indicate. even with conservative assumption, the level of risk does not warrant further
investigation. Other risk assessments are more detailed and quantitative and are based on
research and supporting data that are generated outside EPA. For example, pesticide reviews are
based on scientific stugies conducted by registrants in accordance with our regulations and
guidance documents. Our test guidelines and Good Laboratory Practices (GLPS)37 describe
sound scientific practices for conducting studies needed to assess human and environmental
hazards and exposures. Such studies are not required to be peer-reviewed. Risk assessments
based on these studies can include occupational, dietary, and environmental exposures.
34 Safe Drinking Water Act Amendments of 1996, 42 U.S.c. 300g-1(b)(3)(A) & (B).
35 In section III.3.iLC. of its guidelines, OMB states that: "With regard to analysis of risks to human health,
safety and the environment maintained or disseminated by the agencies, agencies shall either adopt or adapt the
equality principles applied by Congress to risk information used and disseminated pursuant to the Safe Drinking
Water Act Amendments of 1996 (42 U.S.c. 300g-1(b)(3)(A) & (B». Agencies responsible for dissemination of vital
health and medical information shall interpret the reproducibility and peer-review standards in a manner appropriate
to assuring the timely flow of vital' information from agencies to medical provioers, patients. health agencies, and the
public. Information quality standards may be waived temporarily by agencies under urgent situations (e.g., imminent
threats to public health or homeland security) in accordance with the latitude specified in agency-specific
guidelines".
36Because the assessment of "environmental risk" is being distinguished in OMB' s adaptation of the
SDWA quality principles from "human health risk", the term "environmental risk" as used in these Guidelines does
not directly involve human health concerns. In other words, "environmental risk assessment" i~, in this case, the
equivalent to what EPA commonly refers to as "ecological risk assessment".
3740 CPR part 160 for FIFRA and 40 CPR part 792 for TSCA.
AppendiX 50
GUldolines for EnsurinQ and Malumlzing the Quality. Objectivity. Utility. and Integrity of InformatIon Disseminated oj" ~o.:.
The results of these risk assessments are conducted and presented to policy makers to infonn
their risk management decisions. EPA currently has numerous policies that provide guidance to
internal risk assessors on how to conduct a risk assessment and characterize risk. The EPA Risk
Characterization Policy8 and associated guidelines are designed to ensure that critical
information from each stage of a risk assessment is used in forming conclusions about risk and
that this infonnation is communicated from risk assessors to policy makers.
Current EPA guidance and policies incorporate quality principles. These are designed to ensure
that critical infonnation from each stage of a risk assessment is used in forming conclusions
about risk and that this infonnation is communicated from risk assessors to policy makers. One
example is the EPA Risk Characterization Policy9 which provides a single, centralized body of
risk characterization implementation guidance to help EPA risk assessors and risk managers
make the risk characterization process transparent and risk characterization products clear,
consistent and reasonable (TCCR). These principles have been included in other Agency risk
assessment guidance, such as the Guidelines for Ecological Risk Assessment. 40 Other examples
of major, overarching guidelines for risk assessments include: Guidelines For Exposure
Assessment 41, Guidelines For Neurotoxicity Risk Assessment,42 and Guidelines For Reproductive
Toxicity Risk Assessment. 43 Each of these documents has undergone external scientific peer
review as well as public comment prior to publication. Additionally, individual EPA offices have
developed more specific risk assessment policies to meet the particular needs of the programs
and statutes under whi~h they operate. 44 EPA's commitment to sound science is evidenced by our
ongoing efforts to develop and continually improve Agency guidance for risk assessment.
38http://www.epa.gov/OSP/spc/rcpolicy.htm
39 Ibid.
40US EPA(1998). Guidelines for ecological risk assessment (Federal Register 63(93):26846-26924).
http://www.eDa.gov/ncyalraf.
41US EPA (1992). Guidelines For Exposure Assessment. Federal Register 57(104):22888-22938.
http://www.epa.gov/ncealraf/.
42US EPA (1998). Guidelines For Neurotoxicity Risk Assessment. Federal Register 63(93):26926-26954.
http://www.epa.gov/ncea/raf/.
43US EPA (1996). Guidelines For Reproductive Toxicity Risk Assessment. Federal Register 61 (212):56274
56322. http://www.epa.gov/ncealraf.
44 The Office of Solid Waste and Emergency Response has developed Tools for Ecological Risk
Assessment for Superfund Risk Assessment. One example is the Ecological Risk Assessment Guidance for
Superfund: Process for Designing and Conducting Ecological Risk Assessments - Interim Final.
http://www.epa.gov/oerrpage/superfundlprograms/risklecorisklecorisk.htm
http://www.epa.gov/oerrpage/superfundlprograms/riskltooleco.htm
Appendix 51
GLildelInes for Ensuring and MaximizIng the Quality. Objectivity. Utility. and Integrity of Information D,ssemlnatea 0\ EP,:
The first EPA human health risk assessment guidelines 45 were issued in 1986. In 1992, the
Agency produced a Frameworkfor Ecological Risk Assessment46 which was replaced by the
1998 Ecological Risk Assessment Guidelines. 47 As emphasized elsewhere in this document, the
statutes administered by EPA are diverse. Although the majority of risk assessments conducted
within the Agency are for chemical stressors, we also assess risks to biological and physical
stressors. In addition to risk assessment guidelines, both the EPA Science Policy Council and the
EPA Risk Assessment Forum have coordinated efforts to address the complex issues related to
data collection and analysis for hazard and exposure assessments. Thus, the Agency has
considerable experience in conducting both screening level and in-depth assessments for a wide
array of stressors.
Most environmental statutes obligate EPA to act to prevent adverse environmental and human
health impacts. For many of the risks that we must address, data are sparse and consensus about
assumptions is rare. In the context of data quality, we seek to strike a balance among fairness,
accuracy, and efficient implementation. Refusing to act until data quality improves can result in
substantial harm to human health, safety, and the environment.
Public Comments
We received a range of public and stakeholder comments on the adaptation of the SDWA
principles for "influential" human health, safety, and environmental risk assessments that are
disseminated by EPA. Some commenters stated that we should adopt the SDWA quality
principles for human health risk, safety and environmental risk assessments. Many commenters
sought clarification on reasons for EPA's adaptation of the SDWA quality principles for human
health risk assessments and additional information on how we plan to address this process.
Others urged us to adapt the SDWA principles rather than adopt, because of certain elements in
the SDWA principles that may not be applicable to all risk assessments such as a "central
estimate of human risk for the specific populations affected." Others stated that we should
neither adapt nor adopt SDWA principles because the "Data Quality Act" does not authorize
importing decisional criteria into statutory provisions where they do not apply. The decisional
criteria set forth in SDWA are expressly limited to SDW A. We also received comments at a
level of detail that are more appropriate for implementation of the Guidelines than for the
formulation of the Guidelines. These include comments regarding the use of clinical human test
data, and comments regarding the use of particular types of assumptions in risk assessments. To
the.extent that an affected person believes that our use of data or assumptions in a particular
46Framework For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1992, EPN6301R
92/001.
470uidelines For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1998, EPN6301R
95/002F, http://cfpub.epa.gov/ncealcfmlecorsk.cfm
Appendix 52
GUidelines tor Ensuring and Ma)(Imizing the Quality. ObJectivity. Utility. and Integrity of Information D,SSemln3tN' b, EP;
dissemination of infonnation is inconsistent with these Guidelines, the issue can be raised at that
time.
A few commenters raised a question regarding a conflict between EPA's existing policies and
the SDWA principles and asked us to identify the conflicting specific risk assessment standards
and make every effort to reconcile the conflicting standards with the SDWA principles. A few
commenters stated that EPA should not have two separate standards for risk assessments (i.e..
one for influential and one for non-influential), but that all risk assessments should be considered
influential. Another stated that if there is a conflict between existing policies and the SDWA
principles, EPA should identify the conflicting specific risk assessment standards and make
every effort to reconcile the conflicting standards with the SDWA principles. Some commenters
have questioned why the "best available, peer reviewed science and supporting studies"
language of SDWA was conditioned by terms such as "to the extent practicable" or "as
appropriate."
Public comments received by the Agency on the draft Guidelines were widely divergent. As no
obvious consensus could be drawn, we carefully considered comments and arguments on
adoption and adaptation. We also reviewed our experience with the SDWA principles, existing
policies, and the applicability and appropriateness of the SDWA language with regard to the
variety of risk assessments that we conduct and have detennined that, to best meet the statutory
obligations of the many statutes EPA implements, it remains most appropriate to adapt the
SDWA principles to human health, safety, and environmental risk assessments.
In response to public comments we have removed "as appropriate" from these Guidelines in our
SDWA adaptation. EPA agrees that the phrase peer reviewed science "as appropriate" was
unclear. We revised this statement in part (A) to "including, when available, peer-reviewed
science and supporting studies." EPA introduced such adaptations in order to accommodate the
range of real-world situations we address in the implementation of our diverse programs.
Numerous commenters expressed that EPA did not provide adequate clarifications of how we
adapted the principles and what our thinking was on each adaptation. In these Guidelines we
have provided detailed clarifications regarding each adaptation made to the original SDWA
language and other remarks regarding our intent during the implementation of the SDWA
adaptation for influential disseminations by EPA. We direct reader to the Guidelines text for
such clarifications.
A few commenters noted that EPA should outline how an affected person would rebut the
presumption of objectivity afforded by peer review. EPA believes this determination would be
made on a case-by-case basis considering the circumstances of a particular peer review and has
Appendix 53
GUldelmes for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of InformatIon Dlsseminatea b\ EC~
decided not to provide specific suggestions for affected persons on how to rebut the presumption
of objectivity afforded by a peer review.
OMB and other commenters noted that agencies' guidelines needed to make clear that a request
for correction can be filed if an affected person believes that information does not comply with
the EPA Guidelines and the OMB guidelines. EPA has added language in the EPA Guidelines to
make sure this is more clear to readers.
EPA received numerous comments on the EPA definition of affected persons. In the draft
Guidelines, EPA had adopted OMB's definition. EPA agrees with comments suggesting that,
instead of elaborating on the definition of "affected person," a more open approach would be to
ask complainants to describe how they are an affected person with respect to the information that
is the subject of their complaint. EPA is asking that persons submitting requests for correction
provide, among other things, such an explanation. EPA has revised the Guidelines accordingly,
so that we may consider this information along with other information in the complaint in
deciding on how to respond.
Some commenters noted that the EPA Guidelines do not state how the process will work,
specifically, for States, municipalities, and EPA. They expressed concern of being "caught in the
middle," so to speak, on trying to get their own information corrected. EPA does not believe that
the Guidelines needed greater details on how States will work with EPA to address complaints,
but intends to work closely with States to better ensure timely correction. EPA does appreciate
the frustration of an information owner in seeing what they deem "incorrect" information in a
disseminated document or web site. However, EPA notes that this is a very complex issue that
cannot be addressed with general language in the Guidelines for all cases.
Several comments indicated that EPA appears to have given itself "Garte blanche" authority to
"elect not to correct" information. The commenters stated that there was no valid reason why
EPA would opt out of correcting information and that all errors should be corrected. To the
contrary, EPA like every Federal agency wants to correct wrong information. The issue is not as
simple as the correction of an improper zip code or phone number on the EPA web site. Even
these simple errors may be very complex if it would involve changing data in an EPA and/or
State database. Furthermore, EPA is not certain of the volume of complaints it will receive after
October 1 and therefore needed to provide a general provision in the Guidelines to recognize that
once EPA approves a request, the corrective action may vary depending on the circumstances.
On a case-by-case basis, EPA will determine the appropriate corrective action for each
complaint. EPA determined that this was the most reasonable approach. The revision also
recognizes practical limitations on corrective action for information from outside sources.
Several commenters noted that EPA needs to establish time frames for the complaint process.
Commenters stated that EPA should establish time frames for when affected persons can submit
a complaint on an information product, when EPA needs to responds to affected persons with a
decision on discrete, factual errors, when EPA would respond to affected persons with a decision
on more complex or broader interpretive issues, and when an affected person should submit a
Appendix 54
Guioell!1cs tor Ensuring and MaxImIzing the Quality. ObJectivity. Utility. and IntegritY 01 Information DISSCmlnaleC 0\ !OPt
request for reconsideration. One commenter suggested that EPA solicit all complaints at one
time during a 6-month window or another time frame. EPA notes that commenters provided
helpful examples and well thought out proposals for such a suite of time frames and appreciates
the public input.
EPA did not agree on the need to develop two separate time frames for complaints that are more
factual in nature versus those that are more complex. One commenter suggests a 15-day time
line for discrete factual errors and a 45-day time line for all other complaints. Another
commenter recommended 30 days for factual errors and 60 days for all other complaints.
Another commenter advised EPA to model this complaint process according to the FOIA
process. This commenter also suggested a 3-week time line for more numeric corrections and 60
days for "broader interpretive issues or technical questions." While EPA appreciates the value of
these approaches, they might be problematic to implement. However, as EPA learns more about
the nature of this complaint process following some period of implementation, these suggested
approaches could be revisited.
EPA also agreed with commenters that a window of opportunity for commenters to submit a
request for reconsideration made sense. EPA has advised affected persons in these Guidelines to
submit a request for reconsideration within 90-days of the initial complaint decision by EPA.
Some commenters asked that EPA establish time lines for when EPA would take corrective
action. EPA does not anticipate that there would be any value in applying a specific time frame
for this action and pre(ers to look at each complaint and appropriate corrective action on a
case-by-case basis, as discussed above.
Commenters suggested that 45 days was a reasonable time frame for EPA to get back to the
affected person with either a decision or a notice that EPA needs more time. One group noted
that HHS, SSA, and NRC adopted the 45-day window. EPA disagreed with this approach and
instead opted for a 90-day time frame similar to the DOT Guidelines.
EPA received many comments on how EPA should structure its internal processes for the
complaint resolution process. Several comments specifically discussed the role that OEI should
play in the initial complaint and the requests for reconsideration. EPA does not agree that OEI
should be the arbiter on all requests for reconsideration, but does view the role of DEI in the
process as an important one. Namely, OEI may work to help ensure consistent responses to
complaints and requests for reconsideration. Other comments recommending specific internal
implementation processes are being considered as EPA designs the correction and request for
reconsideration administrative processes in greater detail.
Many commenters argued that Assistant Administrators and Regional Administrators should not
decide requests for reconsideration because they would be biased or would have a conflict of
interest when deciding complaints regarding information disseminated by their own Offices or
programs, or if they had to reconsider decisions made by their own staffs. EPA does not agree.
This type of decision making is within the delegated decision making authority of EPA's
Appendix 55
GUidelines TOr Erlsunng and Maximizing the Quality. ObJectivity. Utility. and Integrity of Information Dlssemmated bv EP;'
officials, and these decisions should be presumed to be unbiased absent a specific showing that a
decision maker is not impartial in a particular case. EPA does agree with commenters who noted
that it is important to make consistent decisions on cross-cutting information quality issues. In
order to achieve appropriate consistency of response to affected persons on requests for
reconsideration and to ensure that cross-cutting information quality issues are considered across
the Agency at a senior level, EPA intends for an executive panel to make the final decisions on
all requests for reconsideration. Furthermore, we felt it important to add greater detail on the
time frame within which EPA would respond to a requestor on their request for reconsideration.
We have added that it is EPA's goal to respond to requesters regarding requests for
reconsideration within 90 days.
EPA received many recommendations in public comments to include the public in the EPA
complaint process. Specifically, commenters requested that EPA notify the public about all
pending requests to modify information and one commenter stated that EPA should allow the
public to comment on information corrections requests for information that are considered
"central to a rulemaking or other Final Agency Action" before EPA accepts or rejects the request.
As a general matter, EPA does not intend to solicit public comment on how EPA should respond
to requests for correction or reconsideration. EPA also does not intend to post requests for
correction and requests for reconsideration on the EPA web site, butwe plan to revisit this and
many other aspects of the Guidelines within one year ofimplementation.
EPA also received many comments on how information that is currently being reviewed by EPA
in response to a compl~int appears to the public on the EPA web site or some other medium.
Some commenters recommended the use of flags for all information that has a complaint pending
with a note that where appropriate, challenged information will be pulled from dissemination and
removed from EPA's web site. Other commenters stated that the information in question should
be removed from public access until the resolution process has been completed. Still other
commenters requested that EPA not embark on self-censorship. As a general rule, EPA has
decided not to flag information that has a complaint pending. EPA believes that information that
is the subject of a pending complaint should not necessarily be removed from public access based
solely on the receipt of a request for correction.
EPA is actively developing new policies and procedures, as appropriate, to improve the quality of
information disseminated to ,the public. Some activities specifically support ensuring and
maximizing the quality, objectivity, utility, and integrity of information. For instance, we are
consulting with the scientific community on the subject of reproducibility. The EPA Science
Advisory Board (SAB) is performing an expedited consultation on the subject on October 1,
2002. Based on this initial consultation, EPA and the SAB may consider a full review of
reproducibility and related information quality concepts in 2003. Furthermore, as noted earlier,
the EPA Science Policy Council has commissioned a workgroup to develop aSsessment factors
for consideration in assessing information that EPA collects or is voluntarily submitted in support
of various Agency decisions.
Appendix 56
GUldeilnes tor Ensuring and Maxlmlzmg the Quality. Objectivity. Utility. and Integrity of Information Disseminated b~ ;:P;,
As new processes, policies, and procedures are considered and adopted into Agency operations,
we will consider their relationship to the Guidelines and detennine the extent to which the
Guidelines may need to change to accommodate new activity.
Appendix 57
OFFICE OF
ENVIRONMENTAL
INFORMATION
www.epa.gov/oei
Friday,
Part IX
Office of
Management and
Budget
GW~MesfurEMuringandM~mmmg
the Quality, Objectivity, Utility, and
Integrity of Information Disseminated by
Federal Agencies; Notice; Republication
8452 Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
OFFICE OF MANAGEMENT AND guidance to Federal agencies for followed in drafting the guidelines that
BUDGET ensuring and maximizing the quality, we published on September 28, 2001
objectivity, utility, and integrity of (66 FR 49719), are also applicable to the
Guidelines for Ensuring and information (including statistical amended guidelines that we publish
Maximizing the Quality, Objectivity, information) disseminated by Federal today.
Utility, and Integrity of Information agencies * * *" Section 515(b) goes on In accordance with section 515. OMB
Disseminated by Federal Agencies; to state that the OMB guidelines shall: has designed the guidelines to help
Republication "(1) apply to the sharing by Federal agencies ensure and maximize the
agencies of, and access to, information quality, utility, objectivity and integrity
Editorial Note: Due to numerous errors, disseminated by Federal agencies; and of the information that thev disseminate
this document is being reprinted in its "(2) require that each Federal agency (meaning to share with, or'give access
entirety. It was originally printed in the to which the guidelines apply to, the public). It is crucial that
Federal Register on Thursday, January 3, "(A) issue guidelines ensuring and information Federal agencies
2002 at 67 FR 369-378 and was corrected on maximizing the quality, objectivity, disseminate meets these guidelines. In
Tuesday, February 5, 2002 at 67 FR 5365. utility, and integrity of information this respect, the fact that the Internet
AGENCY: Office of Management and (including statistical information) enables agencies to communicate
Budget, Executive Office of the disseminated by the agency, by not later information quickly and easily to a wide
President. than 1 year after the date of issuance of audience not only offers great benefits to
ACTION: Final guidelines. the guidelines under subsection (a); society, but also increases the potential
"(B) establish administrative harm that can result from the
SUMMARY: These final guidelines mechanisms allowing affected persons dissemination of information that does
implement section 515 of the Treasury to seek and obtain correction of not meet basic information quality
and General Government information maintained and guidelines. Recognizing the wide variety
Appropriations Act for Fiscal Year 2001 disseminated by the agency that does of information Federal agencies
(Public Law 106-554; H.R. 5658). not comply with the guidelines issued disseminate and the wide variety of
Section 515 directs the Office of under subsection (a); and dissemination practices that agencies
Management and Budget (OMB) to issue "(C) report periodically to the have, OMB developed the guidelines
government-wide guidelines that Director with several principles in mind.
"provide policy and procedural. "(i) the number and nature of First. OMB designed the guidelines to
guidance to Federal agencies for complaints received by the agency apply to a wide variety of government
ensuring and maximizing the quality, regarding the accuracy of information information dissemination activities
objectivity, utility, and integrity of disseminated by the agency and; that may range in importance and scope.
information (including statistical "(ii) how such complaints were OMB also designed the guidelines to be
information) disseminated by Fesleral handled by the agency." generic enough to fit all media" be they
agencies." By October 1, 2002, agencies Proposed guidelines were published printed, electronic, or in other form.
must issue their own implementing in the Federal Register on June 28, 2001 OMS sought to avoid the problems that
guidelines that include "administrative (66 FR 34489). Final guidelines were would be inherent in developing
mechanisms allowing affected persons published in the Federal Register on detailed, prescriptive, "one-size-fits-all"
to seek and obtain correction of September 28, 2001 (66 FR 49718). The government-wide guidelines that would
information maintained and Supplementary Information to the final . artificially require different types of
disseminated by the agency" that does guidelines published in September 2001 dissemination activities to be treated in
not comply with the OMB guidelines. provides background, the underlying the same manner. Through this
These final guidelines also reflect the principles OMB followed in issuing the flexibility, each agency will be able to
changes OMB made to the guidelines final guidelines, and statements of incorporate the requirements of these
issued September 28, 2001, as a result intent concerning detailed provisions in OMB guidelines into the agency's own
of receiving additional comment on the the final guidelines. information resource management and
"capable of being substantially In the final guidelilnes published in administrative practices.
reproduced" standard (paragraphs September 2001, OMB also requested Second, OMB designed the guidelines
V.3.B, V.9, and V.I0), which OMB additional comment on the "capable of so that agencies will meet basic
previously issued on September 28, being substantially reproduced" information quality standards. Given the
2001, on an interim final basis. standard and the related definition of administrative mechanisms required by
DATES: Effective Date: January 3, 2002.
"influential scientific or statistical section 515 as well as the standards set
information" (paragraphs V.3.B, V.9, forth in the Paperwork Reduction Act, it
FOR FURTHER INFORMATION CONTACT:
and V.I0), which were issued on an is clear that agencies should not
Brooke J. Dickson, Office of Information
interim final basis. The final guidelines disseminate substantive information
and Regulatory Affairs, Office of
published today discuss the public that does not meet a basic level of
Management and Budget, Washington,
comments OMS received, the OMS quality. We recognize that some
DC 20503. Telephone (202) 395-3785 or
response, and amendments to the final government information may need to
bye-mail to
guidelines published in September meet higher or more specific
informationquality@omb.eop.gov. 2001. information. quality standards than
SUPPLEMENTARY INFORMATION: In section In developing agency-specific those that would apply to other types of
515(a) of the Treasury and General guidelines, agencies should refer both to government information. The more
Government Appropriations Act for the Supplementary Information to the important the information, the higher
Fiscal Year 2001 (Public Law 106-554; final guidelines published in the the quality standards to which it should
H.R. 5658), Congress directed the Office Federal Register on September 28, 2001 be held, for example, in those situations
of Management and Budget (OMB) to (66 FR 49718), and also to the involving "influential scientific,
issue, by September 30, 2001, Supplementary Information published financial, or statistical information" (a
government-wide guidelines that today. We stress that the three phrase defined in these guidelines). The
"provide policy and procedural "Underlying Principles" that OMB guidelines recognize, however, that
Federal Register / Vol. 67, No. 36/ Friday, February 22, 2002/ Notices 8453
information quality comes at a cost. confidential. In such cases, while presented in an accurate. clear.
Accordingly, the agencies should weigh agencies' implementation of the complete, and unbiased manner, and as
the costs (for example, including costs guidelines may differ, the essence of the a matter of substance, is accurate.
attributable to agency processing effort, guidelines will apply. That is, these reliable, and unbiased. "Integrity" refers
respondent burden, maintenance of agencies must make their methods to security--:-the protection of
needed privacy, and assurances of transparent by providing information from unauthorized access
suitable confidentiality) and the benefits documentation, ensure quality by or revision, to ensure that the
of higher information quality in the reViewing the underlying methods used information is not compromised
development of information, and the in developing the data and consulting through corruption or falsification. OMB
level of quality to which the information (as appropriate) with experts and users, modeled the definitions of
disseminated will be held. and keep users informed about "information," "government
Third, OMB designed the guidelines corrections and revisions. information," "information
so that agencies can apply them in a dissemination product," and
common-sense and workable manner. It Summary of OMB Guidelines
"dissemination" on the longstanding
is important that these guidelines do not These guidelines apply to Federal definitions of those terms in OMB
impose unnecessary administrative agencies subject to the Paperwork Circular A-130, but tailored them to fit
burdens that would inhibit agencies Reduction Act (44 U.S.C. chapter 35). into the context of these guidelines.
from continuing to take advantage of the Agencies are directed to develop In addition, Section 515 imposes two
Internet and other technologies to information resources management reporting requirements on the agencies.
disseminate information that can be of procedures for reviewing and The first report, to be promulgated no
great benefit and value to the public. In substantiating (by documentation or later than October I, 2002, must provide
this regard, OMB encourages agencies to other means selected by the agency) the the agency's information quality
incorporate the standards and quality (including the objectivity, guidelines that describe administrative
procedures required by these guidelines utility, and integrity) of information mechanisms allowing affected persons
into their existing information resources before it is disseminated. In addition, to seek and obtain, where appropriate,
management and administrative agencies are to establish administrative correction of disseminated information
practices rather than create new and mechanisms allowing affected persons that does not comply with the OMB and
potentially duplicative or contradictory. to seek and obtain, where appropriate, agency guidelines. The second report is
processes. The primary example of this correction of information disseminated an annual fiscal year report to OMB (to
is that the guidelines recognize that, in by the agency that does not comply with be first submitted on January I, 2004)
accordance with OMB Circular A-l30, the OMB or agency guidelines. prOViding information (both quantitative
agencies already have in place well Consistent with the underlying and qualitative, where appropriate) on
established information quality principles described above, these the number, nature, and resolution of
standards and administrative ~ guidelines stress the importance of complaints received by the agency
mechanisms that allow persons to seek having agencies apply these standards regarding its perceived or confirmed
and obtain correction of information and develop their administrative failure to comply with these OMB and
that is maintained and disseminated by mechanisms so they can be agency guidelines.
the agency. Under the OMB guidelines, implemented in a common sense and
agencies need only ensure that their workable manner. Moreover, agencies Public Comments and OMB Response
own guidelines are consistent with must apply these standards flexibly, and Applicability of Guidelines, Some
these OMB guidelines, and then ensure in a manner appropriate to the nature comments raised concerns about the
that their administrative mechanisms and timeliness of the information to be applicability of these guidelines,
satisfy the standards and procedural disseminated, and incorporate them into particularly in the context of scientific
requirements in the new agency existing agency information resources research conducted by Federally
guidelines. Similarly, agencies may rely management and administrative employed scientists or Federal grantees
on their implementation of the Federal practices. who publish and communicate their
Government's computer security laws Section 515 denotes four substantive research findings in the same manner as
(formerly, the Computer Security Act, terms regarding information their academic colleagues. OMB
and now the computer security disseminated by Federal agencies: believes that information generated and
provisions of the Paperwork Reduction quality, utility, objectivity, and disseminated in these contexts is not
Act) to establish appropriate security integrity. It is not always clear how each covered by these guidelines unless the
safeguards for ensuring the "integrity" . substantive term relates-or how the agency represents the information as, or
of the information that the agencies four terms in aggregate relate-to the uses the information in support of, an
disseminate. widely divergent types of information official position of the agency.
In addition, in response to concerns that agencies disseminate. The As a.general matter, tliese guidelines
expressed by some of the agencies, we guidelines provide definitions that apply to "information" that is
want to emphasize that OMB recognizes attempt to establish a clear meaning so "disseminated" by agencies subject to
that Federal agencies provide a wide that both the agency and the public can the Paperwork Reduction Act (44 U.S.C.
variety of data and information. readily judge whether a particular type 3502(1)). See paragraphs II, V.5 and V.B.
Accordingly, OMB understands that the of information to be disseminated does The definitions of "information" and
guidelines discussed below cannot be or does not meet these attributes. "dissemination" establish the scope of
implemented in the same way by each In the guidelines, OMB defines the applicability of these guidelines.
agency. In some cases, for example, the "quality" as the encompassing term, of "Information" means "any
data disseminated by an agency are not which "u~ility," "objectivity," and communication or representation of
collected by that agency; rather, the "integrity" are the constituents. knowledge such as facts or data * * *"
information the agency must provide in "Utility" refers to the usefulness of the This definition of information in
a timely manner is compiled from a information to the intended users. paragraph V,5 does "not include
variety of sources that are constantly "Objectivity" focuses on whether the opinions, where the agency's
updated and revised and may be disseminated information is being presentation makes it clear that what is
8454 Federal Register/Vol. 67. No. 36/Friday, February 22, 2002/Notices
being offered is someone's opinion and even if the Federal agency retains Most comments approved of the
rather than fact or the agency's views." ownership or other intellectual property prominent role that peer review plays in
"Dissemination" is defined to mean rights because the Federal government the OMB guidelines. Some comments
"agency initiated or sponsored paid for the research. To avoid contended that peer review was not
distribution of information to the confusion regarding whether the agency accepted as a universal standard that
public." As used in paragraph V.8, is sponsoring the dissemination, the incorporates an established. practiced,
"agency INITIATED' •• distribution researcher should include an and sufficient level of objectivity. Other
of information to the public" refers to appropriate disclaimer in the comments stated that the guidelines
information that the agency publication or speech to the effect that would be better clarified by making peer
disseminates, e.g .. a risk assessment the "views are mine, and do not review one of several factors that an
prepared by the agency to inform the necessarily reflect the view" of the agency should consider in assessing the
agency's formulation of possible agency. On the other hand, subsequent objectivity (and quality in general) of
regulatory or other action. In addition, agency dissemination of such original research. In addition, several
if an agency, as an institution, information requires that the comments noted that peer review does
disseminates information prepared by information adhere to the agency's not establish whether analytic results
an outside party in a manner that information quality guidelines, In llum, are capable of being substantially
reasonably suggests that the agency these guidelines govern an agency's reproduced. In light of the comments,
agrees with the information, this dissemination of information. but the final guidelines in new paragraph
appearance of having the information generally do not govern a third-party's V.3.b.i qualify the presumption in favor
represent agency views makes agency dissemination of information (the of peer-reviewed information as follows:
dissemination of the information subject exception being where the agency is "However, this presumption is
to these guidelines. By contrast, an essentially using the third-party to rebuttable based on a persuasive
agency does not "initiate" the disseminate information on the agency's showing by the petitioner in a particular
dissemination of information when a behalf). Agencies, particularly those that instance."
Federally employed scientist or Federal fund scientific research, are encouraged We believe that transparency is
grantee or contractor publishes and to clarify the applicability of these important for peer review, and these
communicates his or her research guidelines to the various types of guidelines set minimum standards for
findings in the same manner as his or information they and their employees the transparency of agency-sponsored
her academic colleagues, even if the and grantees disseminate. peer review. As we state in new
Federal agency retains ownership or Paragraph V.8 also states that the paragraph V.3.b.i: "If data and analytic
other intellectual property rights definition of "dissemination" does not results have been subjected to formal,
because the Federal government paid for include'" • • distribution limited to independent, external peer review, the
the research. To avoid confusion correspondence with individuals or information may generally be presumed
regarding. whether the agency agrees persons, press releases, archival records, to be of acceptable objectivity. However,
with the information (and is therefore public filings, subpoenas or adjudicative this presumption is rebuttable based on
disseminating it through the employee processes." The exemption from the a persuasive showing by the petitioner
or grantee), the researcher should definition of "dissemination" for in a particular instance. If agency
include an appropriate disclaimer in the "adjudicative processes" is intended to sponsored peer review is employed to
publication or speech to the effect that exclude, from the scope of these help satisfy the objectivity standard. the
the "views are mine, and do not guidelines, the findings and review process employed shall meet the
necessarily reflect the view" of the determinations that an agency makes in general criteria for competent and
agency. the course of adjudications involving credible peer review recommended by
Similarly, as used in paragraph V,8" specific parties. There are well OMB-DIRA to the President's
"agency' •• SPONSORED established procedural safeguards and Management Council (g/20/01) (http://
distribution of information to the rights to address the quality of www.whitehouse.gov/omb/inforeg/
public" refers to situations where an adjudicatory decisions and to provide oira_review-process.htrnll, namely, 'that
agency has directed a third-party to persons with an opportunity to contest (a) peer reviewers be selected primarily
disseminate information, or where the decisions. These guidelines do not on the basis of necessary technical
agency has the authority to review and impose any additional requirements on expertise, (b) peer reviewers be expected
approve the information before release. agencies during adjudicative to disclose to agencies prior technicall
Therefore, for example, if an agency proceedings and do not provide parties policy positions they may have taken on
through a procurement contract or a to such adjudicative proceedings any . the issues at hand, (c) peer reviewers be
grant prOVides for a person to conduct additional rights of challenge or appeal. expected to disclose to agencies their
research, and then the agency directs The Presumption Favoring Peer sources of personal and institutional
the person to disseminate the results (or Reviewed Information ,As a general funding (private or public sector), and
the agency reviews and approves the matter, in the scientific and research (dl peer reviews be conducted in an
results before they may be context, we regard technical information open and rigorous manner.'''
disseminated), then the agency has that has been subjected to formal, The importance of these general
"sponsored" the dissemination of this independent, external peer review as criteria for competent and credible peer
information, By contrast, if the agency presumptively objective. As the review has been supported by a number
simply provides funding to support guidelines state in paragraph V.3.b.i: "If of expert bodies. For example, "the
research, and it the researcher (not the data and analytic results have been work of fully competent peer-review
agency) who decides whether to subjected to formal, independent, panels can be undermined by
disseminate the results and-if the external peer review, the information allegations of conflict of interest and
results are to be released-who may generally be presumed to be of bias. Therefore, the best interests of the
determines the content and presentation acceptable objectivity." An example of a Board are served by effective policies
of the dissemination, then the agency formal, independent, external peer and procedures regarding potential
has not "sponsored" the dissemination review is the review process used by conflicts of interest. impartiality, and
even though it has funded the research scientific journals. panel balance." (EPA's Science Advisory
Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices 8455
Board Panels: Improved Policies and Although such cases of falsification gUidelines, and the differences in th,e
Procedures Needed to Ensure are presumably rare, there is a nature of the information they
independence and Balance, GAD-Ol significant scholarly literature disseminate, we also believe it will be
536. General Accounting Office. documenting quality problems with helpful if agencies elaborate on this
Washington. DC, June 2001, page 19~) articles published in peer-reviewed definition of "influential" in the context
As another example, "risk analyses research. "In a [peer-reviewed] meta of their missions and duties. with due
should be peer-reviewed and analysis that surprised many-and some consideration of the nature of the
accessible-both physically and doubt-researchers found little evidence information they disseminate. As we
intellectuallv-so that decision-makers that peer review actually improves the state in amended paragraph V.g. "Each
at all levels will be able to respond quality ofresearch papers." (See, e.g., agency is authorized to define
critically to risk characterizations. The Science, Vol. 293, page 2187 (September 'influential' in ways appropriate for it
intensity of the peer reviews should be 21,2001.)) In part for this reason, many given the nature and multiplicity of
commensurate with the significance of agencies have already adopted peer issues for which the agency ,is
the risk or its management review and science advisory practices responsible." .
implications." (Setting Priorities, that go beyond journal peer review. See, Reproducibility. As we state in new
Getting Results: A New Direction for e.g., Sheila Jasanoff, The Fifth Branch: paragraph V.3.b.ii: "If an agency is
EPA, Summary Report, National Science Advisers as Policy Makers, responsible for disseminating influential
Academv of Public Administration, Cambridge, MA, Harvard.University scientific, financial, or statistical
Washington, DC, April 1995, page 23.) Press, 1990; Mark R. Powell, Science at information, agency guidelines shall
These criteria for peer reviewers are EPA: Information in the Regulatory include a high degree of transparency
generally consistent with the practices Process. Resources for the Future, about data and methods to facilitate the
now followed by the National Research Washington, DC., 1999, pages 138-139; reproducibility of such information by
Council of the National Academy of 151-153; Implementation of the qualified third parties." OMB believes
Sciences. In considering these criteria Environmental Protection Agency's Peer that a reproducibility standard is
for peer reviewers, we note that there Review Program: An SAB Evaluation of practical and appropriate for
are many types of peer reviews and that Three Reviews, EPA-SAB-RSAC-Dl information that is considered
agency guidelines concerning the use of 009, A Review of the Research Strategies "Influential", as defined in paragraph
peer review should tailor the rigor of Advisory Committee (RSAC) of the EPA V.9-that "will have or does have a
peer review to the importance of the Science Advisory Board (SAB), clear and substantial impact on
information involved. More generally, Washington, DC., September 26,2001. important public policies or important
agencies should define their peer-review For information likely to have an private sector decisions." The
standards in appropriate ways, given the important public policy or private sector reproducibility standard applicable to
nature and importance of the impact, OMB believes that additional influential scientific. financia.l, or
information they disseminate. ' quality checks beyond peer review are statistical information is intended to
is Journal Peer Review Always appropriate. ensure that information disseminated by
Sufficient? Some comments argued that Definition of "Influential". OMB agencies is sufficiently transparent in
journal peer review should be adequate guidelines apply stricter quality terms of data and methods of analysis
to demonstrate quality, even for standards to the dissemination of that it would be feasible for a replication
influential information that can be information that is considered to be conducted. The fact that the use
expected to have major effects on public "influential." Comments noted that the of original and supporting data and
policy. OMB believes that this position breadth of the definition of "influential" analytic results have been deemed
overstates the effectiveness of journal in interim final paragraph V.9 requires "defensible" by peer-review procedures
peer review as a quality-control much speculation on the part of does not necessarily imply that the
mechanism. agencies. results are transparent and replicable.
Although journal peer review is We believe that this criticism has Reproducibility of Original and
clearly valuable, there are cases where merit and have therefore narrowed the Supporting Data. Several of the
flawed science has been published in definition. In this narrower definition, comments objected to the exclusion of
respected journals. For example, the "influential", when used in the phrase original and supporting data from the
NIH Office of Research Integrity recently "influential scientific, financial, or reprodUcibility requirements.
reported the following case regarding statistical information", is amended to Comments instead suggested that OMB
environmental health research: mean that "the agency can reasonably should apply the reproducibility
determine that dissemination of the standard to original data, and that OMB
.. Based on the report of an investigation . information will have or does have a should provide flexibility to the
conducted by [XX] University, dated July 16, clear and substantial impact on
1999, and additional analysis conducted by agencies in determining what
ORr in its oversight review, the US Public important public policies or important constitutes "original and supporting"
Health Service found that Dr. [XI engaged in private sector decisions." The intent of data. OMB agrees and asks that agencies
scientific misconduct. Dr. [Xl committed the new phrase "clear and substantial" consider, in developing their own
scientific misconduct by intentionally is to reduce the need for speculation on guidelines, which categories of original
falsifying the research results published in the part of agencies. We added the and suppoi}:ing data should be subject to
the journal SCIENCE and by providing present tense-"or does have"-to this the reproducibility standard and which
falsified and fabricated materials to narrower definition because on should not. To help in resolving this
investigating officials at [XX] Universitv in occasion. an information dissemination issue, we also ask agencies to consult
response to a request for original data to
support the research results and conclusions may occur simultaneously with a directly with relevant scientific and
report in the SCIENCE paper. In addition, particular policy change. In response to technical communities on the feasibility
PHS finds that there is no original data or a public comment, we added an explicit of having the selected categories of
other corroborating evidence to support the reference to "financial" information as original and supporting data subject to
research results and conclusions reported in consistent with our original intent. the reproducibility standard. Agencies
the SCIENCE paper as a whole." (66 FR Given the differences in the many are encouraged to address ethical,
52137, October 12,2001). Federal agencies covered by these feasibility, and confidentiality issues
8456 Federal Register / Vol. 67. No .. 36/ Friday, February 22, 2002/ Notices
with care. As we state in new paragraph be required to reproduce each analytical Harvard Six Cities Stud\' and the
V.3.b.iLA. "Agencies may identify. in result before it is disseminated. While American Cancer Societ\' Study of
consultation with the relevant scientific several comments commended OMS for Particulate Air Pollution and Mortality,"
and technical communities. those establishing an appropriate balance in A Special Report of the Health Effects'
particular types of data that can the "capable of being substantially Institute's Particle Epidemiology .
practicably be subjected to a reproduced" standard. others Reanalysis Project. Cambridge. MA.
reproducibility requirement, given considered this standard to be 2000.
ethical, feasibility. or confidentiality inherently subjective. There were also The primary benefit of public
constraints." Further. as we state in our comments that suggested the standard transparency is not necessarily that
expanded definition of would cause more burden for agencies. errors in analytic results will be
"reproducibility" in paragraph V.IO, "If It is not OMB's intent that each detected, although error correction is
agencies apply the reproducibility test agency must reproduce each analytic clearly valuable. The more important
to specific types of original or result before it is disseminated. The benefit of transparency is that the public
supporting data. the associated purpose of the reproducibility standard will be able to assess how much an
guidelines shall provide relevant is to cultivate a consistent agency agency's analytic result hinges on the
definitions of reproducibility (e.g., commitment to transparency about how specific analytic choices made by the
standards for replication of laboratory analytic results are generated: the agency. Concreteness about analytic
data)." OMS urges caution in the specific data used. the various choices allows. for example. the
treatment of original and supporting assumptions employed, the specific implications of alternative technical
data because it may often be impractical analytic methods applied. and the choices to be readily assessed. This type
or even impermissible or unethical to statistical procedures employed. If of sensitivity analysis is widely
apply the reproducibility standard to sufficient transparency is achieved on regarded as an essential feature of high
such data. For example. it may not be each of these matters. then an analytic quality analysis. yet sensitivity analysis
ethical to repeat a "negative" result should meet the "capable of being cannot be undertaken by outside parties
(ineffective) clinical (therapeutic) substantially reproduced" standard. unless a high degree of transparency is
experiment and it may not be feasible to While there is much variation in types achieved. The OMS guidelines do not
replicate the radiation exposures of analytic results, OMB believes that compel such sensitivity analysis as a
studied after the Chernobyl accident. . reproducibility is a practical standard to necessary dimension of quality. but the
When agencies submit their draft agency apply to most types of analytic results. transparency achieved by
guidelines for OMS review. agencies As we state in new paragraph V.3.b.iLS. reproducibility will allow the public to
should include a description of the "With regard to analytic results related undertake sensitivity studies of interest.
extent to which the reproducibility [to influential scientific, financial. or We acknowledge that confidentiality
standard is applicable and reflect statistical information]. agency concerns will sometimes preclude
consultations with relevant scientific guidelines shall generally require public access as an approach to
and technical communities that were sufficient transparency about data and reproducibility. In response to public
used in developing guidelines related to methods that an independent reanalysis comment. we have clarified that such
applicability of the reproducibility could be undertaken by a qualified concerns do include interests in
standard to original and supporting member of the public. These "intellectual property." To ensure that
data. transparency standards apply to agency the OMB guidelines have sufficient
It is also important to emphasize that analysis of data from a single study as flexibility with regard to analytic
the reproducibility standard does not well as to analyses that combine transparency, OMB has. in new
apply to all original and supporting data information from multiple studies." We paragraph V.3.b.iLS.i. provided agencies
disseminated by agencies. As we state in elaborate upon this principle in our an alternative approach for classes or
new paragraph V.3.b.iLA. "With regard expanded definition of types of analytic results that cannot
to original and supporting data related "reproducibility" in paragraph V.IO: practically be subject to the
[to influential scientific, financial. or "With respect to analytic results. reproducibility standard. "[In those
statistical informationl. agency 'capable of being substantially situations involving influential
guidelines shall not require that all reproduced' means that independent scientific. financial. or statistical
disseminated data be subjected to a analysis of the original or supporting information· • • 1making the data and
reproducibility requirement." In data using identical methods would methods publicly available will assist in
addition. we encourage agencies to generate similar analytic results. subject determining whether analytic results are
address how greater transparency can be to an acceptable degree of imprecision. reproducible. However. the objectivity
achieved regarding original and or error." standard does not override other
supporting data. As we also state in new Even in a situation where the original compelling interests such as privacy.
paragraph V.3.b.iLA. "It is understood and supporting data are protected by trade secrets. intellectual property. and
that reproducibility of data is an confidentiality concerns, or the analytic other confidentiality protections. "
indication of transparency about computer models or other research Specifically. in cases where
research design and methods and thus methods may be kept confidential to reproducibility will not occur due to
a replication exercise (i.e.• a new protect intellectual property. it may still other compelling interests. we expect
experiment. test, or sample) shall not be be feasible to have the analytic results agencies (1) to perform robustness
required prior to each dissemination." subject to the reproducibility standard. checks appropriate to the importance of
Agency guidelines need to achieve a For example, a qualified party. the information involved, e.g..
high degree of transparency about data operating under the same determining whether a specific statistic
even when reproducibility is not confidentiality protections as the is sensitive to the choice of analytic
required. original analysts. may be asked to use method. and, accompanying the
Reproducibility of Analytic Results. the same data. computer model or information disseminated. to document
Many public comments were critical of statistical methods to replicate the their efforts to assure the needed
the reproducibility standard and analytic results reported in the original robustness in information quality. and
expressed concern that agencies would study. See. e.g., "Reanalysis of the (2) address in their guidelines the
Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices 8457
degree to which they anticipate the public dissemination of data and robustness checks required by these
opportunity for reproducibility to be models per se does not mean that the guidelines. Otherwise. the agency
limited by the confidentiality of analytic result has been reproduced. It should not disseminate anv of the
underlying data. As we state in new means onlv that the result should be studies that did not meet the applicable
paragraph V.3.b.iLB.ii, "In situations CAPABLE"of being reproduced. The standards in the guidelines at the time
where public access to data and transparency associated with this it publishes the notice of proposed
methods will not occur due to other capability of reproduction is what the rulemaking.
compelling interests, agencies shall OMB guidelines are designed to Some comments suggested that OMB
apply especially rigorous robustness achieve. consider replacing the reproducibility
checks to analytic results and document We also want to build on a general standard with a standard concerning
what checks were undertaken. Agency observation that we made in our final "confirmation" ofresults for influential
guidelines shall, however, in all cases, guidelines published in September scientific and statistical information.
require a disclosure of the specific data 2001. In those guidelines we stated: "... Although we encourage agencies to
sources that have been used and the in those situations involving influential consider "confirmation" as a relevant
specific quantitative methods and scientifid, financial,] or statistical standard-at least in some cases-for
assumptions that have been employed." information, the substantial assessing the objectivity of original and
Given the differences in the many reproducibility standard is added as a supporting data, we believe that
Federal agencies covered by these quality standard above and beyond "confirmation" is too stringent a
guidelines, and the differences in some peer review quality standards" (66 standard to apply to analytic results.
robustness checks and the level of detail FR 49722 (September 28, 2001)). A Often the regulatory impact analysis
for documentation thereof that might be hypothetical example may serve to prepared by an agency for a major rule,
appropriate for different agencies, we for example, will be the only formal
illustrate this point. Assume that two
also believe it will be helpful if agencies analysis of an important subject. It
Federal agencies initiated or sponsored
elaborate on these matters in the context would be unlikely that the results of the
of their missions and duties, with due the dissemination of five scientific
regulatory impact analysis had already
studies after October 1, 2002 (see
consideration of the nature of the been confirmed by other analyses. The
information they disseminate. As we paragraph III.4) that were, before
"capable of being substantially
state in new paragraph V.3.b.iLB.ii, . dissemination, subjected to formal,
reproduced" standard is less stringent
"Each agency is authorized to define the independent, external peer review, Le., than a "confirmation" standard because
type of robustness checks. and the level that met the presumptive standard for it simply requires that an agency's
of detail for documentation thereof, in "objectivity" under paragraph V.3.b.L analysis be sufficiently transparent that
ways appropriate for it given the nature Further assume, at the time of another qualified party could replicate it
and multiplicity of issues for which the dissemination, that neither agency through reanalysis.
agency is responsible." ~
reasonably expected that the Health, Safety, and Environmental
We leave the determination of the dissemination of any of these studies Information. We note. in the scientific
appropriate degree of rigor to the would have "a clear and substantial context, that in 1996 the Congress, for
discretion of agencies and the relevant impact" on important public policies, health decisions under the Safe
scientific and technical communities i.e., that these studies were not Drinking Water Act. adopted a basic
that work with the agencies. We do, considered "influential" under standard of quality for the use of science
however, establish a general standard paragraph V.9, and thus not subject to in agency decisionmaking. Under 42
for the appropriate degree of rigor in our the reproducibility standards in U.S.C. 300g-1(b)(3)(A). an agency is
expanded definition of paragraphs V.3.b.iLA or B. Then directed. "to the degree that an Agency
"reproducibility" in paragraph V.I0: assume, two years later, in 2005, that action is based on science." to use "(i)
" 'Reproducibility' means that the one of the agencies decides to issue an the best available, peer-reviewed
information is capable of being important and far-reaching regulation science and supporting studies
substantially reproduced, subject to an based clearly and substantially on the conducted in accordance with sound
acceptable degree of imprecision. For agency's evaluation of the analytic and objective scientific practices: and
information judged to have more (less) results set forth in these five studies and (ii) data collected by accepted methods
important impacts, the degree of that such agency reliance on these five or best available methods (if the
imprecision that is tolerated is reduced studies as published in the agency's reliability of the method and the nature
(increased)." OMB will review each notice of proposed rulemaking would of the decision justifies use of the
agency's treatment of this issue when constitute dissemination of these five data)."
reviewing the agency guidelines as a studies. These guidelines would require We further note that in the 1996
whole. the rulemaking agency, prior to amendments to the Safe Drinking Water
Comments also expressed concerns publishing the notice of proposed Act, Congress adopted a basic quality
regarding interim final paragraph rulemaking, to evaluate these five standard for the dissemination of public
V.3.B.iiL "making the data and models studies to determine if the analytic information about risks of adverse
publicly available will assist in results stated therein would meet the health effects. Under 42 U.S.C. 300g
determining whether analytic results are "capable of being substantially 1(b)(3)(BJ, the agency is directed, "to
capable of being substantially reproduced" standards in paragraph ensure thatthe presentation of
reproduced," and whether it could be V.3.b.ii.B and, if necessary, related information Irisk] effects is
interpreted to constitute public standards governing original and comprehensive. informative. and
dissemination of these materials, supporting data in paragraph V.3.b.iLA. understandable." The agency is further
rendering moot the reproducibility test. If the agency were to decide that any of directed, "in a document made available
(For the equivalent provision, see new the five studies would not meet the \to the public in support of a regulation
paragraph V.3.b.iLB.i.) The OMB reproducibility standard, the agency ItoJ specify, to the extent practicable
guidelines do not require agencies to may still rely on them but only if they (il each population addressed by any
reproduce each disseminated analytic satisfy the transparency standard and estimate lof applicable risk effects]: (ii)
result by independent reanalysis. Thus, as applicable-the disclosure of the expected risk or central estimate of
8458 Federal Register / Vol. 67, No. 36/ Friday, February 22, 2002 / Notices
risk for the specific populations for agency decisions on whether and public comment before issuing these
[affected]; (iii) each appropriate upper how to correct the information. and final guidelines, OMS will refine thesp
bound or lower-bound estimate of risk; agencies shall notify the affected guidelines as experience develops and
(iv) each significant uncertainty persons of the corrections made." further public comment is obtained.
identified in the process of the Several comments stated that the Dated: December 21, 2001.
assessment of [risk] effects and the OMB guidelines needed to direct John D. Graham,
studies that would assist in resolving agencies to consider incorporating an
administrative appeal process into their Administrator, Office of Information and
the uncertainty; and (v) peer-reviewed Regulatory Affairs.
studies known to the [agency] that administrative mechanisms for the
support, are directly relevant to, or fail correction of information. OMB agreed. Guidelines for Ensuring and
to support any estimate of [risk] effects and added the following new paragraph Maximizing the Quality, Objectivity.
and the methodology used to reconcile rrr.3.ii: "If the person who requested the Utility. and Integrity of Information
inconsistencies in the scientific data." correction does not agree with the Disseminated by Federal Agencies
As suggested in several comments, we agency's decision (including the
have included these congressional corrective action, if any), the person 1.0MB Responsibilities
standards directly in new paragraph may file for reconsideration within the Section 515 of the Treasurv and
V.3.b.iLC, and made them applicable to agency. The agency shall establish an General Government Appropriations
the information disseminated by all the administrative appeal process to review Act for FY2001 (Public Law 106-554)
agencies subject to these guidelines: the agency's initial decision, and specify directs the Office of Management and
"With regard to analysis ofrisks to appropriate time limits in which to Budget to issue government-wide
human health. safety and the resolve such requests for guidelines that provide policy and
environment maintained or reconsideration." Recognizing that procedural guidance to Federal agencies
disseminated by the agencies, agencies many agencies already have a process in for ensuring and maximiZing the
shall either adopt or adapt the quality place to respond to public concerns, it quality, objectivity, utility, and integrity
principles applied by Congress to risk is not necessarily OMB's intent to of information, including statistical
information used and disseminated require these agencies to establish a new information, disseminated by Federal
pursuant to the Safe Drinking Water Act or different process. Rather, our intent is agencies.
Amendments of 1996 (42 U.S.C. 300g- . to ensure that agency guidelines specify
1(b)(3)(A) & (B))." The word "adapt" is an objective administrative appeal II. Agency Responsibilities
intended to provide agencies flexibility process that, upon furthercomplaint by Section 515 directs agencies subject to
in applying these principles to various the affected person, reviews an agency's the Paperwork Reduction Act (44 U.S.C.
types of risk assessment. decision to disagree with the correction 3502(1)) to
Comments also argued that the request. An objective process will 1. Issue their own information quality
continued flow of vital information from ensure that the office that originally guidelines ensuring and maximiZing the
agencies responsible for disseminating disseminates the information does not quality, objectivity, utility .. and integrity
health and medical information to have responsibility for both the initial of information, including statistical
medical providers, patients, and the response and resolution of a information, disseminated by the agency
public may be disrupted due to these disagreement. In addition. the agency no later than one year after the date of
peer review and reproducibility guidelines should specify that if the issuance of the OMB guidelines;
standards. OMS responded by adding to agency believes other agencies may have 2. Establish administrative
new paragraph V.3.b.iLC: "Agencies an interest in the resolution of any mechanisms allowing affected persons
responsible for dissemination of vital administrative appeal, the agency to seek and obtain correction of
health and medical information shall should consult with those other information maintained and
interpret the reproducibility and peer agencies about their possible interest. disseminated by the agency that does
review standards in a manner Overall. OMS does not envision not comply with these OMB gUidelines;
appropriate to assuring the timely flow administrative mechanisms that would and
of vital information from agencies to burden agencies with frivolous claims. 3, Report to the Director of OMB the
medical providers, patients, health Instead. the correction process should number and nature of complaints
agencies, and the public. Information serve to address the genuine and valid received by the agency regarding agency
quality standards may be waived needs of the agency and its constituents compliance with these OMS guidelines
temporarily by agencies under urgent without disrupting agency processes. concerning the quality, objectivity,
situations (e.g., imminent threats to Agencies, in making their determination utility, and integrity of information and
public health or homeland security) in of whether or not to correct information, how such complaints were resolved.
accordance with the latitude specified may reject claims made in bad faith or
in agency-specific guidelines." without justification, and are required to III. Guidelines for Ensuring and
Administrative Correction undertake only the degree of correction Maximizing the Quality. Objectivity,
Mechanisms. In addition to commenting that they conclude is appropriate for the Utility, and Integrity of Information
on the substantive standards in these nature and timeliness of the information Disseminated by Federal Agencies'
guidelines, many of the comments noted involved. and explain such practices in 1. Overall, agencies shall adopt a
that the OMB guidelines on the their annual fiscal year reports to OMS. basic standard of quality [including
administrative correction of information OMS's issuance of these final objectivity, utility, and integrity) as a
do not specify a time period in which guidelines is the beginning of an performance goal and should take
the agency investigation and response evolutionary process that will include appropriate steps to incorporate
must be made. OMB has added the draft agency guidelines, public information quality criteria into agency
following new paragraph III.3.i to direct comment, final agency guidelines, information dissemination practices.
agencies to specify appropriate time development of experience with OMS Quality is to be ensured and established
periods in which the investigation and and agency guidelines, and continued at levels appropriate to the nature and
response need to be made. "Agencies refinement of both OMS and agency timeliness of the information to be
shall specify appropriate time periods guidelines. Just as OMB requested disseminated. Agencies shall adopt
Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices 8459
specific standards of quality that are the nature and extent of the complaint. perspective of the agency but also from
appropriate for the various categories of Examples of appropriate responses the perspective of the public. As a
information thev disseminate. include personal contacts via letter or result. when transparency of
2. As a matter of good and effective telephone, form letters, press releases or information is relevant for assessing the
agency information resources mass mailings that correct a widely information's usefulness from the
management, agencies shall develop a disseminated error or address a public's perspective, the agency must
process for reviewing the quality frequently raised complaint. take care to ensure that transparency has
(including the objectivity, utility, and 3. Each agency must prepare a draft been addressed in its review of the
integrity) of information before it is report, no later than April 1, 2002, information.
disseminated. Agencies shall treat providing the agency's information 3. "Objectivity" involves two distinct
information quality as integral to every quality guidelines and explaining how elements, presentation and substance.
step of an agency's development of such guidelines will ensure and a. "Objectivity" includes whether
information, including creation, maximize the quality, objectivity, disseminated information is being
collection, maintenance, and utility, and integrity of information, presented in an accurate, clear,
dissemination. This process shall enable including statistical information, complete, and unbiased manner. This
the agency to substantiate the quality of disseminated by the agency. This report involves whether the information is
the information it has disseminated must also detail the administrative presented within a proper context.
through documentation or other means mechanisms developed by that agency Sometimes, in disseminating certain
appropriate to the information. to allow affected persons to seek and types of information to the public. other
3. To facilitate public review, agencies obtain appropriate correction of information must also be disseminated
shall establish administrative information maintained and in order to ensure an accurate. clear,
mechanisms allowing affected persons disseminated by the agency that does complete, and unbiased presentation.
to seek and obtain, where appropriate, not comply with the OMB or the agency Also, ilie agency needs to identify the
timely correction of information guidelines. sources of the disseminated information
maintained and disseminated by the 4. The agency must publish a notice (to the extent possible, consistent with
agency that does not comply with OMB of availability of this draft report in the confidentiality protections) and, in a
or agency guidelines. These Federal Register, and post this reporton scientific, financial. or statistical
administrative mechanisms shall be the agency's website, to provide an context, the supporting da.ta and
flexible, appropriate to the nature and opportunity for public comment. models. so that the public can assess for
timeliness of the disseminated 5. Upon consideration of public itself whether there may be some reason
information, and incorporated into comment and after appropriate revision, to question the objectivity of the
agency information resources the agency must submit this draft report sources. Where appropriate, data should
management and administrative to OMB for review regarding have full, accurate, transparent
practices. ~ consistency with these OMB guidelines documentation, and error sources
\. Agencies shall specify appropriate no later than July 1, 2002. Upon affecting data quality should be
time periods for agency decisions on completion of that OMB review and identified and disclosed to users.
whether and how to correct the completion of this report, agencies must b. In addition, "objectivity" involves
information, and agencies shall notify publish notice of the availability of this a focus on ensuring accurate, reliable,
the affected persons of the corrections report in its final form in the Federal and unbiased information. In a
made. Register, and post this report on the scientific, financial. or statistical
ii. If the person who requested the agency's web site no later than· October context, the original and supporting
correction does not agree with the 1,2002. data shall be generated, and the analytic
agency's decision (including the 6. On an annual fiscal-year basis, each results shall be developed, using sound
corrective action, if any), the person agency must submit a report to the statistical and research methods.
may file for reconsideration within the Director of OMB providing information \. If data and analytic results have
agency. The agency shall establish an (both quantitative and qualitative, been subjected to formal. independent,
administrative appeal process to review where appropriate) on the number and external peer review., the information
the agency's initial decision, and specify nature of complaints received by the may generally be presumed to be of
appropriate time limits in which to agency regarding agency compliance acceptable objectivity. However, this
resolve such requests for with these OMB guidelines and how presumption is rebuttable based on a
reconsideration. such complaints were resolved. persuasive showing by the petitioner in
4. The agency's pre-dissemination Agencies must submit these reports no. a particular instance. If agency
review, under paragraph III.2, shall later than January 1 of each following sponsored peer review is employed to
apply to information that the agency year, with the first report due January 1, help satisfy the objectivity standard, the
first disseminates on or after October 1, 2004. review process employed shall meet the
2002. The agency's administrative general criteria for competent and
mechanisms, under paragraph III.3., V. Definitions credible peer review recommended by
shall apply to information that the 1. "Quality" is an encompassing term OMB-GIRA to the President's
agency disseminates on or after October comprising utility, objectivity, and Management Council (9/20101) (http://
1, 2002, regardless of when the agency integrity. Therefore, the guidelines www. whitehouse.gov I omblinforegl
first disseminated the information. sometimes refer to these four statutory oira_review-process.htrnl), namely,
terms, collectively, as "quality." "that (a) peer reviewers be selected
IV. Agency Reponing Requirements 2. "Utility" refers to the usefulness of primarily on the basis of necessary
1. Agencies must designate the Chief the information to its intended users, technical expertise, (b) peer reviewers
Information Officer or another official to including the public. In assessing the be expected to disclose to agencies prior
be responsible for agency compliance usefulness of information that the technical/policy positions they may
with these guidelines. agency disseminates to the public, the have taken on the issues at hand, (c)
2. The agency shall respond to agency needs to consider the uses of the peer reviewers be expected to disclose
complaints in a manner appropriate to information not only from the to agencies their sources of personal and
8460 Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
institutional funding (private or public detail for documentation thereof, in information to the public (see 5 CFR
sector), and (dl peer reviews be ways appropriate for it given the nature 1320.3(d) (definition of "Conduct or
conducted in an open and rigorous and multiplicity of issues for which the Sponsor")). Dissemination does not
manner." agency is responsible. include distribution limited to
iL If an agency is responsible for C. With regard to analysis of risks to government employees or agency
disseminating influential scientific, human health, safety and the contractors or grantees; intra- or inter
financial. or statistical information, environment maintained or agency use or sharing of government
agency guidelines shall include a high disseminated by the agencies, agencies information; and responses to requests
degree of transparency about data and shall either adopt or adapt the quality for agency records under the Freedom of
methods to facilitate the reproducibility principles applied by Congress to risk Information Act, the Privacy Act, the
of such information by qualified third information used and disseminated Federal Advisory Committee Act or
parties. pursuant to the Safe Drinking Water Act other similar law. This definition also
A. With regard to original and Amendments of 1996 (42 U.S,C. 300g does not include distribution limited to
supporting data related thereto, agency 1(b)(3)(A) & (Bll. Agencies responsible correspondence with individuals or
guidelines shall not require that all for dissemination of vital health and persons, press releases, archival records.
disseminated data be subjected to a medical information shall interpret the public filings, subpoenas or adjudicative
reproducibility requirement. Agencies reproducibi1ity and peer-review processes.
may identify, in consultation with the standards il1 a manner appropriate to
assuring the timely flow of vital 9. "Influential", when used in the
relevant scientific and technical phrase "influential scientific, financial.
communities. those particular types of information from agencies to medical
providers. patients, health agencies, and or statistical information", means that
data that can practicable be subjected to the agency can reasonably determine
a reproducibility requirement, given the public. Information quality
standards may be waived temporarily by that dissemination of the information
ethical, feasibility, or confidentiality will have or does have a clear and
constraints. It is understood that agencies under urgent situations (e,g.,
imminent threats to public health or substantial impact on important public
reproducibility of data is an indication policies or important private sector
of transparency about research design homeland security) in accordance with
the latitude specified in agency-specific decisions. Each agency is authorized to
and methods and thus a replication define "influential" in ways appropriate
exercise (Le., a new experiment, test, or . guidelines.
4. "Integrity" refers to the security of for it given the nature and multiplicity
sample) shall not be required prior to of issues for which the agency is
each dissemination. information-protection of the
information from unauthorized access responsible.
B. With regard to analytic results
related thereto, agency guidelines shall or revision, to ensure that the 10. "ReprodUcibility" means that the
generally require sufficient transparency information is not compromised information is capable of being
about data and methods that an ~ through corruption or falsification. substantially reproduced, subject to an
5. "Information" means any acceptable degree of imprecision. For
independent reanalysis could be communication or representation of
undertaken by a qualified member of the information judged to have more (less)
knowledge such as facts or data. in any important impacts, the degree of
public. These transparency standards medium or form, including textual,
apply to agency analysis of data from a imprecision that is tolerated is reduced
numerical, graphic, cartographic, (increased). If agencies apply the
single study as well as to analyses that narrative, or audiovisual forms. This
combine information from multiple reproducibility test to specific types of
definition includes information that an original or supporting data, the
studies. agency disseminates from a web page,
i. Making the data and methods associated guidelines shall provide
but does not include the provision of relevant definitions of reproducibility
publicly available will assist in hyperlinks to information that others
determining whether analytic results are (e.g., standards for replication of
disseminate. This definition does not laboratory data). With respect to
reproducible. However, the objectivity include opinions, where the agency's
standard does not override other analytic results, "capable of being
presentation makes it clear that what is substantially reproduced" means that
compelling interests such as privacy, being offered is someone's opinion
trade secrets, intellectual property, and independent analysis of the original or
rather than fact or the agency's views. supporting data using identical methods
other confidentiality protections. 6. "Government information" means
ii. In situations where public access to would generate similar analytic results,
information created, collected, subject to an acceptable degree of
data and methods will not occur due to processed, disseminated, or disposed of
other compelling interests, agencies imprecision or error.
by or for the Federal Government. .
shall apply especially rigorous 7. "Information dissemination [FR Doc. 02-59 Filed 1-2~2: 1:36 pm]
robustness checks to analytic results product" means any books, paper. map, BILLING CODE 3110-01-M
and document what checks were machine-readable material, audiovisual Editorial Note: Due to numerous errors,
undertaken. Agency guidelines shall, production, or other documentary this document is being reprinted in its
however, in all cases, require a material. regardless of physical form or entirety. It was originally printed in the
disclosure of the specific data sources characteristic, an agency disseminates to Federal Register on Thursday. January 3,
that have been used and the specific the public. This definition includes any 2002 at 67 FR 369-378 and was corrected on
quantitative methods and assumptions electronic document, CD-ROM, or web Tuesday, February 5, 2002 at 67 FR 5365.
that have been employed. Each agency page.
is authorized to define the type of 8. "Dissemination" means agency [FR Doc. R2-59 Filed 2-21~2; 8:45 am]
robustness checks, and the level of initiated or sponsored distribution of BILLING CODE 1505-01-{)
Page 10f1
Unknown
Sir,
Rick Newsome suspects that Mr. Barry Groveman, of the Inland Empire Perchlorate Task Force (IEPTF),
may contact Mr. DuBois in a further attempt to gain a commitment from DoD to participate in
Potentially Responsible Party (PRP) negotiations for cleanup of perchlorate contamination in drinking
water, in San Bernadino, California. The info paper enclosed is the latest update on the site, "Rialto."
Recommend that the MAs forward any calls to Col George Ledbetter of OGC, • He is up
to date on the legal issues involved.
Kurt
Rialto Ammunition Storage PointlInland Empire Site
Issue: Mr. Barry Groveman, of the Inland Empire Perchlor~te Task Force (IEPTF), may contact
Mr. DuBois in a further attempt to gain a commitment from DoD to participate in Potentially
Responsible Party (PRP) negotiations for cleanup of perchlorate contamination in drinking water, in
San Bernadino, California.
Background: As part of the Formerly Used Defense Program (FUDS) program, the Corps of
Engineers completed two eligibility assessments on the property, both of which suggested that
perchlorate was not present during -the time of DOD ownership and control of the former Rialto
Ammunition Storage Point. The assessments are not 100% conclusive, and the Corps is
considering expanding the eligibility assessment specifically to address potential perchlorate
contamination, in FY 2003. Subsequent to DOD ownership, records indicate that several DOD
contractors operated at the property, with contracts with both the Air Force and Navy, and may have
been indemnified for their activities. Therefore, in regard to liability, the site appears to have little
FUDS interest, although could evolve into a complex contractural/PRP situation for DoD.
Current Status: In a conference calIon August 15, 2002, representatives of IEPTF requested Anny
agree to participate in PRP negotiations allocating financial responsibility for cleanup of perchlorate
contamination in the drinking water of four municipalities near San Bernadino, California. IEPTF
alleged that the former RIalto Ammunition Storage Point contributed to the contamination at the
site. In subsequent telephone calls, Mr. Ryan Heite and Mr. Barry Gr:oveman, ofIEPTF, requested
Army designate an individual to participate in the PRP negotiations.
On September 4, 2002, Army informed Mr. Groveman, that before they could respond to IEPTF on
any of these issues, they must first establish whether c>r not DOD caused or controlled activities that
could have potentially contributed to the perchlorate contamination at the site. The Army also
informed Mr. Groveman that the Department of Justice (DoJ) would represent the Army in any
discussion, or negotiation, of DOD liability at this property. Further, even if Anny and IEPTF
were to come to agreement on the degree ofcontribution at the site, 000 could not commit to any
direct payments to IEPTF municipalities.
(b)(5)
Perchlorates - OSD Human Pathway Question Page 1 of2
/C:<5
Sandy,
I am told you requested information on Army installations/locations where there may be a completed
pathway for human exposure to perchlorate. Ms. Van Brocklin, et. al. have prepared the information provided at
the attachment. It is provided for your use as appropriate.
Rick
-'-··Original Message-"-'
From: Van Brocklin, Connie H Ms ACSIM
Sent: Wednesday, June 25, 2003 8:47 AM
To: Newsom~, Richard EMr ASA·I&E
Cc: Garg, Malcolm) ACSIM/CH2M HILL
Subject: Perchlorates Info for Ms. Catter
Rick,
Sandy Cotter called last week and said the Ben Cohen also wanted to know if there is an exposure
pathway for the sites where we have quantitative data. If you approve, can send to Ms. Cotter?
Thanks,
Connie
(b)(2)
DAIM-EDT
25 June 2003
Installations have been asked to address the question, "Is there a perchlorate
exposure pathway that could threaten public health". Determinations of
pathways often require rigorous site investigation and interpretation of data.
Th~se are preliminary answers based on best available information.
fyi
-----Original Message----
From: Bowling, Curtis, Mr, OSD-ATL
Sent: Tuesday, March 23, 2004 8:46 AM
To: Kratz, Kurt, , OSD-ATLi Larkin, Janice, Ms, OSD-ATLi Beard, Bruce, Mr,
OSD-ATL
Subject: FW: Perchlorate White paper
fyi
-----Original Message----
From: Bowling, Curtis, Mr, OSD-ATL
Sent: Thursday, March 18, 2004 9:32 AM
To: Beehler, Alex, Mr, OSD-ATLi Cohen, Ben, Mr, DoD OGC
Cc: I. . Wright, William, CAPT, OSD-ATLi Kiser,
Richard CAPT DDESBi Bowling, Curtis, Mr, OSD-ATLi Kaminski, Art, LtCol,
OSD-ATL; Nicholls, William, Mr, OSD-ATL
Subject: Perchlorate White paper
PerchlorateWhitePa
per.doc
Curtis
1
Document 890
Exemption 5
Christina,
As promised. Sorry it took a couple of days. Remember - close hold· back ground use for you.
Certainly don't mind if you use these ideas to start doing investigation on your own.
Thx,
Kurt
7/19/2005
Document 290
Title: Draft DoD Perchlorate Site Characterization and Treatment Cost Estimate
Exemption 5
Document 290
Exemption 5
Document 290
Exemption 5
Issue Background
• Perchlorate case
- Establishment of standards by guidance rather than rulemaking
o OMB lead
• Chemistry'
Stable salt in soil and groundwater
Chile
o Potash formation
o 110 ppb in drinking water in surrounding communities
• Industrial Uses:
Energetic for solid rockets and munitions (- 90 percent)
Fireworks, airbag inflators, medicines (- 10 percent)
• Health effects
Iodide uptake inhibitor in thyroid (replaces iodide at receptors)
EPA identified sensitive subpopulation as fetal
Extremely high doses may cause developmental problems
~J' lIT ~""T:"T"\.Llnn_T"'~"""~'IC"!.I..f"\.._"'TA I" Il~T"'-----.L.....1.I:"LI:"_.I:l..-..&._~I,---rT""JJT"'t..r"t. ,"1F"'\..~ 1'"""'\.1_nC'!I C\c,..., I" ' ......... rn rC\.I_--'
rKI v ILLULDIII IXLDLCh,IVI"ilftL!IDCLIOLlx::r\ I I V DIDV l"ilV I UIJCLV.,L VrITJ'LK I
I
vln 3
Percholorate Fundamentals
- Voluntary
o 200 I survey
• State standards
Califomia:action level at 4 - 18 ppb
Massachusetts: interim drinking water advice level at I ppb for sensitive populations
o Industrial at 7 - 10 ppb
o Residential at 4 ppb
• Mission effects
Aberdeen
o Stopped training on selected ranges
Vandenberg
decision-making
regulatory concerns
• Sampling Policy
November 2002
o 000 Components may assess perchlorate occurrence if:
• Reasonable basis for presence
• Pathway that impacts public health
o Allows Components to react to regulator requests
• California Outreach
- Conducted by ADUSD(E)
Agree.ment to establish working group with California regulatory
agencIes
. 0 Coordinated response to California request for all DoD installations to
sample for perchlorate plus five other unregulated chemicals
Agreement to work with Southern California water agencies
o Cooperation in testing, validation, and certification of drinking water
treatment technologies
• N-Nitrosodimethylamine (NOMA)
Component of rocket fuel
Probable human carcinogen identified by EPA
California action level is 10 nglL
• 1,4 Dioxane
Stabilizer for chlorinated solvents
Probable human carcinogen (low hazard) identified by rPA
California action level is 2 micrograms/L
Paint and varnish remover, cleaning, degreasing agent Cal EPA survey request
Hexavalent Chromium
Metal plating, corrosion inhibitor
No state or federal regulation
Regulation by California possible by Jan 2004 (Senate Bill 351)
Unknown
From: Kratz, Kurt, ,OSD-ATL
Sent: Wednesday, August 06,200309:42
To: Cotter, Sandra, Ms, OSD-ATL
SUbject: Perchlorate
fyi
Unknown
From: Kratz, Kurt, , OSD-ATL
Sent: Wednesday, August 06,200309:42
To: 'Kowalczyk Daniel'
SUbject: RE: FW: EPA-NAS contract
-----Original Message----
From: Kowalczyk Daniel [mail to: ,
Sir,
vir
Dan
>
> -----Original Message----
> From: Cruz Angelyn Ctr. SAF/TEE
> Sent: Monday, August 04, 2003 10:13 AM
> To: Cornell Jeff Lt. Col SAF/lE
> SUbject: FW: requested scanned documents
>
> -----Original Message----
> From: Cruz Angelyn Ctr. SAF/IEE
> Sent: Friday, July 11, 2003 8:33 AM
> To: Cornell Jeff Lt. Col SAF/lE
> Subject: requested scanned documents
>
> «NAS Perchlorate Task Order.doc»
>
> Angelyn A. Cruz
> Office of the Deputy Assistant Secretary
> (Environment, Safety & Occupational Health)
>
>
> Name: NAS Perchlorate Task
Order. doc
> NAS Perchlorate Task Order.doc Type: WINWORD File
(application/msword)
> Encoding: base64
> Download Status: Not downloaded with
message
Daniel Kowalczyk
8283 Greensboro Dr
McLean, VA 22102
(b)(2)
2
Unknown
Okay, but what exactly is it you are trying to tell me to do about this?
If you want me to go to Connaughton, I can but I will need a copy of the
document we think we should have and the document NAS is using, and'
advise that our version is the proper one. I don't mind engaging
principlas on this if we should.
Thanks,
MK
-----Original Message----
From: Cornell Jeff Lt. Col SAF/IE
Sent: Tuesday, August 05, 2003 9:37 PM
To: Koetz Maureen SES SAF/IE; Cohen, Ben, Mr, DoD OGC: Kratz, Kurt, ,
OSD-ATL; Meehan, Patrick, Mr, OSD-ATL; Rogers Daniel Col AFLSA!JACE
Subject: PRIVATE: FW: Perchlorate SOW on NAS website different from our
negotiated position
Importance: High
vr,
jeff
-----Original Message----
From: Cornell Jeff Lt. Col SAF/IE
1
To:
Sent: 8/572003 9:22 PM
Subject: FW: Perchlorate SOW on NAS website different from our
negotiated position
Importance: High
«Final Task Order SOW 5-13-03.doc»
Paul - as we discussed. I left another voicemail for Paul ... let's hope
this is easily resolved. I'll call you tomorrow.
Jeff
-----Original Message----
From: Cornell Jeff Lt. CQl SAflIE
To: Panastas (E-mail)
Sent: 8/4/2003 9:18 AM
Subject: Perchlorate SOW on NAS website different f rom our negotiated
position
Importance: High
thanks,
jeff
2
Unknown
From: Kratz, Kurt, , OSD-ATL
Sent: Wednesday, August 06, 2003 09:38
To: Meehan, Patrick, Mr, OSD-ATL
SUbject: FW: DOD/NASA/DOE Discussion
fyi
Message----
Rusden
, .
Civ SAF/lEE; Ashworth Richard Col SAF/lE
Lesly
jeff
-----Original Message----
From II'~Bwu~b~ailri'.p.aiti1lriiililiclie
To: iJ:
• ••••••••••••••••••••••
Cc: Guevara, Karen; Rowley, Blaine
Sent: 8/5/2003 6:56 PM
Subject: DOD/NASA/DOE Discussion
1
Thanks
Patty Bubar
Page 1 of 1
Unknown
-----Original Message----
From: Koetz Maureen SES SAFflE
Sent: Wednesday, August 06, 2003 8:59 AM
To: Kratz, Kurt, , OSD-ATL
Subject: RE: Brief to Wynne
Ma'am,
Mr. Grone postponed meeting w;th Wynne. He wants to vette the brief with DUSD(IP) - Suzanne Patrick,
Director, Defense Systems - Glen Lamartin, and Director, Defense Procurement and Acquisition Policy,
Diedre Lee for their comments. Next time both Mr. Wynne and Mr. Grone are in the building at the same
time will be first week in Sep. That is the proposed new brief date.
Kurt
Daniel Kowalczyk
Booz Allen Hamilton
8283 Greensboro Dr
McLean, VA 22102
(b)(2)
Document 232
Exemption 5
PERCHLORATE REMED~TION
.:. The potential interaction between perchlorate remediation and RRPI occur in the legislation's
proposed changes to RCRA and CERCLA. The legislation also proposes some changes to
the Marine Mammal Protection Act, the Endangered Species Act. and the Clean Air Act.
.:. The changes to RCRA seek to codify EPA's-1997 Military Munitions Rule into statute. This
1997 rule drew the line between ranges and range activities that ~ exempt and that are
subject to solid and hazardous waste regulation.
•:. The RRPI legislation would exclude from in the definition of a "solid waste" under the Solid
Waste Disposal Act, explosives, unexploded ordnance (UXO), munitions or their
constituents when used for the following purposes:
~ Research and development, testing and evaluation of munitions and weapons systems;
~ Where they are deposited on an operational range incident to their expected use or are
•:. Then, to be clear, it includes in the statutory definition of solid waste explosives, unexploded
ordinance, munitions or their constituents which:
> Are removed from an operational range for reclamation, treatment, storage or disposal;
> Are deposited off-range incident to nonnal use and are not promptly rendered safe or
retrieved.
•:. The bill clarifies that such munitions or ordnance remaining on closed ranges remain subject
to existing legal requirements.
•:. The proposed changes to CERCLA are more complex. The major changes to DoD's
activities under CERCLA include:
. ~ It would exempt DoD from CERCLA's section 103 requirement to report releases of
hazardous substances.
~ It would eliminate the requirement under CERCLA section 120 that EPA assure that
DoD has conducted a preliminary assessment at all applicable Federal facilities.
~ It retains EPA's authority under Section 106 of CERCLA to order clean up - even on
active ranges exempt under the RCRA portions of RRPI and current regulations - if EPA
finds an imminent and substantial danger to public health.
How Does the Legislation Effect Current and Future Perchlorate Cleanups?
.:. The scope of activities covered by the proposed legislation are identical to, and no broader
than, those currently covered under the Clinton Administration's 1997 Military Munitions
Rule. Thus, nothing new would be exempted under RRPI that isn't already exempted by
Federal reguiaiion.
•:. However, only about 30 States have adopted the Munitions rule in their State regulations.
Since States can be more stringent than EPA's Munitions rule, the RRPI bill would preclude
States from adopting more stringent regulations for active ranges.
•:. Elevating the RCRA exclusions into statute would also prevent States, EPA, and private
parties from filing suits to force cleanup under RCRA's corrective action program for exempt
activities at operational ranges.
•:. DoD argues that the legislation hannonizes the numerous provisions under RCRA and
CERCLA for Federal, State, and private party litigation. IfRRPI became law, EPA would be
able to address problems under its CERCLA 104 and 106 authorities. If EPA and DoD failed
to devise a solution, private parties and States could then initiate action since the constituents
were "deposited off-range incident to normal use and are not promptly rendered safe or
retrieved." Thus, the legislation elevates potential conflicts between operational readiness
and the pace of environmental cleanup to the Federal level for resolution.
.:. RRPI would not change the scope, pace, and authority of EPA, States, and private parties for
perchlorate cleanups of:
~ Closed ranges;
~ Spills and releases 'at private ordnance manufacturing and storage facilities;
.:. The vast majority of sites with known perchlorate releases would not be changed by RR,PI.
Since perchlorate has only been detected since 1997, current site investigations and cleanups
have been carried out under the exemptions of the Munitions Rule. Since current cleanup
sites have not received exemptions under the Munitions Rule, this activity is evidence that
their cleanups would be unaffected by RRPI.
Are Private Parties Exempted by RRPI?
.:. RRPI would codify the existing regulatory exemptions for private parties that engage in
exempt activities with munitions at test ranges they own. For example, if a private
manufacturer operated an active firing range to conduct research, development, and testing of
munitions, its range clearing activities meeting the regulatory definition would be exempt.
.:. Releases from munitions that migrate off any active range are - and would continue to be
subject to Federal and State enforcement. Again, RRP! only codifies the existing provisions
of the Munitions Rule. EPA's authority to order cleanups or investigations currently
underway would not change.
•:. This narrow provision would appear not to exempt the most well-known perchlorate releases
from private facilities. Most perchlorate releases appear to be from closed manufacturing
operations, munitions disposal areas, and closed testing facilities.
.:. An environmental group, Center for Public Environmental Oversight (CEPO), and its
director Lenny Segal has circulated a "connect the dots" argument that purports to show how
RRPI exempts DoD from its perchlorate liability. This argument, and its numerous flaws.
are reviewed below.
~ DoD contends that perchlorate cleanup is not required, except under statutory authority.
> The Bush Administration and DoD, in particular, is likely to delay promulgation of a
perchlorate standard. .
~ However, DoD's proposed RRPI language would limit the applicability of Califomi a's
(and other State's) standard at CERCLA and RCRA at operational ranges and possibly
other sites.
~ Questions the validity of testimony given by EPA Assistant Administrator Suarez that the
Safe Drinking Water Act (SDWA) gives EPA the tools it needs to address groundwater
issues at sites that might be exempted from RCRA and CERCLA under the proposed
legislation. He also claims that the SDWA does not recognize State standards.
~ Thus, DoD hopes to avoid full regulation of its perchlorate sites even if California and
other States issue protective standards.
(b)(5)
Perchlorate Executive Briefing Book
Table of Contents
Exemption 5
~ ...kr
.!t •." it' \t
..".'
. "
Sandy,
Toxicity Data
Do you have a preference for which one gets finished next? If so,
David R. Green
Hamilton 4063
(b)(2) (v)
(f)
1
Use of the Data Quality Act in Challenges to Health Based Standards
Problem
The Department of Defense (DoD) has repeatedly expressed concerns regarding the data used by
the U.S. Environmental Protection Agency (EPA) in the January 2002 draft perchlorate risk
assessment. This draft risk assessment proposed a reference dose (RID) drinking water equivalent
level (DWEL) of 1 ppb, based in part on an EPA decision to discard many of the results and
conclusions of both a 1998 risk characterization and the conclusions and recommendations from
the peer review of that risk assessment EPA conducted in late 1998-early 1999. Despite repeated
challenges to the science and health effects analyses underlying the decision to set the RID for
perchlorate at 1 ppb, DoD has realized little success at changing EPA's latest proposal. This paper
explores the option of a formal challenge under EPA's Guidelines for Ensuring and Maximizing
the Quality, Objectivity, Utility, and Integrity of Information Disseminated by the
Environmental Protection Agencyl (hereinafter the "EPA Guidelines") (provided at Attachment
A).
Discussion
Enacted as part of the Treasury and General Government Appropriation Act for Fiscal Year
2001, (Public Law 106-554, section 515), the Data Quality Act (DQA) amended the Paperwork
Reduction Act (PRA) (44 U.S.C. § 3501 et seq) (see Attachment B). The DQA requires Federal
agencies ensure that the information disseminated by the agency meet a standard of quality. The
Office of Management and Budget (OMB) issued Guidelines for Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity ofInformation Disseminated by Federal Agencies; (67
F.R. 8452, February 22,2002) (provided at Attachment C). The OMB guidance required all
Federal agencies:
• Issue guidelines"... ensuring and maximizing the quality, objectivity, utility and integrity
of information (including statistical information) disseminated by the agency.
• Establish administrative mechanisms allowing affected persons to seek and obtain
correction of information maintained and disseminated by the agency that does not comply
with the data quality guidelines.
• Periodically report on the number and nature of complaints received by the agency and
how such complaints were handled by the agency.
The EPA Guidelines use a tiered approach to set the level of quality, objectivity, utility and
integrity of information products that is based on the intended uses of those products. Under this
tiered system, the more important the use, the higher the quality standard. As a general matter,
EPA believes that current statues, regulations, and scientific practices (including both internal
quality management systems and external peer review) that EPA employs implement these
guidelines. EPA has also stated that information subjected to public review and comment will not
be considered under the EPA Guidelines and asserted the view that that formal, independent,
external peer review, of and by itself, means information is objective. EPA does, however, allow
this presumption of objectivity to be rebutted. It is this provision allowing rebuttal of EPA 's
assertions as to the quality, objectivity, utility and integrity which may offer a means to challenge
the scientific analysis underlying the 2002 perchlorate risk assessment.
DRAFT
Revision 0-0
December 5,2003
To make use of the mechanism for challenging the quality, objectivity, utility and integrity of
information used by EPA a petitioner must submit a Request for Correction (RFC) to EPA. As
discussed in the EPA Guidelines, an RFC must include:
(1) Name and contact information for the individual or organization submitting a complaint;
identification of an individual to serve as a contact.
(2) A description of the information the person believes does not comply with EPA or OMB
guidelines, including specific citations to the information and to the EPA or OMB
guidelines, if applicable.
(3) An explanation of how the information does not comply with EPA or OMB guidelines
and a recommendation of corrective action. EPA considers that the complainant has the
burden of demonstrating that the information does not comply with EPA or OMB
guidelines and that a particular, corrective action would be appropriate.
(4) Explanation of how the error affects or how a·correction would benefit the requestor.
It is the responsibility of the petitioner to show how the information fails to comply with the, and
how any proposed actions to address that non-compliance resolves the data problem. Once the
petition is submitted to the EPA Office of Environmental Information (OEI), EPA begins the
administrative process for addressing the RFC by sending the RFC to the appropriate EPA office
(i.e., the office responsible for the quality of the information or document). An RFC can only be
rejected by the official at the highest organization level in the responsible EPA Office or Region.
If unsatisfied by the response a petitioner can appeal the decision by submitting a Request for
Reconsideration (RFR). As with the RFC, the RFR must include an explanation of the
disagreement with EPA's decision and a specific recommendation for addressing the conflict. The
RFR is reviewed by an executive-level panel comprised of EPA personnel. Beyond the
administrative review mechanism of the RFC and RFR, those with legal standing can seek judicial
review (judicial review is not an option available to DoD).
Recommendations
The first action that should be taken is an analysis of the DQA, OMB guidelines, and EPA
guidelines to detennine if DoD is in any way prohibited from submitting a challenge under the
DQA and the EPA or OMB Guidelines. For example, under the construct of the unitary executive,
DoD may be prohibited from submitting a RFC challenging the analyses and data underlying the
draft risk assessments. As a mater requiring legal analysis, the Office of the Deputy Under
Secretary of Defense (Installations and Environment) should request a written legal opinion from
the Office of General Counsel. Since this opinion is will answer a "go/no go" question, it must be
obtained as quickly as possible. 2
Assuming that DoD is not enjoined from seeking a review of the EPA data and analysis
underlying the second draft risk assessment for perchlorate, DoD should begin development of the
RFC documentation. It is observed that represents a significant level of effort since the petition
must be specific in regards both the problems with the data and the solutions proposed, and
requires examination of hundreds, if not thousands, of pages of documents. The Air Force, which
has led the perchlorate effort for DoD as part of their role as the Executive Agent for the Safe
Drinking Water Act, should be tasked with this responsibility.
2It would also be advisable to request the Office of General Counsel provide guidance on how the DQA and OMB guidelines
affect data collected and used in the execution ofthe Defense Environmental Restoration Program.
DRAFT 2
Revision 0-0
December 5,2003
- - - - - - - - - - - - - - - - _.._---
Prepared by:
Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of
Information Disseminated by the Environmental Protection Agency
Table of Contents
1 Introduction .J.
2 EPA Mission and Commitment to Quality ~
2.1 EPA's Mi'ssion and Commitment to Public Access ~
2.2 Information Management in EPA ~
2.3 EPA's Relationship with State, Tribal, and Local Governments R
3 OMB Guidelines 2
4 Existing Policies and Procedures that Ensure and Maximize Information Quality 10
4.1 Quality System 10
4.2 Peer Review Policy ·.· ·il
4.3 Action Development Process .ll
4.4 Integrated Error Correction Process 12
4.5 Information Resources Management Manual 13
4.6 Risk Characterization Policy and Handbook U
4.7 Program-Specific Policies 13
4.8 EPA Commitment to Continuous Improvement H
4.9 Summary of New Activities and Initiatives H
5 Guidelines Scope and Applicability U
5.1 What is "Quality" According to the Guidelines? ~
5.2 What is the Purpose of these Guidelines? 12
5.3 When Do these Guidelines Apply? 12
5.4 What is Not Covered by these Guidelines? 12
5.5 What Happens if Information is Initially Not Covered by these Guidelines. but
EPA Subsequently Disseminates it to the Public? lR
5.6 How does EPA Ensure the Objectivity, Utility, and Integrity of information that is
not covered by these Guidelines? lR
6 Guidelines for Ensuring and Maximizing Information Quality l2.
6.1 How does EPA Ensure and Maximize the Quality of Disseminated Information?
............................................................... l2.
6.2 How Does EPA Define Influential Information for these Guidelines? 12
6.3 How Does EPA Ensure and Maximize the Quality of "Influential" Information?
............................................................... 20
6.4 How Does EPA Ensure and Maximize the Quality of "Influential" Scientific Risk
Assessment Information? 21
6.5 Does EPA Ensure and Maximize the Quality of Information from External
Sources? 28
Table of Contents ,1
GUidelines, for' Ensuring and MaKIITllzlng the Quality. Objectivity. Utility. and Integrlty'of Information DISSemIM\e'( t)\ EP':'
8.1 What are EPA's Administrative Mechanisms for Affected Persons to Seek and
8.3 When Does EPA Intend to Consider a Request for Correction of Information?
••••••••••••••••••••••••••••• 0 ••••••••••••••••••••••••••••••••• 31
8.4 How Does EPA Intend to Respond to a Request for Correction of Information?
o 0 •• 0 •••••••••••••••••••••••••••••••••••••••••••••••• on
8.5 How Does EPA Expect to Process Requests for Correction of Information on
8.6 What Should be Included in a Request Asking EPA to Reconsider its Decision on
a Request for the Correction of Information? 034 ••••••• 0 • 0 ••••••• ,
8.7 How Does EPA Intend to Process Requests for Reconsideration of EPA
Decisions? 0 34
•••••••••••••• , ••••••••
A.1 Introduction 36
A.3.5 Reproducibility 47
Table of Contents 2
C,Uldenne, tN Ensunno and Maximizlllg the Quality. Obrectivity. Utility. and Integrity of Information Disseminated b, EPt,
1 Introduction
Developed in response to guidelines issued by the Office of Management and Budget (OMB)I
under Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal
Year 2001 (Public Law 106-554; H.R. 5658), the Guidelinesfor Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity of Information Disseminated by the Environmental
Protection Agency (the Guidelines) contain EPA's policy and procedural guidance for ensuring
and maximizing the quality of information we disseminate. The Guidelines also outline
administrative mechanisms for EPA pre-dissemination review of information products and
describe some new mechanisms to enable affected persons to seek and obtain corrections from
EPA regarding disseminated information that they believe does not comply with EPA or OMB
guidelines. Beyond policies and procedures these Guidelines also incorporate the following
performance goals:
• The principles of information quality should be integrated into each step of EPA's
development of information, including creation, collection, maintenance, and
dissemination.
OMB encourages agencies to incorporate standards and procedures into existing information
resources management practices rather than create new, potentially duplicative processes. EPA
has taken this advice and relies on numerous existing quality-related policies in these Guidelines.
EPA will work to ensure seamless implementation into existing practices. It is expected that
EPA managers and staff will familiarize themselves with these Guidelines, and will carefully
review existing program policies and procedures in order to accommodate the principles outlined
in this document.
IOuidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information
Disseminated by Federal Agencies; OMB, 2002. (67 FR 8452) Herein after "OMB guidelines".
http://www.whitehouse.gov/omb/fedreg/reproducibJe2.pdf
Introduction 3
GUIOC!lnC5 to' E:nsunn9 and ~.'lny.lmlZ:lI1g the Quality. OOlectlvity. Utilil,'. and IntegrIty of Informall on Dlss~mJn~t·~,·· i) to;;.,
EPA's Guidelines are intended to carry out OMB's government-wide policy regarding
information we disseminate to the public. Our Guidelines reflect EPA's best effort to present our
goals and commitments fOf ensuring and maximizing the quality of information we disseminate.
As such, they are not a regulation and do not change or substitute for any legal requirements.
They provide non-binding policy and procedural guidance, and are therefore not intended to
create legal rights, impose legally binding requirements or obligations on EPA or the public
when applied in particular situations, or change or impact the status of information we
disseminate, nor to contravene any other legal requirements that may apply to particular agency
determinations or other actions. EPA's intention is to fully implement these Guidelines in order
to achieve the purposes of Section 515.
These Guidelines are the product of an open, collaborative process between EPA arid numerous
EPA stakeholders. The Guidelines development process is described in the Appendix to this
document. EPA received many public cdmments and has addressed most comments in these
Guidelines. A discussion of public comments is also provided in the Appendix and is grouped by
overarching themes and comments by Guidelines topic areas. EPA views these Guidelines as a
living document, and anticipates their revision as we work to further ensure and maximize
information quality.
Introduction 4
GUiDelines te, Ensurlllo and MaxlIl1izrng the Quality. Oblectivity. Utility. and Integnl)' at Informotlon Disscrninntcc t\ "P;'
The mission of the EPA is to protect human health and safeguard the natural environment upon
which life depends. EPA is committed to making America's air cleaner, water purer, and land
better protected and to work closely with its Federal, State, Tribal, and local government
partners; with citizens; and with the regulated community to accomplish its mission. In addition,
the United States plays a leadership role in working with other nations to protect the global
environment.
EPA statutory responsibilities to protect human health and safeguard the natural environment are
described in the statutes that mandate and govern our programs. EPA manages those programs in
concert with numerous other government and private sector partners. As Congress intended, each
statute provides regulatory expectations including information quality considerations and
principles. Some statutes are more specific than others, but overall, each directs EPA and other
agencies in how we regulate to protect human health and the environment. For example, the Safe
Drinking Water Act (SDWA) Amendments of 1996 set forth certain quality principles for how
EPA should conduct human health risk assessments and characterize the potential risks to
humans from drinking water contaminants. Information quality is a key component of every
statute that governs our mission.
The collection, use, and dissemination of information of known and appropriate quality are
integral to ensuring that EPA achieves its mission. Information about human health and the
environment -- environmental characteristics; physical, chemical, and biological processes; and
chemical and other pollutants -- underlies all environmental management and health protection
decisions. The availability of, and access to, information and the analytical tools to understand it
are essential for assessing environmental and human health risks, designing appropriate and
cost-effective policies and response strategies, and measuring environmental improvements.
EPA works every day to ensure information quality, but we do not wait until the point of
dissemination to consider important quality principles. While the final review of a document
before it is published is very important to ensuring a product of high quality, we know that in
order to maximize quality, we must start much earlier. When you read an EPA report at your
local library or view EPA information on our web site, that information is the result of processes
undertaken by EPA and our partners that assured quality along each step of the way. To better
describe this interrelated information quality process, the following presents some of the major
roles that EPA plays in its effort to ensure and maximize the quality of the information:
The final category of information that is not included in any of the above
th~ee categories includes information that is either voluntarily
submitted to EPA in hopes of influencing a decision or that EPA
obtains for use in developing a policy, regulatory, or other decision.
Examples of this information include scientific studies published in
journal articles and test data obtained from other Federal agencies,
industry, and others. EPA may not have any financial ties or regulatory
requirements to control the quality of this type of information.
• EPA is a conduit for information: Another major role that EPA plays in the
management of information is as a provider of public access. Such access enables
public involvement in how EPA achieves it mission. We provide access to a
variety of information holdings. Some information distributed by EPA includes
information collected through contracts; information collected through grants and
As mentioned in the previous section, EPA works with a variety of partners to achieve its
mission. Our key government partners not only provide information, they also work with EPA to
manage and implement programs and communicate with the public about issues of concern. In
addition to implementing national programs through EPA Headquarters Program Offices, a vast
network of EPA Regions and other Federal, State, Tribal and local governments implement both
mandated and voluntary programs. This same network collects, uses, and distributes a wide
range of information. EPA plans to coordinate with these partners to ensure the Guidelines are
appropriate and effective.
One major mechanism to ensure and maximize information integrity is the National
Environmental Information Exchange Network (NEIEN, or Network). The result of an important
partnership between EPA, States and Tribal governments, the Network seeks to enhance the
Agency's information architecture to ensure timely and one-stop reporting from many of EPA's
information partners. Key components include the establishment of the Central Data Exchange
(CDX) portal and a System of Access for internal and external users. When fully implemented,
the Network and its many components will enhance EPA and the public's ability to access, use,
and integrate information and the ability of external providers to report to EPA.
3 OMB Guidelines
In Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year
2001 (Public Law 106-554; H.R. 5658), Congress directed OMB to issue government-wide
guidelines that "provide policy and procedural guidance to Federal agencies for ensuring and
maximizing the quality, objectivity, utility, and integrity of information (including statistical
information) disseminated by Federal agencies...." The OMB guidelines direct agencies subject
to the Paperwork Reduction Act (44 U.S.C. 3502(1» to:
• Issue their own information quality guidelines to ensure and maximize the
quality, objectivity, utility, and integrity of information, including statistical
information, by no later than one year after the date of issuance of the OMB
guidelines;
• Report to the Director of OMB the number and nature of complaints received by
the agency regarding agency compliance with OMB guidelines concerning the
quality, objectivity, utility, and integrity of information and how such complaints
were resolved.
The OMB guidelines provide some basic principles for agencies to consider when developing
their own guidelines including:
• Some agency information may need to meet higher or more specific expectations
for objectivity, utility, and integrity. Information of greater importance should be
held to a higher quality standard.
OMS Guidelines 9
(,llld01'Tl0, ic,r t:'15.Urlr10 <!nci fllawr.rzrng the Quality. ObJectivity. UHlity. and Integntv of Informalio r , Drssomrna:t· l ::"',
4 Existing Policies and Procedures that Ensure and Maximize Information Quality
EPA is dedicated to the coilection, generation, and dissemination of high 'quality information.
We disseminate a wide variety of information products, ranging from comprehensive scientific
assessments of potential health risks,2 to web-based applications that provide compliance
information and map the location of regulated entities,3 to simple fact sheets for school children. 4
As a result of this diversity of information-related products and practices, different EPA
programs have evolved specialized approaches to information quality assurance. The OMB
guidelines encourage agencies to avoid the creation of "new and potentially duplicative or
contradictory processes." Further, OMB stresses that its guidelines are not intended to "impose
unnecessary administrative burdens that would inhibit agencies from continuing to take
advantage of the Internet and other technologies to disseminate information that can be of great
benefit and value to the public." In this spirit, EPA seeks to foster the continuous improvement
of existing information quality activities and programs. In implementing these guidelines, we
note that ensuring the quality of information is a key objective alongside other EPA objectives,
such as ensuring the success of Agency missions, observing budget and resource priorities and
restraints, and providing useful infonnation to the public. EPA intends to implement these
Guidelines in a way that will achieve all these objectives in a hannonious way in conjunction
with our existing guidelines and policies, some of which are outlined below. These examples
illustrate some of the numerous systems and practices in place that address the quality,
objectivity, utility, and integrity of information.
The EPA Agency-wide Quality System helps ensure that EPA organizations maximize the
quality of environmental information, including information disseminated by the Agency. A
graded approach is used to establish quality criteria that are appropriate for the intended use of
the information and the resources available. The Quality System is documented in EPA Order
5360.1 A2, "Policy and Program Requirements for the Mandatory Agency-wide Quality
System" and the "EPA Quality Manual.,,5 To implement the Quality System, EPA organizations
(1) assign a quality assurance manager, or person assigned to an equivalent position, who has
sufficient technical and management expertise and authority to conduct independent oversight of
the implementation of the organization's quality system; (2) develop a Quality Management
Plan, which documents the organization's quality system; (3) conduct an annual assessment of
the organization's quality system; (4) use a systematic planning process to develop acceptance or
performance criteria prior to the initiation of all projects that involve environmental information
2 hltp:llcfpub.epa.gov/ncealcfm/partmatt.cfm
3 http://www,epa.gov/enviro/wme/
4 http://www.epa.gov/kids
5 EPA Quality Manual for Environmental Programs 5360 AI. May 2000.
http://www.epa.gov/quality/qs-docs/5360.pdf
EXisting Policies and Procedures that Ensure and Maximize Information Quality 10
Guidel1r)(OS tor Ensurill~ and Maxllnlzing the Quality. ObJectIvity. Ulilil~(. and Integrity of Information DIs.s:'!'11I!'\"te~ ')'~".
collection and/or use; (5) develop Quality Assurance Project Plan(s), or equivalent document(s)
for all applicable projects and tasks involving environmental data; (6) conduct an assessment of
existing data, when used to support Agency decisions or other secondary purposes, to verify that
they are of sufficient quantity and adequate quality for their intended use: (7) implement all
Agency-wide Quality System components in all applicable EPA-funded extramural agreements:
and (8) provide appropriate training, for all levels of management and staff.
The EPA Quality System may also apply to non-EPA organizations, with key principles
incorporated in the applicable regulations governing contracts, grants, and cooperative
agreements. EPA Quality System provisions may also be invoked as part of negotiated
agreements such as memoranda of understanding. Non-EPA organizations that may be subject to
EPA Quality System requirements include (a) any organization or individual under direct
contract to EPA to furnish services or items or perfonn work (i.e., a contractor) under the
authority of 48 CFR part 46, (including applicable work assignments, delivery orders, and task
orders); and (b) other government agencies receiving assistance from EPA through interagency
agreements. Separate quality assurance requirements for assistance recipients are set forth in 40
CFR part 30 (governing assistance agreements with institutions of higher education, hospitals,
and other non-profit recipients of financial assistance) and 40 CPR parts 31 and 35 (government
assistance agreements with State, Tribal, and local governments).
In addition to the Quality System, EPA's Peer Review Policy provides that major scientifically
and technically based work products (including scientific, engineering, economic, or statistical
documents) related to Agency decisions should be peer-reviewed. Agency managers within
Headquarters, Regions, laboratories, and field offices detennine and are accountable for the
decision whether to employ peer review in particular instances and, if so, its character, scope,
and timing. These decisions are made consistent with program goals and priorities, resource
constraints, and statutory or court-ordered deadlines. For those work products that are intended
to support the most important decisions or that have special importance in their own right,
external peer review is the procedure of choice. For other work products, internal peer review is
an acceptable alternative to external peer review. Peer review is not restricted to the penultimate
version of work products; in fact, peer review at the planning stage can often be extremely
beneficial. The basis for EPA peer review policy is articulated in Peer Review and Peer
Involvement at the U.S. Environmental Protection Agency.b The Peer Review Policy was first
issued in January, 1993, and was updated in June, 1994. In addition to the policy, EPA has
published a Peer Review Handbook, 7 which provides detailed guidance for implementing the
policy. The handbook was last revised December, 2000.
6Peer Review and Peer Involvement at the U.S. EPA. June 7, 1994.
http://www.epa.gov/osp/spc/perevmem.htm
7Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
1OO-B-OO-OO I. http://www.epa.gov/osp/spclprhandbk.pdf
Existing Policies and Procedures that Ensure and Maximize Information Quality 11
GUldelflle, for Ensuring and MaXilnizmg the Quality. Objectivity. Utility. and Integrity of InformatIOn DI5seminaler b. rp.
The Agency's Action Development Process also serves to ensure and maximize the quality of
EPA disseminated information. Top Agency actions and Economically Significant actions as
designated under Executive Order 12866 are developed as part of the Agency's Action
Development Process. The Action Development Process ensures the early and timely
involvement of senior management at key decision milestones to facilitate the consideration of a
broad range of regulatory and non-regulatory options and analytic approaches. Of particular
importance to the Action Development Process is ensuring that our scientists, economists, and
others with technical expertise are appropriately involved in determining needed analyses and
research, identifying alternatives, and selecting options. Program Offices and Regional Offices
are invited to participate to provide their unique perspectives and expertise. Effective
consultation with policy advisors (e.g., Senior Policy Council, Science Policy Council), co
regulators (e.g., States, Tribes, and local governments), and stakeholders is also part of the
process. Final Agency Review (FAR) generally takes place before the release of substantive
information associated with these actions. The FAR process ensures the consistency of any
policy determinations, as well as the quality of the information underlying each policy
determination and its presentation.
The Agency's Integrated Error Correction Process 8 (IECP) is a process by which members of the
public can notify EPA of a potential data error in information EPA distributes or disseminates.
This process builds on existing data processes through which discrete, numerical errors in our
data systems are reported to EPA. The IECP has made these tools more prominent and easier to
use. Individuals who identify potential data errors on the EPA web site can contact us through
the IECP by using the "Report Error" button or error correction hypertext found on major data
bases throughout EPA's web site. EPA reviews the error notification and assists in bringing the
notification to resolution with those who are responsible for the data within or outside the
Agency, as appropriate. The IECP tracks this entire process from notification through final
resolution.
Existing Policies and Procedures that Ensure and Maximize Information Quality 12
GlIID.eiH1',S to, t:fl5urong and Maxllnizing the Quality. Objectivity. Utility. and Integrity of Information D.ssemma\f"(j D\ E;;'~
The EPA Information Resources Management (IRM) Manual9 articulates and describes many of
our information development and management procedures and policies, including infonnation
security, data standards, records management, information collection, and library services.
Especially important in the context of the Guidelines provided in this document, the IRM
Manual describes how we maintain and ensure information integrity. We believe that
maintaining information.integrity refers to keeping information "unaltered," Le., free from
unauthorized or accidental modification or destruction. These integrity principles apply to all
information. Inappropriately changed or modified data or software impacts information integrity
and compromises the value of the information system. Because of the importance of EPA's
information to the decisions made by the Agency, its partners, and the public, it is our
responsibility to ensure that the information is, and remains, accurate and credible.
Beyond addressing integrity concerns, the IRM Manual also includes Agency policy on public
access and records management. These are key chapters that enable EPA to ensure transparency
and the reproducibility of information.
The EPA Risk Characterization Policy and Handbook 1o provide guidance for risk
characterization that is designed to ensure that critical information from each stage of a risk
assessment is used in forming conclusions about risk. The Policy calls for a transparent process
and products that are clear, consistent and reasonable. The Handbook is designed to provide risk
assessors, risk managers, and other decision-makers an understanding of the goals and principles
of risk characterization.
We mentioned just a few of the Agency's major policies that ensure and maximize the quality of
information we disseminate. In addition to these Agency-wide systems and procedures, Program
Offices and Regions implement many Office-level and program-specific procedures to ensure
and maximize information quality. The purpose of these Guidelines is to serve as a common
thread that ties all these policies together under the topics provided by OMB: objectivity,
integrity and utility. EPA's approach to ensuring and maximizing qUality is necessarily
distributed across all levels of EPA's organizational hierarchy, including Offices, Regions,
divisions, projects, and even products. Oftentimes, there are different quality considerations for
different types of products. For example, the quality principles associated with a risk assessment
IORisk Characterization Handbook, U.S. EPA, Science Policy Council, December 2000.
http://www.epa.gov/osp/spcl2riskchr.htm
Existing Policies and Procedures that Ensure and Maximize Information Quality 13
GUldelJne~ tor Ensuring and Maximizing the Quality. ObJectivity. Utility. and Integrity of Intormatlon Dlsseminawc l'" E;>t.
differ from those associated with developing a new model. The Agency currently has a
comprehensive but distributed system of policies to address such unique quality considerations.
These Guidelines provide us with a mechanism to help coordinate and synthesize our quality
policies and procedures.
As suggested above, we will continue to work to ensure that our many policies and procedures
are appropriately implemented, synthesized, and revised as needed. One way to build on
achievements and learn from mistakes is to document lessons learned about specific activities or
products. For example, the documents that present guidance and tools for implementing the
Quality System are routinely subjected to external peer review during their development:
comments from the reviewers are addressed and responses reviewed by management before the
document is issued. Each document is formally reviewed every five years and is either reissued,
revised as needed, or rescinded. If important new information or approaches evolve between
reviews, the document may be reviewed and revised more frequently.
In response to OMB's guidelines, EPA recognizes that it will be incorporating new policies and
administrative mechanisms. As we reaffirm our commitment to our existing policies and
procedures that ensure and maximize quality, we also plan to address the following new areas of
focus and commitment:
• Working with the public to develop assessment factors that we will use to assess
the quality of information developed by external parties, prior to EPA's use of
that information.
Existing Policies and Procedures that Ensure and Maximize Information Quality 14
GUidelines to~ EnsurlOQ and Maximizing the Quality. Ob,ectlvity. Utility. and Integrity of Intormatlon Disseminate" c" r:".:
Consistent with the OMB guidelines, EPA is issuing these Guidelines to ensure and maximize
the quality, including objectivity, utility and integrity, of disseminated information. Objectivity,
integrity, and utility are defined here, consistent with the OMB guidelines. "Objectivity" focuses
on whether the disseminated information is being presented in an accurate, clear, complete, and
unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. "Integrity"
refers to security, such as the protection of information from unauthorized access or revision, to
ensure that the information is not compromised through corruption or falsification. "Utility"
refers to the usefulness of the information to the intended users.
The collection, use, and dissemination of information of known and appropriate quality is
integral to ensuring that EPA achieves its mission. Information about the environment and
human health underlies all environmental management decisions. Information and the analytical
tools to understand it are essential for assessing environmental and human health risks, designing
appropriate and cost-effective policies and response strategies, and measuring environmental
improvements.
These Guidelines describe EPA's policy and procedures for reviewing and substantiating the
quality of information before EPA disseminates it. They describe our administrative mechanisms
for enabling affected persons to seek and obtain, where appropriate, correction of information
disseminated by EPA that they believe does not comply with EPA or OMB guidelines.
These Guidelines apply to "information" EPA disseminates to the public. "Information," for
purposes of these Guidelines, generally includes any communication or representation of
knowledge such as facts or data, in any medium or form. Preliminary information EPA
disseminates to the public is also considered "information" for the purposes of the Guidelines.
Information generally includes material that EPA disseminates from a web page. However not
all web content is considered "information" under these Guidelines (e.g., certain information
from outside sources that is not adopted, endorsed, or used by EPA to support an Agency
decision or position).
For purposes of these Guidelines, EPA disseminates information to the public when EPA
initiates or sponsors the distribution of information to the public.
EPA intends to use notices to explain the status of information, so that users will be aware of
whether the information is being distributed to support or represent EPA's viewpoint.
If an item is not considered "information," these Guidelines do not apply. Examples of items that
are not considered information include Internet hyperlinks and other references to information
distributed by others, and opinions, where EPA's presentation makes it clear that what is being
offered is someone's opinion rather than fact or EPA's views.
"Dissemination" for the purposes of these Guidelines does not include distributions of
information that EPA does not initiate or sponsor. Below is a sample of various types of
information that would not generally be considered disseminated by EPA to the public:
EPA's response to requests for agency records under the Freedom of Information
Act (FOlA), the Privacy Act, the Federal Advisory Committee Act (FACA), or
other similar laws.
elsewhere: interviews, speeches, and similar communications that EPA does not
disseminate to the public beyond their original context, such as by placing them
on the Internet. If a speech, press release, or other "ephemeral" communication is
about an information product disseminated elsewhere by EPA, the product itself
will be covered by these Guidelines.
5.5 What Happens if Information is Initially Not Covered by these Guidelines, but EPA
Subsequently Disseminates it to the Public?
If a particular distribution of information is not covered by these Guidelines, the Guidelines may
still apply to a subsequent dissemination of the infonnation in which EPA adopts, endorses, or
uses the information to formulate or support a regulation, guidance, or other Agency decision or
position. For example, if EPA simply makes a public filing (such as facility data required by
regulation) available to the public, these Guidelines would not apply to that distribution of
information. However, if EPA later includes the information in a background document in
support of a rulemaking, these Guidelines would apply to that later dissemination of the
information in that document.
5.6 How does EPA Ensure the Objectivity, Utility, and Integrity of information that is
not covered by these Guidelines?
These Guidelines apply only to information EPA disseminates to the public, outlined in section
5.3, above. Other information distributed by EPA that is not covered by these Guidelines is still
subject to all applicable EPA policies, quality review processes, and correction procedures.
These include quality management plans for programs that collect, manage, and use
environmental information, peer review, and other procedures that are specific to individual
programs and, therefore, not described in these Guidelines. It is EPA's policy that all of the
information it distributes meets a basic standard of information quality, and that its utility,
objectivity, and integrity be scaled and appropriate to the nature and. timeliness of the planned
and anticipated uses. Ensuring the quality of EPA information is not necessarily dependent on
any plans to disseminate the information. EPA continues to produce, collect, and use information
that is of the appropriate quality, irrespective of these Guidelines or the prospects for
dissemination of the information.
6.1 How does EPA Ensure and Maximize tbe Quality of Disseminated Information?
EPA ensures and maximizes the quality of the information we disseminate by implementing well
established policies and procedures within the Agency as appropriate to the information product.
There are many tools that the Agency uses such as the Quality System,11 review by senior
management, peer review process,12 communications product review process~13 the web guide. 14
and the error correction processY Beyond our internal quality management system, EPA also
ensures the quality of information we disseminate by seeking input from experts and the general
public. EPA consults with groups such as the Science Advisory Board and the Science Advisory
Panel, in addition to seeking public input through public comment periods and by hosting public
meetings.
For the purposes of the Guidelines, EPA recognizes that if data and analytic results are subjected
to formal, independent, external peer review, the information may generally be presumed to be
of acceptable objectivity. However, this presumption of objectivity is rebuttable. The Agency
uses a graded approach and uses these tools to establish the appropriate quality, objectivity,
utility, and integrity of information products based on the intended use of the information and
the resources available. As part of this graded approach, EPA recognizes that some of the
information it disseminates includes influential scientific, financial, or statistical information,
and that this category should meet a higher standard of quality.
6.2 How Does EPA Define Influential Information for these Guidelines?
II EPA Quality Manual for Environmental Programs 5360 AI. May 2000.
http://www.epa.gov/guality/gs-docs/5360.pdf
12 Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
IOO-B-OO-OOl. http://www.epa.gov/osplspc/prhandbk.pdf
16The term "clear and substantial impact" is used as part of a definition to distinguish different categories of
information for purposes of these Guidelines. EPA does not intend the classification ofinformation under this
definition to change or impact the status of the information in any other setting, such as for purposes of determining
whether the dissemination of the information is a final Agency action.
Information Quality Guidelines, EPA will generally consider the following classes of
information to be influential, and, to the extent that they contain scientific, financial, or statistical
information, that information should adhere to a rigorous standard of quality:
• Major work products undergoing peer review as called for under the Agency's
Peer Review Policy. Described in the Science Policy Council Peer Review
Handbook, the EPA Peer Review Policy regards major scientific and technical
work products as those that have a major impact, involve precedential, novel,
and/or controversial issues, or the Agency has a legal and/or statutory Obligation
to conduct a peer review. These Major work products are typically subjected to
external peer review. Some products that may not be considered "major" under
the EPA Peer Review Policy may be subjected to external peer review but EPA
does not consider such products influential for purposes of these Guidelines.
6.3 How Does EPA Ensure and Maximize the Quality of "Influential" Information?
EPA recognizes that influential scientific, financial, or statistical information should be subject
to a higher degree of quality (for example, transparency about data and methods) than
information that may not have a clear and substantial impact on important public policies or
private sector decisions. A higher degree of transparency about data and methods will facilitate
Several Agency-wide and Program- and Region-specific policies and processes that EPA uses to
ensure and maximize the quality of environmental data, including disseminated information
products, would also apply to infonnation considered "influential" under these Guidelines.
Agency-wide processes of particular importance to ensure the quality, objectivity, and
transparency of "influential" information include the Agency's Quality System, Action
Development Process, Peer Review Policy, and related procedures. Many "influential"
information products may be subject to more than one of these processes.
6.4 How Does EPA Ensure and Maximize the Quality of "Influential" Scientific Risk
Assessment Information?
EPA conducts and disseminates a variety of risk assessments. When evaluating environmental
problems or establishing standards, EPA must comply with statutory requirements and mandates
set by Congress based on media (air, water, solid, and hazardous waste) or other environmental
interests (pesticides and chemicals). Consistent with EPA's current practices, application of these
principles involves a "weight-of-evidence" approach that considers all relevant infonnation and
its quality, consistent with the level of effort and compleXity of detail appropriate to a particular
risk assessment. In our dissemination of influential scientific infonnation regarding human
health, safety'? or environmental 18 risk assessments, EPA will ensure, to the extent practicable
17"Safety risk assessment" describes a variety of analyses, investigations, or case studies conducted by EPA
to respond to environmental emergencies. For example, we work to ensure that the chemical industry and state and
local entities take action to prevent, plan and prepare for, and respond to chemical emergencies through the
development and sharing of information, tools, and guidance for hazards analyses and risk assessment.
18Because the assessment of "environmental risk" is being distinguished from "human health risk," the term
"environmental risk" as used in these Guidelines does not directly involve human health concerns. In other words, an
"environmental risk assessment" is in this case the equivalent to what EPA commonly calls an "ecological risk
and consistent with Agency statutes and existing legislative regulations, the objectivity '9 of such
infonnation disseminated by the Agency by applying the following adaptation of the quality
principles found in the Safe Drinking Water Act 20 (SDWA) Amendments of 199621 :
(A) The substance of the infonnation is accurate, reliable and unbiased. This involves the use
of:
(i) the best available science and supporting studies conducted in accordance with
sound and objective scientific practices, including, when available, peer reviewed
science and supporting studies; and
(ii) data collected by accepted methods or best available methods (if the reliability of
the method and the nature of the decision justifies the use of the data).
(i) each population addressed by any estimate of applicable human health risk or
each risk assessment endpoint, including populations if applicable, addressed by
any estimate of applicable ecological risk 22 ;
(ii) the expected risk or central estimate of human health risk for the specific
assessment" .
19 0MB stated in its guidelines that in disseminating information agencies shall develop a process for
reviewing the quality of the information. "Quality" includes objectivity, utility, and integrity. "Objectivity" involves
two distinct elements, presentation and substance. Guidelines for Ensuring and Maximizing the Quality, Objectivity,
Utility. and Integrity of Information Disseminated by Federal Agencies, OMB, 2002. (67 FR 8452)
http://www.whjtehouse.gov/omb/fedreg/reproducible2.pdf
20Safe Drinking Water Act Amendments of 1996, 42 U.S.c. 300g-l(b)(3)(A) & (B)
21The exception is risk assessments conducted under SDWA wruch will adhere to the SDWA principles as
amended in 1996.
22Agency assessments of human health risks necessarily focus on populations. Agency assessments of
ecological risks address a variety of entities, some of which can be described as populations and others (such as
ecosystems) wruch cannot. The phrase "assessment endpoint" is intended to reflect the broader range of interests
inherent in ecological risk assessments. As discussed in the EPA Guidelines for Ecological Risk Assessment (found
at http://cfpub.epa.gov/ncealcfmlrecordisplay.cfm?deid=12460), assessment endpoints are explicit expressions of the
actual environmental value that is to be protected, operationally defined by an ecological entity and its attributes.
Furthermore, those Guidelines explain that an ecological entity can be a species (e.g., eelgrass, piping plover), a
community (e.g., benthic invertebrates), an ecosystem (e.g., wetland), or other entity of concern. An attribute of an
assessment endpoint is the characteristic about the entity of concern that is important to protect and potentially at
risk. Examples of attributes include abundance (of a population), species richness (of a community), or function (of
an ecosystem). Assessment endpoints and ecological risk assessments are discussed more fully in those Guidelines
as well as other EPA sources such as Ecological Risk Assessment Guidance for Superfund: Process for Designing
and Conducting Ecological Risk Assessments· Interim Final found at
http://www.epa.gov/oerrpage/superfundlprogramslrisk/ecorisk/ecorisk.htm
In applying these principles, "best available" usually refers to the availability at the time an
assessment is made. However, EPA also recognizes that scientific knowledge about risk is
rapidly changing and that risk information may need to be updated over time. When deciding
which influential risk assessment should be updated and when to update it, the Agency will take
into account its statutes and the extent to which the updated risk assessment will have a clear and
substantial impact on important public policies or private sector decisions. In some situations,
the Agency may need to weigh the resources needed and the potential delay associated with
incorporating additional information in comparison to the value of the new information in terms
of its potential to improve the substance and presentation of the assessment.
Adaptation clarifications
In order to provide more clarity on how EPA adapted the SDWA principles in this guidance in
light of our numerous statutes, regulations, guidance and policies that address how to conduct a
risk assessment and characterize risk we discuss four adaptations EPA has made to the SDWA
quality principles language.
EPA adapted the SDWA principles by adding the phrase "consistent with Agency statutes and
existing legislative regulations, the objectivity of such information disseminated by the Agency"
in the introductory paragraph, therefore applying to both paragraphs (A) and (8). This was done
to explain EPA's intent regarding these quality principles and their implementation consistent
with our statutes and existing legislative regulations. Also, as noted earlier, EPA intends to
implement these quality principles in conjunction with our guidelines and policies. The
procedures set forth in other EPA guidelines set out in more detail EPA's policies for conducting
risk assessments, including Agency-wide guidance on various types of risk assessments and
program-specific guidance. EPA recognizes that the wide array of programs within EPA have
resulted not only in Agency-wide guidance, but in specific protocols that reflect the
requirements, including limitations, that are mandated by the various statutes administered by
the Agency. For example, the Agency developed several pesticide science policy papers that
explained to the public in detail how EPA would implement specific statutory requirements in
the Food Quality Protection Act (FQPA) that addressed how we perform risk assessments. We
also recognize that emerging issues such endocrine disruption, bioengineered organisms, and
genomics may involve some modifications to the existing paradigm for assessing human health
and ecological risks. This does not mean a radical departure from existing guidance or the
SOWA principles, but rather indicates that flexibility may be warranted as new information and
approaches develop.
EPA introduced the following two adaptations in order to accommodate the range of real-world
situations that we confront in the implementation of our diverse programs. EPA adapted the
SOWA quality principles by moving the phrase "to the extent practicable" from paragraph (B) to
the introductory paragraph in this Guidelines section to cover both parts (A) and (B) of the
SOWA adaptation. 24 The phrase refers to situations under (A) where EPA may be called upon to
conduct "influential" scientific risk assessments based on limited information or in novel
situations, and under (B) in recognition that all such "presentation" information may not be
available in every instance. The level of effort and complexity of a risk assessment should also
balance the information needs for decision making with the effort needed to develop such
information. For example, under the Federal Insecticide, Fungicide and Rodenticide Act 25
(FIFRA) and the Toxic Substances and Control Act 26 (TSCA), regulated entities are obligated to
provide information to EPA concerning incidents/test data that may reveal a problem with a
pesticide or chemical. We also receive such information voluntarily from other sources. EPA
carefully reviews incident reports and factors them as appropriate into risk assessments and
decision-making, even though these may not be considered information collected by acceptable
methods or best available method as stated in A(ii). Incident information played an important
role in the Agency's conclusion that use of chlordane/heptachlor termiticides could result in
exposures to persons living in treated homes, and that the registrations needed to be modified
accordingly. Similarly, incident reports concerning birdkills and fishkills were important
components of the risk assessments for the reregistration of the pesticides phorate and terbufos,
respectively. In addition, this adaptation recognizes that while many of the studies incorporated
into risk assessments have been peer reviewed, data from other sources may not be peer
reviewed. EPA takes many actions based on studies and supporting data provided by outside
sources, including confidential or proprietary information that has not been peer reviewed. For
example, industry can be required by regulation to submit data for pesticides under FIFRA or for
chemicals under TSCA. The data are developed using test guidelines and Good Laboratory
Practices (GLPs) in accordance with EPA regulations. While there is not a reqUirement to have
studies peer reviewed, such studies are reviewed by Agency scientists to ensure that they were
conducted according to the appropriate test guidelines and GLPs and that the data are valid.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
in many circumstances to conserve Agency resources and those of the regulated community who
otherwise might have to generate significant additional data. This flexibility is already provided
24 The discussion in this and following paragraphs gives some examples of the types of assessments that
may under some circumstances be considered influential. These examples are representative of assessments
performed under other EPA programs, such as CERCLA
25
7 U.S.c. 136 et seq.
26
15 U.S.C. 2601 et seq.
for paragraph (B) in the SDWA quality principles. Pesticide and chemical risk assessments are
frequently performed iteratively, with the first iteration employing protective (conservative)
assumptions to identify possible risks. Only if potential risks are identified in a screening level
assessment, is it necessary to pursue a more refined, data-intensive risk assessment. This is
exhibited, for example, in guidance developed for use in CERCLA and RCRA on tiered
approaches. In other cases, reliance on "structure activity relationship" or "bridging data" allows
the Agency to rely on data from similar chemicals rather than require the generation of new.
chemical-specific data. \Yhile such assessments mayor may not be considered influential under
the Guidelines. this adaptation of the SDWA principles reflects EPA's reliance on less-refined
risk assessments where further refinement could significantly increase the cost of the risk
assessment without significantly enhancing the assessment or changing the regulatory outcome.
In emergency and other time critical circumstances, risk assessments may have to rely on
information at hand or that can be made readily available rather than data such as described in
(A). One such scenario is risk assessments addressing Emergency Exemption requests submitted
under Section 18 of FIFRA27 which, because of the emergency nature of the request, must be
completed within a short time frame. As an example, EPA granted an emergency exemption
under Section 18 to allow use of an unregistered pesticide to decontaminate anthrax in a Senate
office building. The scientific review and risk assessment to support this action were necessarily
constrained by the urgency of the action. Other time-sensitive actions include the reviews of new
chemicals under TSCA. Under Section 5 of TSCA28 , EPA must review a large number of
pre-manufacture notifications (more than 1,000) every year, not all of which necessarily include
"influential" risk assessments, and each review must be completed within a short time frame
(generally 90 days). The nature of the reviews and risk assessment associated with these
pre-manufacture notifications are affected by the limited time available and the large volume of
notifications submitted.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
to account for safety risk assessment practices. This flexibility is already provided for paragraph
(B) in the SDWA quality principles. We applied the same SDWA adaptation for use with human
health risk assessments to safety risk assessments with the needed flexibility to apply the
principles to the extent practicable. "Safety risk assessments" include a variety of analyses.
investigations, or case studies conducted by EPA concerning safety issues. EPA works to ensure
that the chemical industry and state and local entities take action to prevent, plan and prepare for,
and respond to environmental emergencies and site specific response actions through the
development and sharing of infonnation, tools and guidance for hazard analyses and risk
assessment. For example, although the chemical industry shoulders most of the responsibility for
safety risk assessment and management. EPA may also conduct chemical hazard analyses,
investigate the root causes and mechanisms associated with accidental chemical releases, and
assesS the probability and consequences of accidental releases in support of agency risk
assessments. Although safety risk assessments can be different from traditional human health
risk assessments because they may combine a variety of available information and may use
expert judgement based on that information, these assessments provide useful information that is
sufficient for the intended purpose.
Next, EPA adapted the SDWA quality principles by adding the clause "including, when
available, peer reviewed science and supporting studies" to paragraph (A)(i). It now reads: "the
best available science and supporting studies conducted in accordance with sound and objective
scientific practices, including, when available, peer reviewed science and supporting studies." In
the Agency's development of "influential" scientific risk assessments, we intend to use all
relevant information, including peer reviewed studies, studies that have not been peer reviewed,
and incident information; evaluate that information based on sound scientific practices as
described in our risk assessment guidelines and policies; and reach a position based on careful
consideration of all such information (i.e., a process typically referred to as the "weight-of
evidence" approach 29 ). In this approach, a well-developed, peer-reviewed study would generally
be accorded greater weight than information from a less well-developed study that had not been
peer-reviewed, but both studies would be considered. Thus the Agency uses a "weight-of
evidence" process when evaluating peer-reviewed studies along with all other information.
Oftentimes under various EPA-managed programs, EPA receives information that has not been
peer-reviewed and we have to make decisions based on the information available. While many
of the studies incorporated in risk assessments have been peer reviewed, data from other sources,
such as studies submitted to the Agency for pesticides under FIFRA 30 and for chemicals under
TSCA, may not always be peer reviewed. Rather, such data, developed under approved
guidelines and the application of Good Laboratory Practices (GLPs), are routinely used in the
development of risk assessments. Risk assessments may also include more limited data sets such
as monitoring data used to support the exposure element of a risk assessment. In cases where
these data may not themselves have been peer reviewed their quality and appropriate use would
be addressed as part of the peer review of the overall risk assessment as called for under the
Agency's peer review guidelines.
Lastly, EPA adapted the SDWA principles for influential environmental ("ecological") risk
assessments that are disseminated in order to use terms that are most suited for such risk
assessments. Specifically, EPA assessments of ecological risks address a variety of entities,
some of which can be described as populations and others (such as ecosystems) which cannot.
Therefore, a specific modification was made to include "assessment endpoints, including
populations if applicable"in place of the tenn "population" for ecological risk assessments and
EPA added a footnote directing the reader to various EPA risk policies for further discussion of
these concepts in greater detail.
6.5 Does EPA Ensure and Maximize the Quality of Information from External Sources?
Ensuring and maximizing' the quality of information from States, other governments, and third
parties is a complex undertaking, involving thoughtful collaboration with States, Tribes. the
scientific and technical community, and other external information providers. EPA will continue
to take steps to ensure that the quality and transparency of information provided by external
sources are sufficient for the intended use. For instance, since 1998, the use of environmental
data collected by others or for other purposes, including literature, industry surveys,
compilations from computerized data bases and information systems, and results from
computerized or mathematical models of environmental processes and conditions has been
within the scope of the Agency's Quality System3 !.
For information that is either voluntarily submitted to EPA in hopes of influencing a decision or
that EPA obtains for use in developing a policy, regulatory, or other decision, EPA will continue
to work with States and other governments, the scientific and technical community, and other
interested information providers to develop and publish factors that EPA would use to assess the
quality of this type of information.
For all proposed collections of information that will be disseminated to the public, EPA intends
to demonstrate in our Paperwork Reduction Ace 2 clearance submissions that the proposed
collection of information will result in information that will be collected, maintained and used in
ways consistent with the OMB guidelines and these EPA Guidelines. These Guidelines apply to
all information EPA disseminates to the public; accordingly, if EPA later identifies a new use for
the information that was collected, such use would not be precluded and the Guidelines would
apply to the dissemination of the information to the public.
31 EPA Quality Manual for Environmental Programs 5360 AI. May 2000, Section 1.3.1.
http://www.epa.gov/guality/gs-docsl5360.pdf
32
44 V.S.c. 3501 et seq.
Each EPA Program Office and Region will incorporate the information quality principles
outlined in section 6 of these Guidelines into their existing pre~dissemination review procedures
as appropriate. Offices and Regions may develop unique and new procedures, as needed, to
provide additional assurance that the information disseminated by or on behalf of their
organizations is consistent with these Guidelines. EPA intends to facilitate implementation of
consistent cross-Agency pre-dissemination reviews by establishing a model of minimum review
standards based on existing policies. Such a model for pre-dissemination review would still
provide that responsibility for the reviews remains in the appropriate EPA Office or Region.
For the purposes of the Guidelines, EPA recognizes that pre-dissemination review procedures
may include peer reviews and quality reviews that may occur at many steps in development of
information, not only at the point immediately prior to the dissemination of the information.
8.1 What are EPA's Administrative Mechanisms for Affected Persons to Seek and
Obtain Correction of Information?
EPA's Office of Environmental Information (OEI) manages the administrative mechanisms that
enable affected persons to seek and obtain, where appropriate, correction of information
disseminated by the Agency that does not comply with EPA or OMB Information Quality
Guidelines. Working with the Program Offices, Regions, laboratories, and field offices, OEI will
receive complaints (or copies) and distribute them to the appropriate EPA information owners.
"Information owners" are the responsible persons designated by management in the applicable
EPA Program Office, or those who have responsibility for the quality, objectivity, utility, and
integrity of the information product or data disseminated by EPA. If a person believes that
information disseminated by EPA may not comply with the Guidelines, we encourage the person
to consult informally with the contact person listed in the information product before submitting
a request for correction of information. An informal contact can result in a quick and efficient
resolution of questions about information quality.
Persons requesting a correction of information should include the following information in their
Request for Correction (RFC):
• A description of the information the person believes .does not comply with EPA
or OMB guidelines, including specific citations to the information and to the EPA
or OMB guidelines, if applicable.
• An explanation of how the information does not comply with EPA or OMB
guidelines and a recommendation of corrective action. EPA considers that the
complainant has the burden of demonstrating that the information does not
comply with EPA or OMB guidelines and that a particular corrective action
would be appropriate.
• An explanation of how the alleged error affects or how a correction would benefit
the requestor.
• An affected person may submit an RFC via anyone of methods listed here:
• Internet at http://www.epa.gov/oei/gualityguidelines
• E-mail atguality.guidelines@epa.gov
• Fax at (202) 566-0255
8.3 When Does EPA Intend to Consider a Request for Correction ofInformation?
EPA seeks public and stakeholder input on a wide variety of issues, including the identification
and resolution of discrepancies in EPA data and infonnation. EPA may decline to review an
RFC under these Guidelines and consider it for correction if:
• The request omits one or more of the elements recommended in section 8.2 and
there is insufficient infonnation for EPA to provide a satisfactory response.
The request itself is "frivolous," including those made in bad faith, made without
justification or trivial, and for which a response would be duplicative. More
infonnation on this subject may be found in the OMB guidelines.
8.4 How Does EPA Intend to Respond to a Request for Correction of Information?
the error. For requests involving information from outside sources. considerations
may include coordinating with the source and other practical limitations on EPA' s
ability to take corrective action.
• For approved requests, EPA assigns a steward for the correction who marks the
information as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.
OEI will provide reports on behalf of EPA to OMB on an annual basis beginning January 1,
2004 regarding the number, nature, and resolution of complaints received by EPA. .
8.5 How Does EPA Expect to Process Requests for Correction of Information on Which
EPA has Sought Public Comment?
When EPA provides opportunities for public participation by seeking comments on information,
the public comment process should address concerns about EPA's information. For example,
when EPA issues a notice of proposed rulemaking supported by studies and other information
described in the proposal or included in the rulemaking docket, it disseminates this information
within the meaning of the Guidelines. The public may then raise issues in comments regarding
the information. If a group or an individual raises a question regarding information supporting a
proposed rule, EPA generally expects to treat it procedurally like a comment to the rulemaking,
addressing it in the response to comments rather than through a separate response mechanism.
This approach would also generally apply to other processes involving a structured opportunity
for public comment on a draft or proposed document before a final document is issued, such as a
draft report, risk assessment, or guidance document. EPA believes that the thorough
consideration provided by the public comment process serves the purposes of the Guidelines,
provides an opportunity for correction of any information that does not comply with the
Guidelines, and does not duplicate or interfere with the orderly conduct of the action. In cases
where the Agency disseminates a study, analysis, or other information prior to the final Agency
action or information product, it is EPA policy to consider requests for correction prior to the
final Agency action or information product in those cases where the Agency has determined that
an earlier response would not unduly delay issuance of the Agency action or information product
and the complainant has shown a reasonable likelihood of suffering actual harm from the
Agency's dissemination if the Agency does not resolve the complaint prior to the final Agency
action or information product. EPA does not expect this to be the norm in rulemakings that it
conducts, and thus will usually address information quality issues in connection with the final
Agency action or information product.
EPA generally would not consider a complaint that could have been submitted as a timely
comment in the rolemaking or other action but was submitted after the comment period. If EPA
cannot respond to a complaint in the response to comments for the action (for example. because
the complaint is submitted too late to be considered and could not have been timely SUbmitted. or
because the complaint is not gennane to the action), EPA will consider whether a separate
response to the complaint is appropriate.
8.6 What Should be Included in a Request Asking EPA to Reconsider its Decision on a
Request for the Correction of Information?
If requesters are dissatisfied with an EPA decision, they may file a Request for Reconsideration
(RFR). The RFR should contain the following information:
• An explanation of why the person disagrees with the EPA decision and a specific
recommendation for corrective action.
• An affected person may submit a Request for Reconsideration (RFR) via anyone
of the methods listed here:
• Internet at hnp://www.epa.gov/oei/gua1ityguidelines
• E-mail atguality.guide1ines@epa.gov
• Fax at (202) 566-0255
• Mail to Information Quality Guidelines Staff, Mail Code 28221 T, U.S.
EPA, 1200 Pennsylvania Ave., N.W., Washington, DC, 20460
• By courier or in person to Information Quality Guidelines Staff, OEI
Docket Center, Room B128, EPA West Building, 1301 Constitution
Ave., N.W., Washington, DC
EPA recommends that requesters submit their RFR within 90 days of the EPA decision. If the
RFR is sent after that time, EPA recommends that the requester include an explanation of why
the request should be considered at this time.
8.7 How Does EPA Intend to Process Requests for Reconsideration of EPA Decisions?
• OEI sends the RFR to the appropriate EPA Program Office or Region that has
responsibility for the information in question.
• For approved requests, EPA assigns a steward for the correction who marks the
infonnation as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.
Appendix A
A.I Introduction
EPA's Guidelines are a living document and may be revised as we learn more about how best to
address, ensure, and maximize information quality. In the process of developing these
Guidelines, we actively solicited public input at many stages. While the public was free to
comment on any aspect of the Guidelines, EPA expli,citly requested input on key topics such as
influential information, reproducibility, influential risk assessment, information sources, and
error correction. ,
• An online Public Comment Session was held March 19-22, 2002, as the first draft of the
Guidelines was being developed. EPA received approximately 100 comments.
• A Public Meeting was held on May 15,2002, after the draft Guidelines were issued.
There were 99 participants, 13 of whom made presentations or commented on one or
more issues.
• A 52 day Public Comment period lasted from May 1 to June 21, 2002, where comments
could be mailed, faxed, or e-mailed to EPA. EPA received 55 comments during this
period.
• A conference call between EPA atd Tribal representatives was held on June 27,2002.
More detailed information on the public comments is available through an DEI web site, serving
as the home page for the EPA Information Quality Guidelines through the development and
implementation process. Please visit this site at http://www.epa.gov/oei/gualityguidelines.
We have established a public docket for the EPA Information Quality Guidelines under Docket
ID No. OEI-looI4. The docket is the collection of materials available for public viewing
Information Quality Guidelines Staff, OEI Docket Center, Room B128, EPA West Building,
1301 Constitution Ave., N.W., Washington, DC, phone number 202-566-0284. This docket
consists of a copy of the Guidelines. public comments received. and other information related to
the Guidelines. The docket is open from 12:00 PM to 4:00 PM, Monday through Friday.
excluding legal holidays. An index of docket contents will be available at
http://www .epa. gov/oeilgualityguidelines.
Appendix 36
GU1~i)i!nes for Ensuring and Mawnlzing the Quality. Objectivity. Utility. and IntegrIty of Information D1SSemlnal!.'(; th :: D;
During the various public comment opportunities, EPA received input from a diverse set of
organizations and private citizens. Comments came from many of EPA's stakeholders - the
regulated community and many interest groups who we hear from frequently during the
management of EPA's Programs to protect the nation's land, air, water, and public health.
Government agencies at the Federal, State, Tribal, and local level also commented on the
Guidelines. OMB sent comments to every Federal agency and EPA received comments from two
members of Congress. Beyond our government colle~gues, the private sector voiced many
concerns and helpful recommendations for these Guidelines. We would like to take this
opportunity to thank all commenters for providing their input on these Guidelines. Due to the
tight time frame for this project, this discussion of public comments generally describes the
major categories of comments and highlights some significant comments, but does not contain
an individual response to each public comment.
Comments received by EPA during the public comment period reflect a diversity of views
regarding EPA's approach to developing draft Guidelines as well as the general concept of
information quality. Some commenters included detailed review of all Guidelines sections, while
others chose to address only specific topics. In some cases, commenters provided examples to
demonstrate how current EPA procedures may not ensure adequate information quality for a
specific application. Commenters provided general observations such as stating that these
Guidelines did not sufficiently address EPA's information quality problems. Some commenters
offered that the Guidelines relied too much on existing policies. Interpretations of the intent of
the Data Quality Act were offered by some commenters. One comment noted that improvement
of data quality is not necessarily an end in and of itself. Another comment was that the goal of
Guidelines should be more to improve quality, not end uncertainty. Public interest and
environmental groups voiced concern over what they believed was an attempt by various groups
to undermine EPA's ability to act in a timely fashion to protect the eilVironment and public
health. Some commenters stated that the directives of the Data Quality Act and OMB cannot
override EPA's mission to protect human health and the environment per the statutory mandates
under which it operates.
EPA was congratulated for the effort and, in some cases, encouraged to go even further in
addressing information quality. Some commenters encouraged EPA to provide additional
process details, provide more detailed definitions, augment existing policies that promote
transparency, and share more information about the limitations of EPA disseminated
information. In one case, EPA was encouraged to develop a rating scheme for its disseminated
information.
This section discusses public comments and our responses to many of the important questions
and issues raised in the comments. First, we provide responses to some overarching comments
we received from many commenters, then we provide a discussion of public comments that were
received on specific topics addressed in the draft Guidelines.
Appendix 37
Gu!delin~s for EnSUring and Maximizing the Quality. Objectivity. Utility. and Integrit~; of Information DlSSernirlatee D', EP:
• Tone: Commenters criticized the "defensive tone", "legalistic tone", and the lack
of detail afforded in the Guidelines. Some commenters said that it was not clear
what the Guidelines were explaining, or how they might apply to various types of
information. We understand and agree with many of these criticisms and have
made attempts to better communicate the purpose, applicability, and content of
these Guidelines.
Many commenters told us that we rely excessively on existing EPA information quality policies.
Commenters provided specific examples of areas they believed were demonstrative of our lack
of commitment to or uneven implementation of our existing policies. Some commenters also
poimed out that there are key areas in which we lack policies to address quality and, as a result,
the Guidelines !Should address such issues in more detail. Some commenters also noted that EPA
itself has highlighted lessons learned with existing approaches to information product
development.
Appendix 38
G'.lIdeitn0, tor Ensurll19 and Maximizing the auallty. ObJectivity. Utility. and Integrit)· of Information Dlssemm31rc to, !'"P:.
The concept of peer review is considered in three Guidelines sections. (1) Application of the
Agency's Peer Review Policy language for "major scientific and technical work products and
economic analysis used in decision making" as a class of information that can be considered
"influential" for purposes of the Guidelines; ( 2) Use of "peer-reviewed science" as a component
of some risk assessments; and (3) Use of the Agency's Peer Review Policy as one of the
Agency-wide processes to ensure the quality, objectivity, and transparency of "influential"
scientific, financial, and statistical information under the Guidelines.
We received a number of comments on section 1.1 (What is the Purpose of these Guidelines?) of
the draft Guidelines. Some commenters argued that the Guidelines should be binding on EPA,
that they are legislative rules rather than guidance, or that the Guidelines must be followed
unless we make a specific determination to the contrary. Others argued that the Guidelines
should not be binding or that we should include an explicit statement that the Guidelines do not
alter substantive agency mandates. Some suggested that our statements retaining discretion to
differ from the Guidelines sent a signal that EPA was not serious about information quality.
With respect to the nature of these Guidelines, Section 515 specifies that agencies are to issue
"guidelines." As directed by OMB's guidelines, we have issued our own guidelines containing
nonbinding policy and procedural guidance. We see no indication in either the language or
general structure of Section 515 that Congress intended EPA's guidelines to be binding rules.
We revised this section (now section 1 in this revised draft) by adding a fuller explanation of
how EPA intends to ensure the quality of information it disseminates. This section includes
language explaining the nature of our Guidelines as policy and procedural guidance. This
language is intended to gi ve clear notice of the nonbinding legal effect of the Guidelines. It
33http://epa.gov/osp/spc/perevmem.htm
Appendix 39
GlllClellnes tor !:nsu"'19 ami Ma~imizing the Quality. Objectivity. Utility. and Integrity of !ntormattor- D,ssem''''ltpc ." ~~,
notifies EPA staff and the public that the document is guidance rather than a substantive rule and
explains how such guidance should be implemented. Although we believe these Guidelines
would not be judicially reviewable, we agree that a statement to this effect is unnecessary and
have deleted it. In response to comments that EPA clarify that the Guidelines do not alter
existing legal requirements, we have made that change. In light of that change, we think it is
clear that decisions in particular cases will be made based on applicable statutes, regulations. and
requirements, and have deleted other text in the paragraph that essentially repeated that point.
Elsewhere in the document, EPA has made revisions to be consistent with its status as guidance.
Some commenters argued that all EPA disseminated information should be covered by the
Guidelines and that we lack authority to "exempt" information from the Guidelines. Others
thought that the coverage in EPA's draft was appropriate. EPA does not view its Guidelines as
establishing a fixed definition and then providing "exemptions." Rather, our Guidelines explain
when a distribution of information generally would or would not be considered disseminated to
the public for purposes of the Guidelines.. As. we respond to complaints and gain experience in
implementing these Guidelines, we may identify other instances where information is or is not
considered disseminated for the purposes of the Guidelines.
Some commenters cited the Paperwork Reduction Act (PRA), 44 U.S.C. 3501 et seq., to support
their argument that the Guidelines should cover all information EPA makes public. EPA's
Guidelines are issued under Section 515 of the Treasury and General Government
Appropriations Act for Fiscal Year 2001, which directs OMB to issue government-wide
guidelines providing policy and procedural guidance to Federal agencies. In tum, the OMB
guidelines provide direction and guidance to Federal agencies in issuing their own guidelines.
EPA's Guidelines are intended to carry out OMB's policy on information quality. One
commenter cited in particular the term "public information" used in the PRA as evidence of
Congress's intent under Section 515. In EPA's view, this does not show that Congress intended a
specific definition for the key terms, "information" and "disseminated," used in Section 515. In
the absence of evidence of Congressional intent regarding the meaning of the terms used in
Section 515, EPA does not believe the PRA requires a change in EPA's Guidelines.
We agree with commenters who noted that even if a particular distribution of information is not
covered by the Guidelines, the Guidelines would still apply to information disseminated in other
ways. As stated in section 1.4, if information is not initially covered by the Guidelines, a
subsequent distribution of that information will be subject to the Guidelines if EPA adopts,
endorses, or uses it.
Some commenters made specific recommendations about what should and should not be covered
by the Guidelines. In addition to the specific recommendations, some suggested that the "scope
and applicability" section was too long, while others thought it had an appropriate level of detail.
Based on other agencies' guidelines and public comments, EPA has removed much of the detail
from the discussion of Guidelines coverage. These revisions were intended to shorten and
simplify the discussion without changing the general scope of the Guidelines.
Appendix 40
GUidelines tor EnsUrtr19 and MaXilnizmg the Quality. ObJectivity. Utility. and Integrity 01 information DIsseminate:, b' =~:
Some commentel's thought all information from outside sources should be covered by the
Guidelines, even if EPA does not use, rely on, or endorse it. Others wished to clarify the point at
which the Guidelines cover information from outside sources. As noted above, section 1.4 of the
Guidelines explains how subsequent distributions of information in public filings may become
subject to the Guidelines. We continue to think that EPA's own public filings before other
agencies should not generally be covered by the Guidelines as long as EPA does not
simultaneously disseminate them to the public, since use of this information would be subject to
the requirements and policies of the agency to which the information is submitted.
We received a number of comments, including from OMB, arguing that the provision regarding
information related to adjudicative processes was too broad, and that the Guidelines should
cover some or all information related to adjudicative processes, particularly administrative
adjudications. In addition to shortening this section, we have limited this provision to
information in documents prepared specifically for an administrative adjudication. This would
include decisions, orders, findings, and other documents prepared specifically for the
adjudication. As indicated in the Draft Guidelines, our view is that existing standards and
protections in administrative adjudications would generally be adequate to assure the quality of
information in administrative adjudications and to provide an adequate opportunity to contest
decisions on the quality of information. For example, in permitting proceedings, parties may
submit comments on the quality of information EPA prepares for the permit proceeding, and
judicial review is available based on existing statutes and regulations. Narrowing the provision
to information prepared specifically for the adjudication should make clear that the Guidelines
would not generally provide parties with additional avenues of challenge or appeal during
adjudications, but would still apply to a separate distribution of information where EPA adopts,
endorses, or uses the infonnation, such as when EPA disseminates it, on the Internet, or in a
rulemaking, or guidance document. When we intend to adopt information such as models or risk
assessments for use in a class of cases or determinations (e.g., for use in all determinations under
Appendix 41
G,ddelHles for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Dlssc",m<Jt~d t ,. c.,
a particular regulatory provision), EPA often disseminates this information separately and in
many instances requests public comment on it. Accordingly, it is not clear that there would be
many instances where persons who are concerned about information prepared specifically for an
adjudication would not have an opportunity to contest the quality of information.
We received many comments on how the Guidelines apply to external parties, the shared quality
responsibilities between EPA and external parties, and specific EPA responsibilities when using
or relying on information collected or compiled by external parties.
EPA roles: Some commenters emphasized that ensuring quality of information at the point of
dissemination is no substitute for vigorous efforts by EPA to receive quality information in the
first place and therefore for information providers to produce quality infonnation. One
commenter stated that EPA cannot be responsible for all aspects of the quality of the information
we disseminate. In response to this and other comments, we have provided additional language
in these Guidelines on the various roles that EPA assumes in either ensuring the quality of the
information we disseminate or ensuring the integrity of information EPA distributes. One
comment suggested that we mention the role of the National Environmental Information
Exchange Network in ensuring information integrity, which we have done in section 2.4 of the
Guidelines.
Assessment factors: Overall, public input was positive and welcoming of our proposal to
develop assessment factors to evaluate the quality of information generated by third parties. A
few commenters offered their involvement in the development of these factors, their advice on
how to develop such factors, and some examples of what assessment factors we should consider.
EPA staff have provided such comments to the EPA Science Policy Council workgroup that was
charged with developing the assessment factors. EPA welcomes stakeholder input in the
development of these factors and published draft assessment factors for public comment in
September 2002.
Coverage of State Information: Some commenters suggested that our Guidelines must apply to
all information disseminated by EPA, including information submitted to us by States. Whereas
some commenters stressed that the quality of information received by EPA is the responsibility
of the providers, others expressed concern about the potential impact that EPA's Guidelines
could have on States. We believe it is important to differentiate between information that we
Appendix 42
Guidelme" for Ensurin9 and Maximizing the Quality. Objectivity. Utility. and Integrity of Informallon Dlssemlnatt'(· p,. E;;,:
generate and data or information generated by external parties, including States. State
information, when submitted to EPA, may not be covered by these Guidelines, but our
subsequent use of the information may in fact be covered. We note, however, that there may be
practical limitations on the type of corrective action that may be taken, since EPA does not
intend to alter information submitted by States. However, EPA does intend to work closely with
our State counterparts to ensure and maximize the quality of information that EPA disseminates.
Furthermore, one commenter stated that if regulatory information is submitted to an authorized
or delegated State program, then the State is the primary custodian of the information and the
Guidelines would not cover that information. We agree with that statement.
We also received comments regarding the use of labels, or disclaimers, to notify the public
whether information is generated by EPA or an external party. We agree that disclaimers and
other notifications should be used to explain the status of information wherever possible, and we
are developing appropriate language and format.
A statement regarding Paperwork Reduction Act clearance submissions has been added in
response to comment by OMB.
Several commenters generally assert that the definition is too narrow. Other commenters
indicated that under EPA's draft definition, only Economically Significant actions, as defined in
Executive Order 12866, or only Economically Significant actions and information disseminated
in support oftop Agency actions, are considered "influential." We disagree. To demonstrate the
broad range of activities covered by our adoption of OMB's definition, we reiterate the
definition below and include an example of each type of action, to illustrate the breadth of our
definition. "Influential," when used in the phrase "influential scientific, financial, or statistical
information," means that the Agency can reasonably determine that dissemination of the
information will have or does have a dear and substantial impact on important public policies or
important private sector decisions. We will generally consider the following classes of
information to be influential: information disseminated in support of top Agency actions;
information disseminated in support of "economically significant" actions; major work products
undergoing peer review; and other disseminated information that will have or does have a dear
and substantial impact (i.e., potential change or impact) on important public policies or
important private sector decisions as determined by EPA on a case-by-case basis. In general,
influential information would be the scientific, financial or statistical information that provides a
substantial basis for EPA's position on key issues in top Agency actions and Economically
Significant actions. If the information provides a substantial basis for EPA's position, EPA
believes it would generally have a clear and substantial impact.
Appendix 43
Gtlldehnes tor Ensuring and MaXImizing the Quality. Objectivity. Utility. and Integrity of Information Diss~minatN' hI E":
Top Agency actions: An example of a top Agency action is the review of the National
Ambient Air Quality Standards (NAAQS) for Particulate Matter. Under the Clean Air
Act, EPA is to periodically review (l) the latest scientific knowledge about the effects on
public health and public welfare (e.g., the environment) associated with the presence of
such pollutants in the ambient air and (2) the standards, which are based on this science.
The Act further directs that the Administrator shall make any revisions to the standards
as may be appropriate, based on the latest science, that in her judgment are requisite to
protect the public health with an adequate margin of safety and to protect the public
welfare from any known or anticipated adverse effects. The standards establish allowable
levels of the pollutant in the ambient air across the United States, and States must
development implementation plans to attain the standards. The PM NAAQS were last
revised in 1997, and the next periodic review is now being conducted.
Peer reviewed work products: An example of a major work product undergoing peer
review is the IRIS Documentation: Reference Dose for Methylmercury. Methylmercury
contamination is the basis for fish advisories. It is necessary to determine an intake to
humans that is without appreciable risk in order to devise strategies for decreasing
mercury emissions into the environment. After EPA derived a reference dose (RID) of
0.0001 mglkg-day in 1995, industry argued that it was not based on sound science.
Congress ordered EPA to fund an National Research CouncilfNational Academy of the
Sciences panel to determine whether our RID was scientifically justifiable. The panel
concluded that the 0.0001 mglkg-day was an appropriate RID, based on newer studies
than the 1995 RID. The information in this document was evaluated, incorporated. and
subjected to comment by the Office of Water, where it contliibuted in large part to
Chapter 4 of Drinking Water Criteria for the Protection of Human Health:
Methylmercury (EPN8231R-01/00l) January 2001. The peer review mechanism was an
external peer review workshop and public comment session held on November 15, 2000,
accompanied by a public comment period from October 30 to November 29, 2000.
Appendix 44
GUldeilne~ lar EnsunJ19 and Maxirnlzlnq the Quality. Objectivity. Utility. and Integrity of Information Dissemtnate~ ::. ::r"
A number of commenters said that EPA created a limited definition of what types of information
are to be considered "influential," and that we have no rational basis to do so. A number of
commenters also stated that "all Agency information should be considered influential"~ that "all
data relied upon by the Agency should meet a high standard of quality regardless of the type"; or
that "'influential' information includes information used to support any EPA action, not just
'top' Agency actions," EPA followed OMB's guidelines in establishing a definition for
"influential" information that was not all-encompassing. OMB stated "the more important the
information, the higher the quality standards to which it should be held, for example, in those
situations involving "influential scientific, financial or statistical information..,". OMB narrowed
the definition of "influential" in their final guidance as follows:
Appendix 45
GUldellfles lor Ensuring and Maxllnlzmg the Quality. Objectivity. Utility. and Integrity of InformatIOn Dlssemmale(i 0, E";.
OMB also amended their definition to say that "each agency is authorized to define "influential"
in ways appropriate for it given the nature and multiplicity of issues for which the agency is
responsible" (67 FR 8455). We adopted OMB's "influential" definition. Once the Agency
reviewed the wide range of information disseminated to the public, such as major rulemakings,
risk assessments, rule related guidance, health advisories, annual reports, fact sheets, and
coloring books, it became apparent that there were reasons to distinguish between "influential"
information and other information. EPA adopted OMB's definition for "influential" and used
types of information the Agency disseminates to further explain what information is included.
Another commenter suggested that EPA should not indicate whether disseminated information is
"influential" when it is first disseminated but should wait to designate information as
"influential" until either an information correction request is made or a final agency action is
taken. We intend to consider this point, as well as other comments made about when
disseminated information becomes influential, as the Agency implements the Guidelines.
One commenter suggests that the definition of the term "influential" should be more narrow.
Specifically, the commenter states the following:
EPA agrees with the commenter that there are significant costs associated with ensuring that
information disseminated by the Agency is of high quality. Consequently, EPA chose a
definition of the term "influential" to COver information that, when disseminated, will result in a
clear and substantial impact on important public policies and private sector decisions. We
believe that this definition balances the costs associated with implementing the Guidelines, the
need to ensure high quality information, and the Agency's mission to protect human health and
safeguard the natural environment.
Several commenters indicated that it is inappropriate for EPA to base its definition of
"influential" on categories of actions. They suggest that the definition be based instead on the
Appendix 46
GlIrdellnes fa' Ensurln0 ancl MaxlITlIzing the Quality. ObJectivity. Utility. and Integrity of Information DrsSemlnatN1 h. EO>: i
content of the infonnation. We consider our definition to be based on infonnation content, given
that those categories of disseminated infonnation we defined as influential are those that EPA
can reasonably determine will or do have a clear and substantial impact on important public
policies or private sector decisions. We note here that, in addition to the specific classes of
disseminated infonnation we have defined as "influential," EPA has reiterated the "case-by
case" portion of the OMB "influential" definition. This general provision is intended to capture
disseminated infonnation, based on its content, that would not otherwise rise to the level of
"influential" under the other parts of our definition (i.e., top Agency actions, Economically
Significant actions, major peer reviewed products).
Several commenters assert that EPA should categorically state that certain specific types of
disseminated infonnation products are influential, and that we should categorically' state that
certain specific types of disseminated infonnation products are not influential. Given the vast
array of infonnation disseminated by the Agency, and given the fact that certain infonnation
may have a clear and substantial impact on important public policies or private sector decisions
at one time, but not have such an impact later on (and vice versa), classifying types of
information as "influential" or otherwise upfront is difficult and could be misleading. We intend
to rely on our definition in determining whether specific types of disseminated information
products are to be considered "influential" for purposes of the Guidelines.
A.3.5 Reproducibility
Some commenters stated that there needs to be more clarity in the definition of "reproducibility"
and related concepts. We have tried to provide definitions that are consistent with OMB
guidelines. Also, our Guidelines now include that EPA intends to ensure reproducibility for
disseminated original and supporting data according to commonly accepted scientific, financial,
or statistical standards. Many commenters thought there should be some kind of method to
consider reproducibility when proprietary models, methods, designs, and data are used in a
dissemination. Some commenters discourage all use of proprietary models; others suggest
proprietary model use be minimized with application limited to situations in which it is
absolutely necessary. We understand this concern, but note that there are other factors that are
appropriately considered when deciding whether to use proprietary models, including feasibility
and cost considerations (e.g., it may be more cost-effective for the Agency to use a proprietary
model in some situations than to develop its own model). In cases where the Agency relies on
proprietary models, these model applications are still subject to our Peer Review Policy. Further,
as recently directed by the Administrator, the Agency's Council on Regulatory Environmental
Modeling is now revitalizing its development of principles for evaluating the use of
environmental models with regard to model validation and certification issues, building on
current good modeling practices. In addition, these Guidelines provide for the use of especially
rigorous "robustness checks" and documentation of what checks were undertaken. These steps,
along with transparency about the sources of data used, various assumptions employed, analytic
methods applied, and statistical procedures employed should assure that analytic results are
"capable of being substantially reproduced."
Appendix 47
Gwdehnes for Ensurll1~ and M:lXImizing the Quality. Objectivity. Utility. and Integrity of Information DlssemlnateG D E~L
Regarding robustness checks, commenters were concerned that the EPA did not use the term
"especially rigorous robustness checks." We have modified our Guidelines to include this term.
Some commenters speculated on the ability of the Agency's Peer Review program to meet the
intent of the Guidelines and were concerned about the process to rebut a peer review used to
support the objectivity demonstration for disseminated information. Our Peer Review program
has been subject to external review and we routinely verify implementation of the program.
Affected persons wishing to rebut a formal peer review may do so using the complaint resolution
process in these Guidelines, provided that the information being questioned is considered to be
"disseminated" according to the Guidelines.
Regarding analytic results, some commenters indicated that the transparency factors identified
by EPA (section 6.3 of the Guidelines) are not a complete list of the items that would be needed
to demonstrate a higher degree of quality for influential information. EPA agreed with the list of
four items that was initially provided by the OMB and recognizes that, in some cases, additional
information regarding disseminated information would facilitate increased quality. However,
given the variety of information disseminated by the Agency, we cannot reasonably provide
additional details for such a demonstration at this time. Also, in regards to laboratory results,
which were mentioned by several commenters, these Guidelines are not the appropriate place to
set out for the science community EPA's view of what constitutes adequate demonstration of test
method validation or minimum quality assurance and quality control. Those technical
considerations should be addressed in the appropriate quality planning documentation or in
regulatory requirements.
EPA has developed general language addressing the concept of reproducibility and may provide
more detail after appropriate consultation with scientific and technical communities, as called for
by OMB in its guidelines. We have already begun to consult relevant scientific and technical
experts within the Agency, and also have planned an expedited consultation with EPA's Science
Advisory Board (SAB) on October 1, 2002. Based on these initial consultations, EPA may seek
additional input from the SAB in 2003. These consultations will allow EPA to constructively and
appropriately refine the application of existing policies and procedures, to further improve
reproducibility. In the interim, EPA intends to base the reproducibility of disseminated original
and supporting data on commonly accepted scientific, financial, or statistical standards.
Appendix 48
GLlldelille~ tor Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information DISSerTllnaH",1 ',. =~ ~
Risk assessments may be performed iteratively, with the fITst iteration employing protective
(conservative) assumptions to identify possible risks, Only if potential risks are identified in a
screening level assessment is it necessary to pursue a more refined, data~intensive risk
assessment. The screening level assessments may not result in "central estimates" of risk or
upper and lower-bounds of risks. Nevertheless, such assessments may be useful in making
regulatory decisions, as when the absence of concern from a screening level assessment is used
(along with other information) to approve the new use of a pesticide or chemical or to decide
whether to remediate very low levels of waste contamination.
Appendix 49
GwdC!lnes fer Ensuring and Maximizing the Quality. OOJectlvity. Utility. and Integrity of Information D,ssemrnaisc u" E""
OMB Guidelines
In its guidelines OMB stated that, with respect to influential information regarding health, safety
or environmental risk assessments, agencies should either adopt or adapt the quality principles in
the Safe Drinking Water Act (SDWA) Amendments of 1996. 34.35 In the background section of
the OMB guidelines, OMB explains that "the word 'adapt' is intended to provide agencies
flexibility in applying these principles to various types of risk assessment."
EPA carefully and practically developed the adaptation of the SDWA quality principles using
our considerable experience conducting human health and ecological36 risk assessments as well
as using our existing policies and guidance.
EPA conducts many risk assessments every year. Some of these are screening level assessments
based on scientific experts' judgments using conservative assumptions and available data and can
involve human health, safety, or environmental risk assessments. Such screening assessments
provide useful information that are sufficient for regulatory purposes in instances where more
elaborate, quantitative assessments are unnecessary. For example, such assessments could
indicate, even with conservative assumption, the level of risk does not warrant further
investigation. Other risk assessments are more detailed and quantitative and are based on
research and supporting data that are generated outside EPA. For example, pesticide reviews are
based on scientific studies conducted by registrants in accordance with our regulations and
guidance documents. Our test guidelines and Good Laboratory Practices (GLPS)37 describe
sound scientific practices for conducting studies needed to assess human and environmental
hazards and exposures. Such studies are not required to be peer-reviewed. Risk assessments
based on these studies can include occupational, dietary, and environmental exposures.
34 Safe Drinking Water Act Amendments of 1996, 42 V.S.c. 300g-1(b)(3)(A) & (B).
35 In section III.3.ii.C. of its guidelines, OMB states that: "With regard to analysis of risks to human health,
safety and the environment maintained or disseminated by the agencies, agencies shall either adopt or adapt the
equality principles applied by Congress to risk infonnation used and disseminated pursuant to the Safe Drinking
Water Act Amendments of 1996 (42 V.S.c. 300g-1 (b)(3)(A) & (B». Agencies responsible for dissemination of vital
health and medical infonnation shall interpret the reproducibility and peer-review standards in a manner appropriate
to assuring the timely flow of vital infonnation from agencies to medical providers, patients. health agencies, and the
public. Infonnation quality standards may be waived temporarily by agencies under urgent situations (e.g.• imminent
threats to public health or homeland security) in accordance with the latitude specified in agency-specific
guidelines".
36Because the assessment of "environmental risk" is being distinguished in OMB's adaptation of the
SDWA quality principles from "human health risk". the tenn "environmental risk" as used in these Guidelines does
not directly involve human health concerns. In other words, "environmental risk assessment" is, in this case. the
equivalent to what EPA commonly refers to as "ecological risk assessment".
3740 CFR part 160 for FlFRA and 40 CFR part 792 for TSCA.
Appendix 50
G,lIdeIlne!'o for Ensuring and Ma~imizmg the Quality. ObJectivity. Utility. and IntegritY of Information D,ssemmated 0\' 1:'0,:.
The results of these risk assessments are conducted and presented to policy makers to inform
their risk management decisions. EPA currently has numerous policies that provide guidance to
internal risk assessors on how to conduct a risk assessment and characterize risk. The EPA Risk
Characterization Policy8 and associated guidelines are designed to ensure that critical
information from each stage of a risk assessment is used in forming conclusions about risk and
that this information is communicated from risk assessors to policy makers.
Current EPA guidance and policies incorporate quality principles. These are designed to ensure
that critical information from each stage of a risk assessment is used in forming conclusions
about risk and that this information is communicated from risk assessors to policy makers. One
example is the EPA Risk Characterization POlicy9 which provides a single, centralized body of
risk characterization implementation guidance to help EPA risk assessors and risk managers
make the risk characterization process transparent and risk characterization products clear,
consistent and reasonable (TCCR). These principles have been included in other Agency risk
assessment guidance, such as the Guidelines for Ecological Risk Assessment. 4o Other examples
of major, overarching guidelines for risk assessments include: Guidelines For Exposure
Assessment 41, Guidelines For Neurotoxicity Risk Assessment,42 and Guidelines For Reproductive
Toxicity Risk Assessment. 43 Each of these documents has undergone external scientific peer
review as well as public comment prior to publication. Additionally, individual EPA offices have
developed more specific risk assessment policies to meet the particular needs of the programs
and statutes under which they operate. 44 EPA's commitment to sound science is evidenced by our
ongoing efforts to develop and continually improve Agency guidance for risk assessment.
38http://www.epa.gov/OSP/spc/rcpolicy.htm
39 Ibid.
40US EPA(l998). Guidelines for ecological risk assessment (Federal Register 63(93):26846-26924).
http://www.epa.gov/nceairaf.
42US EPA (1998). Guidelines For Neurotoxicity Risk Assessment. Federal Register 63(93):26926-26954.
http://www.epa.gov/ncea/raf/.
43US EPA (1996). Guidelines For Reproductive Toxicity Risk Assessment. Federal Register 61 (212):56274·
56322. http://www.cQa.gov/ncea/raf .
44 The Office of Solid Waste and Emergency Response has developed Tools for Ecological Risk
Assessment for Superfund Risk Assessment. One example is the Ecological Risk Assessment Guidance for
Superfund: Process for Designing and Conducting Ecological Risk Assessments· Interim Final.
http://www.epa.gov/oerrpage/superfund/programs/risk/ecorisk/ecorisk.htm
http://www.epa.gov/oerrpage/superfund/programs/risk/tooleco.htm
Appendix 51
C.'ll!o€.'iJn~& for Ensuring and MaXIIllizing the Quality. ObJectivity. Utility. and Integrity ot InformallO'1 D,ssemrnate" l)\ : : ; ,.
The first EPA human health risk assessment guidelines 45 were issued in 1986. In 1992. the
Agency produced a Framework for Ecological Risk Assessment46 which was replaced by the
1998 Ecological Risk Assessment GUidelines. 47 As emphasized elsewhere in this document. the
statutes administered by EPA are diverse. Although the majority of risk assessments conducted
within the Agency are for chemical stressors, we also assess risks to biological and physical
stressors. In addition to risk assessment guidelines, both the EPA Science Policy Council and the
EPA Risk Assessment Forum have coordinated efforts to address the complex issues related to
data collection and analysis for hazard and exposure assessments. Thus, the Agency has
considerable experience in conducting both screening level and in-depth assessments for a wide
array of stressors.
Most environmental statutes obligate EPA to act to prevent adverse environmental and human
health impacts. For many of the risks that we must address, data are sparse and consensus about
assumptions is rare. In the context of data quality, we seek to strike a balance among fairness.
accuracy, and efficient implementation. Refusing to act until data quality improves can result in
substantial harm to human health, safety, and the environment.
Public Comments
We received a range of public and stakeholder comments on the adaptation of the SOWA
principles for "influential" human health, safety. and environmental risk assessments that are
disseminated by EPA. Some commenters stated that we should adopt the SOWA quality
principles for human health risk, safety and environmental risk assessments. Many commenters
sought clarification on reasons for EPA's adaptation of the SOWA quality principles for human
health risk assessments and additional information on how we plan to address this process.
Others urged us to adapt the SOWA principles rather than adopt, because of certain elements in
the SOWA principles that may not be applicable to all risk assessments such as a "central
estimate of human risk for the specific populations affected." Others stated that we should
neither adapt nor adopt SOWA principles because the "Data Quality Act" does not authorize
importing decisional criteria into statutory provisions where they do not apply. The decisional
criteria set forth in SOWA are expressly limited to SOWA. We also received comments at a
level of detail that are more appropriate for implementation of the Guidelines than for the
formulation of the Guidelines. These include comments regarding the use of clinical human test
data, and comments regarding the use of particular types of assumptions in risk assessments. To
the extent that an affected person believes that our use of data or assumptions in a particular
46Framework For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1992, EPAl630fR
92/001.
470uidelines For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1998. EPAl630fR
95/002F. http://cfpub.epa.gov/ncea/cfm/ecorsk.cfm
Appendix 52
(,llloelines for E.nsunng and MaxImIzing the Quality, Objectivity. Utility. and Integrity of Information D,ssemlnatrc' D, ;:~
dissemination of infonnation is inconsistent with these Guidelines, the issue can be raised at that
time.
A few commenters raised a question regarding a conflict between EPA's existing policies and
the SOWA principles and asked us to identify the conflicting specific risk assessment standards
and make every effort to reconcile the conflicting standards with the SOWA principles. A few
commenters stated that EPA should not have two separate standards for risk assessments (i.e..
one for influential and one for non-influential), but that all risk assessments should be considered
influential. Another stated that if there is a conflict between existing policies and the SOWA
principles, EPA should identify the conflicting specific risk assessment standards and make
every effort to reconcile the conflicting standards with the SOWA principles. Some commenters
have questioned why the "best available, peer reviewed science and supporting stud'ies"
language of SOWA was conditioned by tenns such as "to the extent practicable" or "as
appropriate."
Public comments received by the Agency on the draft Guidelines were widely divergent. As no
obvious consensus could be drawn, we carefully considered comments and arguments on
adoption and adaptation. We also reviewed our experience with the SOWA principles. existing
policies, and the applicability and appropriateness of the SOWA language with regard to the
variety of risk assessments that we conduct and have detennined that, to best meet the statutory
obligations of the many statutes EPA implements, it remains most appropriate to adapt the
SOWA principles to human health, safety, and environmental risk assessments.
In response to public comments we have removed "as appropriate" from these Guidelines in our
SOWA adaptation. EPA agrees that the phrase peer reviewed science "as appropriate" was
unclear. We revised this statement in part (A) to "including, when available, peer-reviewed
science and supporting studies." EPA introduced such adaptations in order to accommodate the
range of real-world situations we address in the implementation of our diverse programs.
Numerous commenters expressed that EPA did not provide adequate clarifications of how we
adapted the principles and what our thinking was on each adaptation. In these Guidelines we
have provided detailed clarifications regarding each adaptation made to the original SOWA
language and other remarks regarding our intent during the implementation of the SOWA
adaptation for influential disseminations by EPA. We direct reader to the Guidelines text for
such clarifications.
A few commenters noted that EPA should outline how an affected person would rebut the
presumption of objectivity afforded by peer review. EPA believes this determination would be
made on a case-by-case basis considering the circumstances of a particular peer review and has
Appendix 53
GUidelines for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Informatton D,ssemmatec c), ::"'.
decided not to provide specific suggestions for affected persons on how to rebut the presumption
of objectivity afforded by a peer review.
OMB and other commenters noted that agencies' guidelines needed to make clear that a request
for correction can be filed if an affected person believes that information does not comply with
the EPA Guidelines and the OMB guidelines. EPA has added language in the EPA Guidelines to
make sure this is more clear to readers.
EPA received numerous comments on the EPA definition of affected persons. In the draft
Guidelines, EPA had adopted OMB's definition. EPA agrees with comments suggesting that.
instead of elaborating on the definition of "affected person," a more open approach would be to
ask complainants to describe how they ate an affected person with respect to the information that
is the subject of their complaint. EPA is asking that persons submitting requests for correction
provide, among other things, such an explanation. EPA has revised the Guidelines accordingly,
so that we may consider this information along with other information in the complaint in
deciding on how to respond.
Some commenters noted that the EPA Guidelines do not state how the process will work,
specifically, for States, municipalities, and EPA. They expressed concern of being "caught in the
middle," so to speak, on trying to get their own information corrected. EPA does not believe that
the Guidelines needed greater details on how States will work with EPA to address complaints,
but intends to work closely with States to better ensure timely correction. EPA does appreciate
the frustration of an information owner in seeing what they deem'''incorrect'' information in a
disseminated document or web site. However, EPA notes that this is a very complex issue that
cannot be addressed with general language in the Guidelines for all cases.
Several comments indicated that EPA appears to have given itself "carte blanche" authority to
"elect not to correct" information. The commenters stated that there was no valid reason why
EPA would opt out of correcting information and that all errors should be corrected. To the
contrary, EPA like every Federal agency wants to correct wrong information. The issue is not as
simple as the correction of an improper zip code or phone number on the EPA web site. Even
these simple errors may be very complex if it would involve changing data in an EPA and/or
State database. Furthermore, EPA is not certain of the volume of complaints it will receive after
October 1 and therefore needed to provide a general provision in the Guidelines to recognize that
once EPA approves a request, the corrective action may vary depending on the circumstances.
On a case·by~case basis, EPA will determine the appropriate corrective action for each
complaint. EPA determined that this was the most reasonable approach. The revision also
recognizes practical limitations on corrective action for information from outside sources.
Several commenters noted that EPA needs to establish time frames for the complaint process.
Commenters stated that EPA should establish time frames for when affected persons can submit
a complaint on an information product, when EPA needs to responds to affected persons with a
decision on discrete, factual errors, when EPA would respond to affected persons with a decision
on more complex or broader interpretive issues, and when an affected person should submit a
Appendix 54
GUldelmcs for E:nsv~lnq and M:lXImlzmg the Quality. Oblectlvity. Uliltty. and Inteqrlty of Information Dlsserntn:lIe~< 0.. ~c;
request for reconsideration. One commenter suggested that EPA solicit all complaints at one
time during a 6-month window or another time frame. EPA notes that commenters provided
helpful examples and well thought out proposals for such a suite of time frames and appreciates
the public input.
EPA did not agree on the need to develop two separate time frames for complaints that are more
factual in nature versus those that are more complex. One commenter suggests a I5-day time
line for discrete factual errors and a 45-day time line for all other complaints. Another
commenter recommended 30 days for factual errors ",nd 60 days for all other complaints.
Another commenter advised EPA to model this complaint process according to the FOlA
process. This commenter also suggested a 3-week time line for more numeric corrections and 60
days for "broader interpretive issues or technical questions." While EPA appreciates the value of
these approaches, they might be problematic to implement. However, as EPA learns more about
the nature of this complaint process following some period of implementation. these suggested
approaches could be revisited.
EPA also agreed with commenters that a window of opportunity for commenters to submit a
request for reconsideration made sense. EPA has advised affected persons in these Guidelines to
submit a request for reconsideration within 90-days of the initial complaint decision by EPA.
Some commenters asked that EPA establish time lines for when EPA would take corrective
action. EPA does not anticipate that there would be any value in applying a specific time frame
for this action and prefers to look at each complaint and appropriate corrective action on a
case-by-case basis, as discussed above.
Commenters suggested that 45 days was a reasonable time frame for EPA to get back to the
affected person with either a decision or a notice that EPA needs more time. One group noted
that HHS, SSA, and NRC adopted the 45-day window. EPA disagreed with this approach and
instead opted for a 90-day time frame similar to the DOT Guidelines.
EPA received many comments on how EPA should structure its internal processes for the
complaint resolution process. Several comments specifically discussed the role that OEI should
play in the initial complaint and the requests for reconsideration. EPA does not agree that OEI
should be the arbiter on all requests for reconsideration, but does view the role of OEI in the
process as an important one. Namely, OEI may work to help ensure consistent responses to
complaints and requests for reconsideration. Other comments recommending specific internal
implementation processes are being considered as EPA designs the correction and request for
reconsideration administrative processes in greater detail.
Many commenters argued that Assistant Administrators and Regional Administrators should not
decide requests for reconsideration because they would be biased or would have a conflict of
interest when deciding complaints regarding infonnation disseminated by their own Offices or
programs, or if they had to reconsider decisions made by their own staffs. EPA does not agree.
This type of decision making is within the delegated decision making authority of EPA's
Appendix 55
GUIdelines lor Ensuring and MaxII11Izing the Ouality. ObJectivity. Utility, and Integrity of Information DlsSCmlllOwG b>, ED;.
officials, and these decisions should be presumed to be unbiased absent a specific showing that a
decision maker is not impartial in a particular case. EPA does agree with commenters who noted
that it is important to make consistent decisions on cross-cutting information quality issues. In
order to achieve appropriate consistency of response to affected persons on requests for
reconsideration and to ensure that cross-cutting information quality issues are considered across
the Agency at a senior level, EPA intends for an executive panel to make the final decisions on
all requests for reconsideration. Furthermore, we felt it important to add greater detail on the
time frame within which EPA would respond to a requestor on their request for reconsideration.
We have added that it is'EPA's goal to respond to requesters regarding requests for
reconsideration within 90 days.
EPA received many recommendations in public comments to include the public in the EPA
complaint process. Specifically, commenters requested that EPA notify the public about all
pending requests to modify information and one commenter stated that EPA should allow the
public to comment on information corrections requests for information that are considered
"central to a rulemaking or other Final Agency Action" before EPA accepts or rejects the request.
As a general matter, EPA does not intend to solicit public comment on how EPA should respond
to requests for correction or reconsideration. EPA also does not intend to post requests for
correction and requests for reconsideration on the EPA web site, but we plan to revisit this and
many other aspects of the Guidelines within one year of implementation.
EPA also received many comments on how information that is currently being reviewed by EPA
in response to a complaint appears to the public on the EPA web site or some other medium.
Some commenters recommended the use of flags for all information that has a complaint pending
with a note that where appropriate, challenged information will be pulled from dissemination and
removed from EPA's web site. Other commenters stated that the information in question should
be removed from public access until the resolution process has been completed. Still other
commenters requested that EPA not embark on self-censorship. As a general rule, EPA has
decided not to flag information that has a complaint pending. EPA believes that information that
is the subject of a pending complaint should not necessarily be removed from public access based
solely on the receipt of a request for correction.
EPA is actively developing new policies and procedures. as appropriate, to improve the quality of
information disseminated to the public. Some activities specifically support ensuring and
maximizi~g the quality, objectivity, utility, and integrity of information. For instance, we are
consulting with the scientific community on the subject of reproducibility. The EPA Science
Advisory Board (SAB) is performing an expedited consultation on the subject on October I,
2002. Based on this initial consultation, EPA and the SAB may consider a full review of
reproducibility and related information quality concepts in 2003. Furthermore, as noted earlier,
the EPA Science Policy Council has commissioned a workgroup to develop assessment factors
for consideration in assessing information that EPA collects or is voluntarily submitted in support
of various Agency decisions.
Appendix 56
Guidelines tor Ensuring and MaxlInizing the Quality. ObJectivity. Utility. and Integrlt}' of InformatIon Dlssemlnatec bv EP':'
As new processes, policies, and procedures are considered and adopted into Agency operations.
we will consider their relationship to the Guidelines and determine the extent to which the
Guidelines may need to change to accommodate new activity.
Appendix 57
OFFICE OF
ENVIRONMENTAL
INFORMATION
www.epa.gov/oei
Friday,
Part IX
Office of
Management and
Budget
Guidelines for Ensuring and Maximizing
the Quality, Objectivity, Utility, and '
Integrity of Information Disseminated by
Federal Agencies; Notice; Republication
8452 Federal Register/Vol. 67. No. 36/Friday. February 22. 2002/Notices
OFFICE OF MANAGEMENT AND guidance to Federal agencies for followed in drafting the guidelines that
BUDGET ensuring and maximizing the quality, we published on September 28. 2001
objectivity, utility, and integrity of (66 FR 49719), are also applicable to the
Guidelines for Ensuring and information (including statistical amended guidelines that we publish
Maximizing the Quality, Objectivity, information) disseminated by Federal today.
Utility, and Integrity of Information agencies' • ." Section 515(b) goes on In accordance with section 515. OMB
Disseminated by Federal Agencies; to state that the OMB guidelines shall: has designed the guidelines to help
Republication "(1) apply to the sharing by Federal agencies ensure and maximize the
agencies of. and access to. information quality, utility, objectivity and integrity
Editorial Note: Due to numerous errors, disseminated by Federal agencies; and of the information that they disseminate
this document is being reprinted in its "(2) require that each Federal agency (meaning to share with. or'give access
entirety, It was originally printed in the to which the guidelines apply to, the public). It is crucial that
Federal Register on Thursday, January 3. "(A) issue guidelines ensuring and information Federal agencies
2002 at 67 FR 369-378 Ilnd WIlS corrected on maximizing the quality, objectivity, disseminate meets these guidelines. In
Tuesday, February 5, 2002 at 67 FR 5365. utility, and integrity of information this respect. the fact that the Internet
AGENCY: Office of Management and (including statistical information) enables agencies to communicate
Budget. Executive Office of the disseminated by the agency, by not later information quickly and easily to a wide
President. than 1 year after the date of issuance of audience not only offers great benefits to
ACTION: Final guidelines. the guidelines under subsection (a); society, but also increases the potential
"(B) establish administrative harm that can result from the
SUMMARY: These final guidelines mechanisms allowing affected persons dissemination of information that does
implement section 515 of the Treasury to seek and obtain correction of not meet basic information quality
and General Government information maintained and guidelines. Recognizing the wide variety
Appropriations Act for Fiscal Year 2001 disseminated by the agency that does of information Federal agencies
(Public Law 106-554; H.R. 5658). not comply with the guidelines issued disseminate and the wide varietv of
Section 515 directs the Office of under subsection (a); and dissemination practices that agencies
Management and Budget (OMB) to issue "(C) report periodically to the have, OMB developed the guidelines
government-wide guidelines that Director with several principles in mind.
"provide policy and procedural "(i) the number and nature of First. OMB designed the guidelines to
guidance to Federal agencies for complaints received by the agency apply to a wide variety of government
ensuring and maximizing the quality, regarding the accuracy of information information dissemination activities
objectivity, utility, and integrity of disseminated by the agency and; that may range in importance and scope.
information (including statistical "(ii) how such complaints were OMB also designed the guidelines to be
information) disseminated by Federal handled by the agency." generic enough to fit all media. be they
agencies." By October 1.2002. agencies Proposed guidelines were published printed. electronic. or in other form.
must issue their own implementing in the Federal Register on June 28. 2001 OMB sought to avoid the problems that
guidelines that include "administrative (66 FR 34489). Final guidelines were would be inherent in developing
mechanisms allowing affected persons published in the Federal Register on detailed. prescriptive, "one·size-fits-all"
to seek and obtain correction of September 28.2001 (66 FR 49718). The government-wide guidelines that would
information maintained and Supplementary Information to the final .artitlcially require different types of
disseminated by the agency" that does gUidelines published in September 2001 dissemination activities to be treated in
not comply with the OMB guidelines. provides background, the underlying the same manner. Through this
These final guidelines also reflect the principles OMB followed in issuing the flexibility, each agency will be able to .
changes OMB made to the guidelines final guidelines, and statements of incorporate the requirements of these
issued September 28,2001, as a result intent concerning detailed provisions in OMB guidelines into the agency's own
of receiving additional comment on the the final guidelines. information resource management and
"capable of being substantially In the final guidelilnes published in administrative practices.
reproduced" standard (paragraphs September 2001, OMB also requested Second. OMB designed the guidelines
V.3.B, V.9, and V.I0). which OMB additional comment on the "capable of so that agencies will meet basic
previously issued on September 28. being substantially reproduced" information quality standards. Given the
2001, on an interim final basis. standard and the related definition of administrative mechanisms required by
DATES: Effective Date: January 3. 2002.
"influential scientific or statistical section 515 as well as the standards set
information" (paragraphs V.3.B, V.9, forth in the Paperwork Reduction Act, it
FOR FURTHER INFORMATION CONTACT:
and V.I0). which were issued on an is clear that agencies should not
Brooke J. Dickson, Office of Information interim final basis. The final guidelines disseminate substantive information
and Regulatory Affairs. Office of published today discuss the public that does not meet a basic level of
Management and Budget. Washington. quality. We recognize that some
comments OMB received. the OMB
DC 20503. Telephone (202) 395-3785 or response, and amendments to the final government information may need to
bye-mail to meet higher or more specific
guidelines published in September
informationquality@omb.eop.gov. information quality standards than
2001.
SUPPLEMENTARY INFORMATION: In section In developing agency-specific those that would apply to other types of
515(a) of the Treasury and General guidelines, agencies should refer both to government information. The more
Government Appropriations Act for the Supplementary Information to the important the information. the higher
Fiscal Year 2001 (Public Law 106-554; final guidelines published in the the quality standards to which it should
H.R. 5658), Congress directed the Office Federal Register on September 28.2001 be held, for example, in those situations
of Management and Budget (OMB) to (66 FR 49718). and also to the involving "influential scientific.
issue, by September 30,2001, Supplementary Information published financial, or statistical information" (a
government-wide guidelines that today. We stress that the three phrase defined in these guidelines). The
"provide policy and procedural "Underlying Principles" that OMB guidelines recognize, however. that
Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices 8453
infbrmation quality comes at a cost. confidential. In such cases, while presented in an accurate. clear.
Accordingly. the agencies should weigh agencies' implementation of the complete, and unbiased manner. and as
the costs (for example, including costs gUidelines may differ, the essence of the a matter of substance, is accurate.
attributable to agency processing effort, guidelines will apply. That is, these reliable, and unbiased. "Integrity" refers
respondent burden, maintenance of agencies must make their methods to security-the protection of
needed privacy, and assurances of transparent by providing information from unauthorized access
suitable confidentiality) and the benefits documentation, ensure quality by or revision, to ensure that the
of higher information quality in the reviewing the underlying methods used information is not compromised
development of information, and the in developing the data and consulting through corruption or falsification. OMB
level of quality to which the information [as appropriate) with experts and users, modeled the definitions of
disseminated will be held. and keep users informed about "information," "government
Third, OMB designed the guidelines corrections and revisions. informatkm," "information
so that agencies can apply them in.a dissemination product." and
common-sense and workable manner. It Summary of OMB Guidelines
"dissemination" on the longstanding
is important that these guidelines do not These guidelines apply to Federal definitions of those terms in OMB
impose unnecessary administrative agencies subject to the Paperwork Circular A-130, but tailored them to fit
burdens that would inhibit agencies Reduction Act [44 U.S.C. chapter 35). into the context of these guidelines.
from continuing to take advantage of the Agencies are directed to develop In addition, Section 515 imposes two
Internet and other technologies to information resources management reporting requirements on the agencies.
disseminate information that can be of procedures for reviewing and The first report, to be promulgated no
great benefit and value to the public. In substantiating (by documentation or later than October I, 2002, must provide
this regard, OMB encourages agencies to other means selected by the agency) the the ag~ncy's information quality
incorporate the standards and quality [including the objectivity, guidelines that describe administrative
procedures required by these guidelines utility, and integrity) of information mechanisms allOWing affected persons
into their existing information resources before it is disseminated. In addition, to seek and obtain, where appropriate,
management and administrative agencies are to establish administrative correction of disseminated information
practices rather than create new and mechanisms allowing affected persons that does not comply with the OMB and
potentially duplicative or contradictory to seek and obtain, where appropriate, agency guidelines. The second report is
processes. The primary example of this correction of information disseminated an annual fiscal year report to OMB (to
is that the guidelines recognize that, in by the agency that does not comply with be first submitted on January I, 2004)
accordance with oMB Circular A-130, the OMB or agency guidelines. providing information [both quantitative
agencies already have in place well Consistent with the underlying and qualitative, where appropriate) on
established information quality principles described above, these the number, nature, and resolution of
standards and administrative guidelines stress the importance of complaints received by the agency
mechanisms that allow persons to seek having agencies apply these standards regarding its perceived or confirmed
and obtain correction of information and develop their administrative failure to comply with these OMB and
that is maintained and disseminated by mechanisms so they can be agency guidelines.
the agency. Under the OMB guidelines, implemented in a common sense and
agencies need only ensure that their workable manner. Moreover, agencies Public Comments and OMB Response
own guidelines are consistent with must apply these standards flexibly, and Applicability of Guidelines. Some
these OMB guidelines, and then ensure in a manner appropriate to the nature comments raised concerns about the
that their administrative mechanisms and timeliness of the information to be applicability of these guidelines,
satisfy the standards and procedural disseminated, and incorporate them into particularly in the context of scientific
requirements in the new agency existing agency information resources research conducted by Federally
gUidelines. Similarly, agencies may rely management and administrative employed scientists or Federal grantees
on their implementation of the Federal practices. who publish and communicate their
Government's computer security laws Section 515 denotes four substantive research findings in the same manner as
[formerly, the Computer Security Act, terms regarding information their academic colleagues. OMB
and now the computer security disseminated by Federal agencies: believes that information generated and
provisions of the Paperwork Reduction quality, utility, objectivity, and disseminated in these contexts is not
Act) to establish appropriate security integrity. It is not always clear how each covered by these guidelines unless the
safeguards for ensuring the "integrity" substantive term relates-or how the agency represents the information as, or
of the information that the agencies four terms in aggregate relate-to the uses the information in support of, an
disseminate. widely divergent types of information offiCial position of the agency.
In addition, in response to concerns that agencies disseminate. The As a general matter, these guidelines
expressed by some of the agencies, we gUidelines provide definitions that apply to "information" that is
want to emphaSize that OMB recognizes attempt to establish a clear meaning so "disseminated" by agencies subject to
that Federal agencies prOVide a wide that both the agency and the public can the Paperwork Reduction Act (44 U.S.C.
variety of data and information. readily judge Whether a particular type 3502(1)). See paragraphs II, V.5 and V.B.
Accordingly, OMB understands that the of information to be disseminated does The definitions of "information" and
guidelines discussed below cannot be or does not meet these attributes. "dissemination" establish the scope of
implemented in the same way by each In the gUidelines, OMB defines the applicability of these guidelines.
agency. In some cases, for example, the "quality" as the encompassing term, of "Information" means "any
data disseminated by an agency are not which "utility," :'objectivity," and communicatiOll or representation of
collected by that agency; rather, the "integrity" are the constituents. knowledge such as facts or data' • ."
information the agency must provide in "Utility" refers to the usefulness of the This definition of information in
a timely manner is compiled from a information to the intended users. paragraph V.5 does "not include
variety of sources that are constantly "Objectivity" focuses on whether the opinions, where the agency's
updated and revised and may be disseminated information is being presentation makes it clear that what is
8454 Federal Register/Vol. 67, No. 36/Friday, February 22. 2002/Notices
being offered is someone's opinion and even if the Federal agency retains Most comments approved of the
rather than fact or the agency's views." ownership or other intellectual property prominent role that peer review plays in
"Dissemination" is defined to mean rights because the Federal government the OMB guidelines. Some comments
"agency initiated or sponsored paid for the research. To avoid contended that peer review was not
distribution of information to the confusion regarding whether the agency accepted as a universal standard that
public." As used in paragraph V.8, is sponsoring the dissemination, the incorporates an established, practiced.
"agency INITIATED' * • distribution researcher should include an and sufficient level of objectivity. Other
of information to the public" refers to appropriate disclaimer in the comments stated that the guidelines
information that the agency publication or speech to the effect that would be better clarified by making peer
disseminates, e.g .. a risk assessment the "views are mine, and do not review one of several factors that an
prepared by the agency to inform the necessarily reflect the view" of the agency should consider in assessing the
agency's formulation of possible agency. On the other hand, subsequent objectivity (and quality in general) of
regulatory or other action. In addit!on. agency dissemination of such original research. In addition. several
if an agency, as an institution, information requires that the comments noted that peer review does
disseminates information prepared by information adhere to the agency's not establish whether analvtic results
an outside party in a manner that information quality guidelines. In sum, are capable of being substantially
reasonably suggests that the agency these guidelines govern an agency's reproduced. In light of the comments,
agrees with the information, this dissemination of information, but the final guidelines in new paragraph
appearance of having the information generally do not govern a third-party's V.3.b.i qualify the presumption in favor
represent agency views makes agency dissemination of information (the of peer-reviewed information as follows:
dissemination of the information subject exception being where the agency is "However, this presumption is
to these guidelines. By contrast, an essentially using the third-party to rebuttable based on a persuasive
agency does not "initiate" the disseminate information on the agency's showing by the petitioner in a particular
dissemination of information when a behalf). Agencies, particularly those that instance."
Federally employed scientist or Federal fund scientific research, are encouraged We believe that transparency is
grantee or contractor publishes and to clarify the applicability of these important for peer review. and these
communicates his or her research guidelines to the various types of guidelines set minimum standards for
findings in the same manner as his or information they and their employees the transparency of agency-sponsored
her academic colleagues. even if the and grantees disseminate. peer review. As we state in new
Federal agency retains ownership or Paragraph V.8 also states that the paragraph V,3.b.i: "If data and analytic
other intellectual property rights definition of "dissemination" does not results have been subjected to formal,
because the Federal government paid for include ,,* • • distribution limited to independent, external peer review. the
the research. To avoid confusion correspondence with individuals or information may generally be presumed
regarding whether the agency agrees persons, press releases, archival records, to be of acceptable objectivity. However,
with the information (and is therefore public filings. subpoenas or adjudicative this presumption is rebuttable based on
disseminating it through the employee processes." The exemption from the a persuasive showing by the petitioner
or grantee), the researcher should definition of "dissemination" for in a particular instance. If agency
include an appropriate disclaimer in the "adjudicative processes" is intended to sponsored peer review is employed to
publication or speech to the effect that exclude, from the scope of these help satisfy the objectivity standard. the
the "views are mine, and do not guidelines, the findings and review process employed shall meet the
necessarily reflect the view" of the determinations that an agency.makes in general criteria for competent and
agency. the course of adjudications involVing credible peer review recommended by
Similarly, as used in paragraph V.8., specific parties. There are well OMB-GIRA to the President's
"agency' * • SPONSORED established procedural safeguards and Management Council (9/20/01) (http://
distribution of information to the rights to address the quality of www.whitehouse.gov/omb/inforeg/
public" refers'lto situations where an adjudicatory decisions and to provide oira_review-process.htmll, namely, 'that
agency has directed a third-party to persons with an opportunity to contest (a) peer reviewers be selected primarily
disseminate information. or where the decisions. These guidelines do not on the basis of necessary technical
agency has the authority to review and impose any additional requirements on expertise, (b) peer reviewers be expected
approve the information before release. agencies during adjudicative to disclose to agencies prior technical!
Therefore, for example, if an agency proceedings and do not provide parties policy positions they may have taken on
through a procurement contract or a to such adjudicative proceedings any the issues at hand, (c) peer reviewers be
grant provides for a person to conduct additional rights of challenge or appeal. expected to disclose to agencies their
research, and then the agency directs The Presumption Favoring Peer sources of personal and institutional
the person to disseminate the results (or Reviewed Information.As a general funding (private or public sector), and
. the agency reviews and approves the matter, in the scientific and research (d) peer reviews be conducted in an
results before they may be context, we regard technical information open and rigorous manner.' "
disseminated), then the agency has that has been subjected to formal, The importance of these general
"sponsored" the dissemination of this independent, external peer review as criteria for competent and credible peer
information. By contrast, if the agency presumptively objective. As the review has been supported by a number
simply provides funding to support guidelines state in paragraph V.3.b.i: "If of expert bodies. For example, "the
research, and it the researcher (not the data and analytic results have been work of fully competent peer-review
agency) who decides whether to subjected to formal, independent, panels can be undermined by
disseminate the results and-if the external peer review. the information allegations of conflict of interest and
results are to be released-who may generally be presumed to be of bias. Therefore, the best interests of the
determines the content and presentation acceptable objectivity." An example of a Board are served by effective policies
of the dissemination, then the agency formal, independent. external peer and procedures regarding potential
has not "sponsored" the dissemination review is the review process used by conflicts of interest, impartiality, and
even though it has funded the research scientific journals. panel balance." (EPA's Science Advisory
Federal Register/Vol. 67. No. 36/Friday, February 22. 2002/Notices 8455
Board Panels: Improved Policies and Although such cases of falsification guidelines, and the differences in the
Procedures Needed to Ensure are presumably rare. there is a nature of the information the\"
Independence and Balance, GAO-Ol significant scholarly literature disseminate, we also believe it will be
536. General Accounting Office. documenting quality problems with helpful if agencies elaborate on this
Washington. DC. June 2001. page 19.) articles published in peer-reviewed definition of" influential" in the context
As another example. "risk analyses research. "In a [peer-reviewed] meta of their missions and duties. with due
should be peer-reviewed and analysis that surprised many-and some consideration of the nature of the
accessible-both physically and doubt-researchers found little evidence information thev disseminate. As we
intellectuallv-so that decision-makers that peer review actually improves the state in amended paragraph V.9. "Each
at all levels will be able to respond quality ofresearch papers." (See. e.g., agency is authorized to define
critically to risk characterizations. The Science, Vol. 293. page 2187 (September .influential' in ways appropriate for it
intensity of the peer reviews should be 21. 2001.)) In part for this reason, many given the nature and multiplicity of
commensurate with the significance of agencies have already adopted peer issues for which the agency is
the risk or its management review and science advisory practices responsible. "
implications." (Setting Priorities, that go beyond journal peer review. See, Reproducibility. As we state in new
Getting Results: A New Direction for e.g.. Sheila Jasanoff. The Fifth Branch: paragraph V.3.b.ii: "If an agency is
EPA, Summary Report. National Science Advisers as Policy Makers, responsible for disseminating influential
Academy of Public Administration. Cambridge. MA. Harvard University scientific. financial, or statistical
Washington. DC. April 1995, page 23.) Press, 1990; Mark R. Powell, Science at information. agency guidelines shall
These criteria for peer reviewers are EPA: Information in the Regulatory include a high degree of transparency
generally consistent with the practices Process. Resources for the Future. about data and methods to facilitate the
now followed by the National Research Washington. DC.. 1999. pages 138-139; reproducibility of such information bv
Council of the National Academy of 151-153; Implementation of the qualified third parties." OMB believes
Sciences. In considering these criteria Environmental Protection Agency's Peer that a reproducibility standard is
for peer reviewers. we note that there Review Program: An SAB Evaluation of practical and appropriate for
are many types of peer reviews and that Three Reviews, EPA-SAB-RSAC-01 information that is considered
agency guidelines concerning the use of 009. A Review of the Research Strategies "influential", as defined in paragraph
peer review should tailor the rigor of Advisory Committee (RSAC) of the EPA V.9-that "will have or does have a
peer review to the importance of the Science Advisory Board (SAB), clear and substantial impact on
information involved. More generally. Washington. DC.. September 26, 2001. important public policies or important
agencies should define their peer-review For information likely to have an private sector decisions." The
standards in appropriate ways. given the important public policy or private sector reproducibility standard applicable to
nature and importance of the impact. OMB believes that additional influential scientific, financial, or
information they disseminate. quality checks beyond peer review are statistical information is intended to
Is Journal Peer Review Always approyriate. ensure that information disseminated by
Sufficient? Some comments argued that Definition of "Influential". OMB agencies is sufficiently transparent in
journal peer review should be adequate guidelines apply stricter quality terms of data and methods of analysis
to demonstrate quality. even for standards to the dissemination of that it would be feasible for a replication
influential information that can be information that is considered to be conducted. The fact that the use
expected to have major effects on public "influential." Comments noted that the of original and supporting data and
policy. OMB believes that this position breadth ofthe definition of "influential" analytic results have been deemed
overstates the effectiveness of journal in interim final paragraph V.9 requires "defensible" by peer·review procedures
peer review as a quality-control much speculation on the part of does not necessarily imply that the
mechanism. agencies. results are transparent and replicable.
Although journal peer review is We believe that this criticism has Reproducibility ofOriginal and
clearly valuable, there are cases where merit and have therefore narrowed the Supporting Data. Several of the
flawed science has been published in definition. In this narrower definition. comments objected to the exclusion of
respected journals. For example. the "influential". when used in the phrase original and supporting data from the
NIH Office of Research Integrity recently "influential scientific, financial, or reproducibility requirements.
reported the following case regarding statistical information", is amended to Comments instead suggested that OMB
environmental health research: mean that "the agency can reasonably should apply the reproducibility
determine that dissemination of the standard to original data, and that OMB
"Based on the report of an investigation information will have or does have a should provide flexibility to the
conducted by [XX] University. dated July 16.
1999. and additional analysis conducted by
clear and substantial impact on agencies in determining what
OR! in its oversight review, the US Public important public policies or important constitutes "original and supporting"
Health Service found that Dr. [Xl engaged in private sector decisions." The intent of data, OMB agrees and asks that agencies
scientific misconduct. Dr. [Xl committed the new phrase "clear and substantial" consider. in developing their own
scientific misconduct by intentionally is to reduce the need for speculation on guidelines, which categories of original
falsifying the research results published in the part of agencies. We added the and supporting data should be subject to
the journal SCIENCE and by providing present tense-"or does have"-to this the reproducibility standard and which
falsified and fabricated materials to narrower definition because on should not. To help in resolving this
investigating officials at [XX] University in occasion, an information dissemination issue. we also ask agencies to consult
response to a request for original data to
support the re~earch results and conclusions
may occur simultaneously with a directly with relevant scientific and
report in the SCIENCE paper. In addition, particular policy change. In response to technical communities on the feasibility
PHS finds that there is no original data or a public comment, we added an explicit of having the selected categories. of
other corroborating evidence to support the reference to "financial" information as original and supporting data subject to
research results and conclusions reported in consistent with our original intent. the reproducibility standard. Agencies
the SCIENCE paper as a whole." (66 FR Given the differences in the many are encouraged to address ethical,
52137, October 12. 2001). Federal agencies covered by these feasibility, and confidentiality issues
8456 Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
with care. As we state in new paragraph be required to reproduce each analytical Harvard Six Cities Study and the
V.3.b.iLA, "Agencies may identify. in result before it is disseminated. While American Cancer Society StudY of
consultation with the relevant scientific several comments commended OMB for Particulate Air Pollution and Mortality."
and technical communities, those establishing an appropriate balance in A Special Report of the Health Effects'
particular types of data that can the "capable of being substantially Institute's Particle Epidemiology
practicably be subjected to a reproduced" standard, others Reanalysis Project. Cambridge. MA,
reproducibility requirement, given considered this standard to be 2000.
ethical, feasibility, or confidentiality inherently subjective. There were also The primary benefit of public
constraints." Further. as we state in our comments that suggested the standard transparency is not necessarily that
expanded definition of would cause more burden for agencies. errors in analvtic results will be
"reproducibility" in paragraph V.IO, "If It is not OMB's intent that each detected. although error correction is
agencies apply the reproducibility test agency must reproduce each analytic clearly valuable. The more important
to specific types of original or result before it is disseminated. The benefit of transparency is that the public
supporting data, the associated purpose of the reproducibility standard will be able to assess how much an
guidelines shall provide relevant is to cultivate a consistent agency agency's analytic result hinges on the
definitions of reproducibility (e.g" commitment to transparency about how specific analytic choices made by the
standards for replication of laboratory analytic results are generated: the agency. Concreteness about analytic
data)," OMB urges caution in the specific data used, the various choices allows, for example, the
treatment of original and supporting assumptions employed, the specific implications of alternative technical
data because it may often be impractical analytic methods applied, and the choices to be readily assessed. This type
or even impermissible or unethical to statistical procedures employed. If of sensitivity analysis is widely
apply the reproducibility standard to sufficient transparency is achieved on regarded as an essential feature of high
such data. For example, it may not be each of these matters. then an analytic quality analysis. yet sensitivity analysis
ethical to repeat a "negative" result should meet the "capable of being cannot be undertaken by outside parties
(ineffective) clinical (therapeutic) substantially reproduced" standard. unless a high degree of transparency is
experiment and it may not be feasible to While there is much variation in types achieved. The OMB guidelines do not
replicate the radiation exposures of analytic results. OMB believes that compel such sensitivity analysis as a
studied after the Chernobyl accident. reproducibility is a practical standard to necessary dimension of quality. but the
When agencies submit their draft agency apply to most types of analytic results. transparency achieved by
guidelines for OMB review, agencies As we state in new paragraph V.3.b.iLB. reproducibility will allow the public to
should include a description of the "With regard to analytic results related undertake sensitivity studies of interest.
extent to which the reproducibility [to influential scientific. financial, or We acknowledge that confidentiality
standard is applicable and reflect statistical information], agency concerns will sometimes preclude
consultations with relevant scientific guidelines shall generally require public access as an approach to
and technical communities that were sufficient transparency about data and reproducibility. In response to public
used in developing gUidelines related to methods that an independent reanalysis comment, we have clarified that such
applicability of the reproducibility could be undertaken by a qualified concerns do include interests in
standard to original and supporting member of the public. These "intellectual property." To ensure that
data, transparency standards apply to agency the OMB guidelines have sufficient
It is also important to emphasize that analysis of data from a single study as flexibility with regard to analytic
the reproducibility standard does not well as to analyses that combine transparency, OMB has, in new
apply to all original and supporting data information from multiple studies." We paragraph V.3.b.iLB.i. provided agencies
disseminated by agencies, As we state in elaborate upon this principle in our an alternative approach for classes or
new paragraph V.3.b.iLA, "With regard expanded definition of types of analytic results that cannot
to original and supporting data related "reprodUcibility" in paragraph V.10: practically be subject to the
[to influential scientific, financial, or "With respect to analytic results, reproducibility standard. "[In those
statistical information], agency 'capable of being substantially situations involving influential
guidelines shall not require that all reproduced' means that independent scientific, financial, or statistical
disseminated data be subjected to a analysis of the original or supporting information· • • J making the data and
reproducibility requirement." In data using identical methods would methods publicly available will assist in
addition, we encourage agencies to generate similar analytic results. subject determining whether analytic results are
address how greater transparency can be to an acceptable degree of imprecision reproducible. However, the objectivity
achieved regarding original and or error." standard does not override other
supporting data. As we also state in new Even in a situation where the original compelling interests such as privacy.
paragraph V.3.b.iLA, "It is understood and supporting data are protected by trade secrets. intellectual property, and
that reproducibility of data is an confidentiality concerns, or the analytic other confidentiality protections. "
indication of transparency about computer models or other research Specifically. in cases where
research design and methods and thus methods may be kept confidential to reproducibility will not occur due to
a replication exercise (i.e., a new protect intellectual property, it may still other compelling interests, we expect
experiment, test, or sample) shall not be be feasible to have the analytic results agencies (1) to perform robustness
required prior to each dissemination." subject to the reproducibility standard. checks appropriate to the importance of
Agency guidelines need to achieve a For example, a qualified party, the information involved, e.g.,
high degree of transparency about data operating under the same determining whether a specific statistic
even when reproducibility is not confidentiality protections as the is sensitive to the choice of analytic
required. original analysts, may be asked to use method. and, accompanying the
Reproducibility of Analytic Results. the same data, computer model or information disseminated, to document
Many public comments were critical of statistical methods to replicate the their efforts to assure the needed
the reproducibility standard and analytic results reported in the original robustness in information quality, and
expressed concern that agencies would study. See, e.g., "Reanalysis of the (2) address in their guidelines the