You are on page 1of 55

Who can use this spreadsheet?

Submitters of source tests conducted prior to January 1, 2012 that did not use ERT to compile the data should use this
spreadsheet to submit the test data to EPA electronically.

Submitters of source tests conducted on or after January 1, 2012, should use this spreadsheet only if the test method use
is not one of the specifically listed EPA test methods listed on the ERT web page or or is not available from the drop down menu
of the ERT.

Submitters of a State/local agency test method which is essentially the same as one of the EPA test methods supported b
the ERT should not use this spreadsheet.

See the ERT website for the most up-to-date list of ERT-compatible methods.

The ERT also includes the flexibility for source testers to add Custom test methods and Custom target analytes within the
Location/Methods tab. These "Custom" methods use the basic information in typical isokinetic test methods and typical
instrumental test methods.

Users are encouraged to use the ERT with the Custom method and target analyte feature when applicable. Please see the
users manual for directions on using the Custom method feature and guidance on when its use may not be advisable.

If the Webfire template spreadsheet is used in lieu of the ERT to document your test report, many features that are incorporated
into the ERT and WebFIRE will be limited; therefore, the full capabilities of the system will also be limited.

If the only rows visible are the instruction rows and you can not see the data entry rows, you should do one of the following:
a) hide one or more of the instruction rows by placing your cursor in the row identification column, right clicking and selecting
"Hide",
b) reduce the height of one or more rows by placing your cursor in the row identification column, right clicking and selecting
"Row Height...", then enter a value lower than the one displayed, or,
c) reduce the size of the spreadsheet by selecting "View" then "Zoom" then choose a "Magnification" level lower than the one
identified.

How do I prepare the Template and/or Test Ratings for delivery?


You are to provide 2 items to EPA: (1) A completed WebFIRE spreadsheet template file, and (2) An electronic copy (PDF) of th
source test report and its supporting information. As the source test report and supporting information must be incorporated int
the WebFIRE spreadsheet fields, we recommend creating a PDF copy of the test report using the Recognize Text Using OCR
command in the Searchable Image (Exact) setting. Using this command when creating the PDF will allow you to cut text from
the test report for pasting into the appropriate cell of the template and then to edit any identified errors in the conversion of the
text.

Send the files to EPA on CD or DVD to the Measurement Policy Group, at the following address:

US EPA
Leader, Measurement Policy Group, OAQPS
Mail Code D243-05
RTP, NC 27711
For questions, contact Chief_Info@epa.gov.

Update History
Template Update August 2012: Two column headings, UNCERTAINTY_LOWER and UNCERTAINTY_UPPER, were
changed to FLAG and FLAG_DETAIL to accommodate identifying data which is below the detection limit.
Template Update July 2013: The Source Test Quality Rating Tool tab was added to this spreadsheet. The Source Test
Quality Rating Tool tab implements the method for calculating the numerical ratings which are presented in the Final Draft
Emission Factor Development Procedures Document. The descriptions of the information required in the Columns for
"TEST_REPORT_RATING" and "EVALUATION" were revised to be consistent with the revised test report rating criteria. In
addition, linkages were established to the "Source Test Quality Rating Tool" tab such that information does not need to be copie
between tabs.

Additional revisions were made as follows:


Corrections were made to the directions in Column A.
Example entries were provided for users in row 2 for select columns.
An expanded explanation of the contents of the required field "PROCESS_DESCRIPTION" was provided.
The directions to include the units in the optional fields PROCESS_PARAMETER1 were removed and directions to include t
units in the fields PROCESS_PARAMETER1_DESCRIPTION were added.
Limited the number of significant figure to provide to four and identified Agency policy limiting significant digits to three. Added
information that significant digits will be rounded to two.
Added the field "TEST_REPORT_RATING" with the example "0 to 100" and provided an explanation of the criteria used to
determine the rating. Indicated that rating is calculated based upon the completion of the tab labeled "Test Report Rating
Assessment Tool" Provided instruction to NOT enter value in column. Provided instruction to copy contents of row 25 to higher
numbered rows if used.
Revised the example entry and explanation of the file name for the source test report and removed the comment.
Revised the explanation of the contents of the field "EVALUATION" and directed user to completer the "Source Test Quality
Rating Tool" tab and to supplement the Justification field text. Directed the user to copy the contents of line number 25 to
additional rows used for test information.

Template Update April 2016: Converted format of template to accept test run-level data instead of average values. Added
columns for ZIP code, FRS ID, State ID, AFS ID, test run number, process run number, test method notes, sampling location,
control status, regulatory part, and regulatory subpart.
General Instructions for WebFIRE Template
Row 1 is a header row and is used to define the data imported into WebFIRE. DO NOT EDIT this row!

Row 2 provides guidance for entering data into the remaining rows of the template tab.

Rows 3 through 6 example data entries.

What information is required and what information is optional in the Webf


Row 3 identifies whether information requested in the spreadsheet cells is required or optional.

You are required to provide information for all columns where Rows 1 through 3 are highlighted yellow.

You should provide information in columns labeled "Optional" in Row 3 if the information is available in the test report.

Other Instructions
If the text in Rows 2 and 3 prevent you from viewing the data entered and you no longer need to view the information in those
hide Rows 2 and 3 using the Format Cells function from the Home tab. Also, you may change the column widths to allow you t

Each row of information that you provide in the "WebFIRE Template" tab (beginning in Row 7) will be included in WebFIRE and
Therefore, you should provide applicable information for every row where a pollutant test data value is entered. Where the info
(copy) the information in every cell in that column. DO NOT use shortcuts such as entering "see above" as this will disrupt the

DO NOT enter information into the columns labeled "TEST_REPORT_RATING" and "EVALUATION" as the entries for these co
in the "Test Quality Rating Tool" tab. Also, you will not see the test report rating nor the evaluation until a value is entered into t
WebFIRE Template. The WebFIRE Template includes the calculation formulas for the "TEST_REPORT_RATING" and "EVALU
you need to enter data after Row 30, copy the formulas in the column titled "TEST_REPORT_RATING" and the column titled "

How do I prepare the Template and/or Test Ratings for delivery?


You are to provide 2 items to EPA: (1) A completed WebFIRE spreadsheet template file that includes the WebFIRE Template a
and (2) An electronic copy (PDF) of the source test report and its supporting information.

As the source test report and supporting information must be incorporated into the WebFIRE spreadsheet fields, we recommen
using the Recognize Text Using OCR command in the Searchable Image (Exact) setting. Using this command when creatin
the test report for pasting into the appropriate cell of the template and then to edit any identified errors in the conversion of the

Where do I send my completed Template and/or Test Ratings?


Send the files to EPA on CD or DVD to the Measurement Policy Group, at the following address:

US EPA
Leader, Measurement Policy Group, OAQPS
Mail Code D243-05
RTP, NC 27711
For questions contact Chief_Info@epa.gov
FACILITY_NA
Header row INDEX ME CITY COUNTY STATE

This column may contain any


information that the stakeholder
wishes to place in the cell. It is only
intended to help manage large
exports such as a single test covering County State
multiple pollutants. No information in City where where where
this column will be imported into Facility is Facility is faciltiy is
Explanation of field WebFIRE. Name of facilty located located located

Optional/Required Required Required Required Required

This is an example entry.


Enter your information in ACB
rows 7, 8, and 9 Manufacturing Raleigh Wake NC

This is an example entry.


Enter your information in ACB
rows 7, 8, and 9 Manufacturing Raleigh Wake NC

This is an example entry.


Enter your information in ACB
rows 7, 8, and 9 Manufacturing Raleigh Wake NC
ZIP_CODE FRS_ID STATE_ID AFS_ID SCC

Source Classification Code:


Standard 8-digit code for point
sources. Area and mobile codes
EPA Facility EPA Facility Registry are 10 digits. The SCC desribes
ZIP code where Registry Service State facility Service the unique emission process
facility is located identification. permit ID. identification. tested.
Required (if Required (if Required (if
Required applicable) applicable) applicable) Required

27752 119999999999 50200505

27752 119999999999 50200505

27752 119999999999 50200505


APPL_SCCS POLL_NAME NEI_POLLUTANT_CODE

WebFIRE uses the NEI pollutant codes to


Applicable SCCs. If more than one describe the pollutant (see "EIS Code
SCC is associated with the Tables (including SCCs)" at
emissions data, enter the SCCs https://www.epa.gov/air-emissions-
here. In order to associate the WebFIRE uses the National inventories/emission-inventory-system-eis-
records in the spreadsheet as Emissions Inventory (NEI) gateway). If no NEI pollutant code is
duplicates of each other, list all pollutant names to describe the available for the pollutant, indicate this by
relevant SCCs in this field. The pollutant (see "EIS Code Tables including "NONE" in the field, and indicate
format for this field is comma- (including SCCs)" at the pollutant identity in the POLL_NAME
delimited. SCCs are to be https://www.epa.gov/air- and/or CAS_NUMBER fields so that
separated by a comma and one emissions-inventories/emission- WebFIRE developers are aware that a
space. inventory-system-eis-gateway). pollutant code is needed.

Optional Required Required

50200505, 50100505 2,3,7,8-Tetrachlorodibenzo-p-Dioxin 1746016

50200505, 50100505 2,3,7,8-Tetrachlorodibenzo-p-Dioxin 1746016

50200505, 50100505 2,3,7,8-Tetrachlorodibenzo-p-Dioxin 1746016


CAS_NUMBER TEST_METHOD TEST_RUN PROCESS_RUN METHOD_NOTES

Test method that was


followed by tester. If
the method use was
EPA method 25,
please use the
Pollutant CAS number following format:
(no dashes). This will "EPA 25" Most if not
often be the same as all test data will be Notes regarding application
the NEI pollutant code, EPA method X, where of the test method (e.g.,
but there may be "X" is numeric or Test run identification Process run special circumstances,
exceptions. alphanumeric. (e.g., 1). identification (e.g., 1). difficulties).
Required (if
applicable) Required Required Required Optional

1746016 EPA Method 23 1 1

1746016 EPA Method 23 2 1

1746016 EPA Method 23 3 1


SAMPLING_LOCATION CONTROL_STATUS PROCESS_DESCRIPTION

Process description. Cut and paste if


Control status of sampling available. This field was previously
location. Enter "0" if the titled "process_description1" While the
emissions reported in the SCC may provide a general indication
"TEST_DATA_VALUE" of the process, this field is used to
column are uncontrolled, provide detail that would distinguish
Name or identification of or "1" if the emissions are this source from others with the same
sampling location. controlled. SCC.

Required Required Required

Ram-fed, controlled air medical waste


incinerator with no add-on pollution
control equipment. Make: Environmental
Controll Products (now Joy Energy
Incinerator #1 stack outlet 0 Systems).

Ram-fed, controlled air medical waste


incinerator with no add-on pollution
control equipment. Make: Environmental
Controll Products (now Joy Energy
Incinerator #1 stack outlet 0 Systems).

Ram-fed, controlled air medical waste


incinerator with no add-on pollution
control equipment. Make: Environmental
Controll Products (now Joy Energy
Incinerator #1 stack outlet 0 Systems).
PROCESS_PARAMETER1_DE
PROCESS_PARAMETER1 SCRIPTION PROCESS_PARAMETER2

Use available information from


the test report document to
The numerical value of the define any process parameter
parameter described in (usually throughput - fuel or
process_parameter1_description. otherwise, ash content, sulfur
Include ONLY numerical values. content) that may have a direct
The units of the number entered bearing on the emission rate.
MUST be in the column This information is meant to If more than one relevant process
PROCESS_PARAMETER1_DESC describe the specific conditions parameter is present, follow format
RIPTION. that were present during the test. described for process parameter 1.

Optional Optional Optional

0.385 lb/hr design throughput rate

0.385 lb/hr design throughput rate

0.385 lb/hr design throughput rate


PROCESS_PARAMETER2_ PROCESS_PARAMETER3_DE
DESCRIPTION PROCESS_PARAMETER3 SCRIPTION

If more than one relevant If more than one relevant process If more than one relevant
process parameter is present, parameter is present, follow process parameter is present,
follow format described for format described for process follow format described for
process parameter 1. parameter 1. process parameter 1.

Optional Optional Optional


PROCESS_PARAMETER4_DE
PROCESS_PARAMETER4 SCRIPTION PROCESS_PARAMETER5

If more than one relevant If more than one relevant If more than one relevant
process parameter is present, process parameter is present, process parameter is present,
follow format described for follow format described for follow format described for
process parameter 1. process parameter 1. process parameter 1.

Optional Optional Optional


PROCESS_PARAMETER5_ CONTROL_DEVICE_ CONTROL_CODE1_
DESCRIPTION DESCRIPTION CONTROL_CODE1 DESCRIPTION CONTROL_CODE2

Description of control
identified by
WebFIRE uses the CONTROL_CODE1.
NEI control measure WebFIRE uses the
Text description of the descriptions to name NEI control measure
entire control system. the control device codes to describe the
If available, this should (see "EIS Code control device (see
provide detailed Tables (including "EIS Code Tables
design, operating and SCCs)" at (including SCCs)" at
monitoring information https://www.epa.gov/ https://www.epa.gov/a
If more than one relevant on each of the control air-emissions- ir-emissions- If more than one control is
process parameter is present, devices identified in inventories/emission- inventories/emission- present, follow format
follow format described for the five control code inventory-system-eis- inventory-system-eis- described for
process parameter 1. fields. gateway). gateway). CONTROL_CODE1.

Optional Required Optional Required Optional

000 Uncontrolled

000 Uncontrolled

000 Uncontrolled
CONTROL_CODE2_ CONTROL_CODE3_D CONTROL_CODE4_
DESCRIPTION CONTROL_CODE3 ESCRIPTION CONTROL_CODE4 DESCRIPTION

If more than one If more than one If more than one


control is present, If more than one control is present, If more than one control is present,
follow format control is present, follow format described control is present, follow format
described for follow format described for follow format described for
CONTROL_CODE1_D for CONTROL_CODE1_D described for CONTROL_CODE1_D
ESCRIPTION. CONTROL_CODE1. ESCRIPTION. CONTROL_CODE1. ESCRIPTION.

Required Optional Required Optional Required


CONTROL_CODE5_
CONTROL_CODE5 DESCRIPTION TEST_DATA_VALUE UNIT MEASURE

Express emissions data in scientific


notation. Do not express the
concentration with any more significant
figures than four since Agency policy
limits most source tests to three
If more than one significant figures. When all runs are
If more than one control is present, reported as below detection level, the
control is present, follow format value reported in this column should be
follow format described for the lowest emissions rate value for any These four fields define the units of the emission fact
described for CONTROL_CODE1_ run and the text "BDL" entered in the example below, the emission factor is read as "6.66E-
CONTROL_CODE1. DESCRIPTION. column labeled "FLAG". waste incinerated."

Optional Required Required Required Required

6.71E-08 lbs ton

6.67E-08 lbs ton

6.63E-08 lbs ton


5.50E-05
MATERIAL ACTION UNIT_ID MEASURE_ID MATERIAL_ID

Optional: Corresponding WebFIRE ID values for the unit, measure, material, and action fields.
ne the units of the emission factor. Using the This is a numerical value. Valid numerical values for these fields are available by downloading the
mission factor is read as "6.66E-08 lbs per ton CSV file for All WebFIRE Units located near the bottom of the WebFIRE simple search page
waste incinerated." (https://cfpub.epa.gov/webfire/index.cfm?action=fire.downloadInBulk).

Required Required Optional Optional Optional

waste incinerated

waste incinerated

waste incinerated
ACTION_ID FLAG FLAG_DETAIL

An indicator qualifying the site-specific


emissions factor value provided in the
column labeled "TEST_DATA_VALUE." The
flag "BDL" is to be used when the laboratory
results for all of the test runs are reported
as non detect or below detection limits. The
flag "DLL" is to be used when one or more
but not all laboratory results are reported as This column may be used to
nit, measure, material, and action fields. non detect or below detection limits. provide additional
ese fields are available by downloading the Additional flags may be provided but should explanation of the Flag
m of the WebFIRE simple search page include an explanation in the column indicator entered in the
action=fire.downloadInBulk). "FLAG_DETAIL." column "FLAG".

Optional Required for BDL test results Optional


TEST_REPORT_RATING SOURCE_TEST_DESCRIPTION

Numeric quality rating (0 to 100) associated with the specific test as


documented in the report provided in conjunction with this WebFIRE import
file. A low score is associated with a report which has little or no supporting
documentation. To receive a high score, the test report supplied to EPA
must contain a complete process description; process records and
descriptions of relevant process information; records of air pollution control
device design and operations; sampling location description; and detailed
sampling and analyses information, including all field data, pre- and post-
sampling equipment calibrations, laboratory analyses reports and
associated QA documentation. In addition, the source should have been
operated at a condition that is within the normal and typical range of
operations using typical feedstocks and producing products of acceptable
quality. Moreover, the test method used to measure the emissions should
have been valid for the pollutant measured and the source should have not
encountered problems that would have significantly biased the emissions This is a detailed description of the
measured. DO NOT ENTER VALUES IN THIS COLUMN! The calculation performance of the source test. It will
of the test report rating is accomplished through the completion of the include information on the test location,
spreadsheet tab labeled "Test Quality Rating Tool." If you use more than 20 the test method employed, the pollutants
lines to enter test data, copy the equation in Cell BC7 to the remaining lines quantified, modifications employed to
that you use. compete the test.

Required (Use the Test Report Rating Tool tab) Optional

95

95

95
0
NOTES REGULATORY_PART REGULATORY_SUBPART

Miscellaneous notes to further describe Regulatory part if the Regulatory subpart if the
any information that is relevant, but does emissions test in conducted emissions test in conducted to
not fit in one of the other descriptive to demonstrate compliance demonstrate compliance with a
categories. Leave blank if not needed. with a standard. standard

Optional Optional Optional

Part 60 Subpart Ec

Part 60 Subpart Ec

Part 60 Subpart Ec
REF_ID REFERENCE_TEXT

For submissions, this is the filename of the Adobe


Acrobat file of the source test report containing the
information in this record. The file must
accompany this spreadsheet file. This file will be
renamed to a WebFIRE specific format which will
be used to "hot-link" to the appropriate AP42 Enter the following required information 1)
reference. We recommended you develop a file the name of the facility and process tested,
name as follows: Use the Facility Name (with "_" 2) the location of the facility tested (city and
replacing spaces) followed by "_" and the test state) 3) the name of the company
report date (using the format xx-xx-20xx) to name responsible for the emissions test, 4) the
the file. This file will be renamed to a WebFIRE dates of the emissions test, 5) the name of
specific format which will be used to "hot-link" to the person supplying the test report and 6)
the appropriate AP42 reference. the date of the test report.

Required Required

Medical Waste Incineration Emission


Test Report, Cape Fear Memorial
Hospital, Wilmington, North Carolina.
Prepared for the U.S. Environmental
Protection Agency, Office of Air Quality
Cape_Fear_Memorial_Hospital_Wilmington_ Planning and Standards. December
12-7-1991.pdf 1991. EMB Report 90-MWI-4.

Medical Waste Incineration Emission


Test Report, Cape Fear Memorial
Hospital, Wilmington, North Carolina.
Prepared for the U.S. Environmental
Protection Agency, Office of Air Quality
Cape_Fear_Memorial_Hospital_Wilmington_ Planning and Standards. December
12-7-1991.pdf 1991. EMB Report 90-MWI-4.

Medical Waste Incineration Emission


Test Report, Cape Fear Memorial
Hospital, Wilmington, North Carolina.
Prepared for the U.S. Environmental
Protection Agency, Office of Air Quality
Cape_Fear_Memorial_Hospital_Wilmington_ Planning and Standards. December
12-7-1991.pdf 1991. EMB Report 90-MWI-4.
DATE_SOURCE_
REVIEW_EVALUATION TESTED

This text field combines the list of items identified


in the "Supporting Documentation Provided" of
the Test Quality Rating Tool and those comments
provided in the "Justification" fields for the
"Regulatory Agency Review" screen thereby
documenting the individual evaluator's test report
assessment and the quality rating of the source
test per the Emissions Factor Procedures
Document. DO NOT ENTER TEXT IN THIS
COLUMN! If you use more than 20 lines to input
test data, copy the equation in cell to the The date the test
remaining lines that you use. was conducted.

Required Required

3/20/2016

3/20/2016

3/20/2016
Who answers the "Supporting Documentation Provided" questions?
You, the submitter, whether a representative of the testing company or of the facility tested, should answer the "Supporting Doc
Agency Reviewer should also check that the answers to these questions are accurate prior to responding to the "Regulatory Ag

As a submitter, you are an employee of either the tested company or the company performing the emissions test. Therefore, yo
"Response" column to the right of the questions under the "Supporting Documentation Provided" column. Respond "Yes" only w
that justifies each affirmative response.

Who answers the "Regulatory Agency Review" questions?


Only Regulatory Agency Reviewers (e.g., federal, state, local, or tribal agency reviewers).

As a Regulatory Agency Reviewer, when should I provide responses to the


"Supporting Documentation Provided" column?
When a response is not already provided to the questions, you should provide responses. To respond, you should use the drop
completeness questions. Respond "Yes" only when the test report includes documentation that justifies each affirmative respon

As a Regulatory Agency Reviewer, how do I complete the spreadsheet?


At the top of the Test Quality Rating Tool, fill in the summary information (i.e., name of facility tested, name of test company, SC
(including company), and the name of the regulatory agency assessor and agency).

In the "Regulatory Agency Review" column, use the drop down menu in the "Response" column to the right of the Regulatory A
Respond to those questions where you have verified that the information in the test report met - or did not meet - the specificati
to Regulatory Agency Review questions where the Submitter provided a negative response to a completeness question. When
Agency Review response, review the entry in the "Justification" column. If you desire to edit the text in the "Justification" column
editable format (the template automatically enters text in the "Justification" column based on responses to the review questions
"Copy" the contents of the cell and paste the contents in the same cell using "Paste Special, Value". The text in the cells in the
"EVALUATION" column in the WebFIRE Template tab.

What is the Test Report Quality Indicator Value Rating?

The "Test Report Quality Indicator Value Rating" is a general indication of the level of documentation available in the test report
method requirements. The rating is based upon the system presented in the "Procedures for the Development of Emissions Fa
https://www3.epa.gov/ttn/chief/efpac/procedures/index.html. You will note that as affirmative responses to questions in either or
Methods areas are provided, the value in the cell to the right of "Test Report Quality Indicator Value Rating" is updated. This val
"TEST_REPORT_RATING" cells of the template tab. The rating value is NOT an indication of compliance or non-compliance w
the rating value is NOT an indication of the precision or accuracy of the resulting emissions data.

How do I prepare the Template and/or Test Ratings for delivery?


You are to provide 2 items to EPA: (1) A completed WebFIRE spreadsheet template file, and (2) An electronic copy (PDF) of th
information. As the source test report and supporting information must be incorporated into the WebFIRE spreadsheet fields, w
report using the Recognize Text Using OCR command in the Searchable Image (Exact) setting. Using this command when
the test report for pasting into the appropriate cell of the template and then to edit any identified errors in the conversion of the t

Where do I send my completed Template and/or Test Ratings?


Send the files to EPA on CD or DVD to the Measurement Policy Group, at the following address:
US EPA
Leader, Measurement Policy Group, OAQPS
Mail Code D243-05
RTP, NC 27711
For questions, contact Chief_Info@epa.gov.
Name of Facility where the test was performed
Name of Company performing stationary source test
SCC of tested unit or units
Name of assessor and name of employer.
Name of regulatory assessor and regulatory agency name.

Test Report Quality Indicator

e s
on
Supporting Documentation Provided

sp
Re
General

As described in ASTM D7036-12 Standard Practice for


Competence of Air Emission Testing Bodies, does the testing
firm meet the criteria as an AETB or is the person in charge of
the field team a QI for the type of testing conducted? A
certificate from an independent organization (e.g., Stack
Testing Accreditation Council (STAC), California Air Resources
Board (CARB), National Environmental Laboratory
Accreditation Program (NELAP)) or self declaration provides
documentation of competence as an AETB.

Is a description and drawing of test location provided?

Has a description of deviations from published test methods


been provided, or is there a statement that deviations were not
required to obtain data representative of typical facility
operation?

Is a full description of the process and the unit being tested


(including installed controls) provided?

Has a detailed discussion of source operating conditions, air


pollution control device operations and the representativeness
of measurements made during the test been provided?

Were the operating parameters for the tested process unit and
associated controls described and reported?
Is there an assessment of the validity, representativeness,
achievement of DQO's and usability of the data?

Have field notes addressing issues that may influence data


quality been provided?

Manual Test Methods


Have the following been included in the report:
Dry gas meter (DGM) calibrations, pitot tube and nozzle
inspections?

Was the Method 1 sample point evaluation included in the


report?

Were the cyclonic flow checks included in the report?

Were the raw sampling data and test sheets included in the
report?
Did the report include a description and flow diagram of the
recovery procedures?

Was the laboratory certified/accredited to perform these


analyses?
Did the report include a complete laboratory report and flow
diagram of sample analysis?

Were the chain-of-custody forms included in the report?

Instrumental Test Methods


Have the following been included in the report:
Did the report include a complete description of the
instrumental method sampling system?

Did the report include calibration gas certifications?

Did report include interference tests?


Were the response time tests included in the report?

Were the calibration error tests included in the report?

Did the report include drift tests?

Did the report include system bias tests?

Were the converter efficiency tests included in the report?

Did the report include stratification checks?

Did the report include the raw data for the instrumental
method?
Test Report Quality Indicator Value Rating 0

e
e

s
s

on
on

Regulatory Agency Review

sp
sp
Re

Re
General

As described in ASTM D7036-12 Standard Practice for


Competence of Air Emission Testing Bodies, does the testing
firm meet the criteria as an AETB or is the person in charge of
the field team a QI for the type of testing conducted? A
certificate from an independent organization (e.g., STAC,
CARB, NELAP) or self declaration provides documentation of
competence as an AETB.

Was a representative of the regulatory agency on site during


the test?
Is a description and drawing of test location provided?
Is there documentation that the source or the test company
sought and obtained approval for deviations from the
published test method prior to conducting the test or that the
tester's assertion that deviations were not required to obtain
data representative of operations that are typical for the
facility?

Were all test method deviations acceptable?

Is a full description of the process and the unit being tested


(including installed controls) provided?

Has a detailed discussion of source operating conditions, air


pollution control device operations and the representativeness
of measurements made during the test been provided?

Is there documentation that the required process monitors


have been calibrated and that the calibration is acceptable?
Was the process capacity documented?
Was the process operating within an appropriate range for the
test program objectives?

Were process data concurrent with testing?

Were data included in the report for all parameters for which
limits will be set?

Did the report discuss the representativeness of the facility


operations, control device operation, and the measurements of
the target pollutants, and were any changes from published
test methods or process and control device monitoring
protocols identified?

Were all sampling issues handled such that data quality was
not adversely affected?

nual Test Methods

Was the DGM pre-test calibration within the criteria specified


by the test method?
Was the DGM post-test calibration within the criteria specified
by the test method?
Were thermocouple calibrations within method criteria?

Was the pitot tube inspection acceptable?

Were nozzle inspections acceptable?


Were flow meter calibrations acceptable?
Were the appropriate number and location of sampling points
used?
Did the cyclonic flow evaluation show the presence of an
acceptable average gas flow angle?

Were all data required by the method recorded?

Were required leak checks performed and did the checks meet
method requirements?

Was the required minimum sample volume collected?

Did probe, filter, and impinger exit temperatures meet method


criteria (as applicable)?

Did isokinetic sampling rates meet method criteria?


Was the sampling time at each point greater than 2 minutes
and the same for each point?

Was the recovery process consistent with the method?


Were all required blanks collected in the field?
Where performed, were blank corrections handled per method
requirements?
Were sample volumes clearly marked on the jar or measured
and recorded?
Was the laboratory certified/accredited to perform these
analyses?
Did the laboratory note the sample volume upon receipt?
If sample loss occurred, was the compensation method used
documented and approved for the method?
Were the physical characteristics of the samples (e.g., color,
volume, integrity, pH, temperature) recorded and consistent
with the method?

Were sample hold times within method requirements?


Does the laboratory report document the analytical procedures
and techniques?
Were all laboratory QA requirements documented?
Were analytical standards required by the method
documented?
Were required laboratory duplicates within acceptable limits?
Were required spike recoveries within method requirements?
Were method-specified analytical blanks analyzed?

If problems occurred during analysis, is there sufficient


documentation to conclude that the problems did not adversely
affect the sample results?

Was the analytical detection limit specified in the test report?


Is the reported detection limit adequate for the purposes of the
test program?
Do the chain-of-custody forms indicate acceptable
management of collected samples between collection and
analysis?
mental Test Methods

Was a complete description of the sampling system provided?

Were calibration standards used prior to the end of the


expiration date?
Did calibration standards meet method criteria?
Did interference checks meet method requirements?
Was a response time test performed?

Did calibration error tests meet method requirements?


Were drift tests performed after each run and did they meet
method requirements?
Did system bias checks meet method requirements?

Was the NOX converter test acceptable?

Was a stratification assessment performed?

Was the duration of each sample run within method criteria?

Was an appropriate traverse performed during sample


collection, or was the probe placed at an appropriate center
point (if allowed by the method)?
Were sample times at each point uniform and did they meet
the method requirements?
Were sample lines heated sufficiently to prevent potential
adverse data quality issues?
Was all data required by the method recorded?

Total
Manual Test 0
Instrumental Test 0
e s
on

Justification
sp
Re

You might also like