Академический Документы
Профессиональный Документы
Культура Документы
ing doc validation Verify mapping doc whether corresponding ETL information is provided or
not. Change log should maintain in every mapping doc.
Define the default test strategy If mapping docs are missed out some
optional information. Ex: data types length etc
ture validation 1. Validate the source and target table structure against corresponding
mapping doc.
5. Source data type length should not less than the target data type length.
traint Validation Ensure the constraints are defined for specific table as expected.
Consistency Issues 1. The data type and length for a particular attribute may vary in files or
tables though the semantic definition is the same.
Example: Account number may be defined as: Number (9) in one field or
table and Varchar2(11) in another table
Completeness Issues Ensures that all expected data is loaded in to target table
http://testing-dwh.blogspot.in/2012/11/etl-test-scenarios-and-test-cases.html 1/5
12/27/2017 ETL Test Scenarios and Test Cases
1. Compare records counts between source and target. Check for any
rejected records.
3. Check boundary value analysis (ex: only >=2008 year data has to load
into the target)
4. Comparing unique values of key fields between source data and data
loaded to the warehouse. This is a valuable technique that points out a
variety of possible data errors without doing a full validation on all fields.
2. Null, non-unique, or out of range data may be stored when the integrity
constraints are disabled.
Example: The primary key constraint is disabled during an import function.
Data is entered into the existing data with null unique identifiers.
Transformation 1. Create a spread sheet of scenarios of input data and expected results
and validate these with the business customer. This is an excellent
requirements elicitation step during design and could also be used as part
of testing.
2. Create test data that includes all scenarios. Utilize an ETL developer to
automate the entire process of populating data sets with the scenario
spread sheet to permit versatility and mobility for the reason that scenarios
are likely to change.
5. Validate that the data types within the warehouse are the same as was
specified in the data model or design.
http://testing-dwh.blogspot.in/2012/11/etl-test-scenarios-and-test-cases.html 2/5
12/27/2017 ETL Test Scenarios and Test Cases
Quality 1. Number check: if in the source format of numbering the columns are as
xx_30 but if the target is only 30 then it has to load not pre_fix(xx_). we
need to validate.
2. Date Check: They have to follow Date format and it should be same
across all the records. Standard format: yyyy-mm-dd etc..
4. Data Check: Based on business logic, few record which does not meet
certain criteria should be filtered out.
Example: only record whose date_sid >=2008 and GLAccount != ‘CM001’
should only load in the target table.
Validation Verify the null values where "Not Null" specified for specified column.
cate check 1. Needs to validate the unique key, primary key and any other column
should be unique as per the business requirements are having any
duplicate rows.
Example: One policy holder can take multiple polices and multiple claims.
In this case we need to verify the CLAIM_NO, CLAIMANT_NO,
COVEREGE_NAME, EXPOSURE_TYPE, EXPOSURE_OPEN_DATE,
EXPOSURE_CLOSED_DATE, EXPOSURE_STATUS, PAYMENT
Validation Date values are using many areas in ETL development for:
http://testing-dwh.blogspot.in/2012/11/etl-test-scenarios-and-test-cases.html 3/5
12/27/2017 ETL Test Scenarios and Test Cases
4. Sometimes based on the date values the updates and inserts are
generated.
plete Data Validation 1. To validate the complete data set in source and target table minus query
ng minus and is best solution.
rsect)
2. We need to source minus target and target minus source.
4. And also we needs to matching rows among source and target using
Intersect statement.
6. If minus query returns o rows and count intersect is less than source
count or target table count then we can considered as duplicate rows are
exists.
e Useful test scenarios 1. Verify that extraction process did not extract duplicate data from the
source (usually this happens in repeatable processes where at point zero
we need to extract all data from the source file, but the during the next
intervals we only need to capture the modified, and new rows.)
2. The QA team will maintain a set of SQL statements that are
automatically run at this stage to validate that no duplicate data have been
extracted from the source systems.
cleanness Unnecessary columns should be deleted before loading into the staging
area.
http://testing-dwh.blogspot.in/2012/11/etl-test-scenarios-and-test-cases.html 4/5
12/27/2017 ETL Test Scenarios and Test Cases
0 Add a comment
http://testing-dwh.blogspot.in/2012/11/etl-test-scenarios-and-test-cases.html 5/5