Вы находитесь на странице: 1из 22

SOA Testing using Keyword

Driven Test Automation

Peter Gillogley
Overview
• The SOA Challenge
• 7 principals for SOA testing
Service Oriented Architecture
Other Discharge
consumers Systems Summary
Service Consumer

business processes
process choreography

S
HL

O
7

A
services

P
atomic and composite

service components HL7 API API


Service Provider

Patient Client Clinical Data


operational systems Administration Directory Repository
Pathology
Theatre
Radiology
Phamracy
Another view

Demographics HL7
Patient
Administration HL7 CD
Update Client
Directory
HL7

Client
Directory SOAP
Lookup

Pathology HL7 Pathology SOAP

Theatre HL7 EHR


Update
Theatre SOAP

HL7
Phamracy SOAP
Pharmacy
The System to be covered
Specification Complexity
Hospital demographics and admission details (HL7) 18 messages

Patient demographics and episodes synchronisation 33 pages, 5 messages


service (HL7) 87 pages, 17 messages
47 pages, 18 messages
CD Business Rules 33 pages, ~120 rules
47 pages, 12 functions
Patient Registration and Update Service (HL7) 13 pages, 7 messages

Patient demographics lookup (SOAP) 25 pages, 3 services


27 pages, 3 services
Investigations Service (SOAP) 21 pages, 1 service

Procedures Service (SOAP) 21 pages, 1 service

Medications Service (SOAP) 21 pages, 1 service


7 Principals for SOA Testing
1. Use a flexible automation framework
2. Adopt a keyword driven test approach
3. Apply data driven testing when appropriate
4. Hide mundane data setup
5. Use risk and requirements driven test design
6. Test the testware
7. Maximize the chance of surprises
The Journey
Manual
QTP
FIT

WATIR
Model Based (UML)

Ruby
DIY Model
Java Net Beans

JUnit

Keyword
Framework
Next Generation Open Source Testing Tool

IDE Other Editors

Framework for Framework for


Unit Testing System Testing

Internal External Drivers for


APIs APIs Interfaces

Continuous Integration Tool

Source Control System

Elisabeth Hendrickson, Quality Tree Software, Inc.


(inspired by the work of the Agile Alliance Functional Testing Tools Program)
Use a flexible automation framework

Netbeans IDE

Action word
JUNIT SOAPUI
Framework

HL7
JAX-WS AutoIT
COMM

Tortoise SVN
Adopt a Keyword Driven Test Approach
• Action words / Action Based Testing™ (Hans
Buwalda)
• Modular, reusable test components that are
built by test architects and then assembled
into test scripts by test designers (Forrster
2005)
• An application-independent automation
framework.. independent of the test
automation tool used to execute them and the
test script code that "drives" the application-
under-test and the data (IBM 2003).
KJRA – Action Word Framework

Keyword XML
Interpreter log

SOAP XSL
translation

Native keyword
Keyword implementation
Server calls
Logging
calls Class
uses
Custom Keyword Test
Library

calls

Application
Interface
Example Keyword Test
Keyword Test Result
Apply Data Driven Testing
Hide Mundane Data Driven Setup

• Creates a person with random content for approx


100 demographic fields (e.g. surname, first name,
middle name, DOB, sex, religion, country_of_birth)
• Ensures valid and unique of names and identifiers

• Creates and sends a HL7 Message populated


with the data generated above
Develop Keywords for Web service

• Query the web service (store the response)


• Parameterised checks of the content of the
response
• Specific keywords for regularly used checks
• Specific keywords for repeating fields
Develop Keywords for HL7
Use Risk and Requirements to Drive
Test Case Design
• Test management tool used to document
risks and requirements (Quality Centre)
• Keyword tests linked to relevant risks and
requirements
• Risk and requirements coverage managed by
test management tool.
• Test execution and reporting managed by test
management tool (manually initiated)
Test the Testware

• Keywords and underlying methods / classes


• Testware regression suite
• Potential false positives
• Test framework needs to be stable
• Testware bugs sap tester morale
Maximise Surprises

• “Simple” test cases are easier to read and maintain


but…
• Unexpected outcomes may be discovered by:
– Checking outside the focus of the test case
– Not immediately blame the testware for intermittent errors
– Sufficient logging to support analysis of incomplete test case
execution
– Randomised input data
– Frequent execution of test cases
Lessons Learnt
• Automation is a powerful way to check complex
interfaces
• Test scenarios may identify specification gaps
• Possible to get high requirements coverage
But…
• Don’t underestimate the number of keywords
required to be developed
• Framework bugs effect morale and schedule
• Need domain and automation knowledge
• Business scenarios may identify specification gaps
Questions

Peter Gillogley
( http://www.gillogley.com )

Вам также может понравиться