Вы находитесь на странице: 1из 8

SOFTWARE QUALITY ASSURANCE PLAN

FOR THE

Online library management system

CS690

Date:
10/13/2016
Prepared by:
Roja Boina

TABLE of CONTENTS

1.Identification and resources


1.1 Introduction
1.2 Organization and Resources
2.Product Evaluation Procedures
3.product Evaluation Records
4.Product Evaluation Activities
5.Appendices

1-3
3-4
4-5
5-6
6

1.

IDENTIFICATION AND RESOURCES

1.1 Introduction
Project Name: online library management system
Project description: Online Library Management System is a system which maintains the information about the
books present in the library, their authors, the members of library to whom books are issued, library staff and all. This
is very difficult to organize manually. Maintenance of all this information manually is a very complex task. Owing to
the advancement of technology, organization of an Online Library becomes much simple.
It is a system, which maintains the information about the books present in the library, their authors, the members
of library to whom books are issued, library staff and all.
1.2 Organization and resources
All members of the Online Library management system project team will play a role in quality management. It is
imperative that the team ensures that work is completed at an adequate level of quality from individual work
packages to the final project deliverable.
Quality roles and responsibilities for the Online library management system are as follows:
Project Sponsor (professor john izzo)

Responsible for approving all quality standards for the Project

Review quality reports and assist in resolution of escalated issues

Sign off authority on the final acceptance of the project deliverables


Project Director (Roja)

Implement the Quality Management Plan to ensure all tasks, processes, and documentation are compliant
with the plan

Responsible for quality management throughout the duration of the project

Collaborate with the Quality Manager, Quality Specialists, and Process Owners in the development of
quality metrics and standards by phase

Ensure team member compliance with quality management processes

Support the Quality Manager in securing resources to perform quality management

Participate in quality management reviews as required

Provide oversight to the closure of corrective actions arising from quality reviews

Communicate quality standards to the project team and stakeholder


Quality Manager (Roja)

Provide overall leadership of quality management activities, including managing quality reviews and
overseeing follow-on corrective actions

Develop and maintain the project software quality assurance plan

Generate and maintain a schedule of software quality assurance activities

Collaborate with the Senior Project Director, Quality Specialists, and Process Owners in the development of
quality metrics and standards

Schedule and perform evaluations of process quality assurance reviews

Escalate non-compliance issues to the Senior Project Director.


Update the Quality Management Plan and maintain the overall quality standards for the Project processes
and products.

Provide oversight to the closure of corrective actions arising from quality reviews
Quality Specialists/Team Leads/ Managers (Roja)

Oversee and support the application of quality standards for the Project processes and products to their
respective team members

Collaborate with the Senior Project Director, Quality Manager, and Process Owners in the development of
the quality plan, including quality metrics and standards.

Participate in quality management reviews as required

Perform QA activities and QC inspections as appropriate

Recommend tools and methodologies for tracking quality and standards to establish acceptable quality
levels

Create and maintain Quality Control and Assurance Logs throughout the project

Conduct process and product assessments, as described within this plan, using objective criteria

Communicate results from assessments with relevant stakeholders

Ensure resolution of non-compliance instances and escalate any issues that cannot be resolved within the
project

Identify lessons learned that could improve processes for future products

Develop and maintain metrics


Process Owners (Roja)

Oversee and support the application of quality standards for the [Project Name] Project processes to their
assigned processes

Collaborate with the Senior Project Director, Quality Manager, and Quality Specialists in the development of
quality metrics and standards

Participate in quality management reviews as required


The Quality Assurance Team for each module is responsible for determining resource and information
requirements necessary to support the operation and monitoring of quality system processes, and for
communicating these requirements to the top management. The top executive management is responsible for
ensuring the availability of necessary resources and information.
I may use many different tools when performing quality management activities for the project. These tools are
listed below:
Software Quality Tools

Microsoft Office Tools (i.e. Project, Word, Excel, and PowerPoint)

Defect Management Repository & Defect Tracking Software

Test Management Software

Test Management Repository


Project Management Tools

[Project Name] Server

[Project Name] Risk & Issue Management System


Software Vendor Web sites and/or Software Development Lifecycle Asset/Artifact(s) Repositories (as
applicable)

Deliverables Repository

Software Vendor Problem Reporting

Schedule Management and Tracking software

2.

PRODUCT EVALUATION PROCEDURES

A deep analysis of the best quality techniques will be applied when the time to do this comes, but meanwhile it
will be applied a quality control based on doing and verifying process in a continuous way. Below is a list of tools and
software that will be used for the project; the list is subject to change during the whole project:

3.

Java Programming Language


My SQL
Visual BASIC
MS-Project 2000
MS-FrontPage and HTML 4
Ftp Software
PRODUCT EVALUATION RECORDS

The project team will maintain records that document assessments performed on the project. Maintaining
these records will provide objective evidence and traceability of assessments performed throughout the
projects life cycle. Example records include the process and product assessments reports, completed
checklists, metrics, and weekly/monthly status reports. The project will use a shared document repository to
contain the reporting data and the reports produced as part of the quality activities and reviews. The records
will be maintained through the implementation phase of the project.

4.

PRODUCT EVALUATIONS ACTIVITIES

Software Product Evaluation is an instrument that supports in controlling actual implementation of investment
proposals. This is executed by translating the investment proposal into a quality profile, on basis of which the actual
software product can be created. Software Product Evaluation provides the procedure to compare different profiles
during different implementation moments for the product. A clear description of the translation process from
investment proposal into software product quality profile is however needed, but not available yet. Software Product
Evaluation is an emerging area in both academia and industry. It is an area in which both the market grows, and the
amount of research questions increases. In this paper we identified the main research areas, problems and
questions divided over three aspects: building a quality profile, improving software implementation by selecting
appropriate actions, and designing and executing evaluation activities. The key component of Software Product
Evaluation is specifying software product quality in a quality profile. This is based on the starting-point that no
evaluation is possible without a proper specification of the needs for the software product. This quality profile is used
for two purposes: a. designing and executing the evaluation, because the quality profile points out the quality
priorities for the product and therefore describes the quality characteristics that must be evaluated most thorough,
and b. improving the software development processes, because specific action must be taken during implementation

to make sure the actual software product is created conform the quality profile. Full support is however not possible
yet, but the available methods, techniques and tools are currently identified and expanded in several international
research settings. Software Product Evaluation supports the learning process of Evaluation of IT investment
proposals, because it bridges the time between the investment decision and the final implementation. Experiences
with proposals in history should be captured and compared to actual implementation. Such an experience base will
give support during the evaluation of new IT investment proposals in the future. We believe that Software Product
Evaluation is a powerful tool to support this learning process.
Functionality functional testing review (checklists) component testing formal proof Reliability programming
language facilities fault tolerance analysis reliability growth model formal proof Usability user interface inspection
conformity to interface standards laboratory testing user mental model Efficiency execution time measurement
benchmark testing algorithmic complexity performance profiling analysis Maintainability inspection of documents
(checklists) static analysis analysis of development process traceability evaluation Portability analysis of installation
conformity to programming rules environment constraints evaluation program design evaluation.
Evaluation techniques are necessary to evaluate software products. An evaluation technique is a measurement
procedure, which also contains the pass/fail criteria for the measurement results. For example, an evaluation
technique 2 The new version of ISO 9126: ISO CD 9126 (1997) has added some sub characteristics to ISO 9126
(1991). For example, 'attractiveness' which is a sub characteristic of 'Usability'. 3 Software Certification Programme
in Europe (Esprit II, P2151) ISO 9126 (1996) 2 Quality characteristics Functionality - the capability of the software to
provide functions which meet stated and implied needs when the software is used under specified conditions.
Suitability -the capability of the software to provide an appropriate set of functions for specified tasks and user
objectives. Accuracy - the capability of the software to provide right or agreed results or effects. Interoperability the capability of the software to interact with one or more specified systems. Security - the capability of the software
to prevent unintended access and resist deliberate attacks intended to gain unauthorized access to confidential
information, or to make unauthorized modifications to information or to the program so as to provide the attacker
with some advantage or so as to deny service to legitimate users. Reliability - the capability of the software to
maintain the level of performance of the system when used under specified conditions Maturity - the capability of
the software to avoid failure as a result of faults in the software. Fault tolerance - the capability of the software to
maintain a specified level of performance in cases of software faults or of infringement of its specified interface.
Recoverability - the capability of the software to re-establish its level of performance and recover the data directly
affected in the case of a failure. Usability - the capability of the software to be understood, learned, used and liked by
the user, when used under specified conditions. Understandability - the capability of the software product to enable
the user to understand whether the software is suitable, and how it can be used for particular tasks and conditions of
use. Learnability - the capability of the software product to enable the user to learn its application. Operability - the
capability of the software product to enable the user to operate and control it. Attractiveness - the capability of the
software product to be liked by the user. Efficiency - the capability of the software to provide the required
performance, relative to the amount of resources used, under stated conditions.
Time behavior - the capability of the software to provide appropriate response and processing times and
throughput rates when performing its function, under stated conditions.
Resource utilization - the capability of the software to use appropriate resources in an appropriate time when the
software performs its function under stated conditions. Maintainability - the capability of the software to be modified.
Analyzability - the capability of the software product to be diagnosed for deficiencies or causes of failures in the
software, or for the parts to be modified to be identified. Changeability - the capability of the software product to
enable a specified modification to be implemented. Stability - the capability of the software to minimize unexpected
effects from modifications of the software. Testability - the capability of the software product to enable modified
software to be validated. Portability - the capability of software to be transferred from one environment to another.
Adaptability -the capability of the software to be modified for different specified environments without applying
actions or means other than those provided for this purpose for the software considered.

Installability - the capability of the software to be installed in a specified environment. Co-existence - the
capability of the software to co-exist with other independent software in a common environment sharing common
resources.
Replaceability - the capability of the software to be used in place of other specified software in the environment
of that software. - 5 - for analyzability which measures failure analysis time should be explicit if 1 hour or 1 week is
acceptable. Those pass/fail criteria are necessary to decide about accepting the product. The pass/fail criteria of an
evaluation technique must be applied to a restricted range of software products: only measures for which the criteria
can be applied are valid for the product. Therefore, it is necessary to specify quality first, before evaluation is
conducted. Quality characteristics and evaluation level Evaluation techniques are related to quality characteristics or
quality sub characteristics. The level of evaluation concept was introduced by the Scope-project to correlate the
stringency of the application of an evaluation technique to the context of the use of the software product (Robert,
1994). Products with different application risks must not be evaluated with equal stringency. For example, a word
processor has lower application risk than the security system of a nuclear power plant. a. Therefore, the word
processor requires less thorough evaluation. Four levels are distinguished by increasing risk : D, C, B and A. Table

3 presents a proposal -taken from the Scope-project- of evaluation techniques for these four levels and
quality characteristics.
Functionality
Reliability

Functional testing
programming
language facilities

review (checklists)
Fault tolerance
analysis

component testing
reliability growth
model

formal proof
formal proof

Usability

User interface
inspection

Conformity to interface
standards

laboratory testing

user mental
model

Efficiency

execution time
measurement

benchmark testing

algorithmic complexity

maintainability

inspection of
documents
(checklists)
analysis of
installation

static analysis

analysis of development
process

performance
profiling
analysis
traceability
evaluation

conformity to
programming rules

environment constraints
evaluation

Portability

program
design
evaluation

APPENDICES
A: Admin, Abbreviation, Acronym, Assumptions; B: Books, Business rules; C: Class, Client,
Conventions; D: Data requirement, Dependencies; G: GUI; K: Key; L: Library, Librarian; M:
Member; N: Non-functional Requirement; O: Operating environment; P: Performance, Perspective,
Purpose; R: Requirement, Requirement attributes; S: Safety, Scope, Security, System features; U:
User, User class and characteristics, User requirement

Вам также может понравиться