Вы находитесь на странице: 1из 24

OCCUPATION : STRACTURAL CONSTRUCTION WORKS

level—II

UNIT OF COMPETENCY: apply QUALITY CONTROL

MODULE TITLE applying QUALITY CONTROL


CONTEN
Title Page No.

Introduction ………………………..........................3
Summary of Learning Outcomes
How to Use This TTLM
Learning Guide ………………………………….…… 4
Information Sheet #1 ……………………………….. 5
Self-Check # 1 .……………………………………….. 14
Model Answer # 1……………………………………. 15
operation Sheet #1 ……………………………….. 16
lap test 1 ----------------------------------------18
reference book----------------------------------------19
information sheet -2----------------------------------20
Self-Check # 2 ………………………………………. 21
Model Answer # 2 …………………………………. 22
Operation sheet 2--------------------------------------24
Lap test 2-----------------------------------------------25
Reference book------------------------------------------26
Information Sheet=3---------------------------------27
Self-Check # 3----------------------------------------28
Model Answer # 3 -------------------------------------28
operation Sheet #3 ---------------------------------29
LAP Test ……………………………………………… 31
Reference book--------------------------------------32
information sheet -4----------------------------------34
Operation sheet 4--------------------------------------37

INTRODUCTION

Welcome to the module “Install Lining, Paneling and Molding “


This learner’s guide was prepared to help you achieve the required competence in “carpentry
add joinery level :II”. This will be the source of information for you to acquire knowledge and
skills in this particular occupation with minimum supervision or help from your trainer.
Summary of Learning Outcomes
After completing this learning guide, you should be able to:
LEARNING OUTCOMES:

At the end of the module the trainee will be able to:

LO1 Establish quality standards

LO2 Assess quality of service delivered

LO3 Record information

LO4 Study causes of quality deviations

LO5 Complete documentation

How to Use this TTLM

 Read through the Learning Guide carefully. It is divided into sections that cover
all the skills and knowledge that you need.
 Read Information Sheets and complete the Self-Check at the end of each section
to check your progress
 Read and make sure to Practice the activities in the Operation Sheets. Ask your
trainer to show you the correct way to do things or talk to more experienced
person for guidance.
 When you are ready, ask your trainer for institutional assessment and provide
you with feedback from your performan
LEARNING GUIDE

Learning Activities Special


Instructions/Resourc
es

Read topic of : Refer to Information


LO1. Establish quality standards Sheet #1 ,2,3,4,5

 Developing and implementing quality standards, policies


and procedures
 Documenting quality standards, policies and procedures
 Update quality standards, policies and procedures

Read topic of ; Refer to Information


LO2. Assess Quality of service delivery Sheet #1 ,2,3,4,5

 Evaluation techniques
 Quality checking procedures
 Applying corrective actions

Read topic of ; Refer to Information


LO3. Record information Sheet #1 ,2,3,4,5

 Recording quality performance


 Maintain records”
Read topic of ; Refer to Information
LO4. Study causes of quality deviations
 Identifying faulty or poor service
 Identifying Causes of deviations
 Identifying and applying corrective actions

LO5. Completing documentation Refer to Information


5.1. Recording information on quality and other indicators
5.2 Reporting procedures

INFORMATION SHEET: ONE


Introduction

Quality control (QC) is a procedure or set of procedures intended to ensure that a manufactured
product or performed service adheres to a defined set of quality criteria or meets the requirements of
the client or customer. QC is similar to, but not identical with, quality assurance (QA). QA is defined as a
procedure or set of procedures intended to ensure that a product or service under development (before
work is complete, as opposed to afterwards) meets specified requirements. QA is sometimes expressed
together with QC as a single expression, quality assurance and control (QA/QC

In order to implement an effective QC program, an enterprise must first decide which specific standards
the product or service must meet. Then the extent of QC actions must be determined (for example, the
percentage of units to be tested from each lot). Next, real-world data must be collected (for example,
the percentage of units that fail) and the results reported to management personnel. After this,
corrective action must be decided upon and taken (for example, defective units must be repaired or
rejected and poor service repeated at no charge until the customer is satisfied). If too many unit failures
or instances of poor service occur, a plan must be devised to improve the production or service process
and then that plan must be put into action. Finally, the QC process must be ongoing to ensure that
remedial efforts, if required, have produced satisfactory results and to immediately detect recurrences
or new instances of trouble.

To establish required Construction Quality procedures and reporting requirements for pre-
activity meetings for each activity preformed on the project

To provide a plan that outlines the construction quality checks to be performed during the
Project. This plan defines the roles and responsibilities of the Contractor with respect to
construction quality. All construction activities associated with this Project will be completed
under this quality program. . Elements will be built with skilled craftsmen following approved
drawings using proper materials and equipment. Inspections and tests will be performed as
outlined in this Plan to document the construction is proper and meets Project requirements.
Periodically, the process will be refined based on worker-identified quality problems, test data,
inspection results, audits, and satisfaction with the Project. The acts of examining, witnessing,
inspecting, checking, and testing and, when necessary, revising in-process or completed design
work, including plan sheets, studies, charts, and reports, to determine conformity with contract
requirements.

LO1 Establish quality standards

What is Quality Control?

• Processor system for monitoring the quality of laboratory testing, and the accuracy and
precision of results

• Routinely collect and analyze data from every test run or procedure

• Allows for immediate corrective action

Designing a QC Program
• Establish written policies and procedures

 Corrective action procedures

• Train all staff

• Design forms

• Assure complete documentation and review

Implementing a QC Program –Quantitative Tests

• Select high quality controls

• Collect at least 20 control values over a period of 20-30

days for each level of control

• Perform statistical analysis

• Develop Levey-Jennings chart

• Monitor control values using the Levey-Jennings chart and/or Westgard rules

• Take immediate corrective action, if needed

 Record actions taken

Selecting Control Materials


Has a known concentration of the substance (analyte) being measured

• Used to adjust instrument, kit, test system in order to standardize the assay

• Sometimes called a standard, although usually not a true standard

• This is not a control

• Known concentration of the analyte

• Use 2 or three levels of controls

• Include with patient samples when performing a test

• Used to validate reliability of the test system


Quality Control operates under the Project Manager. Quality Control is the responsibility of the
Contractor during all phases (e.g. design, fabrication, construction, environmental compliance)
of the Project.
Staff performing QC functions have sufficient authority and organizational freedom to identify
quality problems and to initiate, recommend, provide, and verify implementation of the
solutions. The QC may be indirectly or directly involved with design or construction production
activities.

Inspection of Masonry

 Requirements based on classification of the building or nature of occupancy

 Exceptions based on design strength and type of occupancy (nonessential buildings,


etc.)

Masonry Inspection Key Points

Pre-Construction Activities:

 Strength Tests of Masonry Prisms (at least 28 days prior to Masonry Construction)

During Masonry Construction:

 Cell Cleanliness and Cleanouts

 Steel Reinforcement Cleanliness

 Lap Length, Size, Spacing, Number, Placement of Reinforcement

 Blocks are approved for use prior to construction, by compressive strength tests in approved
laboratory.

 Measuring spacing between vertical steel reinforcement

 Visual inspection of steel reinforcement to check for placement, size, etc.

 Vertical Steel Placement

 Check cell cleanliness (i.e., no mortar protrusions beyond the allowable limit, or debris
to clog cells)

 Cell cleanouts prior to grout placement

 Cleanout after grout placement shows grout made it to the bottom of cell

 Cell cleanouts prior to grout placement

 Cleanout after grout placement shows grout made it to the bottom of cell

 Monitor Grout Placement and Sample for Strength Tests

 Sample Block for each Type for Compressive Strength Testing


SELF CHECK - 1

1.explain the purposes Quality control (QC) ?


2. What is Quality Control?

.
3.Explain,the factors during Inspection of Masonry?

MODEL ANSWER 1

1. Quality control (QC) is a procedure or set of procedures intended to ensure that a


manufactured product or performed service adheres to a defined set of quality criteria or
meets the requirements of the client or customer
2. Processor system for monitoring the quality of laboratory testing, and the accuracy and
precision of results

3. the factors during Inspection of Masonry are based on classification of the building or
nature of occupancy, based on design strength and type of occupancy (nonessential
buildings, etc.)
INFORMATION SHEET: two

LO2 Assess quality of service delivered

Districts should evaluate each potential DDM to determine quality and verify the
appropriateness of the assessment for use as a DDM. The basic process involves gathering
information about the assessment’s purpose and quality and conducting an internal review of
the assessment. During the internal review, districts should consider numerous criteria using
the core assessment concepts that were described in Section 2. Descriptions of the criteria to
be evaluated are outlined below to guide districts through the remainder of the process.

Step 1: Gather Documentation Regarding the Quality of the DDM (Reviews, Critiques).
For each promising assessment considered for use as a DDM, districts should gather
documentation regarding the quality of the instrument. When possible, information should be
gathered from sources external to the publisher or developer as well as from the authors/test
developers. For assessments developed by the district, this process may include collecting feedback
from teachers who have used the assessment. For commercially developed assessments, this
process may include reviewing documentation from the following sources, where available:

 Published information from sources external to the test developers such as:
 Formal reviews published by sources external to the author or publisher
 Informal reviews of the assessment published by research or evaluation groups
 Published or unpublished information from the author or publisher of the assessment
such as:
 Technical manual
 Administration manual
 Policies regarding the assessment
 Any other available information (e.g., recent reliability data, newly created
norms)
 Unpublished information from sources external to the assessment authors (e.g.,
teachers in the district or other districts who are using or have used the assessment)
such as:
 Testimonials, ratings, or other user feedback
 Data that demonstrates the quality of the assessment
 Any other available information that can be shared
Vet the material gathered about the assessment’s quality with an eye toward how
knowledgeable and current the source is. Knowledgeable sources will include information
about the assessment concepts outlined in Section 2, including construction, reliability, validity,
bias checks, and administration and reporting procedures. Knowledgeable sources will further
address the quality of the assessment as it aligns to the purpose or use of the assessment.
(Here, reviewers will be interested in information about assessment quality for the purpose of
tracking student progress.)

Step 2: Use the Documentation to Evaluate the Assessment.


Review the documentation gathered to determine whether the assessment is of sufficient
quality for use as a DDM. Because the information may exist in one or more locations in the
gathered documentation, the criteria below are organized by topic. Following the tasks below
will assist the district in conducting a systematic evaluation. Appendix A, the Assessment
Quality Checklist and Tracking Tool, provides districts with a place to collect evidence and
information on potential DDM characteristics and quality.

The following list provides the basic components of and information about an assessment that
districts should collect and review .If components or information is missing but the assessment
is a promising candidate for use as a DDM, districts can begin to develop missing components
and/or gather evidence of the assessment’s quality (e.g., pilot data). Note that the evaluation
process can be halted at any time if the review team determines that the assessment is not
appropriate for use as a DDM.

The Assessment Quality Checklist and Tracking Tool will aid districts in documenting the
following steps:

Identify general information about the assessment.


Grade/subject or course: Identify the grade/subject or course aligned to the
DDM.
Potential DDM name or title
Potential DDM source: Identify the source of the assessment (e.g., district-
developed, commercially developed, developed by another school district).
Type of assessment: Refer to the types of assessment described previously in
Section 2.Indicate whether the potential DDM is an on-demand assessment,
performance/project, portfolio, hybrid, or other type of assessment.
Item types: Refer to the item types described in Section 2 of this guide.
Consider the utility and feasibility of using the assessment
Utility: Districts are strongly advised to select DDMs that provide useful results
for students and educators. Districts are encouraged to include teachers and
administrators in the DDM identification process to determine the utility of
DDMs.
Feasibility: Take stock of the cost, length, accommodations, technology needs,
and report types of the DDM.
Identify and evaluate the components of the assessment.
Table of Test Specifications: The Table of Test Specifications describes the
alignment and rigor of the instrument’s content by matching all items on the
assessment with the tested standards (and sometimes learning objectives)
and the level of rigor (such as Bloom’s Revised Taxonomy).
Administration protocol: This protocol includes proctoring directions, security
provisions, and how to provide student accommodations.
Instrument: The instrument refers to the test itself.
Scoring method: The scoring method refers to the availability of a scoring key for
selected response items and scoring papers or rubrics for the scoring of
constructed response items.It could take the form of a scoring guide.
Technical documentation: The assessment may be accompanied by additional
documentation. This documentation may include a technical manual, which
describes the reliability and validity evidence associated with the
assessment, along with instrument development procedures. Well-known
commercial assessments will frequently be accompanied by technical
documentation.
Evaluate the level of alignment to the curriculum and to the intended level of rigor.
Alignment to curriculum: Indicate the procedure(s) used to establish the
alignment between the district curriculum and the DDM.
Alignment to intended level of rigor: Indicate the procedure(s) used to establish
the intended degree of rigor on the assessment .Indicate the use of
taxonomy, such as Bloom’s Revised for establishing rigor.
Gather evidence of and evaluate the technical qualities of the assessment.
Reliability: Evidence for reliability is collected as described earlier in Section
2.The type of reliability evidence collected should conform to the type of
assessment.For example, an internal consistency reliability coefficient is
typically reported for on-demand tests.
Validity: Evidence for validity is collected as described in Section 2 of this
guide.Districts are advised to collect three types of validity evidence:
content, relationships to other measures, variables, outcomes, and
consequential validity evidence.Districts are advised to pay particular
attention to content and consequential validity evidence.
Nonbias: Evidence of nonbias indicates that students who belong to particular
gender, demographic, and cultural groups are not advantaged or
disadvantaged by the instrument items included on the assessment.Nonbias
evidence is collected by reviewing items singly and collectively for possible
bias, including under- and overrepresentation of student demographic
groups.On commercial tests, quantitative measures of test bias may also be
reported.
Item quality:These properties provide evidence that each instrument item is
performing well (i.e., has an appropriate level of item difficulty and
discrimination)and that the instrument items collectively show a range of
difficulty from easy to hard so that the instrument shows no floor or ceiling
effects as either a pretest or a posttest.Finally, evidence that the instrument
contains “instructionally sensitive” items is also preferred. Additional
information about item difficulty, item discrimination, and floor and ceiling
effects, see Appendix B.

SELF CHECK - 2
What are the basic process involves gathering information about the assessment’s purpose and
quality and conducting an internal review of the assessment?

MODEL ANSWER 2

1. Gather Documentation Regarding the Quality of the DDM (Reviews, Critiques).


For each promising assessment considered for use as a DDM, districts should gather
documentation regarding the quality of the instrument.

2. the Assessment Quality Checklist and Tracking Tool, provides districts with a place to collect
evidence and information on potential DDM characteristics and quality. Grade/subject or
course: Identify the grade/subject or course aligned to the DDM

INFORMATION SHEET: three

LO3 Record information

Definitions

 RECORD : information captured in reproducible form required for conducting any


transaction or activity.

 RECORDS MANAGEMENT: a logical and practical approach to the creation, maintenance,


use and disposition of records.

 LIFE CYCLE CONCEPT: records pass thru three stages from its creation and active use to
its final disposition.

Effective Records Management

Why an integrated approach to RM?


 Lack of uniform classification and filing system

 lack of systematic and orderly transfer of inactive records

 lack of standardization and control to the creation of forms and directives

 loss or misfiling of records

 lack of storage space and filing equipment

A good records system:

 contains complete and comprehensive files thereby enabling effective decision making

 provides integrity and continuity regardless of changes in personnel

 facilitates protection and preservation of records

 provides low cost and efficient maintenance of records

 reduces the possibility of misfiling and duplication

means less time spent searching for files and documents.

Goals of an RM program

 create only necessary records for efficient and successful operation of the
office/institution.

 produce the records when needed.

 retain/preserve only records needed for continued operation of the office/ institution,
and dispose what is not needed.

Goals of an RM program

 create only necessary records for efficient and successful operation of the
office/institution.

 produce the records when needed.

 retain/preserve only records needed for continued operation of the office/ institution,
and dispose what is not needed.
SELF CHECK - 3

1 . What is the Definitions of RECORD information?


2. What are the good criteria’s records system:

MODEL ANSWER 3

 the Definitions of RECORD information


1 RECORD : information captured in reproducible form required for conducting any
transaction or activity.
2 RECORDS MANAGEMENT: a logical and practical approach to the creation,
maintenance, use and disposition of records.
3 LIFE CYCLE CONCEPT: records pass thru three stages from its creation and active use to
its final disposition.

 the Criteria’s records system:


1 contains complete and comprehensive files thereby enabling effective decision making
2 provides integrity and continuity regardless of changes in personnel

3 facilitates protection and preservation of records

4 provides low cost and efficient maintenance of records

5 reduces the possibility of misfiling and duplication

Means less time spent searching for files and documents.


INFORMATION SHEET: four

LO4 Study causes of quality deviations


MEASUREMENT, ANALYSIS AND IMPROVEMENT

Quality staff will review all data gathered through formal and informal audits or any of the
monitoring procedures discussed above to assess performance against plans, objectives, and
other defined Project goals and Design-Build Program goals. Through this and other analysis,
the Quality Manager, CQM, and DQAM will seek to determine the root cause of the
nonconformity.

Analysis for individual assessments will be defined prior to gathering the data.

Monitoring and Measurement

6 Executive Management Committee meetings, attended by the MnDOT, where feedback


will be solicited.
7 Review of correspondence, meetings, and memos/letters from the MnDOT.
All personnel are responsible to report all incidents of MnDOT dissatisfaction to the Quality
Manager. The Quality Manager will determine the cause of the dissatisfaction, and work with
the Project Managerto prevent a recurrence.

INFORMATION SHEET: five

LO5 Complete documentation

A completed document or record furnishing evidence of successful implementation of any


given aspect of the Quality Management Program. These records may include e any number of
formats or media, such as written reports, electronic media, drawings, or charts

To ensure that assessments are used in an appropriate and standardized way, they are typically
accompanied by documentation. This documentation ensures transparency in the development
and administration of assessments and can include the following documents:

8 Technical manual. An assessment’s technical manual is a comprehensive technical


document. It should identify the purpose of the assessment as well as when, how, and
to whom it can be appropriately administered. The document should explain how the
instrument content was identified and developed, specific requirements regarding the
administration of the instrument, the process for scoring the instrument, the types of
scores reported by the instrument, and information regarding the proper interpretation
of scores. It should include information a potential user may need for determining the
assessment’s psychometric quality, such as reliability, validity, and bias analyses. The
technical manual may also include other policies describing the appropriate use of the
instrument, such as the training requirements for instrument administration and the
interval of time before which the instrument must be reevaluated. Technical manuals
are typically developed for commercial assessments; districts wishing to use commercial
assessments for DDMs should consult the technical manual to review the instrument
quality information reported there.
9 Administration manual. This document details the instrument administration
procedures. When followed closely, it standardizes the administration procedures,
enhances instrument security, supports the equitable treatment of examinees, and
minimizes errors in instrument administration and scoring. The instrument
administration manual typically includes a list of the examinee resources (e.g.,
calculators or dictionaries) that are required and prohibited during administration, a
description of the appropriate conditions for administering the instrument, a script that
the administrator reads to students, details regarding what can and cannot be said or
done by students and by those administering or proctoring the assessment, instructions
for timed tests, insights into how to deal with emergencies that may arise during a test,
and a list of the examinee accommodations that are permitted (e.g., offering extra time
or administering the exam in a quiet setting outside the classroom). It should explain
clearly the procedures for scoring the instrument and procedures for training scorers to
score the instrument items reliably. Finally, the instrument administration manual
should include instructions to ensure instrument security, including procedures for
accounting for instrument materials (e.g., how to check out and return instrument
materials) and other administrative details (e.g., how to process or score answer sheets)

Finalizing a Decision Regarding DDMs


After conducting an evaluation of the assessment and recording results using the Assessment
Quality Checklist and Tracking Tool, district teams will have enough information about an
assessment to begin to make a determination about it can be. Often, review teams will find that
they have some but not all of the information needed to make a decision.If more information is
needed, the district can do one of two things:

10 Collect more information before making a final decision


Districts may encounter a scenario in which an assessment would have sufficient quality for use
to improve its technical qualities.If the assessment is locally developed or is in the public
domain, then the district may decide to pursue the assessment for use with modifications.
During the process of making modifications, districts can update pertinent information in the
Assessment Quality Tracking Tool. (The updated information will be specific to the changes in
the modified assessment.)After the assessment is revised, districts can pilot it and update the
remaining information in the Tracking Tool to determine if the revisions on the assessment had
the intended effect of improving the quality of the assessment.

Monitoring the Assessment’s Use

The process of maintaining a district’s set will be ongoing.Aftera district selects and implements
a, the district should monitor the quality of the assessment to determine if it is living up to its
promise of being a high-quality assessment. Districts may wish to continue to monitor the
following assessment characteristics:

11 Continued alignment to the district’s curriculum and intended degree of rigor


12 Instrument security (i.e., procedures intended to ensure that assessment results are not
tainted by improper instrument administration procedures or by an overfamiliarity with
the exam or the exam contents)
13 Reliability
14 Validity associated with its use as a DDM, including useful results and good score
reporting
15 Feasible administration and scoring procedure

SELF CHECK - 5

1. What are the documentations To ensure the assessments are used in an appropriate and
standardized way, accompanied by documentations?
MODEL ANSWER 5

1. Technical manual. An assessment’s technical manual is a comprehensive technical


document. It should identify the purpose of the assessment as well as when, how, and to
whom it can be appropriately administered. The document should explain how the
instrument content was identified and developed, specific requirements regarding the
administration of the instrument, the process for scoring the instrument, the types of
scores reported by the instrument, and information regarding the proper interpretation of
scores. It should include information a potential user may need for determining the
assessment’s psychometric quality, such as reliability, validity, and bias analyses. The
technical manual may also include other policies describing the appropriate use of the
instrument, such as the training requirements for instrument administration and the
interval of time before which the instrument must be reevaluated.

Administration manual. This document details the instrument administration procedures.


When followed closely, it standardizes the administration procedures, enhances instrument
security, supports the equitable treatment of examinees, and minimizes errors in instrument
administration and scoring. The instrument administration manual typically includes a list of the
examinee resources (e.g., calculators or dictionaries) that are required and prohibited during
administration, a description of the appropriate conditions for administering the instrument, a
script that the administrator reads to students, details regarding what can and cannot be said
or done by students and by those administering or proctoring the assessment, instructions for
timed tests, insights into how to deal with emergencies that may arise during a test, and a list
of the examinee accommodations that are permitted (e.g., offering extra time or administering
the exam in a quiet setting outside the classroom). It should explain clearly the procedures for
scoring the instrument and procedures for training scorers to score the instrument items
reliably
LAP Test 1

Name:_________________________________________Date: _________________
Time Started: _______________________ Time Finished: ___________________

Instructions: You are required to perform the following-

1write the procedure of the project& select their materials


2.showing the techniques of preparation
3. using safety rules
3. prepared plan to inspect the column form work

Reference books
Motivation building construction
Civil engineering
Building construction
Modern carpentry & joinery

Вам также может понравиться