Вы находитесь на странице: 1из 5

Running head: EVALUATION PLAN

Evaluation Plan
Brad Boykin, Steven Hill, Meagan Oakley, Dixie Shoemaker
Georgia Southern University

EVALUATION PLAN

Implementation Evaluation
Data Source(s)
Key Informant
Interview
Professional Learning
Feedback Form
Observation &
Rubric

Question(s)
Who are the program participants and how were they recruited? (P3)
What is the quality of initial program activities? (P2)
What is the quality of follow up and support activities? (P4)
Were the initial experience and follow up activities implemented as planned? (P1)
What is the quality of initial program activities? (P2)

Ms. Carla Lawton, Braxton County Schools Director of Professional Learning, and Dr.
Carlos Hedges, the Project Director, will serve as key informants during evaluation of
implementation. These individuals will be interviewed to gather information regarding the
recruitment and selection of program participants as well as the implementation of activities.
These interviews will take place prior to the professional learning summer workshop. A few
sample questions are included below:

What process was used to recruit participants for this project?


Upon completion of recruitment, how were participants selected?
How does the process used for recruitment and selection of participants help achieve the
goal/objectives for this project?
Upon conclusion of the professional learning summer workshop, participants will

complete a professional learning feedback form. This form, which was adapted from the
professional learning feedback form used by Forsyth County Schools, is included on page 57c of
the GaDOE Professional Learning Resource Guide. This form provides opportunities for
participants to give information for three levels of evaluation: participant reaction (What do you
value most from this experience, How could this session be improved?), participant learning
(What did you learn from this session?), and organizational change and support (What do you
now need?) (Georgia State Department of Education, 2006, p. 57D). This form will also be

EVALUATION PLAN

completed by teacher participants upon conclusion of the electronic communication follow up


activities as well as the mid-implementation meeting.
Observations of the summer workshop training sessions will be completed in addition to
a review of planning documents such as outlines, agendas, etc. by the evaluation team. A
framework for professional development and a professional development rubric designed by the
New York State Education Department will be utilized to evaluate alignment of project goals
with planning documents and delivery of training (New York State Education Department, n.d.).
Additional evidence of implementation will continue through observation and evaluation of
classroom PBL modules. During these observations, evidence of practices, strategies, and
behaviors learned through training will be evaluated along with the integration of technology,
teacher documents, and student work samples.
Summative Evaluation
Q1: To what extent were teachers able to develop PBL modules that were connected to local businesses
and industries, aligned with NSSM, and incorporated appropriate uses of technology?
Q2: To what extent were teachers able to implement and evaluate those modules?
Objective

Indicator

Data Source(s)

1.Create PBL Modules


based on local businesses
and industries (Q1)

Modules draw on content and


processes from local businesses
and industries

Lesson Plan Rubric

2.Create PBL Modules


addressing NSSM (Q1)

Module content, tasks, and


assessments are aligned with
appropriate NSSM

Lesson Plan Rubric


Classroom Observation Measure

3.Integrate technology into


PBL experiences (Q1)

PBL modules contain activities


that effectively use technology

Observation Protocol for Technology


Integration in the Classroom (OPTIC)

4.Implement and evaluate


PBL Modules (Q2)

Record of implementation
Record of self-evaluation

Classroom Observation Measure


NY Professional Learning Rubric

EVALUATION PLAN

To address objectives one and two, a rubric will be created based on the specific types of
mathematical and problem solving skills that were addressed during the summer workshop.
These skills will directly correlate to the local businesses and industries that were visited and
also address the use of NSSM. This rubric will then be used by the University faculty to review
lesson plans and modules created by the teachers in the program. This review will take place
after the workshop but before teachers actually implement the lessons in their classrooms.
Objective two will additionally be addressed in conjunction with objective four by the
use of a classroom observation measure created by Texas A&M University to evaluate the
effectiveness of PBL modules and their correlation to content standards. This rubric measures
specific elements of PBL based on a scale of 1 not evident to 5 to a great extent (Stearns,
Morgan, Capraro, & Capraro, 2012). University faculty will be thoroughly trained in the use of
this measure prior to its application during the module implementation by teachers. A random
sample of teachers will be observed twice each using this measure.
Objective three will also be measured with a classroom observation measure, the OPTIC.
This measure uses a rubric to gauge how well observed classroom activities effectively integrate
technology in student learning on a scale of 1 no integration to 5 high level of integration. The
measure includes indicators such as how students are using technology, their engagement in
technology use, and specific skills embedded in the curriculum (Northwest Regional Educational
Laboratory, 2004).
Objective four will be assessed by taking samples of teachers self-evaluation records and
reviewing them for evidence of implementation, self-reflection, and evidence of use of
professional learning, using the New York educational professional development rubric.

EVALUATION PLAN

To conclude the evaluation process, summative data will be compared to the level of
implementation by means of the professional development rubric. Summative data will be used
to show evidence of implementation and to evaluate whether or not level of implementation
affected performance in the creation and execution of the PBL modules.
DATA COLLECTION SCHEDULE
Data set
Key Informant Interview
Training Observations
Professional Development Rubric
Professional Learning Feedback Form
Lesson Plan Review/Rubric
Classroom Observations
Professional Development Rubric
Classroom PBL Observations
Classroom Technology survey OPTIC
Review/survey of teacher selfevaluation records

Date of
collection
June 18

Instruments
already developed?
No

June 25-29

Yes

Evaluation Team

Yes

Evaluation Team

No

Evaluation Team

Fall 2015

Yes

Evaluation Team

Fall 2015
Fall 2015

Yes
Yes

January 2016

Yes

Evaluation Team
Evaluation Team
Participating teachers
and Evaluation Team

June 29
July 20
Fall 2015
July 21-30

Data collected by:


Evaluation Team

References
Georgia State Department of Education. (2006). Georgia standards for professional learning
resource guide. Retrieved from
http://archives.doe.k12.ga.us/DMGetDocument.aspx/Updated%20Resource
%20Guide.pdf?
p=6CC6799F8C1371F6F8823DF7FBC8A546C580E685B1489535EF8D826CFF6D501
E&Type=D%20
New York State Education Department. (n.d.). A framework for professional development.
Retrieved from http://www.p12.nysed.gov/ciai/tqpd/documents/PDFrameworkPDF.pdf
Northwest Regional Educational Laboratory. (2004). OPTIC - Observation protocol for
technology integration in the classroom. Retrieved from
http://members.tripod.com/sjbrooks_young/observationrubric.pdf
Stearns, L. M., Morgan, J., Capraro, M. M., & Capraro, R. M. (2012, May-June). A teacher
observation instrument for PBL classroom instruction. Journal of STEM Education:
Innovations and Research, 13(3), 7-16. Retrieved from
http://hub.mspnet.org/media/data/Teacher_Observation_Instrument.pdf?
media_000000008227.pdf

Вам также может понравиться