Академический Документы
Профессиональный Документы
Культура Документы
Gary Jechorek
Seminar alone
Lab alone
Clinical alone
Gary Jechorek
EIDT-6130-2,
Program Evaluation.2015 Summer Sem 05/04-08/23-PT
Table of Contents
00.
I.
II.
III.
IV.
V.
VI.
VII.
VIII.
IX.
Seminar alone
Lab alone
Clinical alone
The present evaluation is most closely aligned to the purposes of the participant-oriented and
expertise-oriented evaluation approaches. Proponents of participant-oriented evaluation view
participants as central to the evaluation. Using this approach, evaluators work to portray the
multiple needs, values, and perspectives of the program stakeholders in order to make judgments
about the value or worth of the program (Fitzpatrick, Sanders, & Worthen, 2004).
ISSUES TO BE ADDRESSED
The issues that were identified and that would be surveyed for accountability and quality were:
Student characteristic and practices
Faculty characteristics and practices
Curricula design, technology
Organizational supports
METHODOLOGY
1|Page
Disadvantages:
2|Page
3|Page
Resources
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation:
Alternative approaches and practical guidelines.
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation:
Alternative approaches and practical guidelines (4th ed.). Upper Saddle
River: Pearson.
Martinez, R., Shijuan, L., Watson, W., & Bichelmeyer, B. (2006a, Fall2006).
EVALUATION OF A WEB-BASED MASTER'S DEGREE PROGRAM.
Quarterly Review of Distance Education Retrieved 3, 7, from
https://ezproxy.lib.uwm.edu/login?url=http://search.ebscohost.com/login.as
px?direct=true&AuthType=ip,uid&db=tfh&AN=22941929&site=ehostlive&scope=site
Martinez, R., Shijuan, L., Watson, W., & Bichelmeyer, B. (2006b). EVALUATION
OF A WEB-BASED MASTER'S DEGREE PROGRAM. [Article]. Quarterly
Review of Distance Education, 7(3), 267-283.
The iSchool at Illinois: Graduate School of Library and Information Science.
(Instructional Technology & Design: Who We Are). Instructional
Technology & Design: Who We Are Retrieved June 30, 2015, from
https://www.lis.illinois.edu/academics/itd
4|Page
Week 2
Concept Map: Analyze Contextual
Gary Jechorek
Walden University
Program Evaluation
(EDUC - 6130 - 2)
Secondary Stakeholders
Program Directors
Program Coordinators
Course Coordinators
Academic Affairs Dean
College Dean
All stakeholders in the diagram are Faculty and Adjunct Faculty looking for improved
course objective outcomes.
The evaluation design content and process is created by Faculty and Adjunct Faculty in
an open door committee.
Faculty and Adjunct Faculty receive their individual teaching and course evaluation
results.
The Evaluation Design Committee does not receive results directly but hold other
supervisory positions.
Course Coordinators receive results for only their supervised course level. (combination
of two or three courses)
Program Coordinators (4) receive results by semester supervised: 1st Semester of
Program, 2nd Semester of Program, 3rd Semester of Program, and 4th Semester of
Program.
Program Directors receive results by program, Undergraduate, Graduate, and Doctorate.
Academic Dean receives all results by semester.
Academic Dean communicates anything notable to the College Dean.
1|Page
Course
Coordinators
Secondary
Primary Interacts
with Secondary
Adjunct
Faculty
Primary
Primary Interacts
with Secondary
Open Door
Faculty
Primary
Secondary to
Secondary
Results
Distribution
Primary
Stakeholders
Results
Distribution
College
Dean
Secondary
Results
Distribution
Open Door
Primarry
Stakeholders
Program
Coordinators
Secondary
Results
Distribution
Openn Door
Nursing Faculty
and Adjunct
Faculty
Evaluation Design
Committee
Primary
Evaluation
Process
Indirect
Primary
Results
Distribution
Stakeholders
Academic
Affairs
Dean
Secondary
Results
Distribution
Secondary to
Secondary
Secondary to
Secondary
Results
Distribution
Primary Interacts
with Secondary
Program
Directors
Secondary
2|Page
Reference
da Cunha Miguel. (2013). Those Who Can Teach Minority Nursing (March 30, 2013 ed.).
University of Wisconsin-Milwaukee Governance Committee. (2013). University of Wisconsin
Milwaukee Faculty Document No. 2934, November 21, 2013 Retrieved July 16, 2015,
from http://www4.uwm.edu/secu/docs/faculty/2934_SharedGov_Statement.pdf
3|Page
Gary Jechorek
Walden University
Program Evaluation
(EDUC - 6130 - 2)
August 1, 2015
Advantages
Disadvantages
The CIPP
Evaluation Model
Page | 1
Advantages
Disadvantages
Utilization-Focused Model
Page | 2
Secondary Stakeholders
Program Directors
Program Coordinators
Course Coordinators
Page | 3
All stakeholders are Faculty and Adjunct Faculty looking for course objective outcomes.
The evaluation design content and process is created by Faculty and Adjunct Faculty in
an open door committee.
Faculty and Adjunct Faculty receive their individual teaching and course evaluation
results.
The Evaluation Design Committee does not receive results directly but hold other
supervisory positions.
Course Coordinators receive results for only their supervised course level. (combination
of two or three courses)
Program Coordinators (4) receive results by semester supervised: 1st Semester of
Program, 2nd Semester of Program, 3rd Semester of Program, and 4th Semester of
Program.
Program Directors receive results by program, Undergraduate, Graduate, and Doctorate.
Academic Dean receives all results by semester.
Academic Dean communicates anything notable to the College Dean.
Page | 4
The standards reflected in the choice of formative evaluation review questions are guided by
the descriptive case study design, and is particularly useful when the purpose of the evaluation
is to describe something -- a case in depth and are concerned with exploring the hows and
whys of a program (Fitzpatrick, et al., 2011), Pg.390.
The role of the stakeholders should be clarifying the evaluation process for adequate
representation of teaching and course objectives.
Question for Formative Evaluation:
Is our end of semester evaluation processes for *FCPI and *FCPII Level evaluations
providing an adequate representation of the teaching and course objectives?
*Foundation of Clinical Practice I & II Courses (Seminar, Lab, & Clinical Site types)
Using the models and criteria questions should assist administrators to make good decisions,
though the process of delineating, obtaining, reporting and applying descriptive and judgmental
information about some objects merit, worth, probity, and significance to guide decision
making, support accountability, disseminate, effective practices, and increase understanding of
the involved phenomena, (Fitzpatrick, et al., 2011), Pg,173)
The formative evaluation will determine if an implementation of new process methods will be
made to the current evaluation process program.
Page | 5
Yes: The results are adequate for our needs in assessing the teaching and course
results
This makes happen:
No further step needs to be taken.
This is the result of:
Question? Are the results in FCPI and FCPII giving us an
accurate assessment, of Seminar, Lab and Clinical
site teaching and course evaluation outcome
results?
Answer: Yes. End process.
Page | 6
Page | 7
References
da Cunha Miguel. (2013). Those Who Can Teach Minority Nursing (March 30, 2013 ed.).
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation:
Alternative approaches and practical guidelines (4th ed.). Upper Saddle River:
Pearson.
University of Wisconsin-Milwaukee College of Nursing UnderGraduate Program
Committee. (2008). Course Evaluation Policy Faculty Document # (01-12) 117A.
Policy and Procedure. Nursing. University of Wisconsin-Milwaukee College of
Nursing. Milwaukee.
University of Wisconsin-Milwaukee Governance Committee. (2013). University of
Wisconsin Milwaukee Faculty Document No. 2934, November 21, 2013 Retrieved July 16,
2015,
from http://www4.uwm.edu/secu/docs/faculty/2934_SharedGov_Statement.pdf
Page | 8
Week 4
Application: Develop a Logic Model
Gary Jechorek
Walden University
Program Evaluation
(EDUC - 6130 - 2)
1|Page
Secondary Stakeholders
Program Directors
Program Coordinators
Course Coordinators
Academic Affairs Dean
College Dean
All stakeholders in the diagram are Faculty and Adjunct Faculty looking for improved
course objective outcomes.
The evaluation design content and process is created by Faculty and Adjunct Faculty in
an open door committee.
Faculty and Adjunct Faculty receive their individual teaching and course evaluation
results.
The Evaluation Design Committee does not receive results directly but hold other
supervisory positions.
Course Coordinators receive results for only their supervised course level. (combination
of two or three courses)
Program Coordinators (4) receive results by semester supervised: 1st Semester of
Program, 2nd Semester of Program, 3rd Semester of Program, and 4th Semester of
Program.
Program Directors receive results by program, Undergraduate, Graduate, and Doctorate.
Academic Dean receives all results by semester.
Academic Dean communicates anything notable to the College Dean.
1|Page
2|Page
2|Page
Details
Steps
Are alternative statistical calculation methods needed as a result of new data collection
methods?
This makes happen:
Are new Reporting methods needed as a result of new data collection forms?
New Evaluation Procedures Improvement Developed and Implemented
This is the result of:
Review current Evaluation Process
Are changes in statistical calculation methods?
This makes happen:
Review current Evaluation Process
Are there changes needed in reporting methods?
This is the result of:
Question? What is missing in the evaluation outcomes?
The results are incomplete and a process correction is needed identifying
Seminar, Lab and Clinical site in teaching and course evaluation outcomes
Are new Reporting methods needed as a result of new data collection forms?
This makes happen:
New Evaluation Procedures Improvement Developed and Implemented 42
How do we design collection forms to separate, Seminar, Clinical Site and Lab?
This is the result of:
Are alternative statistical calculation methods needed as a result of new data collection
methods?
Review current Evaluation Process
Are there changes needed in reporting methods?
This makes happen:
Review current Evaluation Process
This is the result of:
Question? What is missing in the evaluation outcomes?
The results are incomplete and a process correction is needed identifying
Seminar, Lab and Clinical site in teaching and course evaluation outcomes
Are changes in statistical calculation methods?
What collection methods could be developed in improvement?
Evaluation Process Planning Committee
This makes happen:
NO there is a need to identify each section as a unique evaluation location
This is the result of:
3|Page
4|Page
5|Page
Reference
da Cunha Miguel. (2013). Those Who Can Teach Minority Nursing (March 30, 2013 ed.).
University of Wisconsin-Milwaukee Governance Committee. (2013). University of Wisconsin
Milwaukee Faculty Document No. 2934, November 21, 2013 Retrieved July 16, 2015,
from http://www4.uwm.edu/secu/docs/faculty/2934_SharedGov_Statement.pdf
6|Page
Gary Jechorek
Walden University
Program Evaluation
(EDUC - 6130 - 2)
August 1, 2015
Advantages
Disadvantages
The CIPP
Evaluation Model
Page | 1
Advantages
Disadvantages
Utilization-Focused Model
Page | 2
Secondary Stakeholders
Program Directors
Program Coordinators
Course Coordinators
Page | 3
All stakeholders are Faculty and Adjunct Faculty looking for course objective outcomes.
The evaluation design content and process is created by Faculty and Adjunct Faculty in
an open door committee.
Faculty and Adjunct Faculty receive their individual teaching and course evaluation
results.
The Evaluation Design Committee does not receive results directly but hold other
supervisory positions.
Course Coordinators receive results for only their supervised course level. (combination
of two or three courses)
Program Coordinators (4) receive results by semester supervised: 1st Semester of
Program, 2nd Semester of Program, 3rd Semester of Program, and 4th Semester of
Program.
Program Directors receive results by program, Undergraduate, Graduate, and Doctorate.
Academic Dean receives all results by semester.
Academic Dean communicates anything notable to the College Dean.
Page | 4
The standards reflected in the choice of formative evaluation review questions are guided by
the descriptive case study design, and is particularly useful when the purpose of the evaluation
is to describe something -- a case in depth and are concerned with exploring the hows and
whys of a program (Fitzpatrick, et al., 2011), Pg.390.
The role of the stakeholders should be clarifying the evaluation process for adequate
representation of teaching and course objectives.
Question for Formative Evaluation:
Is our end of semester evaluation processes for *FCPI and *FCPII Level evaluations
providing an adequate representation of the teaching and course objectives?
*Foundation of Clinical Practice I & II Courses (Seminar, Lab, & Clinical Site types)
Using the models and criteria questions should assist administrators to make good decisions,
though the process of delineating, obtaining, reporting and applying descriptive and judgmental
information about some objects merit, worth, probity, and significance to guide decision
making, support accountability, disseminate, effective practices, and increase understanding of
the involved phenomena, (Fitzpatrick, et al., 2011), Pg,173)
The formative evaluation will determine if an implementation of new process methods will be
made to the current evaluation process program.
Page | 5
Yes: The results are adequate for our needs in assessing the teaching and course
results
This makes happen:
No further step needs to be taken.
This is the result of:
Question? Are the results in FCPI and FCPII giving us an
accurate assessment, of Seminar, Lab and Clinical
site teaching and course evaluation outcome
results?
Answer: Yes. End process.
Page | 6
Page | 7
References
da Cunha Miguel. (2013). Those Who Can Teach Minority Nursing (March 30, 2013 ed.).
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation:
Alternative approaches and practical guidelines (4th ed.). Upper Saddle River:
Pearson.
University of Wisconsin-Milwaukee College of Nursing UnderGraduate Program
Committee. (2008). Course Evaluation Policy Faculty Document # (01-12) 117A.
Policy and Procedure. Nursing. University of Wisconsin-Milwaukee College of
Nursing. Milwaukee.
University of Wisconsin-Milwaukee Governance Committee. (2013). University of
Wisconsin Milwaukee Faculty Document No. 2934, November 21, 2013 Retrieved July 16,
2015,
from http://www4.uwm.edu/secu/docs/faculty/2934_SharedGov_Statement.pdf
Page | 8
Gary Jechorek
Walden University
Program Evaluation
(EDUC - 6130 - 2)
August 9, 2015
The objective in this assignment was a practice method to interview two current Masters of
Instructional Design and Technology and to find out what they consider the most appropriate collection
strategies to evaluate the current program that they are completing for a Master of Science degree.
The choice of interview focus:
Have MS IDT students successfully acquired the knowledge and skills in the IDT program.
In both interviews there was agreement that Walden course statement of objectives should be
evaluated to identify if they had been prepared for real world tasks in Instructional Design.
The objectives of evaluation would be a review and measurement of the 13 courses to understand
student perceptions of program quality, to improve both student satisfaction and retention to degree
completion, and to plan for the future. Measured by using these stated program objectives:
Youll learn to apply theory, research, creativity, and problem-solving skills to a variety of technology
applications in order to improve learning. You will also develop the skills to assess, create, and manage
training materials. The combination of these skills will help you to support technology supported
training in educational institutions and corporate training classrooms. Through your coursework, you
will gain the experience needed to efficiently and effectively use technology and multimedia tools.
(Walden University, 2015)
Data Collection Method
The use of a pretestposttest design. The pre-test was completed at the beginning of the program. It
involved obtaining a pretest measure of the outcome of interest prior to the program of learning
followed by a posttest on the same measure after learning occurs used with the control group, our
cohort.
Strategy
Interview Consensus: This is an online program, an online survey tool would be the most appropriate
media tool available for collection. Both quantitative questions with use of a Likert scale and qualitative
questions to be used for program review. All students in the cohort would be offered participation that
are finishing the program, there would be no sampling. At all times there is student anonymity and
confidentiality.
Suggested questions discussed in the interview process for inclusion in the final evaluation of the
program measuring student perceptions of their successful completion of the program.
1.
2.
3.
4.
5.
6.
7.
What skills did you learn in this program that you can use?
How will you apply the skills you learned in this program?
What changes will you make in your situation based on the program?
How do todays program goals meet your needs?
In what way was this program useful to you?
How will this program help you set goals?
What goals have you set based on this program?
1|Page
What changes did you make in your operation/situation as a result of courses you
attended in the past year?
Why did you participate in this program?
What are you doing today in your operation that you did not do prior to this
educational program? (Specify the program/context).
What result do I expect from using information gained from this program?
What problems will be addressed by you being involved in this program?
What practices you currently use will be discontinued as a result of this program?
What new practice (s) will you implement as a result of this program? In what way
has decision-making been made easier by participation in this program?
What is the best thing that can happen if you use the information from this
program?
What immediate steps/actions will you take as a result of this program?
What specific assistance would be helpful to you in implementing the new practices
presented in this program?
What will it take for you to implement the new practices/Information provided in
this program?
What result(s)/impact(s) do you expect from participation in this program?
What was the result/impact of your participation?
(Martin Robert A, 2003)
Who do they consider to be the stakeholders in this program evaluation, and what would their
interests be?
Interview Consensus Key players:
Students - Meeting the instructional needs of students is the cornerstone of every effective
distance education program, and the test by which all efforts in the field are judged.
Faculty - The success of any distance education effort rests squarely on the shoulders of the
faculty Special challenges confront those teaching at a distance. For example, the instructor
must:
Facilitators - The instructor often finds it beneficial to rely on a BB of D2L site facilitator to act as
a bridge between the students and the instructor. To be effective, a facilitator must understand
the students being served and the instructor's expectations.
Support Staff - These individuals are the silent heroes of the distance education enterprise and
ensure that the myriad details required for program success are dealt with effectively. Most
successful distance education programs consolidate support service functions to include student
registration, materials duplication and distribution, textbook ordering, securing of copyright
2|Page
3|Page
4|Page
Gary Jechorek
Walden University
Program Evaluation
(EDUC - 6130 - 2)
Stakeholder
Evaluation
Design
Committee
(Faculty,
Adjunct
Faculty,
Statistician /
Academic
Support Staff)
Faculty /
Adjunct
Faculty
Program
Directors
Program
Coordinators
Reporting Strategy
Interviews/Oral/
Written Reports producing
Programs Evaluation
Methods.
Implications
Stakeholder Involvement
Participation of
administrators, faculty,
staff,. in decision-making
makes all of us take
responsibility for the
success of our colleges
and universities and
thereby enhances
everyones commitment to
excellence, efficiency, and
productivity.This
participation promotes
developing and maintaining
high academic quality
(University of WisconsinMilwaukee Governance
Committee, 2013).
Reporting results to course
objectives, results reviewed
for teaching effectiveness.
Data and comments used
for annual review.
Reporting results to course
objectives, results reviewed
for teaching effectiveness.
Data and comments used
for annual review.
Reporting results to course
objectives, results review of
effectiveness to meeting
program objectives.
1|Page
Administrators
:
Statistician /
Academic
Support Staff
Reporting Strategy
Implications
Stakeholder Involvement
Interviews/Oral/
Written Reports of Programs
Evaluation Methods and
Processes.
Possible implications:
Confidentiality of
stakeholders in data
collection process.
Incorrect forms creation.
Forms distribution
problems.
Incomplete instruction for
forms completion.
Incomplete forms
completion by
stakeholders.
Data collection not
completed at all.
Missing forms packets not
delivered to return area.
Scanning problems.
Reading qualitative
comments for typing.
Statistical calculation
problems.
Improper collation of
results, comments and
statistics.
Improper distribution of
results.
2|Page
Reporting Strategy
Written reporting results
all programs
Academic
Affairs Dean
College Dean
Values,
Standards,
and Criteria:
Implications
Stakeholder Involvement
Reporting results to
Nursing Program(s)
comparing results to
objectives, meeting Vision
Statement and Mission
Statement objectives.
Reporting results to
Nursing Program(s)
comparing results to
objectives, meeting Vision
Statement and Mission
Statement objectives.
Accountability
Collaboration
Creativity
Diversity
Excellence
Integrity
Human Dignity
Social Justice
3|Page
4|Page
References
Billings, D. M., & Halstead, J. A. (2013). Teaching in nursing: A guide for faculty: Elsevier Health Sciences.
University of Michigan. (2014). Guidelines for Evaluating Teaching Retrieved August 13, 2015,
from http://www.crlt.umich.edu/tstrategies/guidelines
University of Wisconsin-Milwaukee College of Nursing. (2015). Fast Facts Retrieved August 14, 2015,
from http://www4.uwm.edu/nursing/about/fast-facts.cfm
University of Wisconsin-Milwaukee Governance Committee. (2013). University of Wisconsin Milwaukee Faculty Document No. 2934,
November 21, 2013 Retrieved July 16, 2015, from http://www4.uwm.edu/secu/docs/faculty/2934_SharedGov_Statement.pdf
5|Page