Вы находитесь на странице: 1из 43

LINGOs PMD Pro 1

e-Learning Formative
Evaluation Report
Prepared by
Julaine Fowlin and Wendy Gentry

Prepared For
Dr. Katherine Cennamo
Department of Learning Sciences and Technologies
Virginia Tech

For the Benefit of
Learning with NGOs (LINGOs)

May 3, 2012






1

Table of Contents
Introduction ..................................................................................................................................... 2
Background ..................................................................................................................................... 2
Purpose ............................................................................................................................................ 3
Guiding Questions .......................................................................................................................... 3
Stakeholders .................................................................................................................................... 3
Methods and Instrumentation ......................................................................................................... 4
Sample............................................................................................................................................. 5
Logistics and Timeline .................................................................................................................... 6
Limitations ...................................................................................................................................... 7
Results ............................................................................................................................................. 8
Discussion and Recommendations ............................................................................................... 12


Table of Tables
Table 1: Summary of Pretest and Posttest Results.......................................................................... 8
Table 2: Breakdown of Posttest Scores by Major Content Chunk ................................................. 8

Table of Figures
Figure 1: Break Down Of Pretest and Posttest Scores By Major Content Chunks: Showing Mean,
Median, Mode, Range and Quartiles .............................................................................................. 9
Figure 2: Module Navigational Map............................................................................................. 13

Table of Appendices
Evaluation Instructions .................................................................................................. Appendix A
Introductory Questionnaire ............................................................................................ Appendix B
Knowledge Questionnaire .............................................................................................. Appendix C
Survey ............................................................................................................................ Appendix D
Alignment of Key Questions, Survey Questions and Data Analysis Methods .............. Appendix E
Evaluation Results Related to Achievement of Learning Objectives .............................Appendix F
Evaluation Results Related to Usability of Interface ..................................................... Appendix G
Evaluation Results Related to Learner Engagement ...................................................... Appendix H
Evaluation Results Related to Overall Module ............................................................... Appendix I



2

Introduction
This report is a compilation of the results of a formative evaluation of a self-directed online
project management module. The module is part of a PMD Pro 1 online training program, a
project management base level certification for project managers working in the development
sector. This module was developed by Virginia Tech Instructional Design and Technology
students for delivery to their client, Learning in NGOs (LINGOs). This report outlines the
background of the evaluation, including the purpose, key questions and the intended users of the
evaluation findings. Additionally, the methods, instrumentation, and sample are discussed.
Lastly, the limitations and findings are discussed and the recommendations are provided.
Background
Founded in 2005, LINGOs is a non-governmental organization (NGO) that provides learning
resources to help over 80 international member organizations build staff capacity. Members of
LINGOs serve a wide range of international development and conservation efforts around the
world. Several years ago, LINGOs partnered with an organization to develop a project
management certification (PMD Pro 1 and 2) that was geared towards the unique context of the
development sector. The syllabus for the PMD Pro 1 certification was recently revised and there
was a need for instruction to prepare students for the new version of the examination.
From August 2011 through May 2012, LINGOs partnered with Virginia Techs Instructional
Design and Technology Department via Dr. Cennamo. Dr. Cennamo teaches a course, now
titled EDCI 5594 ID Project Development: Professional Designer, and the LINGOs project was
used to accomplish the learning by doing teaching model. Virginia Tech was assigned the task
of developing five self-directed online modules, each reflecting a phase of the project
management cycle.
Given that the modules were to be completed over a staggered time frame, the evaluation
focused on a single module titled Module 2: Project Identification and Design. Please note that
while this is the second module in the series, knowledge obtained in Module 1 is not required to
successfully master the skills in Module 2.
In addition to using the evaluation results to make revisions to the online module prior to release
to member organizations, the instrumentation provides additional value to key stakeholders. Due
to time and resources constraints, LINGOs management has not performed formative or
summative evaluations on the prior release of the PMD Pro 1 courses. The survey
instrumentation and processes will be made available to the organization for future efforts.

3

Purpose
The purpose of the formative evaluation is to identify the strengths and weaknesses of a self-
directed online project management module. The evaluation collected data to determine: (a) the
extent to which the learning objectives were achieved; (b) the usability of the learning interface;
and (c) the level of the participants engagement in completing the instruction. The results
provided in this report will be used by to modify the module prior to release to LINGOs, its
member organizations and learners.
Guiding Questions
The evaluation plan is designed to address the following questions:
To what extent does the design of the instruction facilitate achievement of the learning
objectives?
To what extent is the interface design usable?
To what extent are learners engaged during the instruction?
These three questions serve as a framework in which the evaluation results are presented within
this report.
Stakeholders
The client and primary recipient of the results of this formative evaluation is Dr. Cennamo,
professor at Virginia Tech. She is responsible for determining which evaluation
recommendations will be incorporated, and overseeing the implementation of any improvement
efforts. Additional primary stakeholders include students enrolled in Dr. Cennamos EDCI 5594
ID Project Development: Professional Designer course, which designed the modules and will be
utilizing the results to support refinement efforts. Secondary stakeholders include Mike Culligan,
the Director of Content and Impact at LINGOs, who is responsible for ensuring that the
instructional modules meet the organizations needs and expectations, and LINGOs Board
Members who desire to provide quality training resources to its member organizations.



4

Methods and Instrumentation
Evaluation Instructions that were provided to each participant via email are included in
Appendix A. The evaluation required participants to complete a four step process as summarized
below:
1. Introductory Questionnaire (Appendix B) - This pretest allowed data to be collected
about their knowledge of project management concepts prior to completing the
instruction.
2. Completion of the Online Module.
3. Knowledge Questionnaire (Appendix C) - This posttest allowed data to be collected
about their knowledge of project management concepts after completing the instruction.
4. Survey This survey was made up of closed and open ended questions related to the
evaluation key questions. It was used to collect data about the participants experience in
reviewing the module (Appendix D).






All surveys were provided to participants online via SurveyMonkey
TM
. In order to discourage
guessing on the Introductory and Knowledge Questionnaires, an I do not know option was
included as an answer option for each question. The answers were presented in random order for
each participant, except for the I do not know option which was always presented as the final
option. The questions were also presented in a random order.
The evaluation instructions and instrumentation were pilot tested with four participants who were
representative of the target population. The collected data was analyzed using both qualitative
and quantitative analysis methods. Open ended questions were coded to reveal themes, and
Likert scale results were analyzed via descriptive statistics. Participant performance on the
Introductory and Knowledge Questionnaire were compared to reveal the extent to which learning
objectives were achieved. A table providing the alignment of the survey questions to the
evaluation key questions as well as the data analysis methods employed is provided in
Appendix E.
Introductory
Questionnaire
Online
Module
Knowledge
Questionnaire
Module
Survey
5

Sample
Based on discussions with LINGOs management, we examined the characteristics of learners
who complete training in preparation for the PMD Pro 1 certification examination. The target
population for the self-directed online training modules is diverse, and is described as having the
following characteristics:
Gender and age: men and women, typically 20 years to 55 years old
Location: located across the globe in both developed and developing countries
NGO Role: serve as a volunteer or employee for an NGO or have the desire to do so
Project Management Training and Experience: varies greatly from those with little
project management training and experience, to those with extensive project management
training and hands on experience
Computer and Software Experience: at minimum have the ability to perform basic
functions on a computer including getting on the Internet, communicating via email,
printing, navigating content, and using basic office productivity software
Language: variety of native languages and multi-lingual abilities
Currently, over 700 individuals per year utilize the asynchronous training materials in order to
prepare for the PMD Pro 1 certification exam. After the revised modules are released, LINGOs
will begin marketing the instruction to its member organizations. As such, there are over 70,000
individuals signed up as participants within the LINGOs system that will have immediate access
to the modules. Next, LINGOs intends to release the training modules into the public domain,
allowing access to any individual with an internet connection.
A combination of a purposive and convenience sampling strategy was used to create a sample
size of nine participants. Two participants were omitted from the final analysis because they
failed to complete all the required evaluation steps. The final seven participants included three
adult females and four adult males. Two participants were from developed countries and five
were from developing countries. All participants were proficient in English, had basic
competency and access to a computer connected to the internet, had little or no formal project
management training and/or certification and had not completed the LINGOs PMD Pro 1
training.


6

Logistics and Timeline
The evaluators were responsible for completing the following tasks:
Identify evaluation participants.
Obtain module assessment questions for the module from the instructional designer.
Develop the three survey instruments for delivery in an on-line environment.
Develop instruction sheet to be provided to evaluation participants.
Obtain evaluation and instrumentation approval from the Institutional Review Board
Perform instrumentation pilot study and modify instrumentation as a result of the findings
from that effort.
Conduct evaluation.
Conduct analysis of data received as a result of the evaluation efforts.
Provide documentation of results to Dr. Cennamo.
Significant project milestones are provided below:
Milestone Target Completion Date
Assess client needs and identify evaluation scope. February 2012
Prepare Evaluation Plan and submit to client for approval. March 2012
Develop and pilot test instrumentation. March 2012
Obtain evaluation and instrumentation approval from the
Institutional Review Board.
March 2012
Receive module from Dr. Cennamo. April 2012
Perform Formative Evaluation. April 2012
Complete data analysis and submit final Evaluation Report. April 2012


7

Limitations
The instructional design group was given very explicit instructions by LINGOs as to what
content to include and were only allowed to use questions specified by LINGOs for formal
assessment. However, they had freedom to use various instructional design strategies and
supporting learning activities. Therefore, the module contains supporting content material which
is not formally assessed at the end of the module. This may present a challenge when evaluating
if learning objectives were achieved since there is not a formal assessment for all the content
covered. At this stage it is not feasible for the instructional designers to create additional
assessment questions for the evaluation. A limitation of this evaluation is that we are only able to
assess the extent to which the design of the instruction facilitates the achievement of the learning
outcomes that are tied to assessments LINGOs provided. In addition since the assessment was
not evaluated for validity we are conducting our evaluation with an assumption that the
assessment is sound. In the future we recommend that the assessment be validated and if any
problems arise, try and show the client the benefits that may be achieved from making changes
and the consequences of not.
Sample size was a very big limitation of the evaluation. We were unable to get sample learners
from LINGOs based on time and other constraints, so the evaluators recruited learners on their
own. The final sample size as mentioned before was seven (7). This is a very small number and
seeing that the module will be used by a large and diverse population it would have been good to
have a sample size of at least 20. Having a small sample size made it difficult to find themes
especially in the open ended responses. It also made us question if the results would have been
the same for some areas if the sample size was larger, for example would the outliers of our data
still be outliers? In the future we recommend over recruiting and negotiating more with the client
to secure/guarantee at least 60% of the desired sample size.
During the data analysis of findings the evaluators wondered about the relationship of some of
the variables, for example, if there was a relationship between the participants preferred learning
styles and some of the views they had about the module. Clarification or further discussion with
participants could have enriched the findings and recommendations. However, the IRB did not
include post survey discussions with participants, so this could not be done. In the future as part
of the instrument and IRB have an option where participants can check a box if they agree to be
contacted after the survey for further discussion/clarification.



8

Results
The formative evaluation was designed so that the key questions, data collection and data
analysis methods were aligned as illustrated in Appendix E. Therefore, to make the results more
meaningful these key questions serve as a framework for the discussion to follow. The results are
reported using percentages, means, modes and medians. The inclusion of range and percentiles in
addition to several measures of central tendency improve the validity of the results and allow us
to make more informed recommendations.
Extent that design of instruction facilitated achievement of learning objectives
As highlighted in Appendix E, an analysis of survey question responses and a comparison of
pretest and posttest scores were used to evaluate the extent that the design of the instruction
facilitated the achievement of the learning objectives.
Pretest and Posttest Results
Table 1 below provides the difference in the scores of the pretest and posttest. Results indicate
an increase in scores, from a mean pretest score of 15.3% to a mean posttest score of 83.7%. The
mode also contributes to the evaluation findings and further highlights the dramatic increase in
scores moving from a pretest mode of 0% to a posttest mode of 92.9%.




The posttest scores were further analyzed on a micro level (see Table 2) to assess if there were
any differences in performance across the three major content chunks within the module. This
analysis revealed that Data Collection had the highest mean of 90.5% while Data Analysis had
the lowest mean of 77.5%. Data Analysis also had the lowest mode and median as well. The
other two areas had modes of 100% while Data Analysis had a mode of 85.7%.
Table 2: Breakdown of Posttest Scores by Major Content Chunk
Content Chunk
Mean
Median
Mode
Date Collection 90.5 100 100
Data Analysis 77.5 85.7 85.7
Project Logic 89.3 100 100
Table 1: Summary of Pretest and Posttest Results
9

Figure 1 provides a comprehensive comparison of all the data collected on the pretest and
posttest. It illustrates the mean, median and mode as discussed, as well as the range (whiskers on
bars) and a representation of quartiles (i.e. 50% of all scores are represented by the blue bar).
Again, please note the lowest posttest mean score of 77.5% which participants achieved on Data
Analysis related assessment items (circled in orange).






















Figure 1: Break Down Of Pretest and Posttest Scores By Major Content
Chunks: Showing Mean, Median, Mode, Range and Quartiles
10

Survey Question Results for Achievement of Objectives
Detailed results of the survey questions related to the achievement of objectives is provided in
Appendix F. The survey (Appendix D) used a Likert scale of Strongly Agree (SA), Agree,(A)
Disagree (D), Strongly Disagree (SD). In analysing the results we assigned numbers to the scale
as follows: SA-4, A-3, D-2, SD-1 and calculated the mean, median and mode. Overall, results
indicate that participants thought that it was made clear what they were expected to learn, survey
item has mean of 3.3 and a mode of 3.0.
A discrepancy was observed between posttest scores and survey results for the content chunks.
As aforementioned, based on the posttest scores the Data Analysis chunk had the lowest mean.
However, when partipants were asked to rate their confidence in their knowledge of the various
content chunk areas, Data Analysis had the highest mean of 3.3 and Project Logic had the lowest
mean of 2.7. Nonetheless all areas had a mode of 3.0.
In terms of presentation of content the following areas stood out as they all had a mode of 4.0
and a mean within the range of 3.4 to 3.6:
Scenario based approach supported learning
Practice reinforced learning
Feedback from practice was relevant
Confidence in answering the assessment questions

Participants open ended responses endorsed the scenario based approach and the practice
activities. One participant stated, I felt like someone was actually speaking to me and
another stated, The practice opportunities helped reinforce my learning.
The open ended responses also revealed some areas of weeknesses. Some people indicated that
more practice exercisest could have been included. Grammatical errors were noted as well.
Knowing the time for completion seemed to be an issue, one participant stated, It will be helpful
to know how long the module is, so I suggest time allotment for the module and/or practices.






11

The Extent that Interface Design was Usable
Detailed results of the survey questions related to usability of the interface are provided in
Appendix G. Overall this area was the strongest area of the module resulting in a mean of 3.5.
Additionally, it is the only area that had both a mean and a median of 4.0.
In terms of usability of interface design the following areas stood out as they all had a mode of
4.0 and a mean within the range of 3.5 to 3.6:
Navigation allowed participants the freedom to explore
Instructions were easy to follow
Screen content was easy to read

Open ended responses revealed weaknesses related with the usability of the interface. For
example, not all participants noticed the navigational arrows, and not all were aware of what
stage of the module they were in as they progressed through the lesson.
The Extent that Learners were Engaged during the Instruction
Detailed results of the survey questions related to learner engagement are provided in Appendix
H. All participants agreed that the module was enjoyable and interesting and that the content was
meaningful. Another strength of the module that stood out as the only item in this section to have
a mode of 4.0 was that participants were motivated to complete the module.
Connection to characters and interest in the storys progression had the lowest mean of 2.7, but
had a mode of 3.0. The open ended responses gave mixed feedback on the use of the characters
and explains why the related Likert survey item resulted in a lower mean. Participant statements
included, I enjoyed the material presented by the use of characters engaging in simple
conversation, while providing situational examples of each. I also thought the body/facial
gestures were engaging, specific the 'lightbulb' moments in gestures were timed appropriately
with myself learning and forming those associations with the material. Participants that were
not supportive of the use of characters made statements such as, The characters weren't
integral to my interest in the lesson so I would have continued whether or not there were
characters and a storyline.
Some participants noted that their level of motivation decreased when the material became very
technical as stated by one participant, My motivation changed during the lesson especially when
there was a lot of technical information.


12

Discussion and Recommendations
Design of Instruction to Facilitate Achievement of Learning Objectives
The evaluation results on the extent that the design of instruction facilitated achievement of
objective highlight the value in collecting evaluation data from multiple sources. Data collection
seemed to be a content area that is well covered as participants felt confident in their knowledge
of this area and performed very well. Data Analysis and Project Logic were two areas that
participants reported disagreeing levels of confidence in the knowledge. The lowest confidence
level was related to Project Logic. Nonetheless, as discussed in our results, performance on the
posttest was better in Project Logic than in Data Analysis.
After seeing the low mean of the Data Analysis section we compared the structure of this section
to the other two content areas and discovered that it did not have any practice exercises. Lack of
practice exercises could have accounted for the difference especially given that students felt that
the practice exercises helped in their learning. In addition, it would be good to examine the
nature of the assessment questions to see if they are any differences. The fact that students did
better in project logic than they expected means it was taught well, but their confidence level is
still an area of concern. Based on the data findings we would like to propose the following
recommendations for improvements to facilitate increased achievement of learning objectives.
Include practice exercises in the Data Analysis content chunk.
Examine the content for Project Logic and assess what could have contributed to
students having such a low level of confidence. Perhaps the complexity of the
topic could be reduced.
Have someone proof read the module and make final edits.
Include an estimated time to complete the module.
Have the assessment instrument validated to make sure there is an alignment
between objectives, content and assessment.
Interface Design Usability
The module was very strong in usability design as indicated in the results, as one participant puts
it, The interface design was well organized and easy to read and follow another stated that
The slides are consistent which makes it easier to follow the lesson.
We had challenges coming up with recommendations as in this section only two weaknesses
were identified each by only one participant. However, when we examined the suggestions and
determined that at least one of them could be helpful to other learners if changed. That is,
making it clearer where learners are in the module.
13

A map is included at the beginning of the module, and each content chunk (See Figure 2) and
this navigational number is in the top left hand corner of the screens when the topic changes
within the module. However, we recommend providing an explanation of this approach at the
beginning of the module so learners may be on the lookout for it as they progress through the
lessons.
Figure 2: Module Navigational Map










Learners Engagement During Instruction
The survey items that received the highest ratings in this section reflect that the overall module
was engaging. It is commendable that most participants strongly agreed that they were motivated
to complete the module. The mixed reaction to the characters may be due to a difference in
learning style and not necessarily that the characters were not incorporated well. In fact, the
characters were part of the scenario based approach that received high ratings under content
presentation.
The fact that it was mentioned that motivation decreased when the material became technical
highlights how different areas of a learning context affect each other. Project Logic, the content
chunk that had the lowest knowledge confidence level is the most technical content area in the
module. There may be a relationship between their motivation level and their confidence in their
knowledge. Please note that there is no empirical evidence to prove this, it is just one possible
explanation that could be further examined.


14

Based on the data the following are recommended:
Provide learners the option to view the content without the characters (for example,
provide a text-based outline of the content).
Review how technical material is presented and determine if it is possible to reduce the
cognitive load.
In summary, the module was well developed with most participants agreeing to all the items in
the survey. Despite the overall strength of the module there are still areas of improvement as
discussed. If integral recommendations are implemented the evaluators are confident that this
module will facilitate learning of objectives while keeping learners engaged. The measures of
central tendency greatly helped in identifying the strengths and weaknesses of the module.


Appendix A - 1

Appendix A
Evaluation Instructions
Thank you for agreeing to evaluate this module from the PMD Pro 1 online training program!
Your input will remain confidential and will only be used to make improvements to the module
prior to release to learners.
The evaluation process involves the completion of 4 steps. You will have access to everything
online. If you have any questions or need further clarification, please do not hesitate to contact
one of the evaluators below.
Wendy Gentry: wag@vt.edu
Julaine Fowlin: jfowlin@vt.edu

Step 1: Complete the Introductory Questionnaire so that we may learn a little about you
and your knowledge of project management concepts. This questionnaire be accessed
through the following link:
/https://www.surveymonkey.com/s/LINGOSPMDPRO1INTRODUCTION

Step 2: Complete Online Module. This module may be accessed through the following
link:
http://dl.dropbox.com/u/53253012/Module%202%20-%20March29/player.html


Step 3: Complete the Knowledge Questionnaire which contains questions related to the
content examined in the online module. This questionnaire may be accessed through the
following link:
https://www.surveymonkey.com/s/LINGOSPMDPRO1KNOWLEDGE


Step 4: Complete the Survey which contains questions related to the online module. This
survey may be accessed through the following link:
https://www.surveymonkey.com/s/LINGOSPMDPRO1SURVEY
Sincerely,
Wendy and Julaine





Appendix B - 1

Appendix B
Introductory Questionnaire


Appendix B - 2


Appendix B - 3


Appendix B - 4


Appendix B - 5


Appendix B - 6


Appendix B - 7


Appendix C - 1

Appendix C
Knowledge Questionnaire

Appendix C - 2


Appendix C - 3


Appendix C - 4


Appendix C - 5


Appendix D - 1

Appendix D
Survey


Appendix D - 2


Appendix D - 3


Appendix D - 4


Appendix D - 5


Appendix E - 1

Appendix E
Alignment of Key Questions, Survey Questions and Data Analysis Methods
Key Question 1
To what extent does the design of the instruction facilitate achievement of the learning objectives?

Survey Question
Data
Analysis Method

Below are a series of statements related to what you were expected to learn in the
module, please indicate whether you: Strongly Agree, Agree, Disagree or Strongly
Disagree.

It was made clear what I was expected to learn from the module.
Data Collection Phase: I know the factors that influence data collection
including: triangulation, community needs and stakeholders.
Data Analysis Phase: I know the approaches and tools for data collection
including: problem and asset based approaches, project interventions, problem
and objective trees.
Project Logic Phase: I know the features of the project logical framework
including: vertical logic, horizontal logic and key parameters.
Below are a series of statements related to the module's Content, please indicate
whether you: Strongly Agree, Agree, Disagree or Strongly Disagree.
The vocabulary in the module was easy to understand.
The content was adequately broken down into subsections.
The content was well sequenced and flowed logically.
The Delta River Municipality examples related to the content and supported
my learning.
The practice activities helped reinforce my learning.
The number of practice activities was adequate.
The feedback from the practice activities was relevant.
The questions on the assessment at the end of the module were appropriate for
the instruction.
I felt confident when answering the assessment questions at the end of the
module.

The data was analyzed
quantitatively by
assigning numbers
from 1-4 to the Likert
Scale options. With 4
being Strongly Agree,
3 Agree, 2 Disagree
and 1 Strongly
Disagree.

Based on the total
participant responses
for each item,
percentage responses
were calculated and
reported for each scale
option.

After which the mean,
mode and median for
each survey item was
calculated.

The analysis was taken
a step further by also
calculating the mean,
median and mode for
overall sections of the
survey.

We compared the
central tendencies and
identified the range.

For the open ended
questions the data was
synthesized to identify
common themes.

For the pretest and
posttest scores the
overall mean, median
and mode were
calculated. After
Appendix E - 2


If you answered "Strongly Agree" or "Agree" to any of the items above please tell
us what you liked about the CONTENT of the module.
If you answered "Strongly Disagree" or "Disagree" to any of the items above please
tell us what you did not like about the CONTENT of the module and suggest some
changes for improvement.
Note: In addition to the survey questions above, the participants performance on
the Introductory and Knowledge Questionnaires are compared. These instruments
test the participants knowledge of project management concepts both before and
after viewing the module. These project management questions are provided in
Appendix B and C.
which we calculated
the mean, median and
mode for each content
chunk.

We also examined the
ranges and percentile
of each content chunk.

We compared the pre
and posttest results.

Key Question 2
To what extent is the interface design usable?

Survey Question
Data
Analysis Method
Below are a series of statements related to the module's Interface Design, please
indicate whether you: Strongly Agree, Agree, Disagree or Strongly Disagree.
It was easy to navigate the module.
The overall appearance of the module was appealing.
The organization of the module was clear.
Images and diagrams were used effectively to help me obtain additional
information.
The navigation allowed me the freedom to explore the module.
The instructions were easy to follow.
The content provided on the screens was easy to read.
I always knew where I was during the course.
The fonts were easy to read.
The practice activities were easy to navigate and I understood what I was
expected to do.
If you answered "Strongly Agree" or "Agree" to any of the items above please tell
us which INTERFACE DESIGN feature(s) (i.e. aesthetics, ease of use, navigation)
that you think we should definitely keep in the module.
If you answered "Strongly Disagree" or "Disagree" to any of the items above please

The data was analyzed
quantitatively by
assigning numbers
from 1-4 to the Likert
Scale options. With 4
being Strongly Agree,
3 Agree, 2 Disagree
and 1 Strongly
Disagree.

Based on the total
participant responses
for each item,
percentage responses
were calculated and
reported for each scale
option.

After which the mean,
mode and median for
each survey item was
calculated.

The analysis was taken
a step further by also
calculating the mean,
median and mode for
overall sections of the
survey.

Appendix E - 3


tell us which INTERFACE DESIGN feature(s) (i.e. aesthetics, ease of use,
navigation) you did not like and suggest some changes for improvement.
Key Question 3
To what extent are learners engaged during the instruction?

Survey Question
Data
Analysis Method
Below are a series of statements related to your Interest and Engagement while
completing the module, please indicate whether you: Strongly Agree, Agree,
Disagree or Strongly Disagree.
The module was enjoyable and interesting.
I was motivated to complete the module.
I felt connected to the characters and was interested to see how the story
progressed.
My interest was sustained throughout the module.
The practice opportunities were rewarding.
The content was meaningful.
The module stimulated my curiosity.
I felt satisfied after completing the module.
If you answered "Strongly Agree" or "Agree" to any of the items above please tell
us what it was about the module that helped to KEEP YOU INTERESTED AND
ENGAGED during the lesson.
If you answered "Strongly Disagree" or "Disagree" to any of the items above please
tell us what caused you to become LESS INTERESTED OR TO LOSE YOUR
FOCUS during the lesson and suggest some changes for improvement.






The data was analyzed
quantitatively by
assigning numbers
from 1-4 to the Likert
Scale options. With 4
being Strongly Agree,
3 Agree, 2 Disagree
and 1 Strongly
Disagree.

Based on the total
participant responses
for each item,
percentage responses
were calculated and
reported for each scale
option.

After which the mean,
mode and median for
each survey item was
calculated.

The analysis was taken
a step further by also
calculating the mean,
median and mode for
overall sections of the
survey.

Appendix F - 1

Appendix F
Evaluation Results Related to Achievement of Learning Objectives

Responses to Closed-Ended Questions






Survey Statement

Percentage of Responses
by Survey Statement
Measures of Central
Tendency
by Survey Statement
Strongly
Agree
4

Agree
3

Disagree
2
Strongly
Disagree
1


Mean


Mode


Median
It was made clear what I was expected
to learn from the module.
28.6% 71.4% - - 3.3 3.0 3.0
Data Collection Phase: I know the
factors that influence data collection
including: triangulation, community
needs and stakeholders.
14.3% 71.4% 14.3% - 3.0 3.0 3.0
Data Analysis Phase: I know the
approaches and tools for data analysis
including: problem and asset based
approaches, project interventions,
problem and objective trees.
28.6% 71.4% - - 3.3 3.0 3.0
Project Logic Phase: I know the features
of the project logical framework
including: vertical logic, horizontal logic
and key parameters.
14.3% 42.9% 42.9% - 2.7 3.0 3.0
Summary for All Above Survey Statements 3.1 3.0 3.0
The vocabulary in the module was easy
to understand.
42.9% 42.9% 14.3% - 3.3 3.0 3.0
The content was adequately broken
down into sub-sections
42.9% 57.1% - - 3.4 3.0 3.0
The content was well sequenced and
flowed logically.
42.9% 57.1% - - 3.4 3.0 3.0
The Delta River Municipality examples
related to the content and supported my
learning.
57.1% 42.9% - - 3.6 4.0 4.0
The practice activities helped reinforce
my learning.
57.1% 28.6% 14.3% - 3.4 4.0 4.0
The number of practice activities was
adequate.
28.6% 57.1% 14.3% - 3.1 3.0 3.0
The feedback from the practice
activities was relevant.
57.1% 28.6% 14.3% - 3.4 4.0 4.0
The questions on the assessment at the
end of the module were appropriate for
the instruction.
42.9% 57.1% - - 3.4 3.0 3.0
I felt confident when answering the
assessment questions at the end of the
module.
57.1% 28.6% 14.3% 0.0% 3.4 4.0 4.0
Summary for All Above Survey Statements 3.4 4.0 3.0
Appendix F - 2


Responses to Open-Ended Questions

If you answered "Strongly Agree" or "Agree" to any
of the items above please tell us what you liked about
the CONTENT of the module.
If you answered "Strongly Disagree" or "Disagree"
to any of the items above please tell us what you did
not like about the CONTENT of the module and
suggest some changes for improvement.

I enjoyed that the content presented examples
simply because it brings clarity to understanding
the material.

the content was clear and broken into easy-to-
learn chunks.

The number of practice activities could have been
increased. Also, the complexity of the activity
could help with a thorough appreciation of the
topic, possibly by using examples not related to
the Delta River project

While completing the course I felt like someone
was actually speaking to me. So I think the content
is well put together. There are a few grammatical
errors though and one of the questions on the
survey has an error in the responses. The
responses outlined were: Goal, outcome, activities,
and outcome.

I liked how the concepts were explained and that
the content builds on what was learned in previous
lesson. This provides the learner with a bigger
picture of the module.


Practices are definitions of terms, to reinforce
learning perhaps scenarios relevant to the lesson
should be considered. Depending on the time
available and what is to be achieved then the
practices maybe increased or reduced. Anyway,
the assessment covers most of the questions asked
in practices.
Appendix G - 1

Appendix G
Evaluation Results Related to Usability of Interface


Responses to Closed-Ended Questions






Survey Statement

Percentage of Responses
by Survey Statement
Measures of Central
Tendency
by Survey Statement
Strongly
Agree
4

Agree
3

Disagree
2
Strongly
Disagree
1


Mean


Mode


Median
It was easy to navigate the module.
57.1% 28.6% 14.3% - 3.4 4.0 4.0
The overall appearance of the module
was appealing.
42.9% 57.1% - - 3.4 3.0 3.0
The organization of the module was
clear.
42.9% 57.1% - - 3.4 3.0 3.0
Images and diagrams were used
effectively to help me obtain additional
information.
42.9% 57.1% - - 3.4 3.0 3.0
The navigation allowed me the freedom
to explore the module.
57.1% 42.9% - - 3.6 4.0 4.0
The instructions were easy to follow.
57.1% 42.9% - - 3.6 4.0 4.0
The content provided on the screens
was easy to read.
57.1% 42.9% - - 3.6 4.0 4.0
I always knew where I was during the
course.
42.9% 42.9% 14.3% - 3.3 3.0 3.0
The fonts were easy to read.
42.9% 57.1% - - 3.4 3.0 3.0
The practice activities were easy to
navigate and I understood what I was
expected to do.
57.1% 28.6% 14.3% - 3.4 4.0 4.0
Summary for All Above Survey Statements 3.5 4.0 4.0

Appendix G - 2

Responses to Open-Ended Questions

If you answered "Strongly Agree" or "Agree" to any
of the items above please tell us which INTERFACE
DESIGN feature(s) (i.e. aesthetics, ease of use,
navigation) that you think we should definitely keep in
the module.
If you answered "Strongly Disagree" or "Disagree"
to any of the items above please tell us which
INTERFACE DESIGN feature(s) (i.e. aesthetics, ease
of use, navigation) you did not like and suggest some
changes for improvement.

the colors, images, the buttons at the bottom

Interface design well organized and easy to read and
follow.

The course is good as it is. However, if the resources
are available you could consider adding audio

The slides are consistent which makes it easier to
follow the lesson.

Although there are blocks at the base of the screen to
follow the lesson. Due to the length of entire lesson, I
wasn't sure where I was within the lesson... Maybe a
second level of blocks could be placed below the
"Primary level" - Data Collection, Project Logic etc
which indicates step 1, 2, 3 etc as outlined in the
introductory slide of each lesson.

It was not easy to see the navigation arrows at the
beginning but latter I became comfortable. Perhaps the
arrow keys should be moved to the bottom right or
make the color darker (similar to the yes/no buttons on
the first slide). The practices i did not know what to
expect and they were not interesting.

Appendix H - 1

Appendix H
Evaluation Results Related to Learner Engagement


Responses to Closed-Ended Questions






Survey Statement

Percentage of Responses
by Survey Statement
Measures of Central
Tendency
by Survey Statement
Strongly
Agree
4

Agree
3

Disagree
2
Strongly
Disagree
1


Mean


Mode


Median
The module was enjoyable and
interesting.
28.6% 71.4% - - 3.3 3.0 3.0
I was motivated to complete the
module.
42.9% 42.9% 14.3% - 3.3 4.0 3.0
I felt connected to the characters and
was interested to see how the story
progressed.
14.3% 57.1% 28.6% - 2.9 3.0 3.0
My interest was sustained throughout
the module.
28.6% 42.9% 28.6% - 3.0 3.0 3.0
The practice opportunities were
rewarding.
14.3% 71.4% 14.3% - 3.0 3.0 3.0
The content was meaningful.
28.6% 71.4% - - 3.3 3.0 3.0
The module stimulated my curiosity.
28.6% 57.1% 14.3% - 3.1 3.0 3.0
I felt satisfied after completing the
module.
28.6% 57.1% 14.3% - 3.1 3.0 3.0
Summary for All Above Survey Statements 3.1 3.0 3.0

Appendix H - 2

Responses to Open-Ended Questions

If you answered "Strongly Agree" or "Agree" to any
of the items above please tell us what it was about the
module that helped to KEEP YOU INTERESTED
AND ENGAGED during the lesson.

If you answered "Strongly Disagree" or "Disagree"
to any of the items above please tell us what caused
you to become LESS INTERESTED OR TO LOSE
YOUR FOCUS during the lesson and suggest some
changes for improvement.

I enjoyed the material presented by the use of
characters engaging in simple conversation, while
providing situational examples of each. I also
thought the body/facial gestures were engaging,
specific the 'lightbulb' moments in gestures were
timed appropriately with myself learning and
forming those associations with the material.

The subject was interesting. The practice
opportunities were good. Completing the module
made me realized how technical this subject is.
And I am willing to learn more about it.

The practice opportunities helped to reinforce the
lesson learned. Likewise, the quiz at the end
helped in recalling all that was taught in the
module.

Module was designed for a good learning
environment.

The module was well designed. The content,
objectives and activities were all aligned. I could
tell that a lot of hard work and thought went into
the design and structure of the module. Job well
done!

Depending on the target audience, the content is
written in simple English and no verbiage.
Moreover the pictures makes the lesson interesting
and are coherent with the text. The maps at the
beginning provides a summary that helps to know
what to expect.


My motivation changed during the lesson
especially when there was a lot of technical
information.

The characters weren't integral to my interest in
the lesson so I would have continued whether or
not there were characters and a storyline.

I dont understand what purpose do the practices
serve, since they are definitions of project terms. I
wish there was on option to skip practice.
However, the assessment at the end is necessary.




Appendix I - 1

Appendix H
Evaluation Results Related to Overall Module


Responses to Open-Ended Questions

There is always room for improvement!
If you could change anything about this module, what would it be and why?

I notice 3 mistakes (repeated words or typo now vs how). The practice activities were quite simple, so
added complexity could reinforce the lesson.

If the team decides to keep the practices as they are now, then the assessment questions should be changed
because it is repetition. However, the decision should be based on the objective of the module.

If you have other suggestions, comments, and recommendations about this module or the evaluation process,
please write them here. Your contributions are welcomed and appreciated!

I'm not sure who the audience of this tutorial is, if they are involved in community health or NGO related
activities then this could be helpful. However, if the audience is in a different field, then the examples used
or "storyline" may not be the most relevant.

It will be helpful to know how long the module is so I suggest time allotment for the module and/ or
practices.

Вам также может понравиться