Вы находитесь на странице: 1из 6

2014 American Control Conference (ACC)

June 4-6, 2014. Portland, Oregon, USA

Applying Problem-Based Learning to Instruction of System Dynamics


and Controls
Nuno Filipe1 and Amy Pritchett2

Abstract Problem-Based Learning is a pedagogy emphasizing student learning via solving authentic, difficult problems
in teams. This paper describes the application of ProblemBased Learning in a large system dynamics and controls
course for third-year undergraduate students in aerospace
engineering at the Georgia Institute of Technology. The pros
and cons of applying Problem-Based Learning to the teaching
of controls are reviewed. The course design and implementation
are described. Student opinions and grades (in this course and
subsequent courses) are analyzed. The paper concludes with
a discussion of lessons learned and recommendations for the
application of problem-based learning.

I. INTRODUCTION
Problem-Based Learning (PBL) is an instructional method
which requires teams of students to solve a tough, authentic
problem as a means to an end. The problem is carefully
crafted by the instructor to require students to:
- identify what knowledge they need to acquire;
- reflect on the applicability of this information to the
problem and their depth of understanding;
- develop the skills essential to problem solving within
the domain;
- stay active in learning activities spanning gathering
information, assessing and applying knowledge as it is
gained, and problem solving; and
- assume ownership and responsibility for their learning.
Thus, PBL fits within the broader construct of instructional
methods focusing on active learning. Such methods can
include simple interventions within lecture-based instruction
such think-pair-share activities to promote active reflection
and discussion on specific elements of content, while leaving
the instructor to determine which knowledge is presented
to the students. PBL, however, further emphasizes that the
students should also learn the skills inherent to solving
problems, including identifying which information they need
to find, reflecting upon their knowledge as it is developed
relative to its contribution towards solving the problem, and
learning how to break down the problem itself. Further, PBL
generally expects students to work within teams for two
main reasons: first, to develop team skills in the common
context of a problem too large for any one engineer to solve;
and second, to promote the benefits of near-peer instruction
between students as they individually master concepts and
then find they need to also teach these to their team mates
[1], [2].
1 N. Filipe is a Ph.D. candidate at the School of Aerospace Engineering,
Georgia Institute of Technology, Atlanta, GA 30332-0150, USA. Email:

nuno.filipe@gatech.edu
2 A. Pritchett is the David S. Lewis Associate Professor at the School
of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA
30332-0150, USA. Email: amy.pritchett@ae.gatech.edu

978-1-4799-3274-0/$31.00 2014 AACC

The intended benefits of PBL may be framed in terms of


the intended student outcomes associated with accreditation
of engineering programs as defined by the Accreditation
Board for Engineering and Technology (ABET) [3]:
(a) an ability to apply knowledge of mathematics, science,
and engineering
(b) an ability to design and conduct experiments, as well
as to analyze and interpret data
(c) an ability to design a system, component, or process
to meet desired needs within realistic constraints such
as economic, environmental, social, political, ethical,
health and safety, manufacturability, and sustainability
(d) an ability to function on multidisciplinary teams
(e) an ability to identify, formulate, and solve engineering
problems
(f) an understanding of professional and ethical responsibility
(g) an ability to communicate effectively
(h) the broad education necessary to understand the impact
of engineering solutions in a global, economic, environmental, and societal context
(i) a recognition of the need for, and an ability to engage
in life-long learning
(j) a knowledge of contemporary issues
(k) an ability to use the techniques, skills, and modern
engineering tools necessary for engineering practice.
First, the active learning aspects of PBL are generally held
to foster students knowledge development beyond the passive
learning endemic to traditional lecturing, thus contributing to
ABET student outcome (a) [2], [4]. Second, PBL further has
a unique capability to also potentially address the other student outcomes (b) through (k) when the problem is carefully
designed to require students to demonstrate (and, perhaps,
acquire) their associated abilities. PBL problems inherently
focus on teams (outcome (d)) and problem solving (outcome
(e)). Students are required to learn on their own, providing
them with the inquiry and research skills fostering lifelong learning (outcome (i)). Students are required to apply
techniques, skills and modern engineering tools involved in
engineering practice (outcome (k)), and to design something
(outcome (c)). PBL problems can also be designed to require some experimentation (outcome (b)), to consider some
aspects of professional and ethical responsibility (outcome
(f)), and to understand the impact of their solution in context
(outcome (h)) and relative to contemporary issues (outcome
(j)). Finally, PBL is commonly constructed to require final
presentations and written submissions graded relative to
standards for effective communication (outcome (g)).

2530

Throughout, PBL requires students to assume responsibility for their own learning and for their teams ultimate
solution to the problem. However, PBL should not be implemented as simply handing students a problem and then
standing back to see what they do with it [5]. Instead, the
problems should be carefully designed to implicitly require
the desired learning activities by the students. Then, the
course must be carefully implemented such that the student
teams are carefully facilitated, requiring the students to properly reflect on their activities and recognize the knowledge
and skills they need to attain. Throughout, the students
should be consistently challenged to attain the desired level
of understanding and capability.
When the class size is small and the learning objectives
are inherently hands-on, PBL can be easy to implement and,
indeed, has been widely applied. Some domains, notably
medical schools, have so extensively applied PBL that their
curriculum may be designed around it. However, the application of PBL to large engineering classes is novel, as is
its application to the instruction of the theory underlying
system dynamics and controls. This paper describes the
design and implementation of a large junior-level course
titled System Dynamics and Automatic Control. Its outcomes
are analyzed in terms of both student opinions and in
terms of learning outcomes as identified by comparing final
exam scores, course grades, and subsequent progress of the
students through the curriculum between the section using
PBL and other sections applying a lecture-and-homework
instructional method. The paper concludes with a summary
of lessons learned and implications for other instructors in
teaching similar topics in engineering curricula.
II. COURSE DESIGN
In designing a PBL course on system dynamics and
controls, the first and most crucial step is to detail the
course objectives. Common curricula only specify high-level
goals; these must be further broken down into detailed
learning objectives spanning the ABET skill set (roughly 180
detailed objectives). For this course, these learning objectives
included about 150 content-related objectives of a typical
lecture-based class (ABET (a)), such as:
i) ability to break dynamic systems down into component
dynamics which can be modeled as first and second
order systems,
ii) use toolset of fundamental modeling concepts to create
useful models of system dynamics, and
iii) use toolset of controller design concepts to create useful
compensators.
PBL also has the unique ability to address roughly 30 skillrelated objectives for this course (ABET (b) through (k)),
such as:
iv) model a physical system in a manner allowing for
analysis of system dynamics and for controller design,
v) design a control system to customer requirements, and
vi) present to a technical audience using proper terminology
and mathematical models.
For each course objective, detailed course content and criteria for mastery were identified. After several months of
brainstorming, consulting with experts in PBL and with other

technical experts, these course objectives were then grouped


thematically as best as possible into four problems:
1) Design of a Seismograph, where the students had to,
among other things:
a) design and build a prototype seismograph using cheap
common materials, and
b) develop the systems transfer function and identify
experimentally its parameters;
2) Structural Dynamic Model of a Bending Wing and
Supporting Aerodynamic Analysis, where the students
had to, among other things:
a) create a structural dynamics model of a B787 wing
using the panels method,
b) evaluate the deflection and twist of the wing in
response to constant loads, sudden turbulence, and
deflections of the aileron, and
c) simulate the complete model in Matlab;
3) Design of a Heading Controller for an Air Transport
Aircraft, where the students had to, among other
things:
a) develop a PID heading controller and implement and
demonstrate it in a provided computer simulator of a
Boeing 747-400 programmed in C++, and
b) meet control performance requirements specified by
the client; and
4) Design of a Guidance Controller for a Launch Vehicle, where the students had to, among other things:
a) develop a pitch controller for a launch vehicle with
unstable dynamics and a lightly damped flexible body
mode,
b) meet control performance requirements specified by
the client, and
c) make a trade-off analysis between SISO, MIMO, and
full-state feedback control.
The wording of each problem statement was carefully tailored to steer the students towards the desired course objectives. For example, the statement of Problem 3 noted that, in
questioning during their final presentation, the presenter may
be asked to hand-sketch a root locus illustrasting the impact
of any changes in the aircraft flight dynamics on the predicted
performance of their proposed controller design; thus, a
standard was given to students for learning that required
sketching a root locus and understanding how to predict
stability, transient response and frequency response from the
location of closed-loop poles on the complex plane. Projects
were cumulative; i.e., each project added new content and
supported prior content. A unique scoring rubric was created
for each project based on the course objectives planned for
each project.
Further, the required deliverables of the problems were
carefully detailed to include a presentation, a report and, in
some problems, models or simulations conducted in Matlab.
Along with the problem statement, the student teams were
provided with a detailed rubric that highlighted the standards
that the problems must attain, particularly emphasizing the
use of underlying physics-based, mathematical models to
drive their solutions rather than a control design made by
trial-and-error. To further require learning by all students

2531

within the team, the requirement for the presentation noted


that one student would be chosen at random out of the
group to give the presentation for the entire team; this was
specifically designed into the problem to encourage peer
instruction and team work.
III. COURSE IMPLEMENTATION
The implementation of the course design described in
Section II started with the release of the course syllabus to the
students. The syllabus laid out the ground rules of PBL to the
students for the first time. It made sure the project deadlines
were clear from the first day of class and established the
grade cut-offs.
Whereas lectures were generally anticipated by the instructor, specific dates and content were not built into the
syllabus, with the exception of review/debrief lectures after
every project. Lectures where only offered when students
could articulate what they wanted to know and why, at
a level illustrating genuine reflection on the recommended
references. When individual groups could articulate their
own isolated stopping points, in addition to the facilitators
feedback, the instructor attended a group meeting for a
focused tutorial. Finally, four office hours per week were
established in the syllabus with the possibility of extra office
hours per arrangement.
During the first day of classes, groups were randomly
formed and, for each one of them, project sites were generated in the universitys online learning management system.
With this system, group members could share documents
and collaborate with each other online. In total, 10 groups
were formed, each containing 12 or 13 students. Starting
with the second class period, most contact time between the
students and the teaching staff was during scheduled class
times used for teams to meet with each other, each time
for one hour and thirty minutes. Every two groups had to
share a room and a facilitator. The facilitators were present
in every group meeting and divided their time between the
two groups assigned to them; the instructor rotated around
between the rooms to monitor the overall progress of each
group, answer specific questions and, on occasion, provide a
mini-lecture in response to a cogent request from a student
team. The facilitators were especially mentored on how to
use their time efficiently with each group (for key training
materials, see [6], [5]). Every week, the facilitators were in
charge of scoring the inquiry/progress reports turned in by
the students, of providing feedback to the students on their
performance (individually and as a group), and of keeping
the instructor informed of the progress of the groups. It was
the task of the instructor to oversee this scoring and mentor
the facilitators, especially at the beginning.
One of the goals of the PBL method is to teach the students
how to search for relevant technical information on their
own. Hence, while some references were provided to help
the students complete the projects (notably a recommended
textbook, and videos of past lectures and past homework
with solutions as further examples of applications of system
dynamics and controls to aerospace), the students were also
mentored on how to search rigorous data sources for further
research such as databases of peer-reviewed journal articles.

Detailed assessments were made throughout the semester


that were both summative and formative. The summative
assessments were designed to identify and quantify the
students learning at key points throughout the semester
(particularly at the conclusion of each problem), and the
formative assessments were intended to direct the students
to higher-performance behaviors. Examples of the summative
assessments included:
i) project scores,
ii) midterms after each project, and
iii) final peer evaluation after the last project.
Examples of the formative assessments included:
i) weekly inquiry/progress updates delivered by the students,
ii) comments by the facilitators on those inquiry/progress
updates and on the performance of the group during
group meetings (facilitator comments were based on
the scoring rubric for each project to promote high
standards and inter-rater reliability),
iii) interim self and peer evaluations provided to facilitators
as preparation of leading teams in self-assessment
sessions as the teams reflected on performance on a
just-completed problem and planned their process on
a just-assigned problem. (These were also all read by
the instructor to detect potential problems requiring
discreet, direct, individualized intervention), and
iv) detailed written feedback on project presentations and
reports.
The interim self and peer evaluations were based on an
assessment rubric distributed to the students at the beginning
of the semester. The rubric was divided in four areas: inquiry
skills, knowledge building, problem solving, and team skills.
For each one of these areas, the rubric established the
prerequisites for an exceptional (A), proficient (B), fair
(C), or poor (D) grade. The students were asked to assess
themselves and their group peers on these four areas based
on this rubric. As an example, the prerequisites for an A
in inquiry skills were:
- actively looks for and recognizes inadequacies of existing knowledge,
- consistently seeks and asks probing questions,
- identifies learning needs and sets learning objectives,
- utilizes advanced search strategies,
- always evaluates inquiry by assessing reliability and
appropriateness of sources.
At the end of each project, in addition to having to
deliver a complete project report, the groups had to present
their work to a review panel formed by usually three or
four persons, each reviewing two or three 25-minute presentations. The review panels were formed by at least one
faculty member from the school, facilitators (not reviewing
their own group), industry/operational experts when possible,
other faculty, and experts on teaching and learning familiar
with methods of questionning and challenging students. Each
panelist was given a detailed scoring rubric. Feedback on the
presentations was provided to the groups within 24 hours,
often on the same business day. Groups were allowed a
few days to finish their project reports after receiving this
feedback. Detailed written feedback was also provided on the

2532

problem presentations and reports, to encourage and inform


student performance on the subsequent problems.
At the conclusion of each problem, a review session was
held on the underlying technical content. The students then
individually completed mid-term exams (and final exam after
the last problem), providing the opportunity to assess them
individually and additional motivation for all students to
thoroughly learn the required course material individually.
IV. RESULTS
This section will start with the students experience during
the PBL course and their reflections on it at the end. In
the end, the most important results are the impact of this
instructional method on student learning; significant gains
were found, as detailed at the end of this section.
Examining how students fared during the course, the first
result worth noting is an improvement of roughly half a letter
grade in project scores over the semester. For example, the
project scores of one of the groups from the first to the
last project were 85%, 89%, 88%, and 90%. We attribute
this improvement to three main factors: the adjustment of
the students to the PBL method, which was new to all of
them; more specifically, the acquisition of professional skills
in inquiry and knowledge development, problem solving
and teamwork which served as enablers to better addressing
their assigned problems; and a higher level of understanding
of dynamics and controls by the students as the semester
progressed.
To understand the students experience within and opinions about the efficacy and impact of PBL on teaching system
dynamics and automatic control, two surveys were completed
by students. The first one was a standard survey sent out by
Georgia Tech at the end of the term, asking the students to
evaluate the class and its instructors. Georgia Tech uses the
same survey for every class on campus. It is important to note
that this survey was designed with the lecture-based method
in mind and, hence, many of its questions are not adequate to
gather the opinion of the students on the PBL method. Hence,
a second voluntary web-based survey was later sent out to
the students by the instructor, more pinpointed to analyze
their opinion on specific aspects of the PBL method and
administered after the students had had more time to reflect
on the course.
In the first survey, the main complaint students had about
the class was that it was too hard and required too much
work. On previous lecture-based offerings of the same class,
that was also the main complaint. Something that changed,
however, was the spread of scores obtained by the class.
The survey revealed a wider spread in the PBL class than
in previous offerings, suggesting that the students opinion
about the class was less homogeneous than before. In Figure
1, it is shown how the students rated their agreement to
the statement: Considering everything, this was an effective
course. from 0 (strongly disagree) to 5 (strongly agree).
The relatively low score compared to other lecture-based
offerings of the same class can be explained in part by this
wider spread, with some zero marks pushing the average
down. On the other hand, some questions consistently received scores higher than 4 out of 5: instructors enthusiasm
and instructors respect and concern for the students.

Fig. 1. Students agreement with the statement: Considering everything,


this was an effective course., for different offerings of the class.

Out of 124 students, 62 answered a second, voluntary,


web-based survey later sent out by the instructor. The students were asked seven questions. The results of three of
those questions are shown in Figures 2, 3, and 4. They reveal
that a majority of the students believed that the PBL class
was very effective in helping them learn the contents of the
class and that, if given a choice in the future between the
PBL method and the lecture-based method, they would prefer
the PBL method.

Fig. 2. Student responses to the survey question: How effective was this
course in helping you learn the course content?

The next set of analyzes examined student learning of


course content, as reflected in both final exam scores and
in subsequent performance in follow-on courses. Four final
exam questions were kept identical (except for superficial
details and specific parameter values) to questions applied
in a previous lecture-based offering of the course with the

2533

Fig. 3.

Student responses to the survey question: How effective was this course in helping you attain each of the ABET learning objectives?

same instructor. The scores obtained by the students on these


four questions are compared with the previous scores in
Table I. The data shows that the students obtained nearly
identical scores in the the course applying PBL (and thus
also addressing development of skills beyond what can be
assessed in a final exam) and in a prior course offering
focused exclusively on the content covered in the final exam.
TABLE I
C OMPARISON BETWEEN SCORES OBTAINED WITH THE PBL METHOD
AND WITH THE LECTURE - BASED METHOD ON QUESTIONS OF THE FINAL
EXAM .
Question
Transfer functions
Modeling (EOM to transfer function)
Root locus
Frequency response/Bode diagram

PBL
96.72 %
61.04 %
83.2 %
71.72 %

Lecture-based
91.78 %
55.91 %
84.93 %
73.31 %

The most significant results were found in an analysis of


the students final grades in two subsequent classes. This
course, AE3515 System Dynamics and Automatic Control
served as a prerequesite for both AE3521 Introduction to
Flight Mechanics and AE4525 Controls Lab. The scores
of students in the PBL offering of 3515 and prior lecture-

based offerings of 3515 were compiled in a dataset that also


then matched each individuals score in 3515 with their subsequent scores in 3521 and 4525. A factor analysis was used
to identify which factors statistically impact students scores;
differences between instructors were found that were then
corrected for in the following analysis. With these corrections
for confounds, students in the PBL offering of AE3515 were
found to later perform significantly better in the subsequent
controls lab AE45252: the numerical average score improved
by 0.35 (roughly 1/3 of a letter grade), a significant difference
with 95% confidence. Students in the PBL offering of this
class were also found to later perform slightly better in the
flight mechanics course AE3521: the numerical average score
improved by 0.16, although this effect size was not profound
enough to be statistically significant with 95% confidence.
Indeed, the score of the students in the PBL section were
found to correlate significantly with (i.e. statistically predict)
their performance in subsequent courses; in contrast, student
scores in the lecture-based offering of AE3515 were found to
have a weak correlation with subsequent scores in AE3521
and no correlation with subsequent scores in AE4525; no
other factor, such as the instructor of an AE3515 section,
were found to cause a statistical correlation between their
scores in AE3515 and subsequent courses.

2534

Fig. 4. Student responses to the survey question: All other things being
equal, if you have the option in the future to take [a class in your major]
in a section using problem-based learning or in a section using traditional
lecture-and-homework format, which would you pick?

V. CONCLUSIONS AND LESSONS LEARNED


1) Follow-on statistical analysis proved with 95% confidence that students taking this course using PBL had
statistically higher scores in a subsequent course, and
a trend to higher scores in another subsequent course.
Indeed, controlling for a range of factors, this analysis
found that scores in this course were only correlated
with their performance in subsequent courses when they
took this course using PBL.
2) In the final exam, students in the PBL session demonstrated mastery of course content commensurate with
prior offerings using lecture-based methods.
3) PBL further addresses several ABET criteria in addition
to the focus on content in criterion (a), notably including
teamwork, life-long learning, problem-solving, design,
and use of engineering tools and techniques. Moreover,
PBL is particularly suited to help students overcome the
white page syndrome where students, faced with a
real-life engineering problem, have difficulties applying
their knowledge. PBL provides the opportunity to first
experience the novelty (and perhaps stress) of solving
a large, open-ended, authentic problem in a relatively
safe academic environment.
4) Student morale goes through profound swings during
the semester. At the lowest points, students may feel
abandoned without the usual comforting structures of
lectures and frequent homeworks, particularly when
they uncover what they do not know and struggle with
the material. This is a necessary (and usually transient)
aspect of the transition to taking responsibility for
their own learning. Indeed, the students actually receive
substantial oversight, but the facilitators challenge them
to struggle with the material to support deeper learning
and the skill development. The instructor must also constantly monitor for the need for discreet interventions,
but these should only seek to return the students to a
condition where their struggles will bear fruit.
5) Students may perceive that they are not learning all
the material. For example, many times students hand in

projects wishing that they knew more, as solving only


90% of a problem highlights the students remaining
10% gap in knowledge much more saliently than, say,
a homework that is 90% complete and given an A.
These perceptions are erroneous. Thus, the instructor
needs to properly callibrate the students perception of
their learning, particularly with feedback on the final
problem and the final exam, to convey to the students
the significant progress that they have made.
6) Further, low points in students morale are natural,
transitory and (if carefully monitored) can be healthy.
The key point for the instructor is to not intervene out
of sympathy or lack of faith in the students abilities, as
such interventions only further project that the students
are not actually responsible for their learning or capable
of solving authentic problems.
7) PBL places an intensive workload on the instructor,
workload that can be mitigated by the use of facilitators.
The use of facilitators has the additional upside of
providing unique personal contact and mentoring of
potential future faculty.
8) Whereas some students like the PBL method, other
really dont. This is also true of lecture-based methods.
The lessons learned here highlight the benefits and difficulties of implementing PBL to teach system dynamics and
automatic control. Learning is improved both in terms of
course content and in terms of the other ABET student outcomes (b) through (k), and this learning is then also reflected
in higher scores in subsequent classes. However, it is an
intensive experience for the students, and requires significant
dedication and effort by a sizeable number of facilitators
(ideally, a ratio of 1 faciltator for every 7 students). Thus, its
effective implementation must be supported not only by the
instructor, but by the entire school (support for facilitators)
and by the curriculum in terms of not overlapping with other
workload intensive courses in the same semester.
R EFERENCES
[1] E. Mazur, Peer Instruction: A Users Manual, ser. Prentice Hall Series
in Educational Innovation. Prentice Hall, 1997. [Online]. Available:
http://books.google.com/books?id=tjcbAQAAIAAJ
[2] J. Michael, Wheres the evidence that active learning works? Advances in Physiology Education, vol. 30, no. 4, pp. 159167, 2006.
[3] Criteria
for
Accrediting
Engineering
Programs
(20132014),
ABET
Engineering
Accreditation
Commission
Std.,
October
27
2012.
[Online].
Available:
http://www.abet.org/uploadedFiles/Accreditation/Accreditation Step by
Step/Accreditation Documents/Current/2013 - 2014/eac-criteria-20132014.pdf
[4] D. L. Darmofal, Frontiers of Computational Fluid Dynamics 2006, ser.
Computational Fluid Dymanics Series. World Scientific, 2005, ch.
Educating the Future: Impact of Pedagogical Reform in Aerodynamics.
[5] A. Walsh, The Tutor in Problem Based Learning: A Novices Guide,
A. F. Sciarra, Ed.
Hamilton, ON, Canada: McMaster University,
Faculty of Health Sciences, 2005.
[6] H. Barrows, The Tutorial Process. Southern Illinois University School
of Medicine, 1992.

2535

Вам также может понравиться