Академический Документы
Профессиональный Документы
Культура Документы
233-241, 1996
Copyright © 1996 The American University
Pergamon Printed in Great Britain. All rights reserved
0889-4906/96 $15.00 + 0.00
S0889-4906(96) 00007 -5
P r o c e s s A s s e s s m e n t in ESP: Input.
T h r o u g h p u t and Output
Julio C~sar Gim~nez
Introduction
Students' performance in ESP courses has been assessed as the end-product
of a process rather than as the reflection of an on-going process which, if
carefully controlled, would assure a more reliable final product. This well-estab-
lished practice of evaluating students' final production has been transferred
from the field of ESL/EFL to the field of ESP without any adaptation. The
reason for this is simply that assessment has been for a long time a neglected
area in ESP (Alderson 1988b). However, as a way of evaluating ESP students'
performance, this practice is quite limited in scope because it is applied at the
end of the process when it is usually too late for improving the final product,
or when it is too difficult to eradicate deeply-rooted erroneous linguistic habits.
Process assessment, on the other hand, offers an alternative form of evalu-
ating students' performance that can be successfully implemented in ESP
Address correspondence to: Julio C__~sarGimL~ez, 9 de Julio 635-3 "A", (5000) C6rdoba, Argentina.
233
234 J.C. Gim~nez
feedbsck process
vmhlbi~ I
Sm~mc
2. Assessing Data
The next step was to assess all the data (input and throughput individual
and organizational variables) we had collected. This step is usually considered
the most troublesome (Berwick 1989; Nunan 1989; Seedhouse 1995).
An analysis of the information gathered through the instruments to collect
input individual variables revealed:
ESP Process Assessment 237
and throughput variables plus all the adjustments made to the input after the
throughput assessment.
At IES, in evaluating the output after the first week of class, we had to assess
the students' performance and consider adjustment of the input variables. Some
instructors were not able to interpret the throughput data collected and so they
then needed to make new decisions to ensure that students' performance would
be judged on the basis of learning (Krashen & Terrell 1983), and not on inappro-
priate variables.
This, in my view, is one of the main aspects that makes ESP assessment
different from assessment in other courses. That is, the learning process, choice
of materials, assessment and evaluation process become more the responsibil-
ity of both parties, teachers and students, in ESP courses than they might be
in other courses of General English. Although there is a need for students'-
participation and responsibility in the learning and assessing process in any
kind of courses of English (Biria & Hassan Tahririan 1994; Harris & McCann
1994; Murphey 1994-1995; Palacios Martinez 1993; Smith 1994), this need is
much more concrete in ESP. More often than not, ESP students possess the
subject-matter knowledge their ESP instructors lack and thus can help them
make more appropriate choices.
To sum up, input assessment gives ESP instructors the foundations to start
building, while throughput assessment indicates how well the structure is
being erected and if some bricks have not been properly laid and need adjust-
ment. Only then can the evaluation of the building activity start to take place.
But this final evaluation should consider factors such as the proper selection
of bricks and mortar, a change of materials if they fail to meet workers' needs,
and so on. Regular meetings, interviews, observations and record keeping are
considered a fundamental part of this final evaluation (Allerson & Grabe
1986).
T h e Impact of P r o c e s s A s s e s s m e n t on an E S P Course
To implement process assessment at IES some changes had to be made.
These changes involved teachers and students as well as the institution itself.
ESP instructors participated in a series of seminars centered around three
aspects: new trends in ESP, classroom management and process assessment
training. The seminar activities aimed to instruct the participants in new teach-
ing methodologies, data interpretation, record keeping procedures and process
assessment implementation.
ESP instructors then held regular meetings with subject-matter teachers to
decide on topics to discuss cooperatively in their classes, and to talk about new
ways of assessing the course content. These meetings proved beneficial for both
kinds of teachers. ESP instructors were helped with doubts they had about
content, and subject-matter teachers increased their knowledge of English by
helping ESP instructors choose the class materials. Cooperative teaching has
proved highly beneficial in any course of English; for ESP courses it has meant
an effective solution to the long-standing debate on ESP instructors' subject-matter
ESP Process Assessment 239
Conclusion
Assessment in ESP courses should be a reflection of the very nature of
ESP. That is, assessment in ESP courses should aim to meet the specific
needs of a specific group of students. In other words, "If one teaches ESP
then on clearly has to test ESP..." (Alderson 1988a: p. 16). And, "Naturally,
theory shapes assessment" (Allerson & Grabe 1986: p. 163). Assessing ESP
students' performance as process rather than as product means a different
and, at times, more complex way of looking at testing. Nonetheless, it should
assure instructors of achieving a more reliable final product which would, in
240 J.C. Gim~nez
REFERENCES
Alderson, J. C. (1988a). Testing English for specific purposes: how specific can
we get? In A. Hughes fEd.), Testing English for university study. (pp. 16-28).
ELT Documents: 127. Oxford: Modem English Publications.
Alderson, J. C. (1988b). Testing and its administration in ESP. In D. Chamberlain
& R.J. Baumgardner (Eds.), ESP in the classroom: Practice and evaluation. (pp.
87-97). ELT Documents: 128. Oxford: Modem English Publications.
Allerson, S. & Grabe, W. (1986). Reading assessment. In F. Dubin, D.E. Eskey,
& W. Grabe, (Eds), Teaching second language reading for academic purposes.
(pp. 161-181) Reading, MA: Addison-Wesley.
Aydellot, J. R. (1995). Foreign language curriculum organization. English
Teaching Forum, 33(1% 30-31.
Berwick, R. (1989). Needs assessment in language programming. In R.K.
Johnson fEd.) The second language curriculum. Cambridge: Cambridge
University Press.
Biria, R. & Hassan Tahririan, M. (1994). The methodology factor in teaching
ESP. English for Specific Purposes, 13(1% 93-101.
Boswood, T. & Marriot, A. (1994). Ethnography for specific purposes: teaching
and training in parallel. English for Specific Purposes, 13(1), 3-21.
Ellis, M. & Johnson, C. (1994). Teaching business English. Oxford: Oxford
University Press.
Gim+nez, J. C. (1995). Cooperative teaching in ESP: a third view. TESOL
Matters, February/March 1995, 11.
Goldstein, L & Liu N-F. (1994). An integrated approach to the design of an
immersion program. TESOL Quarterly, 28(4% 705-725.
Harris, M. & McCann, P. (1994). Assessment. Oxford: Heinemann.
Hutchinson, T. & Waters, A. (1989). English for specific purposes. Cambridge:
Cambridge University Press.
Krashen, S. & Terrell, T. D. (1983). The natural approach: language acquisition
in the classroom. Oxford: Pergamon Press.
ESP Process Assessment 241
van Lier, L. (1994). Some features of a theory of practice. TESOL Journal, 28(1),
6-10.
Murphey, T. (1994--1995). Tests: learning through negotiated interaction.
TESOL Journal, 4(2), 12-16.
Nunan, D. (1989). Understanding language classrooms. Cambridge: Prentice Hall
International.
Palacios Martinez, I. M. (1993). Learning from the learner. English Teaching
Forum, 31(2), 4447.
Prodromou, I. (1995). The backwash effect: from testing to teaching. ELT
Journal, 49(1), 13--25.
Rea-Dickins, P. & Germaine, K. (1993). Evaluation. Oxford: Oxford University
Press.
Seedhouse, P. (1995). Needs analysis and the general English classroom. ELT
Journal, 49(1), 59-65.
Smith, P. (1994). Learner self-assessment in reading comprehension: the case for
student-constructed tests. English Teaching Forum, 32(4), 41-43.
Spada, N. (1994). Classroom interaction analysis. TESOL Quarterly, 28(4),
685-688.