Вы находитесь на странице: 1из 8

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

THE REASONING TOWARDS USING DYNAMIC


ASSESSMENT IN EFL (ENGLISH AS A FOREIGN
LANGUAGE) EDUCATIONAL SYSTEM IN IRAN
Saeideh Ahangari
Department of English, Tabriz Branch, Islamic Azad University, Tabriz, Iran
S_ahangari@yahoo.com

Khodaverdi Alizadeh
Department of English, Tabriz Branch, Islamic Azad University, Tabriz, Iran
khodaverdializadeh@gmail.com
ABSTRACT
Dynamic assessment, as a leading strategy in today's teaching universe, has changed into one of the
most influential trends in efl educational systems worldwide. Lots of scholars have conducted study
projects to find out about the importance of using this strategy in teaching efl students. There have
not been enough investigations done in iran to show the advantages of using this strategy in the
educational and higher educational systems in iran. The present study is going to investigate about
the importance of dynamic assessment (da) versus static assessment and present some reasoning why
it should be taken seriously by our educational system. The study is a library study using the
historical information about the issue and our goal is to emphasize on the need to establish a new
trend to include this strategy in all efl teaching settings within our country.
Keywords: static assessment, dynamic assessment, strategy, teaching, efl students, reasoning
1- Introduction
The term dynamic assessment (DA), in a broad conceptualization sense, refers to an assessment, by an
active teaching process, of a learners perception, learning, thinking, and problem solving. This
process is aimed at modifying an individual's cognitive functioning and observing subsequent
changes in learning and problem-solving patterns within the testing situation. The goals thought of
the DA are to: (a) assess the capacity of the learner to grasp the principle underlying an initial
problem and to solve it, (b) assess the nature and amount of investment (teaching) that is required to
teach a learner a given rule or principle, and (c) identify the specific deficient cognitive functions (i.e.,
systematic exploratory behavior) and non-intellective factors (i.e., need for mastery) that are
responsible for failure in performance and how modifiable they are as a result of the teaching process.
On the contrary, the term static test (ST) generally refers to a standardized testing procedure in which
an examiner presents items to an examinee without any attempt to intervene to change, guide, or
improve the learner's performance. On the whole, a static test usually has graduated levels of
difficulty, with the tester merely recording and scoring the responses.
DA development has been motivated regarding the inadequacy of standardized tests to be
administered in educational settings. The inadequacy of standardized tests can be summarized in the
following points:
(1) Static tests do not provide crucial information about learning processes, deficient cognitive
functions that are responsible for learning difficulties and mediational strategies that facilitate
learning.
(2) The manifested low performance level of many learners, as revealed in ST, very frequently falls
short of revealing their learning potential, especially of those identified as coming from
disadvantaged social backgrounds, or as having some sort of learning difficulty. Many students fail in
static tests because of lack of opportunities for learning experiences, cultural differences, specific
learning difficulties, or traumatic life experiences.

Vol. 5, Issue 2, June 2015

534

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

(3) In many static tests learners are described mostly in relation to their relative position of their peer
group, but they do not provide clear descriptions of the processes involved in learning and
recommendations for prescriptive teaching and remedial learning strategies.
(4) Static tests do not relate to non-intellective factors that can influence individuals' cognitive
performance, sometimes more than the "pure" cognitive factors.
In comparison with ST, DA is designed to provide accurate information about: (a) an individual's
current learning ability and learning processes; (b) specific cognitive factors (i.e., impulsivity,
planning behavior) responsible for problem-solving ability and academic success or failure; (c)
efficient teaching strategies for the learner being studied; and (d) motivational, emotional, and
personality factors that affect cognitive processe.
2- The Nature of Dynamic Assessment
Dynamic Assessment (DA) offers a conceptual framework for teaching and assessment according to
which the goals of understanding individuals abilities and promoting their development are not only
complementary but are in fact dialectically integrated. More specifically, DA follows Vygotskys
proposal of the Zone of Proximal Development (ZPD) by offering learners external forms of
mediation in order to help them perform beyond their current level of independent functioning. Lev
Vygotsky's concept of a zone of proximal development (ZPD) is defined as the difference between a
child's "actual developmental level as determined by independent problem solving" and the higher
level of "potential development as determined through problem solving under adult guidance or in
collaboration with more capable peers" (Vygotsky, 1981). In a DA context, the examiner mediates the
rules and strategies for solving specific problems on an individual basis, and assesses the level of
internalization (i.e., deep understanding) of these rules and strategies as well as their transfer value to
other problems of increased level of complexity, novelty, and abstraction.
DA is meant to be a complement to standardized testing, not a substitute for it. It is presented as a
broad approach, not as a particular test. Different criteria of change are used in DA: pre-to postteaching gains, amount and type of teaching required, and the degree of transfer of learning. The
choice to use change criteria to predict future cognitive performance (as well as predicted outcome of
intervention programs) is based on the belief that measures of change are more closely related to
teaching processes than they are to conventional measures of intelligence.
Clinical experience has shown that it is most useful to use DA when standardized tests result in low
scores; when standardized tests hover around margins of adequacy in cognitive functioning; when
there are serious discrepancies between a child's test scores and academic performance; when a child
comes from a low socioeconomic or culturally or linguistically different background; or when a child
shows some emotional disturbance, personality disorder, or learning disability.
3- Dynamic vs. Static Assessment
One important distinction in assessment is between static and dynamic assessment. In static
assessment, the evaluator administers an assessment and the individuals performance on that
assessment is determined by comparison to norms or set criteria. A static assessment assesses the
skills and knowledge the individual has gained from his or her prior experiences. It does not assess
the individuals ability to acquire skills and knowledge since that would have happened before the
assessment was completed. Current commercially available assessment materials are static
assessments. They generally are either norm-referenced or criterion-referenced tests.
Norm-referenced tests compare the test-takers performance to the performance of individuals in the
normative sample. A criterion referenced test, however, compares the test-takers ability to a number
of criteria assumed to have been acquired by the test taker at that age (e.g., linguistic forms,
vocabulary, spelling skills,etc.). Norm-referenced tests are usually not valid for culturally and
linguistically diverse children because the norming sample is not representative of the individuals
background. Comparatively, criterion-referenced tests are very susceptible to the bias of the test
developers since the test developers are the ones determining what skills and knowledge should be
present by what age.
Dynamic assessment, in contrast to static assessment, looks at an individuals ability to acquire skills
or knowledge during the evaluation. Clinical judgment is required to accurately administer a
dynamic assessment because the evaluator is responsible for comparing the individuals performance
on the assessment tasks with the performance of typically developing children from the same speech

Vol. 5, Issue 2, June 2015

535

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

community. In dynamic assessment, a skill is tested, then taught and then retested. With this
procedure, you are giving the individual the chance to learn the skill or knowledge being tested.
There are no currently available commercially published dynamic language assessments. However,
research has demonstrated the usefulness of several dynamic assessment procedures, including fast
mapping and non-word repetition tests, with less of the bias and validity issues that are common to
static assessment (Dollaghan & Campbell, 1998).
4- Studies regarding DA
Since the early 1960s, a range of approaches to DA has been developed in different contexts such as
Germany, Denmark, and the US. As noted by Haywood and Lidz (2007), the hallmark of the studies
that fall under the umbrella of DA is active intervention provided by examiners during the test
procedure and assessment of the examinees response to intervention.
Antn (2003) found the utility of DA procedure to test language proficiency of advanced L2 learners.
The DA procedure included mediation to observe what learners were able to do with the language
while being exposed to dialogic teacher-learner interactions. The participants of the study involved
five undergraduate learners majoring in Spanish at an urban US university. The results of the study
also showed that the inclusion of a mediation-driven DA procedure in the placement test increased
the tests ability to differentiate learners writing and speaking skills and provided the learners with
more accurate recommendations concerning their particular academic needs.
Ableeva (2008) reported on a study focusing primarily on the effects of DA on developing L2 French
learners listening comprehension in university-level in which participants achieved a better
comprehension with mediator guidance. This revealed that learners abilities were more developed
than one would have expected in an unmediated condition. Lantolf and Poehner (2011) examined the
implementation of DA in a combined fourth and fifth grade Spanish classroom. In this study, the
classroom teacher used standardized mediation prompts to dynamically assess noun/adjective
agreement in Spanish. They incorporated dynamic assessment into daily lessons without changing
instructional objectives or curricular goals by teaching within the ZPD of learners to promote the
development of grammatical structures in question in Spanish and found positive results in
promoting the groups ZPD.
Shrestha and Coffin (2012) probed the value of tutor mediation in the context of academic writing
development among undergraduate business studies learners in open and distance learning. The
authors concluded that DA can help to identify and respond to the areas that learners needed the
most support (in this study, managing information flow). However, the authors recognized that the
study was limited to a particular sociocultural context in higher education (Open University) and
their findings could not be generalized to other contexts.
Sadeghi and Khanahmadi (2011) investigated about the role of mediated learning experience in L2
grammar of Iranian EFL learners. Sixty EFL learners (30 male and 30 female) in two institutes in Iran
were the participants of the study. The results showed that the type of assessment based instruction
or mediation (DA based versus NDA-based) made significant difference in the learning of grammar
by Iranian EFL learners.
Pishgadam, Barabadi, and Kamrood (2011) examined the effectiveness of using a computerized
dynamic reading comprehension test (CDRT) on Iranian EFL learners with a moderate level of
proficiency. Findings showed that providing mediation in the form of hints increased significantly the
learners scores and consequently their reading comprehension. DA seemed to be a bigger help to
weaker learners than stronger ones.
Naeini and Duvali (2012) studied improvements in English Language Training (ELT) university
learners reading comprehension performance by applying the mediations of a dynamic assessment
approach to instruction and assessment. The descriptive and analytic analyses of the results revealed
dramatic and measurable progress in participants reading comprehension performance.
DA can be traced back to Vygotsky (1981, 1986) that stresses the social environment as a facilitator of
the learning process (Karpov & Haywood 1998; Kozulin & Garb 2002). DA has gained momentum in
research (e.g., Leung 2007; Poehner and van Compernolle 2011; Rea-Dickins 2006; Tzuriel 2011) and
has also been applied to classroom-based assessment (Ableeva 2008; Ableeva & Lantolf 2011;
Sternberg & Grigorenko 2002). In DA, teaching and testing are intertwined into a joint activity which
targets the activation of the learners cognitive and meta-cognitive processes (Ableeva & Lantolf 2011;
Tzuriel 2011). Research (e.g., Gass 1997; Lidz 2002; Swain 2001) has shown that learners become co-

Vol. 5, Issue 2, June 2015

536

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

constructors of meaning in collective joint activities where knowledge and meaning can be negotiated
and mediated. This negotiation is context-bound.
Mediation, the zone of proximal development (ZPD), contingency and scaffolding are cornerstones in
DA. Vygotskys theory of learning stresses mediation in that it can instruct learners in how to use
their cognitive and meta-cognitive strategies, for instance, in a problem-solving activity. Gibbons
(2003), defines the ZPD as the cognitive gap between what learners can do unaided and what they
can do in collaboration with a more competent other. To this end, learners can only perform
successfully in the presence of another participant, such as a teacher. Contingency consists of the
assistance required by the learner on the basis of moment-to-moment understanding (Gibbons
2003) i.e., teachers modulate the kind of support based on the learners reaction of and attitudes
towards this support. Scaffolding, however, mediates the learners in acquiring new strategies to be
able to finish the task independently (Kozulin & Garb 2002). An awareness of such strategies can be
conducive to success in language learning and assessment. In this regard, Vandergrift et al. (2006)
note that awareness of the listening strategies can have positive influence on language learners
listening development, and by extension to accessing the test items easily. Such awareness is a
cornerstone in assessing language learning dynamically. Adhering to DA both in teaching and testing
depends on the teaching experience, experience with language, motivation and views of language
and language learning.
The duality between dynamic and static assessment can in fact be blended together with the goal of
forming a comprehensive view about the language learning ability of the test-takers. Though
complementary they might appear, static and dynamic assessment have methodological differences.
Since this type of assessment considers the learners abilities as already matured i.e., fixed and stable
across time (Leung 2007), in DA, such abilities are malleable and flexible (Sternberg & Grigorenko,
2002). In addition, while scores in SA may be praised for their objectivity, they nevertheless fail to
infer much about the learners cognitive processes.
5- EFL teachers in Iran and their role in DA administration
Language learning processes in Iran consist of teachers doing most of the talking in class. Based on
some interviews to teachers and cross-sectional visits to some EFL classes in Iran, it was found that
most of the teachers' attention during classroom interaction was geared towards the treatment of
students' grammatical errors even in tasks calling for greater attention to communication, discourse
and sociolinguistic appropriateness. In Iranian EFL classes learners are studying to pass exams which
are still informed by structuralism and behavioral views of language and language learning. This
view of teaching of most of Iranian teachers is also reflected in testing. Teachers are not even trained
in how to carry out classroom-based assessment such as DA, nor are they exposed to developing
effective teaching strategies. This is the current situation now. They learn test design out of teaching
experience.
In this regard, in most Iranian EFL classrooms the number of students exceeds the standards, and
teachers still stick to the traditional way of assessing learners by one-shot multiple-choice or, essaylike exams; in fact teachers are not trained enough to practice DA in this particular EFL context. Also,
according to the parameter of practicality, a method should be applicable in real situation; otherwise,
the practice-theory relationship cannot be approached argues against the existing dichotomous
distinction, perceived in applied linguistics, in which the teacher is spoon-fed with whatever
knowledge and theory theorist produces (Kumaravadivelu, 2003). Regarding the role of teachers in
Iranian EFL classrooms, often the dominant pluralistic society of Iran influences the educational
contexts of EFL, which leads to ignoring teachers' sense of plausibility (Prabhu, 1990) and dictating
some pre-determined set of materials and methods to be implemented in classrooms. However, this
restricted view of methodology is limited mostly to school classrooms; in other language institutes,
teachers have more liberty of deciding on the methodology and materials. On the basis of the
principle of possibility, authors encourage critical thinking of teachers and students to question the
status que that keeps them restrained on what to teach, how to teach, etc. This parameter, moreover,
highlights the importance of the experience they bring to the classroom; their values and background
including culture, education, language, race, and other variables, directly or indirectly, influence the
content and character of classroom input and interaction (Benesch, 2001, cited in Kumaravadivelu,
2006).

Vol. 5, Issue 2, June 2015

537

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

As for Iranian EFL classrooms, the trend of critical thinking and giving teachers a voice in questioning
the current methods of assessment and teaching is gradually gathering momentum, but compared to
the global tempo, in Iranian EFL contexts, it is relatively restrained and slow. In fact, EFL teachers in
Iran cannot cause a radical change in the existing traditional static testing, dominant in educational
settings. Moreover, there is no tendency in educational settings to keep up with the pace of the
paradigm shift in ELT, and replace the present system with DA or any other alternative assessment
tools.
6- Implications
DA may be beneficial for learners who are mediated to activate their cognitive and meta-cognitive
strategies to notice things. In classical standardized testing, however, such mediation is not offered.
DA may be at stake when validity and reliability are concerned. These two notions have been largely
addressed in dealing with psychometric standardized testing. However, DA researchers have not
managed to find reasonable arguments for validity and reliability, except for Lantolf (2009) and
Poehner (2011). Lantolf (2009) believed that DA makes a strong claim with regard to predictive
validity. DA concentrates on changing the learner to better levels of linguistic attainment. Since the
use of effective dynamic instructions leads the test-takers to perform better in the future, proponents
of DA (e.g., Lantolf & Poehner 2009) remark that this future success does in fact echo predictive
validity.
This study addressed a need to examine and improve current assessments of language learning. It
had theoretical, pedagogical and methodological implications which could be addressed for future
research. First, in the theoretical implications, results of DA brought to light the fact that there should
be an interface between language learning and language testing. This interface has been addressed in
research (e.g., Alderson 2005; Bachman 1989; Bachman & Cohen 1998; Douglas & Selinker1985). This
link integrates instruction and assessment in class to help the learners meet their needs and reach the
stage where they can perform independently. DA is not an alternative to classroom assessment, nor
can it replace other types of assessment. Rather, it is integrated with classroom instructions to help
test-takers overcome their testing difficulties by, for instance, developing their cognitive and metacognitive processes. The findings of DA interactions can be considered additional contributions to the
link between assessment and learning.
Second, the pedagogical implications addressed the different steps through which teaching and
testing can be improved. In this regard, assessing the learners in a progressive dynamic test can help
locate the areas of weaknesses in the language program or in the learners cognitive and metacognitive strategies. In addition, grabbing the test-takers attention to notice things and praising them
to overcome their difficulties are in fact at the heart of any learning process. Research on DA and
learning in general highlights this endeavor. Despite the threats to validity and reliability of the test,
assessing learners in a dynamic way in Iran may be practical and useful given the tremendous
language problems these learners have. In terms of authenticity, DA echoes the authentic tasks and
activities that the learners are supposed to meet in everyday life, not like psychometric standardized
tests. In short, implementing DA has the goal of changing the learners behavior in their perception of
the different courses undertaken at the university level in Iran.
Third, the methodological implications called for the importance of using qualitative (interaction in
the dynamic test and interview) and quantitative instruments (test scores). Like other studies (Buck
1994), the use of qualitative and quantitative methods played a crucial role in assessment. The
feedback teachers suggested about the nature of problems has immediate implications for teaching as
well as for testing. In the light of this feedback, the teachers can address and remedy these
shortcomings in teaching, and, therefore, in testing.
7- Conclusion
Dominating the field of language testing, static assessment used to determine whether some predetermined achievement level had been reached. Traditional static assessment was limited because it
did not directly aim to stimulate learners into becoming independent knowledge constructors and
problem solvers. Unlike static assessment, DA gives the language teacher a chance to appropriately
gauge the students understanding and ability level and how to improve the students level
development. To put it in another way, by engaging in DA activity, teachers may be able to challenge
individuals to reach higher levels of functioning (Poehner, 2005, cited in Naeini and Duvail, 2012). As
a matter of fact DA with its monistic view toward teaching and testing not only assesses the learners

Vol. 5, Issue 2, June 2015

538

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

abilities but also provides them with opportunities for learning and development. This in turn has
some positive results both for teachers and learners; therefore, the implications can be multifold.
Mixing assessment and instruction can be beneficial for EFL learners in learning. Process- oriented
dynamic assessment can improve the learning of EFL learners. The researchers believe that adopting
DA in EFL classes leads to more involvement of leaners in the process of learning. It also increases
learners motivation and reduces the anxiety of taking test. On the other hand, teachers can exploit
DA to gauge the leaners understanding and awareness and diagnose the areas that learners need
more help. Teachers may be able to challenge learners to reach higher levels of functioning by
engaging in DA. The current study may offer suggestions to the EFL test developers as well as those
involved in educational administrations. EFL teachers, syllabus designers, curriculum planner, and
materials developers and also the learners interested in learning EFL can take advantage of the study.
REFERENCES
Ableeva, R InLantolf P, Poehner ME (Eds.) (2008). The effects of dynamic assessment on L2 listening
comprehension. Sociocultural theory and the teaching of second languages (pp. 5786). London: Equinox.
Ableeva, R, &Lantolf, JP (2011). Mediated dialogue and the microgenesis of second language listening
comprehension. Assessment in Education, 18, 133149. Publisher Full Text
Alderson, JC (2005). Diagnosing Foreign Language Proficiency: The interfaces between learning and
assessment. London: Continuum.
Antn, M. (2003). Dynamic assessment of advanced foreign language learners. Paper presented at the
American Association of Applied Linguistics, Washington, D.C., March, 2003.
Bachman, LF, & Cohen, AD In Bachman LF, Cohen AD (Eds.) (1998). Language testing-SLA
interfaces: An update. Interfaces between second language acquisition and language testing research
Cambridge: Cambridge University Press.
Buck, G (1994). The appropriacy of psychometric measurement models for testing second language
listening comprehension. Language Testing, 11(2), 145170. Publisher Full Text
Dollaghan, C., & Campbell, T. F. (1998). Nonword repetition and child language impairment. Journal
of Speech, Language, and Hearing Research, 41, 1136-1146.
Douglas, D, & Selinker, L (1985). Principles for language tests within the 'discourse domains' theory of
interlanguage: Research, test construction and interpretation. Language Testing, 2, 205226.
Publisher Full Text
Gass, S (1997). Input, interaction, and the second language learner. Mahwah, NJ: Lawrence Erlbaum.
Gibbons, P (2003). Mediating Language Learning: Teacher Interactions with ESL Students in a
Content-Based Classroom. TESOL Quarterly, 37(2), 247273. Publisher Full Text
Karpov, YV, & Haywood, HC (1998). Two ways to elaborate Vygotsky's concept of mediation:
Implications for instruction. American Psychologist, 53(1), 2736.
Kozulin, A, & Garb, E (2002). Dynamic assessment of EFL text comprehension. School Psychology
International, 23(1), 112127. Publisher Full Text
Kumaravadivelu, B. (2003). Critical Language Pedagogy: A Postmethod Perspective on English
Language Teaching. World Englishes, 22 (4), 539-550. Retrieved from http://dx.doi.org/10.1111/j.
1467971X.-2003.00317.x
Kumaravadivelu, B. (2006). TESOL methods: Changing tracks, challenging trends. TESOL Quarterly,
40, 5981.
Lantolf, P (2009). Dynamic assessment: The dialectic integration of instruction and assessment.
Language Teaching, 42(3), 355368. Publisher Full Text
Lantolf, JP, & Poehner, ME In Ritchie WC, Bhatia TK (Eds.) (2009). The artificial development of
second language ability: A sociocultural approach. The new handbook of second language acquisition (pp.
138159). Bingley, UK: Emerald Press.
Leung, C (2007). Dynamic assessment: Assessment for and as teaching. Language Assessment Quarterly,
4(3), 257278. Publisher Full Text
Lidz, CS (2002). Mediated learning experiences (MLE) as a basis for an alternative approach to
assessment. School Psychology International, 23(1), 6884. Publisher Full Text
Lidz, C. S., & Pea, E. D. (2009) Response to intervention and dynamic assessment: Do we just appear
to be speaking the same language? Seminars in Speech and Language. 30(2), 121-133.
Naeini, J. & Duvali, E. (2012). Dynamic assessment and the impact on English language learners
Reading comprehension performance. Language Testing in Asia. 2(2).

Vol. 5, Issue 2, June 2015

539

MJLTM

[MODERN JOURNAL OF LANGUAGE TEACHING METHODS]

Pishgahadam, R., Barabadi, E., Kamrood, A. M. (2011). The differing effect of computerized dynamic
assessment of L2c reading comprehension on high and low achievers. Journal of Language Teaching and
Research. 2(6), 1353-1358.
Poehner, ME, &Lantolf, J (2005). Dynamic assessment in the language classroom. Language Teaching
Research, 9, 233265. Publisher Full Text
Prabu, N. S. (1990). There Is No Best Method Why? TESOL Quarterly, 24 (2), 14-38.
Rea-Dickins, P (2006). Currents and eddies in the discourse of assessment: A learning-focused
interpretation. International Journal of Applied Linguistics, 16, 164189.
Sadeghi, K., & Khanahmadi, F. (2011). Dynamic assessment of L2 grammar of Iranian EFL learners:
The role of mediated learning experience. International Journal of Academic Research,3(2),931-935.
Shrestha, P., & Coffin, C. (2012). Dynamic assessment, tutor mediation and academic development
writing. System, Assessing Writing17, 55- 70. doi:10.1016/j.asw.2011.11.003
Sternberg, RJ, & Grigorenko, EL (2002). Dynamic testing. The nature and measurement of learning
potential. Cambridge: Cambridge University Press.
Swain, M (2001). Examining dialogue: Another approach to content specification and to validating
inferences drawn from test scores. Language Testing, 18(3), 275282.
Tzuriel, D (2011). Revealing the effects of cognitive education programms through Dynamic Assessment,
Assessment in Education: Principles, Policy & Practice, 18, 113131.
Vandergrift, L, Goh, CCM, Mareschal, CJ, Tafaghodtari, MH (2006). The metacognitive awareness
listening questionnaire: Development and validation. Language Learning, 56(3), 431462.
Publisher Full Text
Vygotsky, L (1981). Mind in society: The development of higher psychological process. Cambridge,
MA: Harvard University Press.
Vygotsky, L (1986). Thought and language. Cambridge, MA: MIT Press.

Vol. 5, Issue 2, June 2015

540

Copyright of Modern Journal of Language Teaching Methods is the property of Modern


Journal of Language Teaching Methods and its content may not be copied or emailed to
multiple sites or posted to a listserv without the copyright holder's express written permission.
However, users may print, download, or email articles for individual use.

Вам также может понравиться