Вы находитесь на странице: 1из 4

2011 IEEE International Conference on Information and Education Technology (ICIET 2011)

Customized Continuous Assessment online version, using Scientific WorkPlace

Plinio del Carmen Tehern Sermeo


Department of Physics, Applied Physics Group Universidad Nacional de Colombia Bogot, Colombia e-mail: pdteherans@bt.unal.edu.co
AbstractThis article provides our findings on the construction of reliable and robust Kinematics exams, using the software Scientific WorkPlace, to be taken online. We propose the primary criterion of invulnerability for tests submitted online with a repetitive or simultaneous application. We found that a database with a minimum of 200 questions sufficiently guarantees the invulnerability of a test in a typical population of students. This pioneering work turns the concepts of Customized Continuous Assessment for massive Physics courses at a university level into a reality. The developed tool enables, in a timely manner, the evaluator and the evaluated to identify errors and faults present in the teaching-learning process. Keywords-Customized Continuous Assessment; Online assessment; Question database; Physics massive courses; Scientific WorkPlace

Rafael David Castro Torres, Julio Csar Len Lquez


Department of Physics, Lev Vgodsky Group Universidad Nacional de Colombia Bogot, Colombia e-mail: rdcastrot@unal.edu.co, jcleon@bt.unal.edu.co printed form, the teacher has the opportunity to get the solutions of the proposed exams questions, reducing the grading time for the exam, which we know is the most boring part of education from the teachers point of view. Having reduced the grading time of exams, this method of evaluation allows the teacher to optimize his/her time, allowing more to be spent teaching, conducting research and other endeavors. The student should not wait long to see his/her score, and most importantly, how he/she is improving in his/her learning. A database of 200 questions was created for the formulation of the exam (this amount will be justified later). The idea is that from this database only five questions are randomly selected to create a separate test for each student. This work is performed by the computer, obviously after a few brief instructions from the professor proposing the test. By doing this random process, the teacher's inclinations towards certain types of exercises and topics of his/her preference are eliminated. Another great advantage of this approach to assessment is that having so many choices of questions is that the student can retake a test in which he/she performed poorly without the teacher needing to devise a new test. The professor would simply instruct the program to create a new test. The process will take a few seconds as it does with grading. We must remember that current guidelines of education do emphasize the assessment of learning, especially as it has to do in identifying the capabilities and difficulties of "every" student and that is precisely one of the accomplishments of this tool [5]. This work aims to be the basis of an assessment system that can be implemented in all physics courses (it also could be very useful in mathematics courses) offered by the National University of Colombia, especially for courses that the university offers in remote form (via video conference). It does not require the presence of the teacher because everything can be managed online [4]. The assembly of these databases of questions on the servers of the University for Reasons such as program compatibility was not carried out due to the fact it is not an essential part of the problem we want to solve. To end this introduction we will talk about the most special function of the tool. The program is able to assign random values of the variables involved in each problem, which allows the same problem to be formulated in many

I.

INTRODUCTION

The motivation for conducting this work is the need to find a learning evaluation system that allows the student to find his/her own mistakes [4-6] (provided that he/she makes them) almost immediately following the evaluation. This first attempt demonstrates an evaluation model for a course in mechanics in which only the kinematics portion has been developed. To be more specific, questions were asked about the following topics [3]: Displacement, time and average speed Instantaneous velocity Average and instantaneous acceleration Motion with constant acceleration Free-fall Vectors of position and velocity Acceleration vector Projectile Motion Relative Velocity The database was constructed with the program Scientific WorkPlace using the Exam Builder tool, which allows you to create assessments with different types of questions: Multiple choice with single answer Multiple choice with multiple answers Open Question Additionally, the program evaluates the exam automatically if it is taken online. If the exam is presented in

978-1-4244-9587-0/11/$26.00

2011 IEEE

V1-57

2011 IEEE International Conference on Information and Education Technology (ICIET 2011)

ways. For example, in a free-fall problem we can change the initial height and initial velocity of an object, which gives us many possibilities for the fall time. So, the 200 questions become in reality, a number "much higher." We don't have the exact figure, as you will see later, the number is not easy to calculate but we are sure that more than 3,000 different statements can be generated with this database It is appropriate at this time to clarify the reason for the 200 questions: If a class has m students and we want to make a test with n questions a good way to reduce the chance that a student and his/her adjacent peer would have at least a single question in common is making a first approximation for the total number of questions to be asked by calculating m x n. For our specific case, we took a typical course of 40 students with a test of five questions, which is the reason for having 200 questions. II. DEVELOPMENT OF THE PROBLEM The questions were not devised because it was not the purpose of the work, for this reason they were taken from the book University Physics by Francis Sears[2], who is well known in our field. The responses were taken from the answer key of the book, although some answers had to be modified, either for lack of clarity in the explanation or because they simply were erroneous (very few cases). Each of the following types of questions provides a wide variety of choices: A. Multiple Choices with Single Answer In this type of question, the application offers the opportunity for response options that are in fixed positions or random, the response option can be one of four types[1]: Text or mathematical expressions Numeric values (generated by evaluating functions) Graph Images The answers in text or mathematical expression form are limited in that they are always the same response options for that question for each student. The numerical value type responses has the advantage with this application that this value may be generated by an algorithm that calculates the value from input data that can be fixed or randomly generated from some restrictions given by the creator of the question during its setup. For graph type responses, the application offers the option of calling up a graph from a .bmp type file. The graph type response has the limitation in that the options are fixed, it is clear that this option would only be useful to develop a test with a single question. Another option is function graphs (2D or 3D) that the computer itself can generate pre-defined function to plot, again with the advantage that the function data can be taken at random. This option has a limitation, in that it demands a powerful computer in order to generate plots quickly. In Fig. 1 there is an example of variables generated randomly

B. Multiple Choices with Multiple Answers This type of question has the same response options as set out in the Multiple Choice with Single Answer section. The novelty of this type of question is that we can assign a "weight to the responses." For example, if the test gives the student ten response options of which only two are correct, the student could select all ten responses and, of course, have selected the correct responses. However, you can instruct the program to deduct for incorrect answers, could set the computer to assign a value of 1 to correct answers and to incorrect answers a value of -1[1]. In the above case, if the student were to check all the response options, he/she would receive a score of -6. In Fig. 2 there is an example. C. Short-answer questions In this type of question the answer should always be an algebraic expression. Be very careful with the preparation of this question (obviously in the part of the answer) because the program has trouble understanding an answer that includes roots, trigonometric functions, logarithms and exponents. The application works very well with addition and multiplication, then clarify the above with some examples: Let's say the answer to an exercise was a+2b and student responds 3b + a-b. In this case, the program would mark the answer as correct. If the answer is cos x and the student responds with the same expression, the program would not recognize it as valid. If the answer is 2 (gh) and the student responds (4gh) the program would not recognize the response as valid. Based on these examples, it is clear that the program works very well when it comes to simplifying expressions that involve the four basic operations, but in other cases has several limitations. It is clear that when it comes to performing numerical calculations we have not found any error by the program, where wrong answers appear these are due to programming errors by the user, for example roots of -1 -1 negative numbers, calculating sin x or cos x for x (, -1) (1, ) or division by 0 and other possible errors. With the application it is also possible that a question that has several parts can be divided into sub-questions, but with the limitation that you cannot combine multiple-choice questions with multiple choice nor short-answer questions. III. ADVANTAGES AND DISADVANTAGES As a form of assessment there is a tendency to reproduce the forms of assessment that follow certain parameters within the reasonable cost-benefit ratio. The proposal presented in this work requires a large initial investment, especially in regards to the time spent constructing the database, but the benefit in the medium and long term is exceedingly large. Our proposal of assessment has the following advantages and disadvantages [7-8]: A. Advantages Reduce the emphasis on memorization.
-1

V1-58

2011 IEEE International Conference on Information and Education Technology (ICIET 2011)

B.

Emphasis on the ability to apply knowledge to real Situations. Quick evaluation (In fact it is instantaneous) Immediate Feedback. Flexibility in the duration, time and place of presentation of tests. It allows develop effectively the use of sources of information by students. The test is designed once and then only improvements are needed but never starting from scratch. Comprehensive coverage of content. It allows storage of data for further study of systematic errors committed by students.

We created a tool for learning assessment of kinematics that can immediately provide feedback to the participants of the teaching-learning process There was evidence that the creation of robust question for online application requires a considerable amount of time in the stage of virtualization. Preliminary tests indicate that the timeline for implementation, rating, feedback and information processing of exams show a significant time savings boost of the teacher / learner. The primary criterion of invulnerability of exams for online or repetitive application was developed, involving the number of students and the number of questions on each test. ACKNOWLEDGMENT The authors acknowledge the support received from the Universidad Nacional de Colombia for the completion of these educational experiences under the project HERMES No. 8894 Evaluacin Continua Personalizada aplicada al curso Fundamentos de FsicaFase Conceptual. REFERENCES
[1] [2] [3] [4] Users manual Exam Builder. Scientific Work Place v 5.5. http://iopireland.org/activity/education/Science_on_Stage/Physics_on _Stage_1/page_19123.html Sears, F. Fsica Universitaria. 11th ed. Vol.1, Ch. 2 and 3 P. Tehern, J.C. Len, Fsica conceptual en versin b-learning, XIV Congreso Internacional de tecnologa y educacin a distancia, San Jos, Costa Rica, 2008. P. Tehern, C.A. Cuesta; Valoracin de cambios conceptuales en estudiantes de Fsica I (Evaluation of Conceptual Changes of Physics I Students), 2005 available at http://dis.unal.edu.co/~hernandg/esip/ Garca Hoz, Vctor, Qu es la educacin personalizada? (What is Personalized Education?). 2nd ed.Buenos Aires:Docencia; 1981, pp. 46-50. http://tecnologiaedu.us.es/edutec/paginas/92.html P. Tehern, J.G. Carriazo and J.C. Len, Blended Learning Applied to an Introductory Course on Conceptual Physics, iJOE-Volume 6 Issue 3, August 2010, pp. 50-53. doi:10.3991/ijoe.v6i3.1303 http://www.online-journals.org/index.php/i-joe/issue/current

Disadvantages: Long preparation time. High probability of errors in the preparation. Skills of originality cannot be evaluated. As we can see, there are many more advantages than disadvantages that are presented in our proposal for assessment. However, we must not ignore the fact that the time factor in the preparation of the database is a substantial limitation. The work that we are presenting has only 200 questions that were already well formulated and structured in a typical physics textbook for first semester students. It evaluates only one of the themes of a course in mechanics and was prepared in approximately three months, obviously with the problem of starting from scratch. Remember that until now we used the Exam Builder tool in Scientific Work Place for the first time in our field (Physics at the National University of Colombia). We assume that the average amount of time spent to make the database for a particular chapter has to be closer to one month. Therefore a full course would take approximately one year.

[5]

[6]

IV. CONCLUSIONS We constructed a database of 200 questions on kinematics using the Exam Builder tool from Scientific WorkPlace software.

[7] [8]

[9]

Figure 1. Example of variables generated randomly.

V1-59

2011 IEEE International Conference on Information and Education Technology (ICIET 2011)

Figure 2. Example of negative scores in multiple choice with multiple answer questions.

V1-60

Вам также может понравиться