The terms clinical reasoning, clinical judgment, problem solving, decision making and critical thinking are often used interchangeably .We use the term clinical reasoning to describe the process by which clinicians collect cues, process the information, come to an understanding of a patient problem or situation, plan and implement interventions, evaluate outcomes, and reflect on and learn from the process (1,2&3). The clinical reasoning process is dependent upon a critical thinking "disposition"(4) and is influenced by a person's attitude, philosophical perspective and preconceptions. Clinical reasoning is not a linear process but can be conceptualized as a series or spiral of linked and ongoing clinical encounters (5). Clinical judgment is the ability to make appropriate decisions in uncertain or ambiguous situations (6). It is this capacity, to reason in contexts of uncertainty and to solve ill-defined problems, that defines professional competence (7). The knowledge needed to successfully reason in these contexts is called professional knowledge (8). In addition to their necessary learning of a large body of knowledge, medical students must develop clinical reasoning skills. Clinical reasoning has mainly been studied in the context of medical diagnosis, which is considered to involve a problem-solving process. Problem-based learning (PBL) was developed to facilitate the Students process of acquisition, organization and retrieval of knowledge (9,10). One aim of PBL is to help the student integrate new information in a rich and connected knowledge network that will be activated later (illness scripts). Through such mechanisms, PBL promotes clinical reasoning skills (11). Assessing the development of this kind of clinical reasoning in medical education in general has typically been difficult, expensive, and time-consuming. The most common form of assessment in medical education, the multiple choice question, works well when there is a single predetermined right answer, but is not appropriate for portraying or capturing the shades of uncertainty inherent in a clinical scenario(12). An assessment like the Triple Jump, an essay-based examination that evaluates clinical reasoning, can be exhausting, resource-intensive both to complete, to evaluate and suffers from poor evaluator inter-rater reliability (13). A relatively new assessment tool called the Script Concordance Test (SCT) presents another option. The assessment is based on the script theory of medical decision-making popularized by Schmidt (14). The Script Concordance (SCT) test, that stems from this cognitive theory of clinical expertise development places examinees in written but authentic clinical situations in which they have to interpret data to make decisions. The test belongs to the class of written simulations, which could be either paper or computer based. It can be used in undergraduate, postgraduate, or continuing medical education (15). The adaptation of cognitive psychology script theory to the characteristics of reasoning in the professions provides a promising way to build a theory based assessment tool (16,17). A script is a goal-directed knowledge structure adapted to perform tasks efficiently (18). Scripts begin to appear when students are confronted with real professional tasks. They are then developed and refined during ones professional life (17). The theory implies that to give meaning and to act adequately to a situation, professionals activate scripts relevant to the situation. These structures are used to actively process information to confirm or eliminate hypotheses, or management options (16). According to this theory reasoning is made with a series of qualitative judgments. Each of these judgments can be measured and compared to those of a reference panel of experienced practitioners. This provides a method of assessment of reasoning on ill-defined problems and in contexts of uncertainty. It is named the script concordance approach (16). SCT have practical relevance for clinical educators first, it introduces an evidence-based tool into our evaluation materials for the assessment of clinical decision making. Second, the SCT has great potential as a tool for use in the formative evaluation of trainees from medical students at the clerkship stage to residents at all levels. Armed with the results of the SCT performance, the educator would be able to sit down with the trainee, and by questioning, determine what the trainee was thinking when an incorrect response was chosen. This approach will help clarify why the particular choice was made and even more importantly provide the teacher with information that could be used to help the trainee improve their clinical decision making capabilities (16). A significant part of professionals competence relies on the capacity to deal with uncertainty and to solve ill-defined problems. Traditional written assessment tools, such as rich-context, multiple-choice questions, properly and reliably test the ability of students to apply well known solutions to well defined problems. However, assessment of reasoning competence should also include tools that measure the ability to rationally solve ill-defined problems (8). The Script Concordance Test (SCT) was developed with the purpose of expanding the material assessed in clinical reasoning to include ill defined problems. It is designed to be added to existing tools, not substituted for them (19). As we are a PBL school, we should have an assessment method that tests clinical reasoning as independent activity. In phase 1 we have triple jump exam for this reason but in clinical clerkship years we assess the clinical reasoning with the oral or the clinical exam but we need to have assessment tool to assess this important skill.