Вы находитесь на странице: 1из 25

Volume:

18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1

A peer-reviewed electronic journal. ISSN 1531-7714


Search:

keyw ords

Go

Copyright 2001, PAREonline.net. Copyright is retained by the first or sole author, who grants right of first publication to Practical Assessment, Research & Evaluation. Permission is granted to distribute this article for nonprofit, educational purposes if it is copied in its entirety and the journal is credited. PARE has the right to authorize third party reproduction of this article in print, electronic and database forms. Mertler, Craig A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25). Retrieved February 27, 2013 from http://PAREonline.net/getvn.asp?v=7&n=25 . This paper has been viewed 324,307 times since 12/11/2001.

Designing Scoring Rubrics for Your Classroom


Craig A. Mertler
Bowling Green State University Rubrics are rating scales-as opposed to checklists-that are used with performance assessments. They are formally defined as scoring guides, consisting of specific preestablished performance criteria, used in evaluating student work on performance assessments. Rubrics are typically the specific form of scoring instrument used when evaluating student performances or products resulting from a performance task. There are two types of rubrics: holistic and analytic (see Figure 1). A holistic rubric requires the teacher to score the overall process or product as a whole, without judging the component parts separately (Nitko, 2001). In contrast, with an analytic rubric, the teacher scores separate, individual parts of the product or performance first, then sums the individual scores to obtain a total score (Moskal, 2000; Nitko, 2001). Figure 1: Types of scoring instruments for performance assessments

Holistic rubrics are customarily utilized when errors in some part of the process can be tolerated provided the overall quality is high (Chase, 1999). Nitko (2001) further states that use of holistic rubrics is probably more appropriate when performance tasks require students to create some sort of response and where there is no definitive correct answer. The focus of a score reported using a holistic rubric is on the overall quality, proficiency, or understanding of the specific content and skills-it involves assessment on a unidimensional level (Mertler, 2001). Use of holistic rubrics can result in a somewhat quicker scoring process than use of analytic rubrics (Nitko, 2001). This is basically due to the fact that the teacher is required to read through or otherwise examine the student product or performance only once, in order to get an "overall" sense of what the student was able to accomplish (Mertler, 2001). Since assessment of the overall performance is the key, holistic rubrics are also typically, though not exclusively, used when the purpose of the performance assessment is summative in nature. At most, only limited feedback is provided to the student as a result of scoring performance tasks in this manner. A template for holistic scoring rubrics is presented in Table 1. Table 1: Template for Holistic Rubrics
Score Description 5 4 Demonstrates complete understanding of the problem. All requirements of task are included in response. Demonstrates considerable understanding of the problem. All requirements of task are included.

3 2 1 0

Demonstrates partial understanding of the problem. Most requirements of task are included. Demonstrates little understanding of the problem. Many requirements of task are missing. Demonstrates no understanding of the problem. No response/task not attempted.

Analytic rubrics are usually preferred when a fairly focused type of response is required (Nitko, 2001); that is, for performance tasks in which there may be one or two acceptable responses and creativity is not an essential feature of the students' responses. Furthermore, analytic rubrics result initially in several scores, followed by a summed total score-their use represents assessment on a multidimensional level (Mertler, 2001). As previously mentioned, the use of analytic rubrics can cause the scoring process to be substantially slower, mainly because assessing several different skills or characteristics individually requires a teacher to examine the product several times. Both their construction and use can be quite time-consuming. A general rule of thumb is that an individual's work should be examined a separate time for each of the specific performance tasks or scoring criteria (Mertler, 2001). However, the advantage to the use of analytic rubrics is quite substantial. The degree of feedback offered to students-and to teachers-is significant. Students receive specific feedback on their performance with respect to each of the individual scoring criteria-something that does not happen when using holistic rubrics (Nitko, 2001). It is possible to then create a "profile" of specific student strengths and weaknesses (Mertler, 2001). A template for analytic scoring rubrics is presented in Table 2. Table 2: Template for analytic rubrics
Beginning 1 Criteria #1 Description reflecting beginning level of performance Developing 2 Description reflecting movement toward mastery level of performance Description reflecting movement toward mastery level of performance Description reflecting movement toward mastery level of performance Description reflecting Accomplished 3 Description reflecting achievement of mastery level of performance Description reflecting achievement of mastery level of performance Description reflecting achievement of mastery level of performance Description reflecting Exemplary 4 Description reflecting highest level of performance Score

Criteria #2

Description reflecting beginning level of performance

Description reflecting highest level of performance

Criteria #3

Description reflecting beginning level of performance

Description reflecting highest level of performance

Criteria #4

Description reflecting

Description reflecting

beginning level of performance

movement toward mastery level of performance

achievement of mastery level of performance

highest level of performance

Prior to designing a specific rubric, a teacher must decide whether the performance or product will be scored holistically or analytically (Airasian, 2000 & 2001). Regardless of which type of rubric is selected, specific performance criteria and observable indicators must be identified as an initial step to development. The decision regarding the use of a holistic or analytic approach to scoring has several possible implications. The most important of these is that teachers must consider first how they intend to use the results. If an overall, summative score is desired, a holistic scoring approach would be more desirable. In contrast, if formative feedback is the goal, an analytic scoring rubric should be used. It is important to note that one type of rubric is not inherently better than the other-you must find a format that works best for your purposes (Montgomery, 2001). Other implications include the time requirements, the nature of the task itself, and the specific performance criteria being observed. As you saw demonstrated in the templates (Tables 1 and 2), the various levels of student performance can be defined using either quantitative (i.e., numerical) or qualitative (i.e., descriptive) labels. In some instances, teachers might want to utilize both quantitative and qualitative labels. If a rubric contains four levels of proficiency or understanding on a continuum, quantitative labels would typically range from "1" to "4." When using qualitative labels, teachers have much more flexibility, and can be more creative. A common type of qualitative scale might include the following labels: master, expert, apprentice, and novice. Nearly any type of qualitative scale will suffice, provided it "fits" with the task. One potentially frustrating aspect of scoring student work with rubrics is the issue of somehow converting them to "grades." It is not a good idea to think of rubrics in terms of percentages (Trice, 2000). For example, if a rubric has six levels (or "points"), a score of 3 should not be equated to 50% (an "F" in most letter grading systems). The process of converting rubric scores to grades or categories is more a process of logic than it is a mathematical one. Trice (2000) suggests that in a rubric scoring system, there are typically more scores at the average and above average categories (i.e., equating to grades of "C" or better) than there are below average categories. For instance, if a rubric consisted of nine score categories, the equivalent grades and categories might look like this: Table 3: Sample grades and categories

Rubric Score

Grade

Category

8 7 6 5 4 3 2 1 0

A+ A B+ B C+ C U U U

Excellent Excellent Good Good Fair Fair Unsatisfactory Unsatisfactory Unsatisfactory

When converting rubric scores to grades (typical at the secondary level) or descriptive feedback (typical at the elementary level), it is important to remember that there is not necessarily one correct way to accomplish this. The bottom line for classroom teachers is that they must find a system of conversion that works for them and fits comfortably into their individual system of reporting student performance. Steps in the Design of Scoring Rubrics A step-by-step process for designing scoring rubrics for classroom use is presented below. Information for these procedures was compiled from various sources (Airasian, 2000 & 2001; Mertler, 2001; Montgomery, 2001; Nitko, 2001; Tombari & Borich, 1999). The steps will be summarized and discussed, followed by presentations of two sample scoring rubrics. Step 1: Re-examine the learning objectives to be addressed by the task. This allows you to match your scoring guide with your objectives and actual instruction. Step 2: Identify specific observable attributes that you want to see (as well as those you dont want to see) your students demonstrate in their product, process, or performance. Specify the characteristics, skills, or behaviors that you will be looking for, as well as common mistakes you do not want to see. Step 3: Brainstorm characteristics that describe each attribute. Identify ways to describe above average, average, and below average performance for each observable attribute identified in Step 2. Step For holistic rubrics, write thorough narrative descriptions for excellent

4a:

work and poor work incorporating each attribute into the description. Describe the highest and lowest levels of performance combining the descriptors for all attributes. For analytic rubrics, write thorough narrative descriptions for excellent work and poor work for each individual attribute. Describe the highest and lowest levels of performance using the descriptors for each attribute separately. For holistic rubrics, complete the rubric by describing other levels on the continuum that ranges from excellent to poor work for the collective attributes. Write descriptions for all intermediate levels of performance. For analytic rubrics, complete the rubric by describing other levels on the continuum that ranges from excellent to poor work for each attribute. Write descriptions for all intermediate levels of performance for each attribute separately.

Step 4b:

Step 5a:

Step 5b:

Step 6: Collect samples of student work that exemplify each level. These will help you score in the future by serving as benchmarks. Step 7: Revise the rubric, as necessary. Be prepared to reflect on the effectiveness of the rubric and revise it prior to its next implementation.

These steps involved in the design of rubrics have been summarized in Figure 2. Figure 2: Designing Scoring Rubrics: Step-by-step procedures

Two Examples Two sample scoring rubrics corresponding to specific performance assessment tasks are presented next. Brief discussions precede the actual rubrics. For illustrative purposes, a holistic rubric is presented for the first task and an analytic rubric for the second. It should be noted that either a holistic or an analytic rubric could have

been designed for either task.

Example 1: Subject - Mathematics Grade Level(s) - Upper Elementary


Mr. Harris, a fourth-grade teacher, is planning a unit on the topic of data analysis, focusing primarily on the skills of estimation and interpretation of graphs. Specifically, at the end of this unit, he wants to be able to assess his students' mastery of the following instructional objectives:

Students will properly interpret a bar graph. Students will accurately estimate values from within a bar graph. (step 1)

Since the purpose of his performance task is summative in nature - the results will be incorporated into the students' grades, he decides to develop a holistic rubric. He identifies the following four attributes on which to focus his rubric: estimation, mathematical computation, conclusions, and communication of explanations (steps 2 & 3). Finally, he begins drafting descriptions of the various levels of performance for the observable attributes (steps 4 & 5). The final rubric for his task appears in Table 4.

Table 4: Math Performance Task Scoring Rubric Data Analysis


Name _____________________________ Score Description 4 3 2 1 0 Makes accurate estimations. Uses appropriate mathematical operations with no mistakes. Draws logical conclusions supported by graph. Sound explanations of thinking. Makes good estimations. Uses appropriate mathematical operations with few mistakes. Draws logical conclusions supported by graph. Good explanations of thinking. Attempts estimations, although many inaccurate. Uses inappropriate mathematical operations, but with no mistakes. Draws conclusions not supported by graph. Offers little explanation. Makes inaccurate estimations. Uses inappropriate mathematical operations. Draws no conclusions related to graph. Offers no explanations of thinking. No response/task not attempted. Date ___________

Example 2: Subjects - Social Studies; Probability & Statistics

Grade Level(s) - 9 - 12
Mrs. Wolfe is a high school American government teacher. She is beginning a unit on the electoral process and knows from past years that her students sometimes have difficulty with the concepts of sampling and election polling. She decides to give her students a performance assessment so they can demonstrate their levels of understanding of these concepts. The main idea that she wants to focus on is that samples (surveys) can accurately predict the viewpoints of an entire population. Specifically, she wants to be able to assess her students on the following instructional objectives:

Students will collect data using appropriate methods. Students will accurately analyze and summarize their data. Students will effectively communicate their results. (step 1)

Since the purpose of this performance task is formative in nature, she decides to develop an analytic rubric focusing on the following attributes: sampling technique, data collection, statistical analyses, and communication of results (steps 2 & 3). She drafts descriptions of the various levels of performance for the observable attributes (steps 4 & 5). The final rubric for this task appears in Table 5.

Table 5: Performance Task Scoring Rubric Population Sampling Name ____________________________ Beginning 1 Sampling Technique Inappropriate sampling technique used Developing 2 Appropriate technique used to select sample; major errors in execution Few pertinent questions asked; data on sample is inadequate Attempts analysis of data, but inappropriate procedures Communicates some important information; not organized well enough to support decision Date ________________ Accomplished 3 Appropriate technique used to select sample; minor errors in execution Most pertinent questions asked; data on sample is adequate Exemplary 4 Appropriate technique used to select sample; no errors in procedures Score

Survey/ Interview Questions

Inappropriate questions asked to gather needed information No attempt at summarizing collected data Communication of results is incomplete, unorganized, and difficult to follow

All pertinent questions asked; data on sample is complete

Statistical Analyses

Proper analytical procedures used, but analysis incomplete Communicates most of important information; shows support for decision

All proper analytical procedures used to summarize data Communication of results is very thorough; shows insight into how data predicted outcome

Communication of Results

Total Score = ____

Resources for Rubrics on the Web The following is just a partial list of some Web resources for information about and samples of scoring rubrics.

"Scoring Rubrics: What, When, & How?" (http://pareonline.net/getvn.asp?v=7&n=3). This article appears in Practical Assessment, Research, & Evaluation and is authored by Barbara M. Moskal. The article discusses what rubrics are, and distinguishes between holistic and analytic types. Examples and additional resources are provided. "Performance Assessment-Scoring" (http://www.pgcps.pg.k12.md.us/~elc/scoringtasks.html). Staff in the Prince George's County (MD) Public Schools have developed a series of pages that provide descriptions of the steps involved in the design of performance tasks. This particular page provides several rubric samples. "Rubrics from the Staff Room for Ontario Teachers" ( http://www.quadro.net/~ecoxon/Reporting/rubrics.htm ) This site is a collection of literally hundreds of teacher-developed rubrics for scoring performance tasks. The rubrics are categorized by subject area and type of task. This is a fantastic resourcecheck it out! "Rubistar Rubric Generator" (http://rubistar.4teachers.org/) "Teacher Rubric Maker" (http://www.teach-nology.com/web_tools/rubrics/) These two sites house Web-based rubric generators for teachers. Teachers can customize their own rubrics based on templates on each site. In both cases, rubric templates are organized by subject area and/or type of performance task. These are wonderful resources for teachers!

References Airasian, P. W. (2000). Assessment in the classroom: A concise approach (2nd ed.). Boston: McGraw-Hill. Airasian, P. W. (2001). Classroom assessment: Concepts and applications (4th ed.). Boston: McGraw-Hill. Chase, C. I. (1999). Contemporary assessment for educators. New York: Longman. Mertler, C. A. (2001). Using performance assessment in your classroom. Unpublished manuscript, Bowling Green State University.

Montgomery, K. (2001). Authentic assessment: A guide for elementary teachers. New York: Longman. Moskal, B. M. (2000). Scoring rubrics: what, when, and how?. Practical Assessment, Research, & Evaluation, 7(3). Available online: http://pareonline.net/getvn.asp?v=7&n=3 Nitko, A. J. (2001). Educational assessment of students (3rd ed.). Upper Saddle River, NJ: Merrill. Tombari, M. & Borich, G. (1999). Authentic assessment in the classroom: Applications and practice. Upper Saddle River, NJ: Merrill. Trice, A. D. (2000). A handbook of classroom assessment. New York: Longman.

Contact information Craig A. Mertler Educational Foundations & Inquiry Program College of Education & Human Development Bowling Green State University Bowling Green, OH 43403 mertler@bgnet.bgsu.edu Phone: 419-372-9357 Fax: 419-372-8265
Descriptors: *Rubrics; Scoring; *Student Evaluation; *Test Construction; *Evaluation Methods; Grades; Grading; *Scoring

Create Your Own Electronic Portfolio


Using Off-the-Shelf Software to Showcase Your Own or Student Work
By Helen C. Barrett Published in Learning & Leading with Technology, April, 2000 In the October 1998 issue of Learning & Leading with Technology, I outlined the strategic questions to ask when developing electronic portfolios. This article describes the electronic portfolio development process further and covers seven different software and hardware tools for creating portfolios. Some very good commercial electronic portfolio programs are on the market, although they often reflect the

developers style or are constrained by the limits of the software structure. Many educators who want to develop electronic portfolios tend to design their own, using off-the-shelf software or generic strategies. Here, I discuss the structure of each type of program, the advantages and disadvantages of each strategy, the relative ease of learning the software, the level of technology required, and related issues. The seven generic types of software are: 1. Relational databases 2. Hypermedia "card" software 3. Multimedia authoring software 4. World Wide Web (HTML) pages 5. Adobe Acrobat (PDF files) 6. Multimedia slideshows 7. Video (digital and analog) Why Portfolios? Portfolio assessment has become more commonplace in schools because it allows teachers to assess student development over periods of time, sometimes across several years. People develop portfolios at all phases of the lifespan. Educators in the Pacific Northwest (Northwest Evaluation Association, 1990), developed the following definition of portfolio. A portfolio is a purposeful collection of student work that exhibits the students efforts, progress, and achievements in one or more areas. The collection must include student participation in selecting contents, the criteria for selection; the criteria for judging merit, and evidence of student self-reflection. Electronic Portfolios. My definition of electronic portfolio includes the use of electronic technologies that allow the portfolio developer to collect and organize artifacts in many formats (audio, video, graphics, and text). A standards-based electronic portfolio uses hypertext links to organize the material to connect artifacts to appropriate goals or standards. Often, the terms electronic portfolio and digital portfolio are used interchangeably. However, I make a distinction: an electronic portfolio contains artifacts that may be in analog (e.g., videotape) or computerreadable form. In a digital portfolio, all artifacts have been transformed into computerreadable form. An electronic portfolio is not a haphazard collection of artifacts (i.e., a digital scrapbook or multimedia presentation) but rather a reflective tool that

demonstrates growth over time. Electronic Portfolio Development Electronic portfolio development brings together two different processes: multimedia project development and portfolio development. When developing an electronic portfolio, equal attention should be paid to these complimentary processes, as both are essential for effective electronic portfolio development. (See the online supplement at www.iste.org/L&L for a complete discussion of these processes.) Danielson and Abrutyn (1997) lay out a process for developing a portfolio: 1. Collection: The portfolios purpose, audience, and future use of the artifacts will determine what artifacts to collect. 2. Selection: Selection criteria for materials to include should reflect the learning objectives established for the portfolio. These should follow from national, state, or local standards and their associated evaluation rubrics or performance indicators. 3. Reflection: Include reflections on every piece in your portfolio and an overall reflection. 4. Projection (or I prefer, Direction): Review your reflections on learning, look ahead, and set goals for the future. I add a Connection stage, in which you create hypertext links and publish your portfolio to enable feedback from others, which can occur before or after the projection/direction stage. Multimedia project development usually includes the following steps (Ivers & Barron, 1998): 1. 2. 3. 4. 5. Assess/Decide Plan/Design Develop, Implement Evaluate

Assess/Decide. The focus is on needs assessment of the audience, the presentation goals, and the appropriate tools for the final portfolio presentation.

Design/Plan. In the second stage, focus on organizing or designing the presentation. Determine audience-appropriate content, software, storage medium, and presentation sequence. Construct flow charts and write storyboards. Develop. Gather materials to include in the presentation and organize them into a sequence (or use hyperlinks) for the best presentation of the material, using an appropriate multimedia authoring program. Implement. The developer presents the portfolio to the intended audience. Evaluate. In this final stage of multimedia development, the focus is on evaluating the presentations effectiveness in light of its purpose and the assessment context. Five Stages I have created a process for developing an electronic portfolio based on the general portfolio and multimedia development processes (Table 1).
Table 1: Stages of Electronic Portfolio Development Stages of Electronic Portfolio Multimedia Portfolio Development Development Development Purpose & Decide, 1. Defining the Portfolio Audience Assess Context & Goals Collect, 2. The Working Portfolio Design, Plan Interject Select, Reflect, 3. The Reflective Portfolio Develop Direct Inspect, 4. The Connected Portfolio Implement, Perfect, Evaluate Connect Respect Present, 5. The Presentation (Celebrate) Publish Portfolio

Differentiating the Levels of Electronic Portfolio Implementation. In addition to the stages of portfolio development, there appear to be at least five levels of electronic portfolio development. Just as there are developmental levels in student learning, there are developmental levels in digital portfolio development. Table 2 presents different levels for electronic portfolio development, which are closely aligned with the technology skills of the portfolio developer.
Table 2. Levels of electronic portfolio software strategies based on

0 1

ease of use. All documents are in paper format. Some portfolio data may be stored on videotape. All documents are in digital file formats, using word processing or other commonly used software, and stored in electronic folders on a hard drive, floppy disk, or LAN server. Portfolio data is entered into a structured format, such as a database or HyperStudio template or slide show (such as PowerPoint or AppleWorks) and stored on a hard drive, Zip, floppy disk, or LAN. Documents are translated into Portable Document Format with hyperlinks between standards, artifacts, and reflections using Adobe Acrobat Exchange and stored on a hard drive, Zip, Jaz, CDR/W, or LAN server. Documents are translated into HTML, complete with hyperlinks between standards, artifacts, and reflections, using a Web authoring program and posted to a Web server. Portfolio is organized with a multimedia authoring program, incorporating digital sound and video. Then it is converted to digital format and pressed to CD-R/W or posted to the Web in streaming format.

Based on these levels and stages, I offer a few items to consider as you make this software selection. Stage 1: Defining the Portfolio Context and Goals (Keywords: Purpose, Audience, Decide, Assess). What is the assessment context, including the purpose of the portfolio? Is it based on learner outcome goals (which should follow from national, state, or local standards and their associated evaluation rubrics or observable behaviors)? Setting the assessment context frames the rest of the portfolio development process. What resources are available for electronic portfolio development? What hardware and software do you have and how often do students have access to it? What are the technology skills of the students and teachers? Some possible options are outlined in Tables 3 & 4.
Table 3. Technology skill levels. Limited experience with desktop computers but able to use mouse and menus and run simple programs Level 1 plus proficient with a word processor, basic e-mail, and Internet browsing; can enter data into a predesigned database Level 2 plus able to build a simple hypertext (nonlinear) document

1 2 3

4 5

with links using a hypermedia program such as HyperStudio or Adobe Acrobat Exchange or an HTML WYSIWYG editor Level 3 plus able to record sounds, scan images, output computer screens to a VCR, and design an original database Level 4 plus multimedia programming or HTML authoring; can also create QuickTime movies live or from tape; able to program a relational database Table 4. Technology Available No computer Single computer with 16 MB RAM, 500 MB HD, no AV input/output One or two computers with 32 MB RAM, 1+ GB HD, simple AV input (such as QuickCam) Three or four computers, one of which has 64+ MB RAM, 2+GB HD, AV input and output, scanner, VCR, video camera, highdensity floppy (such as a Zip drive) Level 4 and CD-ROM recorder, at least two computers with 128+ MB RAM; digital video editing hardware and software. Extra Gb+ storage (such as Jaz drive)

1 2 3 4

Who is the audience for the portfoliostudent, parent, professor, or employer? The primary audience for the portfolio affects the decisions made about the format and storage of the presentation portfolio. Choose a format the audience will most likely have access to (e.g., a home computer, VCR, or the Web). You will know you are ready for the next stage when you have:

identified the purpose and primary audience for your portfolio, identified the standards or goals you will use to organize your portfolio, and selected your development software and completed the first stage using that software.

Stage 2: The Working Portfolio (Keywords: Collect, Interject, Design, Plan). What is the content of portfolio items (determined by the assessment context) and the type of evidence to be collected? This is where the standards become a very important part of the planning process. Knowing which standards or goals you are trying to demonstrate should help determine the types of portfolio artifacts to collect. For example, if the portfolio goal is to demonstrate the standard of clear communication, then examples should reflect students writing (scanned or imported from a word processing document) and speaking abilities (sound or video clips).

Which software tools are most appropriate for the portfolio context and the resources available? This question is the theme of the rest of this article. The software used to create the electronic portfolio will control, restrict, or enhance the portfolio development process. The electronic portfolio software should match the vision and style of the portfolio developer. Which storage and presentation medium is most appropriate for the situation (computer hard disk, videotape, LAN, the Web, CD-ROM)? The type of audience for the portfolio will determine this answer. There are also multiple options, depending on the software chosen. What multimedia materials will you gather to represent a learners achievement? Once you have answered the questions about portfolio context and content and addressed the limitations on the available equipment and users skills (both teachers and students), you will be able to determine the type of materials you will digitize. This can include written work, images of 3-D projects, speech recordings, and video clips of performances. You will want to collect artifacts from different time periods to demonstrate growth and learning achieved over time. You will know you are ready for the next stage when you have:

collected digital portfolio artifacts that represent your efforts and achievement throughout the course of your learning experiences, and used the graphics and layout capability of your chosen software to interject your vision and style into the portfolio artifacts.

Stage 3: The Reflective Portfolio (Keywords: Select, Reflect, Direct, Develop). How will you select the specific artifacts from the abundance of the working portfolio to demonstrate achieving the portfolios goals? What are your criteria for selecting artifacts and for judging merit? Having a clear set of rubrics at this stage will help guide portfolio development and evaluation. How will you record self-reflection on work and achievement of goals? The quality of the learning that results from the portfolio development process may be in direct proportion to the quality of the students self-reflection on their work. One challenge in this process is to keep these reflections confidential. The personal, private reflections of the learner need to be guarded and not published in a public medium. How will you record teacher feedback on student work and achievement of goals, when appropriate? Even more critical is the confidential nature of the assessment process. Teachers feedback should also be kept confidential so that only

the student, parents, and other appropriate audiences have access. Security, in the form of password protection to control access, is an important factor when choosing electronic portfolio development software. How will you record goals for future learning based on the personal reflections and feedback? The primary benefit of a portfolio is to see growth over time, which should inspire goal setting for future learning. It is this process of setting learning goals that turns the portfolio into a powerful tool for long-term growth and development. You will know you are ready for the next stage when you have:

selected the artifacts for your formal or presentation portfolio, and written reflective statements and identified learning goals.

Stage 4: The Connected Portfolio (Keywords: Inspect, Perfect, Connect, Implement, Evaluate). How will you organize the digital artifacts? Have you selected software that allows you to create hyperlinks between goals, student work samples, rubrics, and assessment? The choice of software can either restrict or enhance the development process and the quality of the final product. Different software packages each have unique characteristics that can limit or expand the electronic portfolio options. How will you evaluate the portfolios effectiveness in light of its purpose and the assessment context? In an environment of continuous improvement, a portfolio should be viewed as an ongoing learning tool, and its effectiveness should be reviewed on a regular basis to be sure it is meeting the goals set. Depending on portfolio context, how will you use portfolio evidence to make instruction/learning decisions? Whether the portfolio is developed with a young child or a practicing professional, the artifacts collected along with the self-reflection should help guide learning decisions. This process brings together instruction and assessment in the most effective way. Will you develop a collection of exemplary portfolio artifacts for comparison purposes? Many portfolio development guidebooks recommend collecting model portfolio artifacts that demonstrate achievement of specific standards. This provides the audience with a frame of reference to judge a specific students work. It also provides concrete examples of good work for students to emulate. You will know you are ready for the next stage when:

your documents are converted into a format that allows hyperlinks and you can navigate using them, you have inserted the appropriate multimedia artifacts into the document, and you are ready to share your portfolio with others.

Stage 5: The Presentation Portfolio (Keywords: Respect, Celebrate, Present, Publish). How will you record the portfolio to an appropriate presentation and storage medium? These will be different for a working portfolio and a presentation portfolio. I find that the best medium for a working portfolio is videotape, computer hard disk, Zip disk, or network server. The best medium for a formal portfolio is CD-Recordable disc, Web server, or videotape. How will you or your students present the finished portfolio to an appropriate audience? This will be a very individual strategy, depending on the context. An emerging strategy is student-led conferences, which enable learners to share their portfolios with a targeted audience, whether parents, peers, or potential employers. This is also an opportunity for professionals to share their teaching portfolios with colleagues for meaningful feedback and collaboration in self-assessment. Software Selection One of the key criteria for software selection should be its capability to allow teachers and students to create hypertext links between goals, outcomes, and various student artifacts (products and projects) displayed in multimedia format. Another is Web accessibility. With seven options to choose from, you should be able to find software to fit your audience, goals, technology skills, and available equipment (See Table 5 for a comparison of software. Find detailed descriptions, software resources, comparison information, and selection guidelines throughout the process online at www.iste.org/L&L). Relational Databases (e.g., FileMaker Pro, Microsoft Access). In recent years, new database management tools have become available that allow teachers to easily create whole-class records of student achievement. A relational database is actually a series of interlinked structured data files linked together by common fields. One data file could include the students names, addresses, and various individual elements; another could include a list of the standards that each student should be achieving; still another could include portfolio artifacts that demonstrate each students achievement of those standards. The purpose of using a relational database is to link together the students with their individual portfolio artifacts and the standards these artifacts should clearly demonstrate.

Advantages include flexibility, network and Web capabilities, cross-platform capabilities, tracking and reporting, multimedia, and security. Disadvantages include the size of relational database files (they can become very large and unwieldy); they may not be accessible to users who do have the software; and they require a high level of skill to use effectively. Databases are really teacher-centered portfolio tools. They allow teachers to keep track of student achievement at every level. They are less appropriate for students to use to maintain their own portfolios. You may save appropriate pages from the database as PDF files for students to include in their own portfolios. Hypermedia "Card" Programs (e.g., HyperStudio, Digital Chisel, Toolbook, and SuperLink). A hypermedia program allows the integration of various media types in a single file, with construction tools for graphics, sound, and movies. The basic structure of a hypermedia file is described as electronic cards that are really individual screens that can be linked together by buttons the user creates. Hypermedia programs are widely available in classrooms, usually all-inclusive, crossplatform, multimedia capable, and secure. Disadvantages include lack of integrated Web accessibility, size and resolution constraints, and increased effort linking artifacts to standards. Hypermedia programs are most appropriate for elementary or middle school portfolios. Templates and strategies are widely available to help you begin using your chosen hypermedia tool as a portfolio development and assessment tool. Multimedia Authoring Software (e.g., Macromedia Director or Authorware). In recent years, multimedia authoring software has emerged from such companies as Macromedia and mTropolis. Authorware is an icon-based authoring environment, in which a user builds a flow chart to create a presentation. Director is a time-based authoring environment, in which the user creates an interactive presentation with a cast and various multimedia elements. Both programs allow the user to create standalone applications that can run on Windows and Macintosh platforms. These programs allow users to create presentations that are self-running, without separate player software. They were designed to incorporate multimedia elements. They are ideal for CD-ROM publishing, but they have a steep learning curve, require extra effort to link artifacts to standards, and may not offer the necessary security. Multimedia authoring programs would be most appropriate for high school, college, or professional portfolio creation.

Web Pages (e.g., Adobe PageMill, Claris Home Page, Microsoft FrontPage, Netscape Composer). An emerging trend in the development of electronic portfolios is to publish them in HTML format. With wide accessibility to the Web, many schools are encouraging students to publish their portfolios in this format. Students convert word processing documents into Web pages with tools built into those programs and create hyperlinks between goals and the artifacts that demonstrate achievement. The advantages of creating Web-based portfolios center on its multimedia, crossplatform, and Web capabilities. Any potential viewer simply needs Internet access and a Web browser. However, the learning curve is steep. Web pages require much more file-management skill than other types of portfolio development tools, and the security can be a problem. Students in upper-elementary grades and beyond can create Web pages, but this type of portfolio is especially appropriate for those who wish to showcase their portfolio for a potential employer. PDF Documents (Adobe Acrobat). One of the more interesting development environments for electronic portfolios is Adobe Acrobats Portable Document Format (PDF). PDF files are based on the Postscript page layout language originally developed for printing to a laser printer. PDF files are created using the tools provided by Adobe, either the PDF Writer or Distiller program. Adobe Acrobat files are called Portable Document Format because the same file can be read by a variety of computer platforms and require only the free Acrobat Reader software. The process of creating an Acrobat file can be as easy as printing to a printer; in fact, the PDF Writer is a printer driver that is selected when the user wants to convert a document from any application into a PDF file. Another software package, PrintToPDF, is a less powerful shareware Macintosh printer driver that creates simple PDFs for a much lower price ($20). Once a PDF file is created, the user can navigate page by page, by using bookmarks they create, or with hypertext links or buttons they can create with the Acrobat Exchange program. My own electronic teaching portfolio is published on CD-ROM with Adobe Acrobat. PDF files are easy to access and read, can be created from multiple applications, include multimedia elements, are easily published to CD-ROM, have few size and resolution constraints, and are secure. Disadvantages include the large file size, the need for separate creation software, and the effort required to link artifacts to standards. Students at all levels can create PDF files, but it is a more appropriate tool for high school and older students.

Multimedia Slideshows (e.g., AppleWorks and PowerPoint). These programs allow the user to create electronic slideshows most often shown in a linear sequence. Most of these tools allow the integration of sound and video, and Microsoft PowerPoint allows some buttons and links. Other software can also be used to create electronic portfolio documents, such as a word processor or spreadsheet. Advantages of multimedia slideshows include the easy access to the software and its multimedia capabilities. But, it is challenging to link artifacts to standards, files can be very large, Web publishing requires conversion to HTML, password protection may not be available. Multimedia slideshows are most appropriate for middle school and older students. Video (digital and analog). Digital video can be a powerful addition to many of the other portfolio development tools outlined here. Nonlinear digital video editing could be used to organize videotaped portfolio artifacts. Analog video can be used to gather evidence of student learning in a low-cost storage medium, and videotape is a popular final publishing medium for sharing student presentation portfolios with family and friends. Advantages of analog video include its widespread use, cheap storage medium, acceptable quality, and relatively low-cost hardware. Analog video, however, allows only linear access, has low interactivity, offers no Web accessibility, takes a lot storage space, and is hard to edit. Digital video adds Web accessibility, high interactivity, random access, and easy editing, but also can be low quality, have huge file size and bandwidth requirements, and require expensive equipment to digitize and edit. Video is appropriate for a wide range of students and audiences. It is the best way to capture classroom interaction, including nonverbal cues, and it is often the method by which final portfolios are shared. TABLE 5 Conclusion With all of these choices, which strategy should you choose? Are different tools more appropriate at different stages of the electronic portfolio development process? These questions can only be answered after addressing some of the questions posed at the beginning of the article, especially the purpose and audience for the portfolio, the

resources available (equipment and technology skills required), and where the advantages of the strategy outweigh the disadvantages for your situation. I would be interested in hearing from educators who have used any of these (or other) tools and who would be willing to share their successes or frustrations in a case study. (See the online supplement at www.iste.org/L&L for more on how to participate.) References Barrett, H. (1998). Strategic questions: What to consider when planning for electronic portfolios. Learning & Leading with Technology 26(2), 6?13. Available: www.iste.org/L&Lselect Archive, then Volume 26, Number 2. Barrett, H. (1999). Using technology to support alternative assessment and electronic portfolios [Online document]. Anchorage: University of Alaska?Anchorage. Available: http://transition.alaska.edu/www/portfolios.html Danielson, C., & Abrutyn, L. (1997) An introduction to using portfolios in the classroom. Alexandria, VA: Association for Supervision and Curriculum Development. Ivers, K., & Barron, A. E. (1998) Multimedia projects in education. Englewood, CO: Libraries Unlimited, Inc.

Software Publisher websites


Relational databases FileMaker Pro (http://www.filemaker.com/products/fm_home.html Microsoft Access (http://www.microsoft.com/office/access/default.htm Hypermedia card formats HyperStudio (http://www.hyperstudio.com/ HyperCard (http://www.apple.com/hypercard/ Digital Chisel (http://www.pierian.com/products/authoring_tools/digital_chisel3/dc3.htm Asymetrix Toolbook (http://www.asymetrix.com/products/

SuperLink (http://www.alchemediainc.com/ Some commercially available electronic portfolio templates use some of these programs. Multimedia authoring software Macromedia Authorware (http://www.macromedia.com/software/authorware/) Macromedia Director (http://www.macromedia.com/software/director/ Network-compatible hypermedia HTML Web page editors Adobe Page Mill (http://www.adobe.com/products/pagemill/main.html Adobe GoLive (http://www.adobe.com/products/golive/main.html FileMaker Home Page (http://www.filemaker.com/products/hp_home.html Macromedia DreamWeaver (http://www.macromedia.com/software/dreamweaver/ Netscape Composer (http://home.netscape.com/communicator/composer/v4.0/ and many, many more HTML Editors Adobe Acrobat portable document format (PDF) files (http://www.adobe.com/products/acrobat/main.html PDF Conversion Programs: PrintToPDF (http://www.jwwalker.com/pages/pdf.html) (Macintosh only) $20 shareware single user, $300 site license Win2PDF http://www.daneprairie.com/ (Windows NT or 2000) free for noncommercial use, $35 single user license fee 5D PDF Creator http://www.pdfstore.com/mainpage.asp?webpageid=216&pdfsproductid=105 7 (Windows or Mac) $99 PDF Converter http://www.amyuni.com/pdfpd.htm (Windows 3.1/95/98/2000/NT) $129 single user PDF Driver 4.0 http://www.zeon.com.tw/pdfdrive.htm (Windows 95/98/NT/2000) $12.50 Education Version

Other tool software programs Integrated "Works" programs, especially those that allow creation of slideshows AppleWorks (http://www.apple.com/appleworks/ Microsoft PowerPoint (http://www.microsoft.com/office/powerpoint/default.htm)

Вам также может понравиться