You are on page 1of 6

NOTE:

this rubric was presented by Allan Gyorke (Penn State) at the 2012 ELI Fall Focus Session. Developing and Testing a Framework for Analyzing Innovation with Chris Brooks (University of Minnesota) and Josh Morrill (University of Wisconsin). http://www.educause.edu/eli/events/eli-fall-focus- session/resources/participant-resources/recordings

Name Affiliation Email address: UTILITY: How well does this project solve or promise to solve a problem? * KEY TERMS: purpose, usefulness, improvement, function, benefit, problem statement, goal, solution -3 (Does Not Solve a Problem: Novelty for its own sake.) -2 -1 0 (Useful: Solves a specific problem. Does not create problems.) 1 2 3 (Creates Problems: Addresses some, but creates more. Unintended consequences.) What is the problem this project addresses? How appropriate is the proposed solution to the nature of the problem? What is the need for this project? Why is the problem significant? Why is the problem worth solving? Who will benefit? CREATIVITY: How does the proposed project do something new? * KEY TERMS: convergent thinking, divergent thinking, revolutionary thinking, novelty, newness, reform, synthesis, transcend, transformation, compelling, imagination, original -3 (Slight Modifications: Tweaking existing methods, practices, and approaches.) -2 -1 0 (Moderate Modifications: Synthesizing existing methods, practices, approaches.) 1 2 3 (Major Modifications: Transcending existing methods, practices, approaches.) How much of a departure from the established methods of doing things does the proposed project make? Does the project build upon existing models to make them better? Does the project combine multiple approaches into something new? Does the project do something for which there is no precedent? To whom is the project new? EFFICACY: What evidence is there that this solution has been or will be effective?

* KEY TERMS: assessment, evaluation, measurement, benchmark, data collection, research design, evidence, literature, innovators, early adopters, incubation -3 (No Supporting Evidence: A shot in the dark, random idea) -2 -1 0 (Efficacious Innovation: Evidence that this can be successful, but we'll still learn something.) 1 2 3 (Standardized Solution: Boring! We know this will work.) For a proposed project, what evidence is there that this approach will be effective? If no evidence exists, what evidence will be collected to show that this will be effective? This may come from a review of published research, results from a pilot implementation, or benchmarking with a similar institution. For an existing project, what are we doing to measure its effect on teaching and learning? How will the results be shared with others? FEASIBILITY: How realistic is the proposed project within the given context and resources available? * KEY TERMS: sustainability, scalable, viability, product, predictable, certainty, achievable, feasibility study, reasonableness -3 (Status Quo, Doable) -2 -1 0 (Possible, Realistic within a Range: May require growing pains.) 1 2 3 (Impossible or Unrealistic) How does the project address the issues surrounding feasibility? What is the potential viability of the product? What does the roadmap of the product(s) in the project look like? What is the intended life cycle? What does sustainability look like? Does the project over-estimate or under-estimate costs, resources, and products associated with the ideas and the innovation? RISK: What is the probability that the proposed project will fail to produce desirable outcomes? * KEY TERMS: Cost-benefit analysis, risk-reward, outcome, uncertainty, liability, gamble, potential, gains, estimation -3 (0% Risk: Completely safe. Outcomes are perfectly known/certain.) -2 -1 0 (50% Risk: Reasonable amount of risk. Outcomes are estimated.)

What evidence is there that a project will fail (or, conversely, succeed) given the stated parameters and expectations? How might the costs and/or liabilities associated with a project outweigh the potential benefits? Does the proposal consider different types of risk (e.g. political, social, financial, etc.)? How does the proposal address issues related to uncertainty? How does the project account for the level of support for uncertainty in the relevant context? RESISTANCE: How will the proposed practice or solution be received and/or adopted? * KEY TERMS: change, disruption, critique, adoption curve, diffusion, dissemination, barriers, skepticism, acceptance -3 (Universal, Uncritical Acceptance: Mainstream.) -2 -1 0 (Reasonable Resistance: Skepticism can be overcome over time.) 1 2 3 (Refusal to Accept: No adoption.) What is the relative advantage of the solution/project over previous projects? Is the solution or proposed practice easy or hard to understand and use? What are the real and/or perceived barriers to adoption? How compatible is the solution/practice with the norm? In what ways does the solution/practice challenge the accepted norm or practice? How easily can the solution/practice be piloted/trialed or experimented with as it is being adopted? What are the cultural, social, or political impacts of adoption? How much exposure/visibility will the project have? What is the plan for disseminating the results and/or products?

1 2 3 (100% Risk: Completely risky. Outcomes are perfectly unknown/uncertain.)

Description of Rubric Technology Fee Instructional Technology Enhancement Projects (ITEP)

1. Alignment to UWF Strategic Plan: While all proposals should outline an alignment to the UWF strategic plan, the question to answer here is To what extent does the specific item of alignment directly relate to students? For example, a proposal that aligns with a component of the strategic plan that has little student impact should not be rated as highly as one that has a direct impact on students. 2. Perceived value on enhancement of instructional technology: In evaluating this criterion, ask the question: How is the application/project innovative and to what extent does the initiative have potential for long-term, sustained benefits on the use of instructional technology including at least one of the following: (1) Enhances student technology access; enhances innovation in teaching and learning; (3) enhances the teaching and learning experience promoting student engagement, assessment and success; (4) provides enhanced resources for students and faculty with special needs or disabilities; and/or (5) provides enhanced training for students and faculty for use of technology. 3. Impact on number of students: In evaluating this criterion, ask the question: How will students directly benefit by the proposed technology initiative? The more students that could directly benefit from the initiative, the more highly the proposal should be rated. Implicit in this is the ability of the University to measure the success/impact of the proposed use of the funds. From the description given, do you feel that you will know whether the proposed use of the funds will produce the desired outcome(s)? The greater the confidence you have in answering yes, the higher the score you should award. 4. Cost/rate of return on investment: A cap on the amount requested in proposals was not included in the call for proposals. However, only about $300,000 is available to fund ITEP projects. Because funding availability is limited, in rating this criterion, you should carefully evaluate cost versus benefit. For example, a high cost proposal that benefits a limited number of students should receive a much lower rating than a low cost proposal that benefits a limited number of students. 5. Budget adequacy/organizations ability to implement: In evaluating this criterion, ask the question: Does the proposal sufficiently describe needed resources/sustainability of the project in light of the requirement that any funding provided will be time-limited? Does the proposal provide for a consideration for scalability and continued operation once funding is no longer available? 6. Perceived value on student engagement, assessment, and success: In evaluating this criterion, ask the question: Will the project enhance student engagement, assessment and success?

NOTE: This rubric was developed by the University of West Florida to evaluate proposals for technology innovation. General information about the UWF program: http://uwf.edu/academic/techfee/itep/itep.cfm

EVALUATION RUBRIC FOR IPOD/IPAD APPS DOMAIN Curriculum Connection 4


Skill(s) reinforced are strongly connected to the targeted skill or concept Targeted skills are practiced in an authentic format/ problem-based learning environment Feedback is specific and results in improved student performance

3
Skill(s) reinforced are related to the targeted skill or concept Some aspects of the app are presented in an authentic learning environment Feedback is specific and results in improved student performance (may include tutorial aids) App offers more than one degree of flexibility to adjust settings to meet student needs Students need to have the teacher review how to use the app

2
Skill(s) reinforced are prerequisite or foundation skills for the targeted skill or concept Skills are practiced in a contrived game/simulation format Feedback is limited to the correctness of student responses and may allow students to try again App offers limited flexibility to adjust settings to meet student needs (e.g., few levels such as easy, medium, hard) Students need to have the teacher review how to use the app on more than one occasion Students view the app as more schoolwork and may be off-task when directed by the teacher to use the app

1
Skill(s) reinforced in the app are not clearly connected to the targeted skill or concept Skills are practiced in a rote or isolated fashion (e.g. flashcards) Feedback is limited to the correctness of student responses App offers no flexibility to adjust settings to meet student needs (settings cannot be altered) Students need constant teacher supervision in order to use the app Students avoid the use of the app or complain when use of the app is assigned

Authenticity

Feedback

Differentiation

App offers complete flexibility to alter settings to meet student needs Students can launch and navigate within the app independently Students are highly motivated to use the app and select it as their first choice from a selection of related choices of apps Data is available electronically to the student and teacher as a part of the app

User Friendliness

Student Motivation

Students use the app as directed by the teacher

Reporting

Data is available Data is available electronically electronically to student on a to the student, The app does not contain a summary page and may be but is not presented summary page screenshot to share with on a single summary page teacher Created by Harry Walker Johns Hopkins University 10/18/2010 Edited, with permission, by Kathy Schrock 02/25/2011

Stakeholder involvement rubric / matrix


Key Stakeholder Faculty Groups Pilot Evaluation Participation Contributing/ reviewing evaluation questions, metrics & success criteria Participating in data-gathering activities Reviewing analysis or participating in analysis Drafting recommendations or findings Communicating results; advocating for use of results Engagement score by group Suggested scores for levels of engagement: High (3) Medium (2) Students IT Staff Librarians Administrators Funders/ (Insert additional as needed) Pilot Sponsors Engagement score by evaluation phase

Low (1)

None (0)

This rubric is intended to be used in 2 ways: for brainstorming and organizing how to engage different stakeholders across phases of an evaluation for a pilot project evaluation or other program evaluation. In addition, each cell can also be scored and totals across rows and columns used to assess whether stakeholder engagement is sufficient within stakeholder groups and within phases of the pilot evaluation.
Rubric: Assessing Stakeholder Engagement in Pilot Evaluation Yvonne Belanger, Duke University September 27, 2012