Академический Документы
Профессиональный Документы
Культура Документы
phase 2 report
January 2007
ISBN 978-1-904158-79-0
First published in Great Britain 2007 by Goldsmiths, University of London, New Cross,
London SE14 6NW.
All rights reserved. No part of this publication may be reproduced in any form or by any
means without the permission of the publishers or the authors concerned.
Additional copies of this publication are available from Department of Design, Goldsmiths,
University of London, New Cross, London SE14 6NW, price £30. Cheques, made payable to
Goldsmiths College should be sent with the order.
e-scape e-solutions for creative assessment in portfolio environments
Acknowledgements
TERU / Goldsmiths Chloe Nast, Tony Wheeler, Richard Kimbell
Activity administrators Kay Stables, Ruth Wright, Tristram Shepard, Soo Miller, Tony
Wheeler, Richard Kimbell
Assessors / judges Kay Stables, Ruth Wight, Tristram Shepard, Gillian Whitehouse,
Tony Wheeler, Richard Kimbell, Jo Hayes
TAG Learning Will Wharfe, Karim Derrick, Wayne Barry, Andrew Campbell,
Declan Lynch
Awarding Bodies
AQA: Bob Penrose, Steve Healey
Edexcel: Paul Humphries, Gillian Whitehouse, Dale Hinch
Dedicated to Paul Humphries who was instrumental in getting e-scape off the ground
1
e-scape e-solutions for creative assessment in portfolio environments
Contents:
page
executive summary
1. e-scape phase 1
1.1 context 8
1.2 starting points 16
1.3 brief for e-scape phase 1 20
1.4 methodology for e-scape phase 1 21
1.5 findings from e-scape phase 1 25
1.6 specifying e-scape phase 2 29
1.7 emerging research questions for phase 2 31
2. e-scape phase 2
2.1 task trials 32
2.2 system components 33
2.3 system trials (May 06) 37
2.4 training the teachers 38
2.5 the national pilot (June/July 06) 39
2.6 an overview of the activity 41
2.7 the response in schools 44
2.8 the KS2 trial 48
2.9 teachers TV 52
2.10 the paper test 53
2.11 the e-scape web-site 54
2.12 an approach to assessment for e-scape 55
2.13 e-scape performance analysis 60
2.14 the response of the judging team 64
2.15 assessing ‘Light Fantastic’ 65
2.16 illustrating performance 66
2.17 findings 82
2.18 issues arising:
concerning the classroom activity 87
concerning assessment 89
concerning the technology 93
2.19 conclusions and next steps 95
references
appendices on CD
2
e-scape e-solutions for creative assessment in portfolio environments
executive summary
In 2003, the Technology Education Research Unit at Goldsmiths University of London was
asked to undertake research to examine the extent to which - and the ways in which -
innovation and teamwork might be more fully recognised and rewarded in assessment
processes, particularly within GCSE. The project ‘assessing design innovation’ was
launched in Jan 2003 and concluded in Dec 2004.
The principal outcome of that project was a developed portfolio assessment system that sat
somewhere between a formal examination and a piece of coursework. It was designed to
operate in 6 hours - typically 2 mornings - and presented learners with a design task that
was to be taken through to a prototype. The outcomes of learners’ work during this project
were most encouraging. It was possible to demonstrate that different levels of innovation
were identifiable in the work and that the best work was highly innovative. Critically, the
consensus of teachers and learners was that the portfolio system acted as a dynamic force
to drive the activity forward with pace and purpose. The data from the trials of this system is
fully reported in the project report (Kimbell et al 2004).
Assessment for learning has become a major concern of educators. It places the teacher
(rather than any external body) at the heart of the assessment process and presents
teachers with large amounts of personalised-learning information to manage. Within this
emerging field, we see much value in exploring the use of digital systems to support
teachers and learners.
In this digital context, e-learning is a term that has emerged to describe a wide range of
digitally enhanced educational experiences; from a straightforward internet search or the
completion of a simple screen-based multiple choice question, to full blown multimedia
managed learning environments providing access to complete courses. The DfES e-
learning strategy identifies the provision of a centralised e-portfolio as an important priority
for reform, second only to the provision of the infrastructure to make it work.
In the context of design & technology alone, Awarding Bodies are responsible for the
assessment of approx half a million students annually using portfolios in which learners
develop a design solution to a task of their own choosing, simultaneously telling the story of
their development process. Approx 50% of learners’ GCSE marks are allocated on the basis
of the quality of their portfolio. The Awarding Bodies responsible for these assessments –
particularly at GCSE – are increasingly seeking to exploit the power of digital technologies.
3
e-scape e-solutions for creative assessment in portfolio environments
The proof of concept operated at four levels; technological, pedagogic, manageable, and
functional. Each of these four ‘proof of concept’ deliverables was explored in schools
through a series of small-scale trials. We explored the system from both ends. At the
classroom activity end, pedagogic priorities and the need to evidence capability dominated
our concerns. But at the assessment end we were required to explore the manageability and
functionality of an e-portfolio for assessment purposes.
Specifically, the activity we were seeking to enhance was the 6-hour ‘light fantastic’ activity
developed for the assessing design innovation project. This activity was capable of
subdivision into a series of component parts, and – for the purposes of exploration with
digital peripherals – we divided the activity into a series of ‘work-parcels’ – some focussed
on supporting learners’ designing and some on supporting teachers’ assessment. We
undertook a series of school-based trials between Jan and May 2005, with learners from
year 6 to year 12. The second area of work concerned the technical systems that would
need to be in place for the learners to be able to develop their solution to the task in a web-
space - accessible to the learners themselves, and their teachers, and (ultimately) to
examination board assessors.
The outcome of this phase 1 proof of concept was a body of digital work from learners and
evidence from the associated e-assessment trials of that work. DfES, QCA and the
Awarding Bodies were persuaded of the concept, and we were invited to take the project to
the next stage.
4
e-scape e-solutions for creative assessment in portfolio environments
Whilst the technology partners were working on the system, in TERU we worked on the
classroom activity and the protocols that would render it manageable as a design activity
worked largely through digital means. Whilst we worked from the 6 hr activity structure of the
paper-based version in assessing design innovation, we modified it substantially to
capitalise on the potential that is made available through the PDA. The drawing and writing
tools can replicate paper based
drawing and writing, and the
camera removed the need for a
separate one that we had
previously used. But the speech
st
tool was entirely new. For the 1
time we could capture the authentic
voice of the learner at points
through the activity. And
throughout, we retained the
importance of real materials – as
learners struggle to create their own prototype solutions to the design task.
By May 2006 we had a working e-scape portfolio system, and all the resources needed to
make is operate as an assessment system in schools. We launched the national pilot in 14
schools across the country through June and July 2006 and as a result accumulated 250 e-
portfolios of year 10 learners’ performance in the website. The system worked remarkably
smoothly and we are grateful for all the support and enthusiasm from teachers and learners.
Making assessments
Assessing web-based portfolios can be done using the same systems as are conventionally
used for assessing paper-based work, by allocating scores for elements within a rubric. But
having all the work in a website opens the possibility of using a quite different approach. In
association with assessment specialists we used an approach of ‘comparative pairs’
judgements that was developed originally in the 1920s. Essentially this involves a judge
looking at two portfolios and deciding (only) which – on the basis of agreed criteria - is the
5
e-scape e-solutions for creative assessment in portfolio environments
better / stronger piece of work. Then looking at another pair, and another pair, and another
pair. Many paired judgements are made, enabling each piece of work to be compared to
many other pieces, and the process is undertaken by a team of judges so that each piece is
seen by many different judges (see sect 2.12). The combined effect of all this is two-fold.
Value
those that win every comparison 3
1
every time. In the middle are those
0
that win half the time and lose half
-1
the time. 0 50 100 150 200 250
Rank
Second, since in our case each portfolio was judged at least 17 times (sometimes
significantly more) and by 7 judges, the judging process renders highly reliable results. The
standard error attaching to the placement of individuals within the rank order is significantly
lower than would be the case in conventional portfolio assessment. (See sect 2.13)
The judging process (including training of judges and 3 round of judging) was undertaken in
September and October 2006 and the resulting data used for analysis of learners’
performance. Whilst the pilot was principally designed to test the system itself, it was
necessary to test learners within the system and the resulting data has proved interesting
(see sect 2.17) concerning for example the relationships in performance between designing
on paper and designing digitally; between girls and boys performance; and in relation to
different ‘general ability’ groups. Overall however, we were concerned to gauge teachers’
and learners’ reaction to the activity system in schools (see sect 2.7) and the judges’
reaction to the assessment system (see sect 2.14). The power of this dynamic real-time e-
portfolio system is best captured in section 2.16, where we illustrate performance at various
levels using the data (drawings, photos, notes, and sound files) direct from the website.
Conclusions
The successful conclusion of phase 2 of project e-scape raises many issues of importance
for the future of e-learning and e-assessment. We summarise these below in relation to the
four categories of research question that launched project e-scape.
6
e-scape e-solutions for creative assessment in portfolio environments
The key point is the infusion of technology into activity. Real-time activity in studios,
workshops, playing fields, theatres, science labs and the like, is typically not aligned with
digital power. That power typically sits in splendid isolation in the shimmering purity of IT
suites. In e-scape we have shown how the technology can get down and dirty and unleash
its digital power where it is really needed. And in the national pilot we demonstrated that it
was manageable.
Next steps
The two major innovations in e-scape have been (i) to create a system in which hand-held
digital tools link dynamically to a website to create portfolios, and (ii) enabling ‘comparative-
pairs’ judging for reliable assessment. These two innovations are at the centre of a new
proposal that is currently being negotiated between DfES, QCA, Becta, Awarding Bodies
and TERU. In this next step we propose to move forward from the prototype system,
creating an authoring tool that allows teachers to create many different kinds of structured
coursework activity – with steps of their choosing and timings for them as best fits their
school. These activities might be in design & technology, but equally might be in geography
or some other curriculum area. At the same time we propose to work alongside Awarding
Bodies to explore how such real time e-portfolios can be seamlessly integrated into the data
management processes that lead to the award of qualifications – and specifically GCSE at
age 16.
Discussions are advanced for this new project – e-scape phase 3 – to run from Feb 2007 to
March 2009, by which time a scalable national system will be operational.
7
e-scape e-solutions for creative assessment in portfolio environments
e-scape phase 1
1.1 Context
The story that underpins this project brings together a number of strands of educational
debate:
a. design & technology
b. assessment for learning
c. e-learning
d. e-learning in design & technology
e. portfolios - what they are and what they aren’t
f. e-assessment & Awarding Body innovation
At the time of publication, the DfEE, in concert with the Design & Technology Association
(DATA) established a Strategy Group for design & technology, charged with the task of
steering the subject through the following years. The group
undertook a number of development tasks, including an
externally commissioned review of the literature
concerning the impact of Design & Technology and a
review of new technologies that might be encouraged to
support the growth of design & technology in the
immediate future. One task - undertaken by members of
the group itself - was to review the internal coherence of
design & technology as presented in NC2000, with
particular regard to the match between the vision
statement, the Programmes of Study (PoS) and the
Attainment Target (AT).
8
e-scape e-solutions for creative assessment in portfolio environments
It was noted that the vision statement encapsulated the need for creativity, innovation and
teamwork in design & technology.
• 'intervene creatively'
• 'creative problem solvers'
• 'members of a team'
• 'become innovators'
It was also noted that whilst the PoS are less clear on these points, there is at least an
implicit recognition of their importance and the scope or flexibility to interpret these
imperatives into school curricula. However it was noted that the Attainment Target is starkly
bereft of any reference to, or recognition of, these key factors.
Beyond NC requirements, related problems were evident with GCSE assessments, partly
through the syllabus specifications themselves (which lack reference to innovation, creativity
and teamwork), and partly, inadvertently, through the impact of 'league-tables'. Teachers,
departments and schools are now almost as dependent upon the GCSE results as are their
learners, and a typical response in schools is that teachers impose ever-more rigid formulas
on learner project portfolios to guarantee success. The concern of the DfES Strategy Group
was that as GCSE project work portfolios become more formulaic, innovative learners may
be penalised by comparison with well organised, rule-following learners. This has had the
result that - in relation to the design & technology vision statement - the wrong learners (or
at least some of the wrong learners) are rewarded with the best grades in GCSE
assessments.
Black et al (2003) informed this debate with the launch of their book "Assessment for
Learning: putting it into practice”, and the Tomlinson Report “14-19: Extending opportunities,
raising standards” took the debate further. The argument from both is essentially that too
much time, effort and expense is tied up in external assessments and that more attention
should be devoted to the kinds of assessment that are classroom and teacher based, and
that are designed to inform the processes of learning and teaching. Assessment arises in
almost every exchange between teacher and learner, and operates as a feedback device,
informing the teacher of any misunderstandings, or half understandings that stand in the
way of learners’ progress.
9
e-scape e-solutions for creative assessment in portfolio environments
Seen in this way, formative assessment, or assessment for learning helps the teacher to
shape their next intervention and it casts the debate on assessment into a personalised form
– customised towards the needs of individual learners and their progress. Phrases such as
‘individualised assessment’ and ‘learner-centred assessment’ thereby become key parts of
the lexicon. As OFSTED has pointed out:
But achieving these benefits increases substantially the requirement for interaction with
learners and presents teachers with a significant increase in the amount of information they
have to manage. Within this emerging field, we believe that there is enormous potential in
the use of digital systems to support teachers and learners with appropriate tools to manage
these rich and complex data. And there is a paradox here that is worth noting. Assessment
reform is being led by groups with limited understanding and experience of digital systems,
whilst digital developments for the classroom are pressed forward by groups with limited
understanding and experience of learning and assessment.
We believe that sympathetically designed digital systems could provide both a framework of
support to better understand these processes of assessment as integral to learning, and at
the same time provide flexible tools to manage and implement this (largely) new emphasis.
c) e-learning
The present government has embarked on a major programme to digitise many of the
activities and services it offers, driven by (among other things) the promise of greater
control, improved efficiencies, cost savings and better standards of service. This focus on
developing new ICT systems straddles many aspects of government from (e.g.) taxation,
registration, legislation, communication, health and education.
These initiatives have developed as largely isolated programmes and we have now reached
a point where it has become clear that there is a pressing need to, and significant additional
benefits to be gained from, joining these systems up. An obvious common denominator to
facilitate a more connected approach is the individual citizen and recent e-government
proposals anticipate binding existing systems together through new bridging services such
as personalised e-learning systems and e-identity cards. E-learning is a term that has
emerged to describe a wide range of digitally enhanced educational experiences; from a
straightforward internet search or the completion of a simple screen-based multiple choice
question, to full blown multimedia managed learning environments providing access to
complete courses.
With the new focus on joining up e-services, e-learning has gained an additional,
longitudinal dimension through the proposal to provide “personal online learning spaces”.
Interestingly, this requirement is identified not just by the DfES but comes as part of an
overarching policy direction from the Prime Minster’s Strategy Unit. In a document entitled
10
e-scape e-solutions for creative assessment in portfolio environments
“Connecting the UK: the Digital Strategy”, action1 is defined as “Transforming Learning with
ICT” and describes the need for everyone to have an electronic portfolio for lifelong learning:
Over time we should see the technology join up better across institutions, so that this is
available to learners to build on wherever they go – to further learning, or to work-based
learning. And in the future it will be more than simply a storage space - a digital site that
is personalised, that remembers what the learner is interested in and suggests relevant
web sites, or alerts them to courses and learning opportunities that fit their needs. We
will encourage all organisations to support a personal online learning space for their
learners that can develop eventually into an electronic portfolio for lifelong learning.
(Prime Minister’s Strategy Unit. 2005)
Developing a similar theme, the DfES e-learning strategy identifies the provision of a
centralised e-portfolio as an important priority for reform, second only to the provision of the
infrastructure to make it work:
Our second priority extends this personalised support to learners, helping with all stages
of education and with progression to the next stage. We will encourage every institution
to offer a personal online learning space to store coursework, course resources, results,
and achievements. We will work towards developing a personal identifier for each
learner, so that education organisations can support an individual’s progression more
effectively. Together, these facilities will become an electronic portfolio, making it simpler
for learners to build their record of achievement throughout their lifelong learning.
(DFES e-strategy 2005)
Moreover it is not just that they have access to the technology, they also use it; with 9 out of
10 texting at least once a day and over 25% taking photos daily.
11
e-scape e-solutions for creative assessment in portfolio environments
The statistics in this DfES survey suggest that design & technology makes the best use of
ICT when compared to other secondary subjects, and this is reinforced in the OFSTED
report of 2004.
This report goes on to note the range of ICT related activities that are typical in design &
technology
We note however, that this list – pleasing though it might be – tends to place the focus for
learners use of ICT in design & technology onto doing and recording activities; ‘to control’ ‘to
simulate’ to manufacture’. There is little here that suggests the ICT is being used
formatively to generate, initiate, stimulate, and develop learners’ ideas. Nor is there much
scope in this list for acknowledging any ICT role in relation to learners’ reflecting, reviewing,
critiquing and evaluating their ideas. These are the designerly, intellectual qualities that lie at
the heart of learner portfolios in design & technology.
As a starting point, we recognise that there are many purposes to which portfolios might be
applied. These have been articulated by IMS Global Learning (developing specifications for
e-learning environments) in the following terms.
• Assessment portfolios
• Presentation portfolios
• Learning portfolios
• Personal development portfolios
• multiple owner project portfolios
• working portfolio
(IMS Sept 2004)
For the purposes of this project we believe it would be helpful to clarify our understanding of
what a portfolio is and how it works in design & technology. Whilst d&t portfolios have been
12
e-scape e-solutions for creative assessment in portfolio environments
refined over the years and attuned in particular to the priorities of assessment, nonetheless,
the essence of a d&t portfolio involves a mix of what the IMS lists as an assessment
portfolio, a learning portfolio and a working portfolio.
Through custom & practice in design & technology it is possible to observe several forms of
what a portfolio might be.
i. The most common meanings of ‘portfolio’ defines it as something akin to a box-file into
which the learner (or perhaps the learner’s teacher) can place work to demonstrate that
certain operations, or skills, or processes have been experienced. Viewed in assessment
terms, the learner’s portfolio becomes a collection of evidence that is then judged against
some rubric to arrive at a mark or a level. A portfolio of this kind is conceived as little more
than a container for evidence.
Translated into the e-portfolio world, it is possible to conceive of many ways in which the
evidence being ‘contained’ could be enhanced through the application of database or
spreadsheet systems, which might even be designed to automate the process of
containment, standardising, streamlining and potentially removing the need for human
interaction.
ii. A somewhat more sophisticated view of portfolio arises from process-rich areas of the
curriculum, where teachers encourage learners to document the story of a developing
project or experience. This results in learners reporting what they have done at various
points in the process.
In this kind of ‘presenting’ or ‘reporting’ e-portfolio, it is not unusual for learners to use linear
digital presentation technologies - e.g. PowerPoint - to give a blow-by-blow account of
where they have been in the project - and how they finally got to the end.
However, whilst these two accounts might be seen as part of the picture, neither of them
captures the dynamic capability dimension that informs our view of a design & technology
portfolio.
The central problem - in both cases - is that the portfolio construction is conceived as a
second-hand activity. First you do the activity - whatever it is - and then afterwards you
construct a portfolio that somehow documents it. The portfolio is a backward-looking
reflection on the experience.
iii) A third and far richer view of the concept of the portfolio is evidenced in schools
(particularly in design & technology) where teachers have embraced the challenge of linking
learning and working concepts of the portfolio to the more commonplace assessment
portfolio.
In this rich form, the portfolio is transformed into an entity that is integrated into and grows
dynamically with the project - and in the process it shapes and pushes forward the project.
The best analogy is neither a container nor a reported story, but is rather a dialogue. The
designer is having a conversation with him/herself through the medium of the portfolio. So it
has ideas that pop up but may appear to go nowhere - and it has good ideas that emerge
13
e-scape e-solutions for creative assessment in portfolio environments
from somewhere and grow into part solutions - and it has thoughts arising from others
comments and reflections on the ideas. Any of these thoughts and ideas may arise from
procedural prompts that are deliberately located in the activity to lubricate the dialogue.
Looking in on this form of portfolio is closer to looking inside the head of the learner –
revealing more of what they are thinking and feeling, and witnessing the live real-time
struggle to resolve the issues that surround and make up the task. Importantly, this dynamic
version of the portfolio does not place an unreal post-activity burden on learners to
reconstruct a sanitised account of the process. Creative learners are particularly resistant to
what they see as such unnecessary and unconnected tasks, and this significantly accounts
for their underperformance in portfolio assessments that demand such post-hoc story telling.
But real-time dynamic portfolios are not tidy, nor is it possible to present them in a pre-
determined PowerPoint template. It is more like a designers sketchbook - full of notes and
jotting, sketches, ideas, thoughts, images, recordings and clippings. These manifestations
are not random - but are tuned to the challenge of resolving the task in hand. And the point
of the portfolio is that the process of working on it shapes and develops the activity and the
emerging solution.
Our three categories of portfolio are somewhat dissimilar to those identified by Ridgway,
McCusker and Pead for Nesta Futurelab in their literature review of e-portfolios.
st rd
Whilst their 1 category is the same as ours, their 3 seems to be little more than an
extension of this – allowing for the repository to contain work selected over time and used –
inter alia - for assessment purposes. It is a container with some display potential.
nd
Furthermore, whilst their 2 category contains some elements of dialogue potential, it does
not capture the dynamic creative essence of portfolios as we see them operating in design &
technology.
These disagreements demonstrate the thorny territory that is conjured-up merely by the use
of the term e-portfolio. We are very conscious of this issues and it demonstrates the
absolute necessity of being very clear about what is proposed within phase 2 of project e-
scape.
14
e-scape e-solutions for creative assessment in portfolio environments
Awarding bodies have faced the challenge of learners using commercial software systems
(particularly CAD/CAM) as part of their product development work, and increasingly
teachers have sought to obtain permission to submit this work digitally. Whatever view one
takes of what the portfolio is, it seems logical that if the work is being done digitally, it seems
somewhat perverse - and inauthentic - to then print it all out as though the work had been
done on paper.
Quite apart from the issue of authenticity, there is a practical issue. Awarding Bodies can
see the advantage of submitting such work digitally (e.g. on a disc or via a secure website)
simply because of the reduced labour, resource (e.g. paper) and costs (e.g. postage)
involved.
As regulator of the activities of Awarding Bodies, the Qualifications and Curriculum Authority
(QCA) developed its own strategy for addressing e-assessment. In 2004, QCA’s 5-year
objectives were as shown below, though it should be noted that, as events have turned out,
this has proved an over-optimistic schedule:
by 2009:
• all new qualifications should include an option for on-screen assessment
• all awarding bodies should be set up to accept and assess e-portfolios
• most GCSEs, AS and A2 examinations should be available on-screen
• National Curriculum Tests should be available on-screen
• on-demand assessments will begin to be a feature of GCSEs
• 10 new qualifications, designed for electronic delivery and assessment, should be
developed, accredited and live
by 2005
• Field trials successfully completed by awarding bodies in at least two subjects
• 75% of basic and key skills tests delivered on-screen
by 2006
• A code of practice, plus audit and other regulatory criteria, is developed
• AQA, OCR and Edexcel offer live GCSE exams in two subjects each
• Pilot of at least one qualification, specifically designed for e-assessment
by 2007
• 10% of GCSE examinations administered on-screen
by 2008
• On-demand testing introduced for GCSEs in at least two subjects
by 2009
• e-assessment becomes increasingly routine
(QCA 2004)
Beyond QCA and the Awarding Bodies however, it should be noted that the importance of e-
portfolios within this strategy has been underlined by OFSTED in their recommendation
concerning the development of ICT in schools. They make clear that at the school level
there is a need to:
15
e-scape e-solutions for creative assessment in portfolio environments
“develop electronic portfolios of learners’ work alongside the use of web- or intranet-
based applications that enable assessed work to be easily accessed by teachers,
learners and parents”
(OFSTED 2004 [ii])
The Strategy Group recommended that research be undertaken to examine the extent to
which - and the ways in which - innovation and teamwork might be more fully recognised
and rewarded in assessment processes, particularly within GCSE. The Technology
Education Research Unit (TERU) at Goldsmiths was asked to undertake the work and
develop a system of assessment that would measure and reward design innovators. The
project was launched in Jan 2003 and concluded in Dec 2004. The thrust of our work arising
from this brief has been to reinvigorate a view of portfolio assessment that transforms it back
into dynamic dialogue mode.
The principal outcome of the project was a developed portfolio assessment system that sat
somewhere between a formal examination and a piece of coursework. It was designed to
operate in 6 hours - typically 2 mornings - and presented learners with a design task that
was to be taken through to a prototype.
The following structure is characteristic of the activities developed. The task ('light fantastic')
centres on re-design of a light-bulb packaging box, so that, once the bulb is taken out for
use, the package/box can be transformed into a lighting feature - either by itself or in
association with other 'liberated' light-bulb package/boxes.
(i) read the task to the group and (through brief Q&A) establish what is involved
(ii) explore a series of 'idea-objects' on an 'inspiration table' and in a handling collection
designed to promote ideas about how boxes / packages / containers might transform
into other forms and functions.
(iii) put down first ideas in a designated box in the booklet
(iv) working in groups of 3, learners swap their booklets and each team-mate adds ideas
to the original
(v) team-mates swap again so that each team member has the ideas of the other two
members
(vi) booklets return to their 'owner' and team members discuss the ideas generated
(vii) the teacher introduces the modelling/resource kit that can be used throughout the 2
mornings
(viii) learners develop their ideas in the booklet - and/or through modelling with the
resources
(ix) learners stop to reflect on the user of the end product and on the
context of use, before continuing with development
16
e-scape e-solutions for creative assessment in portfolio environments
(x) at intervals, learners are asked to pause and row a dice - with questions on each
face. The questions focus on procedural understanding e.g. “how would you ideas
change if you had to make 100?' and learners answer the questions in their booklet
(xi) photographs are used at approx 1 hr intervals to develop a visual story line to
illustrate the evolution of models & prototypes
(xii) at the end of the 1st morning, learners - and their team members reflect on the
strengths and weaknesses of their evolving ideas
(xiii) the 2nd morning starts with a celebration of the work emerging
from day 1. This is based on post-it labels that highlight
learners' thoughts about the qualities in their ideas
(xiv) further prototype development
(xv) regular hourly photos and pauses for reflective thought on
strengths and weaknesses
(xvi) final team reflections, when (in turn) team members review
each others' ideas and progress
(xvii) individually, learners then 'fast-forward' their idea illustrating
what the product will look like when completely finished and
set-up in context
(xviii) learners finally review their work from start to finish.
All the learners’ work was structured into an A4 workbook that folded out to become an A2
sheet. The activity was designed to be administered by teachers in ordinary design &
technology facilities. The workbooks were carefully designed to unfold throughout the
activity, ensuring that learners always had sight of the instructions for the sub task they were
currently working on at the same time as being able to see the work they had just
completed.
The illustrations below show two learner booklets. The photo story lines demonstrate the
st
progress of the ideas from inception to final prototype. In the 1 case it is clear that the
nd
strength of this idea emerges predominantly through the medium of 3D modelling. The 2
case however illustrates a learner who is equally comfortable with developing ideas through
drawings and with 3D modelling. In both cases, the booklet allows us to ‘see’ their very
different modes of working in operation.
17
e-scape e-solutions for creative assessment in portfolio environments
st
The concept underlying the 1 piece of work was named by
the learner ‘your name in lights’, and was for light-bulb
packaging to become pentagonal and tapering, allowing
‘used’ boxes to build into a spherical lighting feature that –
when illuminated by a light at the centre - projected letters
around the wall. In this case the learner developed his
prototype using a combination of graphic modelling and 3D
modelling, supported by a considerable amount of reflective
comments and critique.
In the process of working on this project, we were able to identify other features of the
portfolio – or of the setting within which it works – that significantly impact on its
effectiveness. And the key one is the learning and teaching culture created by the teacher in
the workshops and studios in which learners operate. This culture in turn influences each of
the following features:
• motivation
For learners to be fully engaged and performing at their best requires levels of motivation
that – in design & technology at GCSE level – must be maintained over an extended period
(typically 6 months). Our 6-hour activity was equally dependent upon generating
enthusiasm for the task and we used a number of techniques to generate and maintain it.
• ownership
Who is the portfolio seen to belong to? Is it the learner’s, or the teacher’s, or the
department’s, or the GCSE Awarding Body’s? Learners’ sense of ownership of the work is
typically a pre-requisite for fully engaged performance.
• environment
For dynamic creative work to be generated by learners, the environment must be one in
which the working atmosphere in conducive to those values. In terms of our project, this
required teachers to be open not just to learners’ ideas but also very flexible in how they
encouraged learners to express and develop them.
• Ideas
At the heart of dynamic creative portfolios are ideas. We were explicit in encouraging
learners to have ideas, grow their ideas and prove their ideas. Equally we encouraged
teachers to facilities these features of learners’ performance.
Each of these four will be seen to have an e-equivalent within the e-scape project.
18
e-scape e-solutions for creative assessment in portfolio environments
digital enhancement
It was during the development of the activities for this previous project (assessing design
innovation) that we became aware of the potential for digital enrichment of the activity.
Learners increasingly use digital technologies as part of their work in design & technology.
They use digital photography to record their designing and manufacturing processes. They
increasingly use the internet for information searches; computer aided design (CAD)
systems for design development work; and - in some cases - this extends to computer aided
manufacturing (CAM). Also, they increasingly access, complete and store their work on
school networks and intranets that allow access from their home computers. This extends
the working environment beyond school workshops and studios and allows them time-
unlimited access to their work. It also broadens the tool set that is available to them to
envision, manipulate and develop their ideas, and in the process it raises important cultural
issues associated with the origins of ideas, the ownership of work, team-work and
plagiarism.
These thoughts led us to develop a proposal to QCA/DfES for a digital approach to portfolio
assessment. Learning activities in design & technology studios and workshops are
increasingly influenced by digital technology, and the portfolio assessment system that we
had developed in the previous project “Assessing Design Innovation” provided a useful
model to explore the possibilities of extending digital working in design & technology into
digital assessment of learners’ performance.
This development involved introducing new technologies into the classroom, as well as
extending the range of existing technologies into the domain of assessment. The expanded
use of these digital technologies into the realm of assessment will have some serious
impacts on current approaches to teaching and learning. We are absolutely committed to
undertaking these developments without compromise to the underlying concepts of design &
technology as expressed in the ‘importance of design & technology’ statement in Curriculum
2000. Indeed we believe that the work may contribute to taking forward our collective
understanding of the power of design & technology as a learning vehicle.
We start from assumptions about the nature of design & technology – the circumstances of
which are almost always workshops and studios. Two of the constants of these typical
design & technology spaces are that
• they are full of materials, apparatus, machinery, and specialist work-spaces
• they are associated with the detritus of manufacturing
They therefore make challenging locations for computers, keyboards and screens. First
there is not enough space; second the space is not clean (glue, paint, flour & water,
sawdust) and third learners themselves get oily or painty or gluey or floury fingers that are
not then ideally suited to keyboard use.
19
e-scape e-solutions for creative assessment in portfolio environments
For all these reasons we do not believe that digital enhancement of the designing activity
will involve computers, keyboards and screens. At least we do not believe that these tools
will be at the leading edge of activity. Rather we think that peripheral, back-pocket
technologies will be more appropriate: mini digital cameras, digital pens, digital PDAs.
At least at the ‘input’ level these technologies enable activities in workshops and studios to
go ahead almost as normal. The computing power does not take up too much space and
(because they can be pocketed) they are not too sensitive to the clutter of the working space
and our trials in schools showed that the use of hand-held technology was indeed
manageable and effective.
Interestingly, learners at KS4 now (almost universally) have access to mobile phones, a
significant proportion of which have digital cameras as a built-in feature. As the telecoms
companies race to differentiate their systems through enhanced features, the current
distinction between handheld PDAs and mobile handsets is disappearing as the two
previously unconnected technology strands merge. While ‘smart’ phones, with all the
features of a PDA, are currently not marketed to learners, camera phones are becoming
more ubiquitous and other ‘smart’ features will increasingly work their way onto phones for
children. This trend will be all the quicker if it is seen (or marketed) as providing valuable
tools for learning, thereby justifying additional parental expenditure.
rd
In short, we are witnessing the growth of 3 generation computing. Mainframe computer
nd
technologies of the 1960s and 70s gradually faded with the emergence of 2 generation
‘desktop’ computers. These completely transformed our working relationship with computers
– providing us with far greater interactivity, apparently unmediated by the programmers
nd
whose services had formerly been essential. We could ‘drive’ our own 2 generation
computers in the 1980s and 90s. As the technologies shrank, the growth of laptop
th
computers particularly in the final decade of the 20 C did not materially change our
relationship to computers. They operated merely as slightly (very slightly) more mobile
rd
versions of the desktop. But the new 3 generation of computers is radically different. They
are FAR more mobile, are equally powerful, and can now genuinely be regarded as ‘back-
pocket’ computers. As such, they are in the process of transforming – once again - our
rd
working relationship with computers. The transition to 3 generation mobile technologies will
st nd
be just as dramatic as was the transition from the 1 to the 2 generation. In the contexts of
learning, teaching, curriculum and schools, these transformations will be profound. We
believe that the e-scape project will provide us with many insights into the educational
rd
implications of this 3 generation.
20
e-scape e-solutions for creative assessment in portfolio environments
Phase 1 of the project (Nov 04-Jun 05) has been - in several senses - a “proof of concept”
phase, to explore the feasibility of the concept outlined above. This proof of concept
operates at four levels:
i) technological
Concerning the extent to which existing technologies can be adapted for assessment
purposes within the portfolio system as currently designed for the DfES “Assessing Design
Innovation” project. This will include the applicability of other international work in this area
and of any relevant system standards.
ii) pedagogic
Concerning the extent to which the use (for assessment purposes) of such a system can
support and enrich the learning experience of design & technology
iii) manageable
Concerning issues of making such assessments do-able in ‘normal’ d&t classrooms /
studios / workshops
• the training / cpd implications for teachers and schools
• the scalability of the system (including security issues) for national implementation
iv) functional
Concerning the factors that an assessment system based on such technologies needs to
address;
• the reliability & validity of assessments in this form
• the comparability of data from such e-assessments in d&t with non e-assessments
Each of these four ‘proof of concept’ deliverables was explored in schools through a series
of small-scale trials. The research report (Kimbell et al 2004) – covering the four ‘proof of
concept’ factors – was the required ‘deliverable’ for phase 1 of the e-scape project. We
summarise in section 1.5 of this report the main findings from phase 1 and then translate
those findings into a detailed specification of what a working system might be like. This
specification – in section 1.6 of this report – then becomes our working template for
developing the prototype in phase 2 of project e-scape.
The second area of work concerned the technical systems that would need to be in place for
the learners to be able to develop their solution to the task in a webspace - accessible to
21
e-scape e-solutions for creative assessment in portfolio environments
the learners themselves, and their teachers, and (ultimately) to examination board
assessors.
e-scape work-parcels
As explained above in 2(c), we had in mind to start our explorations with a range of
‘peripheral’ digital technologies – typically hand-held – that we might use to enhance the
designing activity.
Specifically, the activity we were seeking to enhance was the 6-hour ‘light fantastic’ activity
developed for the assessing design innovation project.
This activity was capable of subdivision into a series of component parts, and – for the
purposes of exploration with digital peripherals – we divided the activity into the following
‘work-parcels’.
These work-parcels were developed iteratively. Initially we worked with a new technology -
and sometimes with the supplier of a new technology - until we had developed it to the point
where we felt it might be useful to support learners’ designing. At that point we arranged a
school trial - often just so we could see what happened. We were frequently unsure about
what learners would do with the products and systems, and we were continually astonished
at their ability to assimilate the new technologies and make purposeful use of them. We
outline some of these experiences in section (b) below.
The second area of work - to support teachers’ assessment – was also developed into a
series of work-parcels.
22
e-scape e-solutions for creative assessment in portfolio environments
The challenge here was somewhat different, and therefore our methodology was different.
We did not focus these work-parcels towards school trials, in part because schools are just
not equipped with the technology systems to do what needs doing. Our approach here was
to engage in a series of meetings with leading-edge systems developers – and to a lesser
extent Awarding Bodies – to discuss the possibilities for developing systems that might be
able to achieve what we increasingly saw as necessary. These discussions are outlined in
section (c) below.
23
e-scape e-solutions for creative assessment in portfolio environments
• to explore the impact with mature users – albeit that they were the youngest learners to try
the activity.
In addition to undertaking the formal activity trial with the year 12 AS group at Saltash, we
also provided the group with a PDA each from Easter to the end of their AS project work.
The purpose of the exercise from our point of view was to see what happened when
learners have regular and free access to the technology. The assumption was that they
would move beyond regarding them as toys – to be experimented with – and begin to use
them more naturally as tools to support their designing. We also encouraged them to
explore the value of using them in their other curriculum subjects and in their extra-curricular
activities.
Each learner produced a report for us, focusing on their use of the PDA over this extended
period.
While there is no system at that time that offered the dynamic integration and presentation
features we required for this project, there were a number of e-portfolio platforms that
provided the core data management systems necessary to drive the system we envisaged.
TAG Learning had significant experience in two of the key technology aspects of this
project:
• handheld digital peripherals devices
• web based, contributory, moderated, portfolio assessment systems.
Qinetic had a useful Show-N-talk system, operate the SIMS assessment manager and a
system of portable electronic school registration.
Extended systems operated the OneBridge system, and with the Social Services had
developed a remote voice to text system based on web-enabled, hand-held (Zire) machines
linked to a remote server using DragonTalk.
As part of these developments, we also worked with Dudley LEA who operated hand-held
systems (Zire) linked to Show-n-talk and OneBridge. and these discussions were helpful
particularly in relation to scalable implementation issues. Finally, we had meetings with
Awarding Bodies to explore with them the technologies that they see as emerging in their
systems.
The central feature of our requirements in relation to any system that we might adopt was
connectivity; the capability to beam data automatically from classroom-based, hand-held
technologies into pre-designated web-spaces. We explored several possibilities (e.g. using
USB, IR [infra-red], Bluetooth and Wireless systems) and identified our priorities for phase 2
24
e-scape e-solutions for creative assessment in portfolio environments
These connectivity and presentation tools needed an additional feasibility study to resolve
outstanding technical issues, and we undertook this during the summer of 2005, as an
extension to phase 1 of the project.
The work outlined in section 1.4 resulted in a set of findings reported here under the four
headings from the brief:
• technological findings
• pedagogic findings
• manageability findings
• functional findings
i) technological findings
Concerning the extent to which existing technologies can be adapted for assessment
purposes within the portfolio system as currently designed for the DfES “Assessing Design
Innovation” project. This will include the applicability of other international work in this area
and of any relevant system standards.
digital cameras
We explored the potential of a number of different types of camera:
• Kodak EasyShare system (single function)
• Mobile Phone cameras (multiple function)
• PDA with built in camera (multiple function)
Our initial trials reflected the Mori survey findings that many learners already had digital
cameras in their mobile phones. In this first phase learners reported that these cameras
were typically of a low resolution and not capable of capturing sufficient detail in close up to
adequately record their modelling. By phase 2 of the project learners’ attitudes had changed
and some commented that the cameras in their phone were better than those we provided
in the PDA. Also they were much more familiar with transferring and sharing data from their
phones using Bluetooth.
Putting learners in control of the recording process freed teachers to concentrate on other
aspects of facilitating the assessment task. It also engaged learners in the important
process of selecting appropriate evidence from the wide range of photographic material they
were able to collect. Combining the camera with the integrated features of a PDA meant that
it was also possible to annotate and sketch over photos to convey additional meaning.
digital pens
We explored the potential of the following types of digital pen:
• Logitech iO (V1.0 and V2.0)
• Nokia (bluetooth)
25
e-scape e-solutions for creative assessment in portfolio environments
Beyond the technical management of the hardware there was also a software requirement
to provide automatic systems that securely collected, collated and managed the data from
the pens, tagging and formatting it and sending it to individual learner web workspaces. In
addition we considered a range of other enhancements to the basic digital pen that would
make them more suitable as d&t assessment tools, for example:
• fingerprint recognition system
• interchangeable mark-making tools
• integral microphone, camera and printer
All the speech-to-text systems we trialled were too demanding for the technology we were
likely to have in place for phase 2. The team noted that it would be at least 1 to 2 years
before the processing power of a PDA was sufficient to handle natural language in this way.
We also explored the potential of creating a conversational ‘chatterbot’ to support learners
working on the 6hr design task.
Both devices offered similar functionality and largely for reasons of cost we chose to pilot
with the Palm Zire 72. We had not intended (or budgeted) to commission any software
development at this stage of the project, however in order to judge the ability of the small
26
e-scape e-solutions for creative assessment in portfolio environments
Data transfer from the PDA to a secure web based system was a critical aspect of the
system and we explored a number of routes to achieve this. Initially we explored the
systems provided with the Palm, including; USB, IrDA, Bluetooth, SD card and wireless.
The market for digital equipment moves very quickly. In the time leading up to Phase 2 of
the project Palm stopped making the Zire 72 PDA. We worked with our technical partners
and selected an alternative devise, the HP iPaq RX3175 which although more expensive
had a better specification (particularly the bundled software packages). This product was
also withdrawn before we had secured funding for phase 2 and we were forced to make a
speculative purchase to ensure we had the technology we required for the phase 2 pilot.
We should note at this point that whilst the phase 2 pilot went ahead with PDAs as the basic
data entry tool, it would have been possible to use alternative devices e.g. laptops, tablets,
digital cameras etc. The choice of PDAs was driven by a combination of concerns,
principally;
• the need for the hardware to be right inside the action, typically in the d&t workshop
• the necessity of having one machine for each learner throughout the activity
• the need for learners to respond in many ways (drawing, writing, with photos and speech)
• the related (organisational) need not to have many different bits of technology
• the need not to dominate the workspace with bulky hardware
• the need for it to be robust in challenging workshop environments
• the need for it to be as cheap as possible.
So, whilst we recognise that future scenarios for e-portfolio creation may involve many types
of hardware our choice for phase 2 was the PDA. However, as we point out later in this
report, the ambition would be that the e-portfolio system should be agnostic about whatever
hardware is used to create the work in the classroom/studio/workshop.
Display technologies
Assessing design & technology capability on-screen from portfolios of digital evidence was a
new endeavour. From analysis of the assessment processes carried out with ‘real’ scripts by
27
e-scape e-solutions for creative assessment in portfolio environments
markers on previous projects, the two key display functions appeared to be ‘comparison’
and ‘scale’. While there were no bespoke assessment systems available to achieve this, the
team considered a range of display technologies that could help to augment the assessment
process.
In this area, the findings from the phase 1 trials were quite unequivocal. It was clear that the
use of peripheral digital tools offered the opportunity for considerable enhancement of the
teaching and learning environment. The following examples illustrate some of this potential:
• the use of the PDA as a device on which preliminary design ideas can be generated was
extended hugely by the potential for ‘beaming’ work between learners.
• the PDA enabled learners to build a digital scrapbook to enrich projects that were
underway.
• the facility to take regular photos of emerging models and prototypes was welcomed by
learners and teachers alike.
• the design talk feature was widely welcomed by teachers who were keen to accentuate
the peer-reflection / peer-review potential of the system.
The issues of manageability were at the forefront of our thinking and these phase 1 trials
indicated a number of important considerations:
• was the hardware/software manageable for learners?
• did the hardware raise theft, damage and loss issues?
• was the activity manageable on hand-held technologies?
• could teachers manage the activity?
• was the outcome of learners’ work assessable?
• was the system scalable for national assessment purposes?
We were also concerned to establish how familiar learners needed to be with the kit and
how much training teachers needed to be comfortable with the system?
The three big issues here were – validity, reliability, and comparability – and from the phase
1 trials we were in a position to comment principally on the first.
Validity takes several forms, but for the purposes of our work may be summarised as the
extent to which the activity, as we have designed it, represents ‘good’ design & technology.
28
e-scape e-solutions for creative assessment in portfolio environments
First, the activity is a direct development from the booklet-based activity that was initially
developed in the previous project; ‘assessing design innovation’. The assessment activities
that were devised in that project originated with the research team but were then shared
with the principal subject officers of the four Awarding Bodies. These Bodies were
sufficiently impressed with the work to each ‘volunteer’ two of their principal design &
technology moderators to work with us in developing more activities of the same kind.
These then formed the basis of the extended pilot, and were warmly received in the 8
schools in which they were administered.
Second, the e-scape activities have been developed directly from these former activities.
We conducted one trial of the 6hr kind that was a complete reflection of the previous project
(even using one of the same tasks), and we additionally conducted a series of shorter trial
activities to test parts of the process. All of these trials were done in booklets that – whilst
being modifications of the original – were recognisably the same format and had a number
of identical features. There is prima facie evidence that the resulting activities might be
regarded as good models of design & technology merely through association with their
predecessors.
However, the third arm of the validity case lies with the teachers who have undertaken these
e-scape activities. These teachers have been clear in their view that whilst the tools have
changed (from analogue to digital) and that this has resulted in a very different form of
representation of designing, the activity itself remains true to design & technology.
The purpose of the e-scape phase 2 prototype is to create a system where the individual
components, explored in principle in phase 1 of the project (and judged to support d&t
capability and make it available for assessment) are built into a working prototype. The
29
e-scape e-solutions for creative assessment in portfolio environments
elements must work together sufficiently well in the field to ensure we can put enough
learners through the system to collect sufficient data to answer key research questions for
the various stake holders, such that they are confident to move forward to a commercial,
scalable implementation in phase 3.
Throughout this process, data integrity and security was a particularly important issue.
Specifically we needed to minimise the possibility of data loss during the 6hr activity where
learners were working on potentially volatile handheld systems. We also needed to ensure
that the interfaces we developed (both on and between the technologies) were intuitive and
easy for teachers (and learners) to understand and operate. We used the outline
specification below to guide our development.
In preparation for phase 2 we worked closely with our technology partners TAG Learning
and Handheld Learning to develop the specification for a working system. While we had a
clear idea of what we wanted the e-scape system to do in principle we needed lots of help to
convert this into a working prototype that functioned technically as well as pedagogically.
We are grateful for the support of our two technology partners without whom the project
would not have been possible.
TAG Learning has a long and successful history of working in partnership with a range of
organisations to develop innovative web based portfolio systems across a wide range of
active learning and assessment contexts. Their MAPS e-portfolio assessment system
provided the foundation for the research team to image and model how the web
components of the e-scape system might function. TAG developers worked closely with the
e-scape team during the first phase of the project and through an iterative specification
process modified the components of their core system to describe the functionality we
required.
Handheld Learning are the leading organisation promoting mobile technology in education in
the UK. As well as developing PDA hardware and software they have initiated and
supported a wide range of projects in schools and local authorities. In the first phase of the
project Handheld Learning developed a virtual handling collection for the Palm PDA that
presented prompts to encourage learners to think differently about the objects presented.
Handheld Learning developers then worked closely with the e-scape team to convert the
stages of the paper-based activity into a format that could be delivered within the confines of
a PDA screen.
The development process was further complicated by the need for the web and PDA
systems to talk seamlessly to each other throughout the activity. TAG and Handheld
Learning collaborated closely to make this happen sharing protocols and aspects of their
individual systems in a way that transcended normal commercial expectation, and that
guaranteed the system worked.
The system modelled as closely as possible what had been shown to work for design &
technology in the paper-based environment. No changes were made for purely technical
reasons. The changes that were made were for reasons of pedagogy or manageability. As
far as possible we avoided making demands on school ICT systems, over which we had
little or no control.
30
e-scape e-solutions for creative assessment in portfolio environments
Standards
The e-learning, e-portfolio, e-assessment territory is informed by many sets of technical
standards (some international) that seek to ‘standardise’ that territory – either in terms of
systems input, user protocols, and/or output processes. Wherever possible and appropriate
we will take due note of these standards. However it should be noted that the challenge for
the e-scape phase 2 prototype was to create something that did not exist anywhere in the
world, and accordingly there were no standards that entirely circumscribed our work. The e-
scape project will inform international standards as much as be led by them.
The development of the prototype system in phase 2 of the e-scape project was based on
the specification outlined in section 1.6 above, and following its development, a school pilot
was conducted to explore the efficacy and the effects of the e-scape, e-portfolio, e-
assessment system. A number of research questions informed this work.
However, the process of developing the prototype was also informed by pedagogic,
manageability and functional assessment questions, for example:
31
e-scape e-solutions for creative assessment in portfolio environments
pedagogic: how will the construction and appearance of the virtual booklet impact upon the
questions and sub-activities that need to be built into the activity? How is the designing
activity changed by the system? What backwash effects would teachers anticipate in
relation to KS3 practices?
manageability: how often will the PDA need to be sync-ed to the web-space? How long
does the process take and can a class of (say) 24 learners manage this process
simultaneously? How do-able is the digital activity in normal studios/workshops? How much
cpd/training do teachers need to prepare for this mode of assessment?
functional: how does the assessment process change when viewing the virtual booklet in
the web-site as opposed to real paper-based booklets? Does the system enable valid and
reliable assessments?
The work of phase 1 is more fully described and analysed in the phase 1 research report
(see App 4.1)
e-scape phase 2
Following a series of negotiations with DfES, QCA, Awarding Bodies and TERU, phase 2 of
the e-scape project was launched in November 2005 and ran to the end of January 2007.
The story of phase 2 is told through the following section 2.1 to 2.19.
The escape task, entitled ‘the pill organiser project’ involves learners in a product design
activity, developing a container/dispenser for pills. Learners have to identify the user group
(maybe a 6yr old on a school trip, an activity-sports enthusiast or an elderly lady living
alone) and think about all the issues involved:
how many pills?
taken how often?
how to remember to take them?
how to keep them secure?
how to make the container/dispenser desirable? etc etc
Learners develop their design solutions using the basic approach developed in our previous
DfES/QCA project ‘assessing design innovation’. The activity runs for 6 hours through 2
mornings in schools, and includes a significant amount of ‘soft material’ modelling activity to
support design development.
32
e-scape e-solutions for creative assessment in portfolio environments
• ‘client’ or ‘user’ cards – profiling particular users and their pill requirements
• the central ‘inspiration’ collection
• the central modelling kit
Additionally the activity is managed through the administrator ‘script’ and the e-scape
application on the PDA.
All of these had to be trialled to make sure that the task would provide opportunities for
learners to do their best and most imaginative designing, and to check that it was do-able
across the specialist material territories of d&t.
Throughout this process we developed not only the test activity and script, but also the
resource lists. It is interesting how the availability of particular resources influences the
emerging designs. At Bulmershe, despite some interesting ideas emerging, there was a
‘boxy’ feel to much of the work. When (in later trials) we supplied more fabrics and
(particularly) some plasticine, the variety of responses blossomed. We concluded that:
• sheet materials (paper/card) best enable ‘boxy’ forms
• strip materials (dowel rod/straws/wire) best enable skeletal forms
• fluid materials (plasticine/clay) best enable organic forms
• and also that textile materials (fabrics) often link to and operate across these types
By the end of the task-trialling we were confident that we had a task (and a set of resources)
that learners could have a good run at, and show us what they could do.
33
e-scape e-solutions for creative assessment in portfolio environments
34
e-scape e-solutions for creative assessment in portfolio environments
At this point the box switches from edit mode to view mode, so
the learners can review what they have done but not change
it. This means that everything learners add to the PDA is time
stamped and, unlike the paper version of the test where we
had to juggle with different coloured pens to establish when
something happened, we can be sure that material in box 1 was produced in the first 5
minutes of the activity and not added later. The ‘fast back’ facility was added at the very end
of the task to allow learners to go back and review everything they had done and add digital
post-it review notes of what they might have done differently.
35
e-scape e-solutions for creative assessment in portfolio environments
This primary system involves linking the laptop to an internet-connected computer so that all
data is up-loaded directly into a secure web-space. Throughout the activity the portable
server system ‘sniffs’ for an external wireless connection to the internet (it could also be
connected through a cable system), if it detects one and makes a connection the locally
cached material can be instantly uploaded to the web based portfolio system.
TAG also included a marker training section on the website where the research team could
upload exemplars, mark schemes and other materials to help the assessors mark the
36
e-scape e-solutions for creative assessment in portfolio environments
portfolios effectively. There was also a facility for the markers to upload their marksheets
once they were completed.
This trial illustrated what a challenge it is to get box 1 / 2 / 3 / and 4 completed properly. Box
1-3 is where learners initiate their ideas and then swap them around the group – for team
supportive comments and drawings. Box 4 then allows them to review all these comments
and ideas before moving on. In short, the whole potential of e-scape is being put seriously
st
to the test in the 1 20 minutes of the activity. After that it eases somewhat.
st
The 1 trial demonstrated that we needed to be far more careful about our management of
nd
that opening 20 minutes. Re-running it all on the 2 morning began to show us how to do
nd
that and by the end of the 2 morning we had got to the middle of the activity. We then used
some spare time to show the learners what they had done. We projected their work back for
them and they were not only impressed and fascinated by it, but made the point that it really
would have helped to see (at their initial training session) this ‘big picture’ of how it is all
supposed to work.
As the activity went right through this time, we were dealing with far more of the
photography and sound files – both of which caused great interest with the learners. Whilst
recording the sound files initially caused some embarrassment, they soon learned the
process and it rapidly became just a normal part of the activity.
37
e-scape e-solutions for creative assessment in portfolio environments
nd
Since we were developing and refining these protocols during the 2 trial, it is not surprising
that we still had some ‘lost’ data from individuals in the Camborne trial. None-the-less we
got to the end of the activity and most of the learners up-loaded all their data.
In the end, we tried to do the whole familiarisation session in 1 hr, and concluded that it
really was not possible to cover everything in that time. The trials showed that ninety
minutes (often a double period or an afternoon) proved adequate, and we accordingly
contacted all the main e-scape schools to find out the exact timing of session – both for
training and for the activity – to make sure that we get the necessary time.
The Saltash trial went reasonably smoothly. We put in place all the protocols, and by now
we were becoming more comfortable with the set-up and management of the system.
Throughout the two mornings of the activity, very little data went missing, and the system
worked well.
Overall, the trials did what they were designed to do. They enabled us to tweak the
application and taught us how to manage it in the classroom / studio / workshop setting.
Throughout the process we were really impressed by the schools, the teachers and the
learners – without whom we would not have been able to take this important step forward.
38
e-scape e-solutions for creative assessment in portfolio environments
Despite this change of position, we still felt it well worthwhile to train the teachers in advance
and accordingly we scheduled two one-day training sessions, one in Birmingham on
Tuesday 23 May and the other in Newcastle on Wednesday 24 May. The locations, schools,
and attendees for these training sessions are shown below.
rd
Tues 23 May The Bond Company. Birmingham
Nancledra C.P. School Pauline Hannigan
Alexandra Park School Ross McGill
Redruth School Mike Laramy
Meole Brace School Stephen Cox
Ashfield School Joanna Hayes & Dave Rogers
Edensor Technology College Nick Bradbury
TERU Goldsmiths Kay Stables
Edexcel Awarding Body Dale Hinch
th
Wed 24 may Blackfriars Cafe Bar. Newcastle upon Tyne
Duchess’s High School Craig Watson & Diane Murphy
Dixons Academy Maria Eccles
Coquet High School David Coils
Hirst High School Bob Miller & Mark Raper
One teacher (Fiona Mather from Bedlington High school, Northumberland) was unable to
make either of those dates due to other school examinations, so we arranged for a private
training session for that teacher in her school on the day after the Newcastle training.
At the end of the training sessions, the teachers reported that they understood the system
and were confident that they could set up the facilities appropriately in their schools.
The training sessions were run in the week before half term in the summer term, and the e-
scape pilot was planned to start immediately after that half term. To run the national pilot we
had acquired four sets of kit;
• four class sets of PDAs (24 per set - for 7 groups of 3 learners with 3 spare)
• four administration laptops running the local network
• four wireless routers
39
e-scape e-solutions for creative assessment in portfolio environments
This enabled us to run the e-scape activity in four sites simultaneously, provided that we
also had four administrators. In addition to Tony Wheeler and Richard Kimbell, four
colleagues who had been involved in earlier stages of the project were trained to help
facilitate the national pilot: Soo Miller, Tristram Sheppard, Kay Stables and Ruth Wright.
Eleven schools took part in the pilot, and since we could run four at a time, they were
clustered as follows with the activity being run by the TERU team member.
th th
Round 1 (June 5 -9 ) TERU
Dixons City Academy – Maria Eccles (Tony)
Nancledra CP School – Pauline Hannigan (Soo)
Edensor Technology College – Nick Bradbury (Tristram)
Redruth School – Mike Laramy (Richard)
th rd
Round 2 (June 19 -23 )
Bedlington Community High School – Fiona Mather (Tristram)
Hirst High Technology College – Bob Miller (Ruth)
Coquet High School – Steve Thompson (Richard)
Duchess’s High School – Craig Watson (Tony)
th th
Round 3 (July 10 -14 )
Alexandra Park School – Ross McGill (Kay)
Meole Brace School – Stephen Cox (Tony)
Ashfield School – Joanne Hayes (Richard)
The pilot week in any school had the following broad framework. We undertook a training
session with the learner group to familiarise them with the e-scape application and the PDA
and specifically with the sketching, text entry, voice recording and camera functions that are
important in the e-scape activity. This training typically took a double period (typically 1.5
st
hrs) on the afternoon preceding the 1 day of the activity. Thereafter the activity itself ran for
two consecutive mornings providing a total of approx 6 hours of activity. The activity was run
by one of the research team. The lead teacher in the school was present throughout the
activity to observe and assist as necessary.
40
e-scape e-solutions for creative assessment in portfolio environments
Having completed the learner training / familiarisation, the administrator had to strip out from
the system all the learners’ work completed during the training and prepare it for the activity
the following morning. To achieve this it is important to recognise that the preparation of the
hardware involved a serious level of personalisation. The laptop was programmed with the
class names and details of the up-coming centre, and each of the PDAs was allocated to
st
one of the individual learners. So when the learner switched on the PDA for the 1 time it
would immediately come live with the interface ‘please confirm that you are Sam Walker of
group 2’ (there were 7 groups of 3 learners in each class). Since the PDAs each had sticky
labels with the same names, there was rarely any difficulty with this arrangement.
Where difficulties did arise it was typically because of absence. If Sam Walker was absent,
then one of the three reserves took her place. This involved re-assigning one of these
replacements into group 2 in place of Sam Walker and up-dating the ‘register’ in the
administration laptop. This happened in several schools and sometimes with more than one
learner. We sought to create the final working group at the training session in the afternoon
st
preceding the 1 morning of the activity. It was rare (but not unknown) for this updating to be
further up-staged by another absence on the morning of the activity itself. The easiest way
to resolve this (with the time pressure of getting the activity started) was simply to ask the
new substitute to adopt the name programmed into the system. In addition to setting up the
system, the studio/workshop space had to be prepared with the handling collections and the
general resources of the activity.
The learners also had their PDA, their booklet, (see App 2.1) a set of fine-liner pens for use
with the booklet, the user profile cards, and a set of wooden ‘pills’ (twice full size).
Elsewhere in the room was a table or work surface set aside for the additional handling
41
e-scape e-solutions for creative assessment in portfolio environments
The final elements of the room set-up was the table or work-
surface set aside for the modelling resources (see App 3.1).
Following the explorations described above we made sure
that this contained sheet, strip, fluid, and textile materials as
well a collections of ways to cut form and fix them together.
There was also an assortment of plastic balls, springs,
elastic bands and other odds and ends of gadgetry.
(http://www.teachers.tv/video/3306>http://www.teachers.tv/vi
deo/3306)
42
e-scape e-solutions for creative assessment in portfolio environments
43
e-scape e-solutions for creative assessment in portfolio environments
st
At the end of the 1 morning and then again towards the end of the activity as a whole, we
asked team-mates to be part of the review process – advising each other on the strengths
and weaknesses of their work as seen through the eyes of their team-mates.
44
e-scape e-solutions for creative assessment in portfolio environments
The opening part of the questionnaire asks learners about their previous experience of using
PDAs, and 84% said that they had little or no previous experience of using them. When
asked how much more time would be needed to get used to working with them, 82%
reported that they would need little or no more time. These data confirm the informal
impression created in the activity that learners very rapidly got to grips with the device and
its associated software. Whilst teachers were somewhat more nervous of them, the learners
adopted them as merely an extension of the mobile phones that they all use ubiquitously.
This is confirmed by learners’ response to the statement ‘it was easy to learn to use the
PDA’. By a massive majority of 127:1 (99%) learners agreed or strongly agreed with the
statement. Equally their response to the statement ‘it was fun using the PDA’ showed a 50:1
(98%) majority agreeing or strongly agreeing.
nd
The 2 part of the questionnaire asks about particular features of the PDA and learners
reactions to them as part of the activity.
96% agreed or strongly agreed that it was easy to follow the activity through the e-scape
interface.
94% agreed or strongly agreed that it was good for making the photo story lines.
92% agreed or strongly agreed that it was good tool for designing.
90% agreed or strongly agreed that the text tool was good for explaining ideas.
89% agreed or strongly agreed that they were able to show what they could do.
Within all of these sections of the data there is no significant gender variance.
The only significant gender effect is observable in response to the statement ‘the voice
memos were good for explaining my ideas’. 50 boys but only 24 girls strongly agreed with
this statement, whilst one boy but 14 girls strongly disagreed. In response to the voice-
memos therefore, at the extremes of the data there is a clear effect that suggests girls are
less likely to appreciate it. The less extreme data (agree or disagree with the statement) is
gender balanced, and overall 70% of learners agreed or strongly agreed that the voice
memos were good for explaining their ideas. We believe that the identifiable gender effect at
the extremes is related to the embarrassment / discomfort that some learners felt in talking
about their work in the public arena of the activity and the working groups.
The final part of the questionnaire is a free-response section inviting learners to tell us the
three best things (thumbs up) and the three worst things (thumbs down) about working with
the PDA. Their responses have been analysed into categories of comment, and we show
45
e-scape e-solutions for creative assessment in portfolio environments
below the top four categories in each case. Naturally – since these are free-response
comments – the percentage mentioning any particular issue is smaller than those from the
fixed elements of the questionnaire discussed above.
Thumbs up
category mentioned by..
quick / easy to use 48 i.e. 19% of learners
photos / camera 41 i.e. 16% of learners
fun / different 38 i.e. 15% of learners
voice recording 20 i.e. 8% of learners (16 boys / 4 girls)
Thumbs down
category mentioned by..
slow / crashed / it went to sleep 37 ie 15% of learners
sketching awkward on screen 31 i.e. 12% of learners
transcribing / stylus too small 31 i.e. 12 % of learners
voice recording 27 i.e. 11% of learners (6 boys 21 girls)
It is interesting to note that the top category of ‘thumbs-down’ comments is purely technical;
clear evidence of technical failure with the PDA during the activity. And we should note that
the technical difficulties giving rise to these comments are rapidly being ironed out through
the emergence of more stable platforms and devices. By contrast two of the top three
‘thumbs-up’ categories are about the emotional response of learners to operating on the
PDA – its fun / easy / different. It’ll be even easier, quicker and more fun as the stability of
the devices improves.
• teacher feedback
Comments were sought from teachers on several issues and the following comments were
typical of those returned to us.
46
e-scape e-solutions for creative assessment in portfolio environments
• I was particularly impressed with how they used the voice recordings and took them so
seriously. I feel this has tremendous potential for capturing their thoughts and feelings about
their work as they are doing it. (Nan)
• They found the novelty and ease of use of the PDA’s a positive motivator
One of the teachers had written a brief report to his head-teacher, describing the event and
the reaction of his learners. With his permission we re-produce it here to indicate the general
response of the group and of the department.
Learners took part in four three-hour sessions over two weeks. They were given two specific design
tasks (innovation challenges) that required them to develop their ideas and produce a three-
dimensional model of their solution. For the first challenge all recording of ideas was through the
hand held PC, sketching ideas or making notes directly onto the screen using a stylus, making voice
recordings to evaluate their ideas and annotate photographs, they took of their models. All
information from the PCs was automatically stored on a central PC for assessment and evaluative
purposes. The same learners then took part in a second challenge, this time recording ideas in a
workbook. The results of the two approaches will be compared in the final analysis of the results.
47
e-scape e-solutions for creative assessment in portfolio environments
o the use of handling collections of three dimensional artefacts that the learners can study and derive
ideas from.
o the use of modelling as the main strategy for developing ideas.
o the use of “post-it” notes to conduct a review of work at the half way stage, using some of the ideas
and principles of six hats thinking.
o the regular incorporation of photographs of learners’ work as it progresses.
o the use of partners for peer assessment and in sharing and contributing to each others ideas
In e-scape phase 1 we took the opportunity of trialling the ‘light fantastic’ activity in several
schools at Key Stage 2. The ease with which young learners adapted to the use of the
technology encouraged us to include at least one school in the national pilot for phase 2 and
we report the results here. The school is Nancledra Primary School in Cornwall.
The children with whom we worked were in Years 4 and 5 and the brief was complex for
such young children. Having discussed it with them however they seemed immediately to
understand it and began to address what they could manage. Inevitably some adjustment
was made in the language used within the classroom context and to the timings within the
sequence of the activity.
48
e-scape e-solutions for creative assessment in portfolio environments
playback and delete voice recordings without any adult assistance. These discoveries were
soon shared with others.
Voice memo
The children were apparently very comfortable with the multi-
functionality of the PDA with photographic, voice, writing and
sketching capability. But it was most noticeable that they
approached the voice recording activities with particular
attention. Whilst they were not at all self conscious about talking
into their PDA we can detect a difference in capability with these
young learners. Whilst older learners responded by articulating
sophisticated ideas such as ‘easy to manufacture, stylish and
attractive, target audience’ etc. younger learners tended to focus on their own experience of
the struggle to create their working prototype ‘very nearly finished - everything is working
perfectly – I really want to test it out’, ‘its looking good’.
The voice memo particularly enhanced the learning experience within the realms of
appraising ideas. Learners appeared more in control of their reflections through this review
process since they did not need to write anything. They could articulate the thoughts in their
head and talk to the PDA as a surrogate teacher, explaining what they were doing, and
perhaps why it hadn’t gone according to plan.
‘I don’t know about the spring, it might not be long enough - I don’t have the mechanism I
wanted, it was way too complicated’.’ I need to fix up the teeth but I don’t know whether they
will work because they’re not even’. ‘ I somehow need to put the pills in but it is a bit of a
problem, because as you put your wrist in they all fall out’.
Screen drawing
The sketching on the PDA tended (for most learners) to flatten the difference in technical
quality of drawing/sketching between the younger and older learners. And the ideas
expressed are impressively comparable across the age ranges.
49
e-scape e-solutions for creative assessment in portfolio environments
The relentless pace of the activity probably exacerbated this difficulty, as did the unreliability
of the ‘transcriber’ tool when used in the context of emergent writers. The transcriber did not
reliably convert handwriting that was not well formed.
Generating Ideas
The influence of the ‘handling collections’ was
evident in the generation of ideas, which ranged
across a mass of product types: pens, slides,
necklaces, bracelets, key rings and headphones.
And whilst young learners’ proposals for the
50
e-scape e-solutions for creative assessment in portfolio environments
not about the PDA or the digital system more widely. Rather it was about the concept of
design teaching and learning.
Collaboration
These young learners worked well both independently and particularly collaboratively;
continually offering ideas and assistance to each other. In addition to the in-built
collaboration of box 1/2/3 and the later team reflections, the depth of thought reflected in
their spontaneous discussion showed a real awareness of the subtleties of the task:
‘What happens if she has a heart condition?’
‘How will she know when to take the tablet?’
Development styles
Just as with the year 10 groups, many styles of development were evident. For some, ideas
popped up but appeared to go nowhere, for others good ideas emerged and grew into part
solutions, some followed their ideas tenaciously, others saw complexities and made trade-
offs, and some abandoned their risky ideas totally because they could not manage the
technicality of making it work. Most of the group developed one idea and stuck with it
throughout, only a few offering many alternative possibilities. This was evident in previous
KS 2 trials of the paper-based activity (light fantastic), suggesting that the new technology
did not impact fundamentally on their ways of working.
Overall evaluation
Despite the intensive 6 hours over which the activity spanned it proved completely ‘doable’
with KS2 learners. They were quite comfortable – and accomplished - when designing,
sharing and discussing, modelling, collaborating and reflecting throughout the activity. The
teacher – along with the head and other staff in the school - was equally enthusiastic about
the potential of the PDA to enrich design learning & teaching. And it is perhaps appropriate
to leave the final comments on the Nancledra year 5 experience to her and her class.
“They liked the task and were very excited by the PDAs”
“I was very pleased how the children stayed on task even thought they must have been
flagging by the end. I think this was due to how the task was structured as well as the
eagerness to do all things digital.”
“I was amazed how quickly the children grasped the technology and were in no way
overawed by it.”
“I was particularly impressed with how they used the voice recordings and took them so
seriously. I feel this has tremendous potential for capturing their thoughts and feelings about
their work as they are doing it.”
51
e-scape e-solutions for creative assessment in portfolio environments
2.9 Teachers TV
Through QCA, we were contacted by a Teachers TV company (Evans-Woolfe) with whom
Richard Kimbell had worked on previous projects. They had heard of the trials and were
interested to film them as part of a design & technology series that they had been
commissioned to produce. They agreed to our conditions concerning the filming and we
nd
agreed to allow them to film the 2 Cornwall trial in Saltash.net Community School. The
school were enthusiastic about this and all appropriate protocols were followed.
The outcome has been two teachers TV films (15 minutes each); one focused on the e-
scape activity over 2 days (‘The future’s hand-held’) and the other is based on a series of
discussions about how the activity worked and what its significance might be for the future
nd
(‘The issues”). This 2 film includes reactions and discussions with the learners, the
teacher, another teacher from a different
department, and the TERU team. Both of
st
these films were broadcast for the 1 time on
th st
Monday 8 Jan 2007, the 1 at 5pm
nd
immediately followed by the 2 at 5.15pm.
Having watched the two films, we were aware of the vast quantity of film that had been
captured in the 2 days in Saltash but that Guiver had not been able to use in the relatively
short TV programmes. Accordingly we commissioned a further programme from Evans
Woolfe; a research-based documentary in which we describe and illustrate (through the
extensive film shot in Saltash) the structure of the activity and the rationale for it. This
programme runs to approx 90 minutes and is built around 8 ‘chapters’ which vary in length,
the shortest being approx 6 minutes and the longest approx 12 minutes. We are delighted
with this DVD – since each of the chapters captures a subtle combination of rationale and
exemplification. We include a copy of the DVD with this report
As we were completing the additional filming for this documentary, we were contacted by
another production company, Real Life Media Productions Ltd, who wanted to film e-scape
for a quite different Teachers TV programme. The focus of the programme was not on
design & technology but on e-assessment. RLMP gained permission from Evans Woolfe to
use enough of their film to portray the classroom/studio activity in Saltash and then we
filmed more in the TERU offices – essentially debating issues of e-assessment.
th
The two Evans-Woolfe films were first screened on 8 Jan 2007
• ‘the future’s hand-held’ at 5pm
http://www.teachers.tv/video/3306
• ‘new technology; the issues’ at 5.15pm
52
e-scape e-solutions for creative assessment in portfolio environments
http://www.teachers.tv/video/3307
The Real-Life-Media film was first broadcast the following day,12.30pm 9th Jan 2007
• e-assessment – where next? (approx 30 minutes).
http://www.teachers.tv/video/5431
Once broadcast, the films are available for viewing and downloading through the Teachers
TV website
http://www.teachers.tv/subject
Whilst the filming of these programmes has represented additional un-planned work for the
team, we believe that it has been well worthwhile. First of course the programmes provide
excellent dissemination to schools about the approach we have been developing and some
of its potential and consequences in the future. But second, we have been provided with
free access to a professional film crew who could record the whole e-scape process in
detail. This has been a fantastically valuable resource, and the Evans Woolfe documentary
in particular stands as a terrific reference product for the project.
During e-scape week 1, we took the paper-test resources with us, and after completing the
escape activity we left them the paper resources to run that activity at some point of their
choosing within the following two weeks. We then re-visited all the e-scape wk 1 schools,
collected all the work and transported all the resources to the e-scape wk 3 schools. They
then had two weeks to complete the paper activity before we turned up to run the digital e-
scape activity.
This plan eliminated the Northumberland/Newcastle schools from the paper test. And we
recognise that there may be a difference between those schools that did the e-scape activity
first and those that did it second. It has been interesting to see the extent to which the
experience of the paper activity (before or after the digital one) affected the performance of
those involved in both tests.
The paper-based test activity is the ‘light fantastic’ activity developed for Assessing Design
Innovation. (See App 2.4 and 2.5) It has been fully reported in the research report of that
project and little more need be said here. The form of the e-scape activity was evolved
directly from this Light Fantastic structure with differences only where the new technology
provided some new and important opportunities (e.g. with the voice memos). Otherwise, the
timings, structures, and resources for the activity are the same.
53
e-scape e-solutions for creative assessment in portfolio environments
We therefore have a total of 12 schools for assessment at age 15 and a total of 249
portfolios. Considering that the maximum possible number of portfolios was 252 (21 learners
in each of 12 school), we consider that being only 3 short (for absence and other technical
reasons) is a substantial achievement and we are immensely grateful for all the efforts of the
schools, our administrator colleagues, Handheld learning and TAG Learning. All the learner
portfolios are in the website at the following address. http://212.100.251.115/e-scape/
Because of the nature of the data on the site – with real names of
schools and learners and including some photos and their voice
files - access to the site is restricted to the research team, Ian
Williams at QCA and research staff at AQA and Edexcel.
Double clicking on the school will open the class list for that school,
and double clicking on a learner name on that list will open their
p
o
r
t
f
o
l
i
o
.
54
e-scape e-solutions for creative assessment in portfolio environments
The portfolio is structured through the 23 sub-tasks of the 6-hour activity, with response
modes of various kinds (drawing, writing, photographing and speaking) and with both
individual and team-based purposes. Like the paper-portfolios that were the precursor to e-
scape, these web-screens provide a very real picture of the learners’ evolving prototype and
their thoughts about it along the way.
This snapshot of box 8 illustrates the richness of these data. The three
photographs show the drawing up to that moment and two photos of the model -
from different angles. Clicking on the magnifying glass brings the images to full-
screen size. The two sound files are the authentic recorded voice of the learner
responding to two questions – ‘what is working well?’ and ‘what needs further
development?’ – and together these provide a real insight into their understanding
of their work. It is important to note that this ‘photo and sound file’ routine recurs
throughout the activity – essentially once an hour for the 6 hours. At least three
significant things result from this. First, many of them get better – more articulate –
in describing their work and the circumstances surrounding it. Second, the routine
– taken together – leaves a real-time visual/audio evidence trail that is quite unique in the
assessment of performance in design & technology. Third, learners’ approach to the task is
enriched as they are more motivated, braver (take more risks), and think more deeply about
what they are doing.
Finally, the review comment (below the sound files) is a reflection by the learner made at the
very end of the 6 hours of activity. Looking back over their whole work, we invite them to
think about what they might have done differently if they had known then what they know
now. Sometimes these meta-cognitive responses are descriptive – as in this case – and
sometimes they are deeply analytic of their own performance.
We are not aware of any equivalent system of real-time, dynamic, e-portfolio assessment for
any subject in any country. We believe this to be a world first. The 249 rich portfolios that
st
inhabit the website have now become the focus of our work in the project. And the 1
challenge was to develop an approach for the assessment process.
However, during some early discussions of our web-based portfolios, our attention was
drawn to the work of Alistair Pollitt – formerly the research director for the University of
Cambridge Local Examinations Syndicate. He had been advocating an alternative form of
assessment that was particularly appropriate for performance disciplines because it is based
55
e-scape e-solutions for creative assessment in portfolio environments
on judges making whole judgments about pieces of work rather than allocating marks to
criteria and them adding up the result.
We have long argued for the pre-eminence of holistic assessment in design & technology
and in 1991 had demonstrated through APU data that such holistic assessment could be
very reliable with suitably trained assessors. It was for this reason that the holistic judgment
st
was the 1 judgment to be made with the form discussed above. The resulting assessment
was not to be seen as the arithmetic sum of the 4 part-judgments. Rather the four part-
judgments were used to illuminate the nature of the performance that had already been
judged holistically.
Pollitt’s approach is based on a theory of judgment derived initially from Thurstone (1927)
and based on what we might call ‘differentiated pairs’. The assessment process involves
making a single judgment about which of two portfolios is the better, and – if enough
judgment of this kind are made (and by enough judges) a rank order of performance can
result. After a series of discussions with QCA, Edexcel, AQA, Pollitt and the TERU team, we
decided on a trial of the system using our existing archive of paper-based performance from
the ‘Assessing Design Innovation’ archive.
The case for the approach – and the outcome of that assessment trial – have been written
up by Pollitt, and we include here his report to us.
But while scoring and aggregation seems to suit those examinations composed of many small questions examiners
in some other subjects, where they want to assess created objects or performances, have often used the method
only reluctantly. In the assessment of ‘writing’, for example, the debate between advocates of analytic and holistic
approaches has never been resolved; it appears that examiners of Art at GCSE and A Level apply the method
‘backwards’, assessing the overall result first and then applying ‘suitable’ numbers as marks. An examination that
assesses design ability belongs to this group where marking may be inappropriate.
Given the Fundamental Requirement, it is not obviously necessary that exams should be marked. The requirement
is to find some way to judge the learners’ performances in order to create the scale that is needed, and marking
items to add up their scores is just the way we seem to have chosen to do this. An alternative method does exist, in
which the examiners are asked to make holistic judgments of the quality of learners’ work.
In the words of a recent book on the psychology of judgment, “There is no absolute judgment. All judgments are
comparisons of one thing with another” (Laming, 2004). In other words, all judgments are relative. Since 1995 almost
all (non-statistical) studies of examination comparability in England & Wales have used a method of relative
judgment (Pollitt, 1994), in which examiners are asked to compare pairs of ‘scripts’ from different exam syllabuses,
simply reporting which is the ‘better’ of the two.
56
e-scape e-solutions for creative assessment in portfolio environments
Whether it is used for studying comparability or simply for grading a single examination, the method is based on the
psychophysical research of Louis Thurstone, and specifically on his Law of Comparative Judgment (Thurstone,
1927). The essential principle in this law is that, whenever a judge compares two performances (using their own
personal ‘standard’ or internalised criteria) the judge’s personal standard cancels out. The greater the true difference
between the quality of the two performances the more likely it is that the better one will win each time they are
compared. Thus a large set of comparisons does more than just generate a rank order; the relative frequency of
success of one performance against another also indicates how far apart they are in quality.
Statistical analysis of a matrix of comparative judgments of ‘scripts’ can construct a measurement scale expressing
the relative value of the performances. The result of comparisons of this kind is objective relative measurement, on a
scale with a constant unit. Furthermore, if a few scripts that have already been agreed to represent grade
boundaries – perhaps from a previous sitting of the examination – are included in the comparisons, the whole
process of marking, grading and comparability of standards can be replaced by the collection and analysis of paired
comparative judgments.
| 3| -08- -17-
| |
| |
| 2| -03- -15-
| | -14- -20-
| | -02- -09-
| 1| -16-
| | -07- -11-
| | -04- -05-
| 0| -19-
| |
| | -10-
|-1|
| | -13-
| |
|-2|
| |
| |
|-3|
| |
| | -06-
|-4| -01-
| | -12-
| | -18-
|-5|
The scale runs from a low of –5 to a high of +3; the average of the 20 scripts’ parameters is 0.00. To interpret this
scale it needs to be “anchored”, as mentioned above, with a few scripts that have already been graded. The scale
reliability was estimated to be 0.92, at least as high as would be expected in a GCSE marking study.
To confirm the results of the analysis the scripts’ parameters were plotted against the marks previously assigned to
them.
57
e-scape e-solutions for creative assessment in portfolio environments
1
Parameter
-1
-2
-3
-4
-5
0 2 4 6 8 10 12
Mark
As expected, there is a strong but non-linear relationship between the parameters and the marks. (The relationship
is expected to be non-linear because the mark scale is bounded, with a minimum of 0 and a maximum of12, while
2
the parameter scale runs from - to + .) The value of R was 0.81, corresponding to a correlation of 0.90 between
two linear variables, as high as could be expected in a case like this.
We should make it clear that the trial outlined above led to the creation of data sets that
were separately analysed by Alistair Pollitt (using his system) and Malcolm Hayes (using the
Edexcel system). Reassuringly, both sets of analysis produced the same results. The
outcome of this trial was very interesting. Not only was the resulting rank-order virtually
identical to that which we had derived from the conventional assessment process, but
moreover the six ‘judges’ who had been involved all felt that the holistic comparative
judgment process was both easier and intuitively more appropriate than trying to allocate
numbers to parts of the portfolios and then summing these numbers.
Accordingly, following the success of the trial, and after further debate, we agreed that the
approach to assessment with the e-scape portfolios should be by using Pollitt-style
comparative pairs judgment, and we then set about designing the system. It is important to
recognize that the process has never been used before for ‘front-line’ assessment. Its use
has been restricted (for manageability reasons) to inter-board research studies of
comparability, and such studies are based on only a handful of scripts.
58
e-scape e-solutions for creative assessment in portfolio environments
Pollitt proposed 16 as the basic number of comparisons that need to be made for any script;
i.e. each portfolio is compared with 16 other portfolios. Moreover some of the judgments
would need to be repeated by different judges. After a good deal of debate we settled on a
3-cycle approach using the 7 judges that had agreed to take part in the process.
• Round 1 would involve each judge in making 140 comparisons. The outcome of the
resulting 980 or so pairs (i.e. involving 1,960 viewings of scripts), would be an approximate
rank-order based on about 8 comparisons per script. Some of these judgments would be
very easy as some of the comparisons would be excellent work compared with poor work.
• Round 2 pairings (another 140 each) would be drawn up using the approximate rank-order
from round 1. In this round we would no longer encounter those big differences, but rather
the pairs selection would be focused on closer judgments to refine the rank-order.
• Finally round 3 would be used to target notional grade boundaries. We decided that we
should model the Awarding Body awarding process, using round 3 to firm up the data at the
boundaries between notional grades.
To make the pairs comparisons somewhat more manageable for the judges, a system of
‘chained-pairing’ was adopted. So, for example, judge 1 might be asked to undertake the
following pairings:
25 : 210
210 : 77
77 : 125
125 : 48
st
Having got into the work of the 1 pair of scripts, and having made a judgment about which
is better, I put away No 25 and just open No 77 to compare with No 210 with which I am
already familiar.
The judging system was developed through August and early September 2006 and we ran a
th
training session in TERU (8 September) for all the judges to become familiar with the
website and how to navigate around the work within it. Each judge was issues unique
access codes for the website and during the training we examined several pairs, shared our
thoughts both about the work and about the process of arriving at a decision.
The first round of judging was undertaken in the last two weeks of September. Pollitt
undertook the resulting analysis and returned new pairings for the team. Round two then
took place in the first two weeks of October. At this point, having analysed the resulting data,
Pollitt suggested an alternative strategy for round 3. Instead of doing more pairs, he asked
whether we would be able to examine (say) 6 pieces of work and place them in a rank order.
We believed that this would be possible and accordingly round 3 involved (for each judge) a
few more pairs and then two sets of 6 to be ranked. Each ranking of 6 pieces provides an
equivalent amount of data to 15 paired comparisons, and we were confident that each set of
6 would not take the same length of time that 15 pairs takes.
59
e-scape e-solutions for creative assessment in portfolio environments
Paired-comparison scale creation for e-scape: a report from Alistair Pollitt: 8th November 2006
------------------------------------------------------------------
Object | Wins Losses Comparisons %
------------------------------------------------------------------
1 | 1 10 12 22 45.5
2 | 2 10 26 36 27.8
3 | 3 10 6 16 62.5
4 | 4 13 7 20 65.0
5 | 5 6 10 16 37.5
6 | 6 12 4 16 75.0
...
247 | 247 9 7 16 56.2
248 | 248 12 6 18 66.7
249 | 249 18 2 20 90.0
------------------------------------------------------------------
Total number of comparisons = 2322
These percentage scores are the starting values for estimating appropriate values for each portfolio: because some of them were
mostly compared to ‘better’ portfolios and others to ‘poorer’ we cannot use the percentages directly. After correcting for these
differences in comparators, the program reports a list of values:
60
e-scape e-solutions for creative assessment in portfolio environments
1 | -0.716 0.484 | 1
2 | -5.543 0.398 | 2
3 | -0.026 0.717 | 3
4 | 0.987 0.499 | 4
5 | -2.953 0.706 | 5
6 | 2.838 0.664 | 6
..
247 | -0.335 0.586 | 247
248 | 0.094 0.569 | 248
249 | 4.788 0.776 | 249
At this stage the average value for all portfolios is, by definition, 0.00. After a simple rescaling, as described above, the report is:
1 | 3.02 0.35 | 1
2 | -0.42 0.62 | 2
3 | 3.52 0.51 | 3
4 | 4.24 0.36 | 4
5 | 1.43 0.50 | 5
6 | 5.56 0.47 | 6
...
247 | 3.30 0.42 | 247
248 | 3.60 0.41 | 248
249 | 6.95 0.55 | 249
These are the final values – equivalent to the final marks – for each portfolio. The first digit indicates the grade (although anything
less than 2 is considered Grade 1, and anything greater than 5 is a Grade 5). Note that portfolio 2 was judged to be extremely poor,
and portfolio 249 extremely good.
In addition to Value or Grade, these reports give a ‘Standard Error’ for each portfolio. Traditional marking fails to report these,
although assessment specialists are all aware that the assessment process should acknowledge the amount of uncertainty in any exam
result. It’s hard to generalise, but the standard errors for GCSEs are probably between 0.5 and 1.0 grades for most subjects. For this
analysis the average standard error is 0.46 grades, which seems acceptable: if we exclude the 30 best and 30 worst portfolios the
average standard error is 0.41 grades. Of course, there are just five grades in this exercise. However, increasing this to nine grades
would mostly involve splitting the two extreme grades – Grade 5 would give A* and A, Grade 1 would give F, G and N/U – and the
middle three grades would only be expanded to four, giving us an average standard error still only about 0.55 grades for most
candidates.
The complete set of results can best be seen in the diagram below. In it, the portfolios have been sorted into order and are shown with
their standard errors. In formal statistical terms 68% of the portfolios’ “true” values will lie within one standard error of the reported
value. (This is the basis for Paul Black’s much-quoted remark that “They give the wrong grade or level about 30 per cent of the
time.”)
Vertical lines are drawn through the grade boundaries to show how many learners would fall into each grade. Note that Grades 2-4
are, by definition, equal size, and that this leads to more learners lying in the central grade than the ones either side of it.
61
e-scape e-solutions for creative assessment in portfolio environments
4
Value
-1
0 50 100 150 200 250
Rank
The diagram also shows the effect of Round 3. This concentrated on the portfolios close to the grade boundaries at the end of Round
2. Each of them was put into 2-5 extra comparisons, in addition to the 16 (usually) they had already been in; as a result their standard
errors are smaller than average. The effect is not large, but only because Round 3 was a rather small exercise. In an operational
system more of these ‘borderline’ judgments would be made so that as many as possible of the portfolios would be assigned with
high confidence to the appropriate grade. Note that this focus on borderline portfolios is not possible in a mark-based system, unless
we go back to the procedures that used to be applied – rather unsystematically – after the award was complete. In this system
‘borderlining’ can be applied routinely, and as much or as little as is desired.
The analysis of the judgments also gives a traditional indication of the quality of the measurement process:
The key figure here is the reliability coefficient of 0.93. This figure allows for unreliability between markers as well as for lack of
internal consistency within the examination – most traditional reliability coefficients only allow for one of these. Only a few current
GCSEs are likely to be as reliable as this if we consider both sources of unreliability.
Finally, the report checks the consistency of each judge and each portfolio. The report on the judges is:
62
e-scape e-solutions for creative assessment in portfolio environments
The column ‘WmnSq’ (weighted mean square) is the most important. A figure greater than the mean plus twice the S.D. – in this
case 1.05 – would indicate cause for concern, a judge who was not behaving in the same way as the others. None of these judges fails
the test, but Judge 3 should be monitored for a while in future to make sure that he does not drift further away from the others.
It shows that a few of them ought to be checked (at least 2 of the 249). The criterion would be 0.85+2*0.23, or 1.31; portfolio
number 247 exceeds this, suggesting that there is something about it that is unusual enough to warrant a further look – perhaps
different judges valued them in different ways.
There are three nuggets in this report to which we would - in particular – draw the attention
of readers, quite apart from the performance scale itself.
But this reliability is hardly surprising. Each piece of work has been compared with many
others, and the judgments had been made by many judges. Any idiosyncratic judgments
were soon outweighed by the weight of opinion of the team. The process is almost inevitably
more reliable than current GCSE practices, where much of the work is assessed by the
teacher alone, or at best by the teacher and one external moderator.
Second it is important to note the consistency of the judges. In this comparative pairs
approach, the analysis automatically produces a reading of the judging team, specifically
concerning their consensuality. The system notes how often – and by how much – my
judgments are at variance with the other judges and in the end produces a mean score for
the whole sample. If I am more than two Standard Deviations from that score, then I am a
cause for concern.
Third, the system also automatically produces data on the consensuality of judgments
applied to individual portfolios. Reference to the ‘plot of values’ (above) shows some
portfolios with much longer standard error ‘tails’ than others. These are the portfolios over
which there was a considerable amount of disagreement within the judging team. In the
process, the system automatically highlights the pieces of work that need closer attention.
“It shows that a few of them ought to be checked (at least 2 of the 249). The criterion would be
0.85+2*0.23, or 1.31; portfolio number 247 exceeds this, suggesting that there is something about it
63
e-scape e-solutions for creative assessment in portfolio environments
that is unusual enough to warrant a further look – perhaps different judges valued them in different
ways.” (see Pollitt above)
These three are all automatic virtues of the comparative pairs judging process.
We were interested initially in the time that the process takes, and there was a degree of
conformity on the matter. Round one was generally agreed to be the toughest and the early
st
pairs (say the 1 20 or 30) took as long as 10 minutes per pair to decide, but gradually we
got quicker. This speeding-up resulted in part from being more skilled in working our way
around the portfolio, and in part from the fact that the pairings inevitably threw up repeats.
st
Having got properly inside a piece of work at the 1 time of asking, it took only a much
nd
briefer scan 2 time around to remind us of its qualities. By the end of the 140 pairs we
were typically doing each pair in 2 minutes. On average, for the whole sample, we can
st
estimate 4.5 minutes per pair, amounting to approx 10 hrs for each judge. The 1 round
therefore used 70 hrs of judging time to produce a rank order for 249 portfolios. Put another
way, each portfolio took approx 17 minutes to locate into the rank order.
nd
The pairs in the 2 round were closer together but the relative difficulty of these round 2
decisions was offset by the familiarity (by now) with much of the work. Generally round 2
was quicker than round 1. Round three was a very limited scrutiny of the grade boundaries
and was relatively easy and quick.
The wider feedback from judges about the process of undertaking the three rounds; about
their attitudes to it and about the things that were easy/hard about it are summarised below.
64
e-scape e-solutions for creative assessment in portfolio environments
There is far more feedback data from the judges than we have reported here, and these
snippets are included merely to illuminate the process from the inside. Other sections of the
feedback data (particularly concerning the wider potential of the system in the future) will
appear later in the report under the ‘issues arising’ from the project.
The marking procedure was a blend of the original approach to assessing ‘light fantastic’ as
part of the previous project (Assessing Design Innovation), and the approach we have used
for e-scape.
In a nutshell, we used the original assessment rubric (see Appendix 2.6), but only as far as
step 1. We divided the scripts into 2 sets and Tony Wheeler marked one and Richard
Kimbell the other. This was done partly be rank-ordering the pieces in our respective sets
and then deciding where to latch the rank order onto the quantitative scale. Since this was
done separately, we had in effect created 2 latched rank orders (TWs and RKs) and we then
spent a considerable time moderating the two to reconcile any disagreement either about
the ranking or the latching onto the scale (or both). In this way we created a simple holistic
mark on a scale of 1-12 for each of the 84 ‘light fantastic’ pieces of work.
65
e-scape e-solutions for creative assessment in portfolio environments
The research team at TERU have explored the nature of performance in design and
technology in a number of previous projects over the past 25 years. Comprehensive
explanations of the principles behind this work can be found in The Assessment of
Performance in Design & Technology (Kimbell et al 1991) and more recently Assessing
Design Innovation (Kimbell et al 2004)
These prior projects have described the wide range of procedural competences necessary
to make effective progress towards the resolution of design challenges. Furthermore they
describe two characteristic qualities of performance within this overall process, a reflective
ability, allowing us to think around the task, and an active ability, allowing us to take action
in response to the task. These two qualities are linked though the critical quality of appraisal
and together the three qualities account for the iterative process of ‘to-ing and fro-ing’
between thought and action. The interaction of these three qualities formed the basis of a
holistic assessment framework that has underpinned much of our subsequent work.
Assessing Design Innovation, the precursor to the e-scape project, extended the team’s
original procedural assessment framework to focus more specifically on innovation (design
ideas) and group work, two qualities that have been significantly undervalued in most
assessment systems for design and technology in school. Ideas, rather than outcomes, lie
at the heart of the creative process, and we developed this to embrace having ideas
(sparkiness), growing ideas (modelling) and proving ideas (criticality).
The key purpose of the e-scape project has been to explore the potential of digital
enhancements to the existing paper-based assessment systems developed in previous
projects. Even though the technology has changed, the tools and format of both the
evidence we can collect of learners’ performance, and the processes we use to judge it, we
have worked throughout to the same procedural and assessment principles. Rather than
reiterate the qualities of performance here, we feel it is more important to describe some of
the key differences and similarities between performance in the paper and digital activities.
The following section is divided into comparisons between the four main media types
available on the PDA, sketching, photography, text entry and audio notes.
66
e-scape e-solutions for creative assessment in portfolio environments
67
e-scape e-solutions for creative assessment in portfolio environments
just sketches
It is possible to find box 1 examples in both the
paper and PDA tests where learners have only
used sketches with no notes, but these are not
typical, and it could be that the learner ran out of
time before getting round to annotating their work.
just notes
In the paper version of the test we noted a number of
responses to box1 where learners had just used
notes with no sketches. Interestingly while there are
PDA box 1 examples that are predominantly text,
there do not seem to be any with no sketching at all.
This might be because while writing legibly is
relatively easy on screen, the style of the handwriting
is not as precise as it is on paper, which may have
discouraged some learners from writing.
use of colour
With very little training and only a few minutes to jot down their ideas, learners made
surprising, appropriate and dynamic use of the various colour tools to enhance their
sketches and notes in box 1.
68
e-scape e-solutions for creative assessment in portfolio environments
use of tools
The use of the different paint tools allowed learners to create dramatically different
responses, from straight-line technical drawings, to more freeform artistic responses.
number of ideas
Just as we had seen in the paper tests, some learners started by jotting down lots of
different ideas, while others focussed on a single idea and developed this in more detail.
In reviewing learners’ use of the PDA to collect and develop their early ideas and the quality
of the sketches they produced, our overall impression of the sample is that the digital tools
did not appear to hinder the development of ideas in box 1, indeed the range of drawing
tools available has facilitated a much richer and more diverse set of responses than we
typically saw in the paper tests.
69
e-scape e-solutions for creative assessment in portfolio environments
70
e-scape e-solutions for creative assessment in portfolio environments
else’s ideas, they mostly made notes on the PDA. This was not the case in the paper tests
where there appeared to be more of a balance between sketching and notes across all 3
boxes. In fact on paper, learners tended to follow what had come before with a text-rich box
1 prompting a text-rich response in boxes 2 & 3. We think this difference can in part be
explained by the lack of space on the PDA as learners filled up much more of box 1 on the
PDA than they had on paper, often leaving only small gaps to add further ideas in box 2 & 3.
high quality Box 1s for whole group (average performance for each overall)
71
e-scape e-solutions for creative assessment in portfolio environments
typical quality box 1s for whole group Y10 (some high overall performance)
Y5 sketching performance
Although the majority of the learners who completed the PDA task were from year 9 and 10,
we also trialled the system with one group of Y5 learners, and looking through their sketches
it is often difficult to tell the difference between primary and secondary early ideas.
Overall twice as many learners agreed or strongly agreed that the PDA was good for
sketching their early ideas (65%) as opposed to disagreed or strongly disagreed (35%)
72
e-scape e-solutions for creative assessment in portfolio environments
In conversation with learners during and after the activity their response to the PDA drawing
tools depended on how you asked the question. It was clear, and not surprising, that in the
context of a formal test most learners felt they could create better presentation drawings on
paper using familiar tools and techniques. Most had not thought about the difference
between this form of formal drawing and quick personal sketches to help get their ideas
sorted out, and if they had they did not consider that these would be valued in a test.
Once a distinction was made between drawing to present, and drawing to sort out your
ideas, only a few learners still felt they would have preferred to draw on paper throughout. In
a future development of the system, where learners have been working with the technology
throughout their course of study, it would be possible to offer learners a choice of medium
that in itself would provide interesting insights into their attitude to communication.
Many learners were frustrated by the size of the drawing area. We had purposely restricted
the size of box 1 on the PDA to the same dimensions as the paper version of box1. This
was partly for technical reasons (we were concerned that big files would clog up the wireless
system) but also so we could compare digital and paper based responses.
The small screen on the device was also a problem for a number of learners, some reported
it as too fiddly and some screens were misaligned and needed resetting. We are confident
that with sufficient time and access to familiarise themselves with the tools and format of the
devise and to personalise it to meet their particular needs, most learners would be satisfied
that the PDA offered an appropriate or better platform for collecting early ideas and taking
notes.
In it’s present form the PDA is certainly not the best platform for creating presentation
drawings, and even after significant time to get used to the tools, it would require a much
larger input/drawing area and display screen. We are aware of a number of technologies
currently in development, such as mini projectors, virtual keyboards and tabletops, which
could be harnessed in the near future to create a more suitable interface for this type of
work.
73
e-scape e-solutions for creative assessment in portfolio environments
The PDA system also allowed us to collect more images (3 per hour) so learners could take
shots from different angles to show different aspects of their work and provide a far richer
pictorial record of the development of their ideas.
The cameras in the PDA were not designed to take pictures close up in poor lighting
conditions. While all the learners picked up the PDA drawing tools quickly, it was difficult to
get all learners to take quality photos of the sketches in their workbooks. We provided black
felt tip drawing pens to ensure high contrast in the drawings (pencil marks were grey on
grey). With the right lighting and positioning it was possible to collect very clear pictures,
however learners really needed more time to experiment and better feedback systems in the
e-scape system to help them select the best photos to keep and transfer to their portfolio.
As with digital sketching it is important to reiterate that good photography skills do not
necessarily mean you are a good designer, and even if you cannot use a camera well, it
doesn’t mean you are a bad designer.
T
e
x
t
good photo, good sketching, typical D&T (Y10) typical photo, typical sketching, good D&T (Y10)
74
e-scape e-solutions for creative assessment in portfolio environments
e
n
t
r
y
good photography, good sketching, good D&T (Y6) good photography, typical sketching, good D&T (Y10)
user group implied general user group suggested specific user group specified
independently in box1 independently in box1 independently in box1
75
e-scape e-solutions for creative assessment in portfolio environments
We provided a set of user profiles and learners either chose one of these, or extended, or
amalgamated, or created their own user profile for their particular design. The following
screen shots illustrate the wide range of responses to this sub task:
As with all the other aspects of the PDA task it is important to reinforce that good performance
in individual sections is not necessarily related to good D&T performance. It’s not the
snapshot of where they are that counts but how this changes throughout the activity.
76
e-scape e-solutions for creative assessment in portfolio environments
Even though this was the last sub task and most
learners were exhausted by this stage of the
activity, their texting skills ensured that these text
frames were a rich source of evidence, not only of
their design ideas but also of their attitude and
approach.
77
e-scape e-solutions for creative assessment in portfolio environments
second audio note (a sound-bite) explaining what was going well with their design and what
could still be improved. We were keen to explore the role of talking in the design process, an
issue that arose in the previous project (Assessing Design Innovation) when a teacher had
pointed out a particularly animated conversation between two learners. “If only we could
eavesdrop on that conversation we would know so much more about what they are trying to
do…” The e-scape system allowed us to explore the potential of design-talk. We outlined
above (in part one of this report) our early trials of some possible approaches, and
eventually we settled on the sound-bite approach, illuminating the photo-story-line.
We were careful to restrict the length of each audio clip to 30 seconds to avoid data
overload for the judges and to force learners to summarise the key points they wanted to
make. Typically early sound files are descriptive of the modelling process:
“What is working well is that I have got the shape how I want it, I’ve glued it and put a pipe
cleaner inside the corrugated cardboard to make it secure”
Most learners identified strengths and weaknesses in the progress of their work. Their
comments still appeared to be more positively than negatively skewed, but not to the same
extent that we had seen in the paper versions.
“What isn’t going so well is the top bit of my model because I find it hard to cut out the shape
that I want so I need to keep trying with that – but the rest of it’s OK”
Some learners demonstrated developmental growth across the series of audio clips,
identifying a weakness in an earlier clip and reporting it resolved in the next.
“the thing that will need further development is the basket where you pop up the pills and it
has to get caught in the basket because at the moment the basket is in the way of the pills
popping up”
“ I have found out how to make the pills pop up by making a piece of card with a hole in it
and putting some springy stuff over it”
Although some learners, particularly the girls, reported feeling uncomfortable recording the
audio notes, it was surprising how, after the first few attempts they relaxed and allowed their
authentic voice to come through and some assessors commented that they felt they could
identify additional aspects of learners’ attitudes as well as intentions, which was not evident
78
e-scape e-solutions for creative assessment in portfolio environments
Lift Pitch
The last audio task we set the learners was to ‘pitch’ their ideas to the company directors
during a short (30 second) lift ride from the first to fifth floor. Given how little training,
guidance and previous experience the learners had with this technique the team were
amazed how well they coped with it, both in terms of the quality of the pitches and the
diversity of the styles they used.
Some made a sales pitch to the user reflecting some understanding of why someone might
want their thing
- bright and pink for girly types
- chunky, clear and easy to use for old people
Some made a design pitch to the manufacturer reflecting more analytically about why they
have done things
- why it's made the way it is
- why it looks like that
- why it's good the way it works
- why it will be good for users
- commercial/economic advantages
There are also different ways in which learners present their pitch including:
- natural/relaxed conversational style
- humorous/whacky/off-the-wall
- dynamic/gripping/intense/dramatic documentary style
- formal/structured and business like
- concise and to the point
We even had one girl who rapped her presentation
The following is typical of the type of response to this subtask. As with the strengths and
weaknesses, the transcript is not representative of the subtle dimensions the live audio
conveys and we have included a selection of lift pitch recordings in appendix 1.
79
e-scape e-solutions for creative assessment in portfolio environments
“My pill dispenser is unique and individual and it’s got a hand wrist strap that can be worn
around the wrist or attached to things like bags. It’s got 2 main compartments, which inside
have 2 secondary compartments so it can hold up to four different types of pill without
confusion. The lids are tight so the pills don’t fall out and the boxes can be separated and
taken if the person only needs 2 types. It also comes in different colours so that people with
different tastes will like them”
80
e-scape e-solutions for creative assessment in portfolio environments
This is not cheating. It is evidence of sophisticated design & technology capability in action
as learners merge and synthesise ideas, an important quality which we struggled to
evidence in the paper activities. With the digital portfolio, is has proved to be far easier to
follow the illusive path of ideas as they passed from one teammate to another.
81
e-scape e-solutions for creative assessment in portfolio environments
Despite a somewhat shaky start, the detail above from this Y5 portfolio illustrates significant
growth in this learners’ thinking between box 1 and box 4, a clear indicator of capability.
Compare this to the limited growth between boxes 1 and 4 for this Y10 learner. (Both
learners’ work then develops rapidly and the portfolios rank highly overall)
Plot of Values with Standard Errors The distribution of performance across the whole of
the e-scape sample is shown in this chart. It also
7
shows the Standard Error attached to each portfolio
6
5
placement.
4
In his analysis of this distribution Pollitt points out
Value
2
that:
1 if we exclude the 30 best and 30 worst portfolios the
average standard error is 0.41 grades… In formal
0
statistical terms 68% of the portfolios’ “true” values
-1 will lie within one standard error of the reported
0 50 100 150 200 250
Rank
value.
82
e-scape e-solutions for creative assessment in portfolio environments
Vertical lines are drawn through the grade boundaries to show how many learners would fall
into each grade. Note that Grades 2-4 are, by definition, equal size, and that this leads to
more learners lying in the central grade than the ones either side of it. (see Pollitt section
2.13)
Performance by gender
The e-scape sample was not a gender-balanced sample. Despite our attempts to achieve
this balance – by asking teachers to create such groupings – we were often worked with the
pre-existing GCSE groups that ‘belonged’ to the teacher responsible for managing the e-
scape pilot in that school. These groups were not always balanced and we ended with a
sample of 109 girls and 140 boys.
83
e-scape e-solutions for creative assessment in portfolio environments
Performance by school
It gradually became apparent to us as we worked through the judging process, that the
schools were not all equivalent in their learners’ performance.
84
e-scape e-solutions for creative assessment in portfolio environments
In the APU survey that we conducted for DES in 1991, we demonstrated a far higher ‘school
effect’ than was normal for most subjects and we attributed that to the relative newness of
the subject. We speculated that where design and technology was well established in the
culture of the school, performance was likely to be better than in those schools where it was
new. But this new data is not so easily explained, since in all the schools visited there was
evidently well-established practice.
We suspect that the differences evident here are attributable to a combination of factors.
• greater difficulty (in some schools) with the technology
• less familiarity (in some schools) with innovation-centred designing
• less familiarity (in some schools) with hand-held digital tools
• less flexibility (in some schools) with dealing with the e-scape challenge
The distribution appears to be very similar to that which we noted in the assessments of the
work from 2004. As we reported at that time ...
50 50
the performance levels to be ‘bottom-
year 10
heavy’ and for progressively smaller 40
Poly. (year 10)
percentages of the sample to be able 33 32
30 29
to achieve the upper levels. This we
believe is a reflection of the current 20
18 18
general performance of learners in 10
12
9
design & technology… Design 7
2
innovation has not received the 0
1 2 3 4 5 6 7 8 9 10 11 12
attention that it deserves and this is
holistic score
one of the reasons why this project
was established.
(Kimbell et al 2004 p 42)
85
e-scape e-solutions for creative assessment in portfolio environments
Despite the thin sample therefore, there is reason to believe that the 69 learners performed
in line with the expectations for that activity.
One of the questions that this sample was intended to help us to answer concerns the
relationship between performance on screen and on paper. Did good ‘on-screen’ designers
also perform well on paper, and vice versa?
There is only a tiny sample of 22 girls and 47 boys, and there is a very limited correlation
between the two sets if figures. For the girls, the correlation is positive but small at 0.2. In
the case of boys there is no correlation at all. When we look closer into the data to see why
this might be the case, it becomes evident that the inter-quartile statistic suggests that there
is indeed a relationship between the datasets, but that any possible correlation is being
destroyed by a small number of extremely interesting cases. In these cases, learners have
a very high score for e-scape and a very low score for light fantastic. As examples, the
highest scoring girl in escape (5.77), achieved only a 3/12 for Light Fantastic, and the
highest scoring boy in e-scape (5.7) achieved only a 2/12 for Light Fantastic.
How are we to explain these very large discrepancies? There are very few girls in this
category, and just removing from the list the one extreme-case girl identified above raises
the positive correlation to 0.4. There are rather more boys, but by removing just 4 from the
sample of 47 the correlation rises to above 0.3. Interestingly, one of these 4 was the
profoundly deaf young man mentioned earlier. His work with the e-scape task was the fifth
best of the whole boys sample (scoring 4.8), but with Light Fantastic he scored only 2/12
and was in the bottom 15% of the boys’ sample
Having identified these 5 individuals, we contacted the schools to ask specifically about the
performance of these few. The following comments were reported to us by the teachers we
86
e-scape e-solutions for creative assessment in portfolio environments
contacted. At our prompting, they had asked the learners about their reaction to the two
tests.
"I didn't take the light fantastic seriously…..I didn't plan my idea carefully enough….thought
my design wouldn't work…...I wasted a lot of time. We didn't have enough time to make
anything!"
"I enjoyed using the PDA and found it very easy to use…..my idea was better because I had
a chance to improve my paper version."
"I loved using the PDA to create my design...really good way to design ideas……I enjoyed
seeing my friends look at my design idea."
It is interesting that whilst there are these few cases of really good e-scape performance
associated with really poor Light Fantastic performance, there are none in the reverse
category. None of the very high scoring Light Fantastic learners performed very poorly in e-
scape.
It does seem as though for at least a small group within the sample a motivational element
goes some way to explaining the misfit between the two data sets. Learners almost always
found the digital form of the activity unusual and engaging, and perhaps they performed
better than they might have been expected to, and certainly much better than they did in
Light Fantastic.
But the major transformation from previous projects was of course the digital tools and the
web-based portfolio, and these innovations have created some dramatic possibilities for the
future
• design-talk – using voice files via the PDA – has enabled us (for the very first time in
assessment history) to collect the authentic voice of the learner – on task – and present it in
87
e-scape e-solutions for creative assessment in portfolio environments
the web-portfolio in a time-stamped slot. Furthermore, we are able to seam together the
sequence of voice files into a continuous file of approximately 2 minutes, providing a
continuous account (from the learner) of the evolving design product over the 6 hours of the
activity. This account highlights the strengths and weaknesses of their emerging solutions
and, taken as a whole, provides a really good indication of the capabilities of the learner. In
the current version of e-scape, this design-talk is steered by a series of question-prompts
and it seems self-evident that the nature of learners’ response will be driven by the
questions we pose. We need to do further work on this matter to identify an optimum
question set.
• the drawing and text tools were digital replacements for existing paper drawing and writing,
but – in the hands of learners – became more creative tools than we had dared to hope.
Learners’ familiarity with texting enabled them to communicate far more than their teachers
could with the same tool, and section 2.15 demonstrates the imaginative design response of
learners in their use of the drawing tool. In the current version of e-scape, the budget did not
allow for the creation of a bespoke drawing package and we rather adopted an existing tool.
In the future we would ideally evolve a customised tool-set that would be simpler and easier
to access for the very brief time available in the activity (10 minutes here and 5 minutes
there) for this early-stage concept drawing.
• the interplay between digital and non-digital (both paper and materials) worked apparently
seamlessly and encouraged us to the conclusion that it is not a matter of creating entirely
digital or entirely paper-based activities. We can mix and match to maximise the best
combination, The advantage of the digital system is that we can seamlessly collect an
evidence trace of how this interplay is working, in future systems we can imagine a feedback
system that helps learners to monitor this as well as their teachers, and make judgments
about how effective their choice of tools/designing processes are as they work through a
task.
• we showed in Assessing Design Innovation that a single task can be replicated into (in that
case) 9 different tasks covering the whole spectrum of design & technology (textiles,
systems & control, graphics etc) with an identical activity structure. All that changes is the
task. We can reasonably assume therefore that we could do exactly the same with the e-
scape version; enabling us to create a matrix of tasks that – taken together – cover the
whole spectrum of design & technology.
• moreover, we see no reason why these activities should be restricted to design &
technology. Any activity-based task could be structured using the same toolset – be it an
English composition, a science investigation, a drama improvisation or a geography
problem-solving activity. The key thing is that it is an activity to be pursued in a way that
demands some kind of performance within which learners can demonstrate capability.
Within those parameters it would be possible to design an assessment activity using the
TERU / PDA toolset so that learners create a web-based performance portfolio that can be
assessed remotely.
88
e-scape e-solutions for creative assessment in portfolio environments
• this issue is particularly significant in the context of the current debates on the status of
GCSE coursework. An e-scape model of coursework would be structured supervised
coursework managed with the TERU / e-scape toolset and undertaken in controlled
conditions in schools. We believe that there is a great deal of potential in this approach and
have opened negotiations with DfES, QCA, Becta and GCSE Awarding Bodies to create a
nationally scalable version of e-scape for this purpose. See 2.18 below: ‘next steps’.
Within this ‘next step’ we also propose to explore the possibilities of teachers’ monitoring of
coursework beyond the classroom. Time-stamped data in the portfolio show when the work
was done; GPS data can show where the work was done, voice memos show (at least to
some degree) who did it. These technological tools are available and offer a degree of
confidence about the authorship of a piece of work, even when conducted remotely. There
is much potential to be explored here.
• the e-scape national pilot (June / July 2006) demonstrated that learners adopt the system
VERY rapidly. It seems to be a natural extension of their mobile technology/gaming/media-
rich youth culture. We should note however that teachers need more support and more time
to get to grips not just with the technology, but also with the transformations that this creates
for their interactions with learners. There is a tendency for teachers to feel de-skilled by
(most) youngsters’ digital dexterity and to believe that because of this they have nothing to
offer in support of this area. Teachers need help to rebuild confidence in their skills as
educational/learning experts, and see that combining their learning expertise with young
peoples digital capability offers great potential for developing rich and compelling learning
experiences. This too has messages for future work – see again 2.18 below.
• finally in this section it is worth pointing out that we saw very little discernible difference (in
terms of attitude and response to the e-scape task) with the year 5, KS2 learners. There are
some differences in the quality of the outcomes, and these are attributable principally to less
developed modelling skills and – to some extent – a somewhat more naïve approach to
design. But when we consider these relatively small performance differences between the
year 5 group and the mass of the responses in year 10, we are forced to wonder what has
been happening in the 5 years that separate the two groups.
concerning assessment
i) designing the web-portfolio
Since the design of the web-based portfolio had to be undertaken before we had any
portfolios to put into it, the decision we made were based on a series of best guesses about
how we might tackle the assessment process. Our approach was to order the sub-task
boxes sequentially on the screen so that the whole of a learners’ work was laid out
effectively in a time line giving us an instant overview (‘glanceware” in software jargon), and
provide a zooming feature to allow assessors to zoom in and out of the detail of each of the
sections (See section 2.11)
89
e-scape e-solutions for creative assessment in portfolio environments
Several difficulties subsequently arose with this layout that made the use of the ‘comparative
pairs’ assessment more difficult than it might have been. As an example, the layout of box 1-
2-3-4 locates the work of the individual learner:
st
1 their 1 ideas
st
2 their response to team-mate (a) 1 ideas
st
3 their response to team-mate (b) 1 ideas
4 a photo of their consolidated drawing in the booklet
nd
This 2 arrangement has the weakness that the flow of work from the principal learner is
interspersed with the supportive interventions from the learners’ teammates, but the benefit
is that the consolidated drawing in box 4 makes more sense – since it is a direct result of the
work in box 1/2/3.
Several design issues of this kind arose through the judging process and inform how we
might re-design the web-based portfolio to make it more effective.
Moreover, the structural hierarchy of the site starts with a list of schools (click on one) that
reveals the list of learners (click on one) that reveals his/her portfolio. Having worked for
many days through three phases of judging (see section 2.12/13/14) a number of other
priorities became apparent in terms of accessing the appropriate portfolios. Three priorities
in particular would feature in any subsequent re-design of the web-site.
i) to anonymise the source of the portfolio by using a unique candidate number
system that is independent of the school (which could also have a unique centre
number).
ii) to display the work not merely sequentially – but in ways that allow judges to
customise its layout to suit their preferences (e.g. the relationship between voice
files and photos).
iii) to enable judges to ‘home-in’ on critical areas of evidence more quickly for the
purpose of facilitating their judgments
In particular it would be desirable in a future system to provide a set of tools to allow each
judge to arrange/view/setup the portfolio in their preferred way, for example by changing the
order of components, or the size of each item on screen, or back-grounding some
components while fore-grounding others or getting some to play automatically in the
background. It was not possible to image and model these possibilities at the outset since
we had not at that time been introduced to the judgmental pairs system.
90
e-scape e-solutions for creative assessment in portfolio environments
The comparative pairs judging process that we adopted with the support of Alistair Pollitt
(see section 2.12) was premised on the holistic approach to assessment that we have
always advocated for design & technology (see e.g. Kimbell et al 1991, Kimbell et al 2004).
The case for holistic assessment in design & technology lies in a combination of validity,
reliability and manageability, and the comparative pairs approach extends the argument
significantly, particularly in terms of reliability and manageability.
The reliability of the judging process is significantly enhanced over conventional approaches
to assessment, partly because of the multiple comparisons that are made (each portfolio is
seen against many other portfolios) and the multiple judges that do the comparisons.
Furthermore the system automatically flags up both the portfolios that are causing any
difficulty and the consensuality of each of the judges. In either case, anomalies are identified
and can be dealt with. We have described this in section 2.12, but it is interesting here to
consider some further possibilities of the system.
The potential exists to expand the assessment and learning impact on the people who are
involved in the assessing / judging process. Currently (for GCSE) teachers assess their own
learners’ portfolios – but do so with very limited external reference. They do not see work
from other schools and the teachers from other schools do not see their learners’ work. This
insularity has two downsides. First, teachers’ assessments are unable to recognise the
wider local, regional or national picture of capability, and second (even more important) the
teacher does not get a sense of the variety and strength of other work that is being
undertaken. The professional development potential of the current arrangement is therefore
limited. It has long been recognised that the real benefit of becoming an external examiner
or moderator is that one gets to see so much other work from so many places and at many
levels of capability.
Imagine then a situation in which all teachers are e-scape-style judges. They get to see all
kinds of work in the website and can compare their own learners’ portfolios with those from
many other schools. The effect would be to make all teachers into external assessors /
judges. Quite apart from the assessment benefit of this, the professional development
benefit could be substantial.
Going further however, imagine if learners themselves were able to access the website –
albeit probably through a different set of gateways. Teachers often try to keep hold of copies
of previous work to provide models of performance for their current learners to see and gain
inspiration from. Imagine a situation in which learners were not only able to see this other
work (in the website) – but were encouraged also to engage in the judging process.
Comparing one’s own work with that of many others could – if managed properly – provide a
very valuable learning tool.
91
e-scape e-solutions for creative assessment in portfolio environments
Current arrangements continually suffer from the challenge of assessing trends over time. Is
this year’s work better than last year’s – or worse? Are standards going up or down? With
an e-scape style system it would be possible to integrate previous portfolios (from last year)
into the sample for this year – and see where they end up in the rank order. This would
provide an immediate measure of the stability (or otherwise) of the scale.
Moreover these insertions from previous years might be benchmarked to indicate grade
boundaries and again the system would provide an automatic register of how these
boundaries appear in the current scale. Particularly in the context of coursework
assessment, one can imagine a new order in which many benefits could flow from the wider
application of e-scape style portfolios assessed through a comparative pairs judging
process.
Finally however it is important to note that there would also be a knock-on effect in the
classroom – concerning in particular the relationship between formative and summative
assessments.
Currently teachers often use project-work assessment rubrics as teaching tools, pointing out
for learners the kinds of things that they will need to do to get all the marks for this section or
that section of the rubric. In an e-scape world however, - whilst there would still be criteria to
guide the judging process – they would not work in the same way. Marks would not be
allocated against individual criteria and then added up. Rather the judge makes an overall
balancing decision about the strength of this piece as against the strength of that one. The
summative assessment process therefore becomes far simpler and quicker than is currently
the case. Our judges in this project (particularly the teacher who is currently involved in
GCSE assessments) were clear about the relative speed and simplicity of the approach.
See section 2.14.
But there is a consequence for formative classroom assessment, for the rubric is no longer
something that can be used as a scoring guide to show learners how they might improve
their work. In it’s place however one can imagine all sorts of alternative and more holistic
support systems for learners. Since the judging process is based on a holistic judgment –
learners would need to understand what it is about their current work that makes judges say
it is better (or worse) than another piece of work. The focus of formative assessment would
be on how the overall quality and impact of the work might be enhanced.
Furthermore, it also seems perfectly possible to adapt the pairs judging process so that it is
more than just holistic. We can imagine a rubric in which the holistic judgment is followed by
3 or 4 major category sub-judgments – again using direct comparison with other portfolios.
Portfolio A Portfolio B
holistically wins loses
category (a) wins loses
category (b) loses wins
category (c) wins loses
92
e-scape e-solutions for creative assessment in portfolio environments
The comparative pairs system has never previously been used in this way, but in the context
of exploring the relationships between summative and formative assessment this would
seem to us to be fertile and important territory.
For these reasons we are interested in the future direction of this technology, and the
following examples illustrate the way things are moving.
See
http://www.alpern.org/weblog/stories/2003/01/09/projectionKeybo
ards.html
We can however speculate on an extension of this idea. If the projector can project a
keyboard there is no reason why it could not also project a blank sheet onto the tabletop. In
this situation one might draw with a pencil on a ‘virtual’ plain sheet of paper and have it
‘sucked up’ by the projection sheet as a digitised drawing. As with them keyboard, the
drawing could equally be transferred to the PDA by blue-tooth. The PDA is therefore no
longer quite so constricting as a drawing tool, since, just like the keyboard, the digital
sketchpad has become external to it.
Speech-to-text systems
During the early stages of the project we experimented with speech to text systems and
Chatterbots. The processing power and software developments for/on desktop computers
are such that these systems are now viable. Presenting young people with the facility to talk
their ideas into a web portfolio and then the option to access this evidence as audio or
93
e-scape e-solutions for creative assessment in portfolio environments
automatically generated transcripts would be liberating for some learners. While we can
demonstrate how this system might work, we will have to wait a while for it to be available
for groups, since identifying individual voice profiles in a multiple-voice free flowing
discussion is currently beyond the scope of the technology.
Mini-projection
We are all familiar with data projectors and with the fact that they continue to get smaller
and smaller. Five years ago the smallest were like cornflake boxes, but now the smallest are
more like individual cornflakes.
See http://www.lightblueoptics.com/
94
e-scape e-solutions for creative assessment in portfolio environments
described as digital sellotape that binds together some elements of pre-existing software (for
drawing / writing / photography and dictation) into a form that could facilitate the e-scape
activity.
We have described above how the hardware may well develop in the next few years, and
the software is equally capable of development. Perhaps most critical would be the creation
of an authoring tool that allows teachers to build activities of their own choosing rather than
being entirely constrained by the design of the activity we built into the software. We can
imagine activities in science, history, music and geography that have an e-scape-like
framework of sub-tasks and timings. Such an application could be enormously empowering
for teachers, putting them in the driving seat and allowing them to develop and customise
activities for their own setting, their own learners, and their own timescales.
The key point here is that the whole system is driven by a remote server dynamically
sending and receiving data to and from the PDA, putting the teacher in control of the
sequences of the task and automatically building an evidence trail in the web portfolio.
Concerning pedagogic challenges (e.g. the most effective structuring of activity sequences
to maximise learners’ performance) we resolved these through an extended set of trials of
the task, the activity structure, the booklet, the timings and the resources. Each school trial
was focused on teasing out a particular set of issues and enabled us to arrive at an
appropriately satisfactory end point. The best evidence for its success is that the learners all
responded to the activity so well. Whilst the technology was no doubt part of its attraction,
the task succeeded in engaging them and the activity enabled them to demonstrate their
capability.
The key point here is that everything we did for the purposes of collecting evidence for
assessment also helps to scaffold the progress of the activity for learners.
95
e-scape e-solutions for creative assessment in portfolio environments
Concerning the manageability challenges (eg is it possible for all learners to have
ubiquitous access to digital tools in a normal workshop setting), we again sought to deal with
these through the trialling process. We were aware for example that learners would need
training with the PDA – but did not know how long or how much. We are aware of the state
of most workshop environments and were interested to explore the robustness of the PDA in
these potentially harsh environments (e.g. getting dropped on concrete floors). We started
with 100 devices and after all the trialling and the national pilot had one screen broken and
none lost. The reaction of teachers both to the PDA and the activity was enthusiastic. We
have established that the approach adopted for e-scape was indeed manageable for
learners, for teachers and for the research team.
The key point here is the infusion of technology into activity. Real-time activity in studios,
workshops, playing fields, theatres, science labs and the like, is typically not aligned with
digital power. This power typically sits in splendid isolation in the shimmering purity of IT
suites. In e-scape we have shown how the technology can get down and dirty and unleash
its digital power where it is really needed. And in the national pilot we demonstrated that it
was manageable.
Finally, concerning the functionality of the assessment system, it is perhaps here that the
most dramatic conclusions might be drawn. The e-scape approach enables learners to
create web-based portfolios directly from their classroom design activity. The web-based
nature of the portfolios has in turn enabled us to explore a quite new paradigm for
assessment – comparative pairs. The direct connection to real-time activity in the
studio/workshop supports the validity of the assessment, and the comparative pairs model
of assessment enabled us to achieve high levels of reliability in the assessment judgments.
Moreover the system reports on the effectiveness (consensuality) of each of the judges, and
we were all well within acceptable tolerances.
The key point here is that performance assessment is notoriously difficult, and at both ends
of the system. It is difficult to manage the performance itself in ways that assure equity to all
learners and it is difficult to ensure reliability in the assessment. Within e-scape we have
created a system that operates right across this terrain. Learners and teachers report that it
worked well at the performance end, and the data shows that it produced reliable statistics
at the assessment end. The prototype has done enough to demonstrate that it is a functional
system for assessment.
Next steps
As we write this report, we are in discussions with DfES, QCA, Becta and the Awarding
Bodies concerning the new directions that might be taken with project e-scape. It is clear
that the system works and opens up many possibilities for development – both for teaching
& learning and for assessment – and the challenge is to decide on the most appropriate next
step.
96
e-scape e-solutions for creative assessment in portfolio environments
st
Whilst the 1 one could stand alone – and might be developed by further explorations of
nd st
hand-held digital tools linking to websites - the 2 is naturally linked to the 1 , since it is only
st
possible because of the 1 . And both innovations are essentially about access.
st
This 1 innovation enabled us to capture genuine designing/problem-solving classroom
activity directly into the website. Every learner had direct access to significant digital
processing power in the midst of (and throughout) the workshop activity and without being
tied into a computer lab or IT suite. They were liberated to operate as autonomous
designers but with ‘back-pocket’ access to the website. Thereafter, since the judges had
st
immediate access to all the portfolios, all the time, we could – for the 1 time in assessment
history - exploit the possibilities of comparative pairs judging for ‘front line’ assessment.
It is these two innovations that will be at the centre of the proposed next step for e-scape.
st
With the 1 innovation, we see no reason in theory why any kind of activity-based
assessment task could not be substituted for the design & technology task that we
developed. English composition, science investigation, or music performance might be the
focus of such development and we propose to explore the extent to which the system can
be made to operate across disciplines. As part of this, we will create an ‘authoring’ interface
that allows the teacher to build an activity through a series of sub-task steps, and each with
a specified timing.
nd
The 2 part of the ‘next step’ concerns the integration of the e-scape system into Awarding
Body data management systems. For the assessment to work as genuine front-line
assessment it has to be seamlessly linked to a national system that allow schools to ‘enter’
candidates for the assessment. Thereafter it has to enable their performance to judged and
the outcome to be managed once again through the awarding process that results in
individuals achieving an authenticated grading from an Awarding Body.
We propose to develop a scalable national system of assessment built around – and linking
together - these two priorities. The matter has been given added urgency by the current
concern with GCSE coursework assessment, which has increasingly been criticised for its
lack of trustworthiness in terms of activity administration (whose work is it) and in terms of
the reliability of assessment judgments.
The proposed next step would enable us to retain some of the important aspects of
coursework, particularly the necessity for learners to tackle real tasks – over time – and
evolve individual solutions. But would enable this to happen in a tighter framework of school-
administered activity. Thereafter, learners’ performance can be judged – using a form of
97
e-scape e-solutions for creative assessment in portfolio environments
comparative pairs – to arrive at a highly reliable assessment outcome. The two major
drawbacks with coursework assessment are thereby nullified.
Discussions are advanced for this new project – e-scape phase 3 – to run from Feb 2007 to
March 2009, by which time a scalable national system will be operational.
End……….
references
Black P, Harrison C, Lee C, Marshall B, Willam D 2003 Assessment for Learning: putting it
into practice Buckingham: Open University Press ISBN 0-335-21297-2
Black P & Harrison C 2004 Science inside the black box: assessment for learning in the
science classroom London: NFER Nelson ISBN 0 7087 14447
Department for Education and Employment (DfEE) 1999 Design and technology: The
National Curriculum for England: Department for Education & Employment (DfEE) and the
Qualifications and Curriculum Authority (QCA). London
Department for Education & Skills (DFES) 2003 Survey of Information and
Communications Technology in Schools Oct 2003 HMSO - available at
http://www.dfes.gov.uk/rsgateway/DB/SBU/b000421/index.shtml
Department for Education & Skills (DFES) 2005 Harnessing Technology: Transforming
Learning and Children’s services: DFES e-strategy - available at
www.dfes.gov.uk/publications/e-strategy
Haste H. 2005 Joined-up texting: The role of mobile phones in young people’s
lives Haste H. Director of Research: Nestlé Social Research Programme Nestle -
available at
http://www.mori.com/polls/2004/nestlesrp3.shtml
IMS Global Learning Consortium Inc. Sept 2004 IMS ePortfolio Best Practice and
Implementation Guide available at
http://www.imsglobal.org/
98
e-scape e-solutions for creative assessment in portfolio environments
Laming, D. (2004) Human Judgment: the eye of the beholder. London, Thomson.
Office for Standards in Education (OFSTED) 2004 ICT in schools – the impact of
government initiatives:
Secondary design and technology May 2004 OFSTED publications - available at
http://www.ofsted.gov.uk/publications/index.cfm?fuseaction=pubs.summary&id=3649
Office for Standards in Education (OFSTED) 2004 (ii) ICT in schools: The impact of
government initiatives 5 years on May 2004 OFSTED publications - available at
http://www.ofsted.gov.uk/publications/index.cfm?fuseaction=pubs.summary&id=3652
Prime Minister’s Strategy Unit 2005 Connecting the UK: the Digital Strategy. A joint
report with Department of Trade and Industry - available at
http://www.strategy.gov.uk/work_areas/digital_strategy/index.asp
Pollitt, A (2004). “Let’s stop marking exams”. Paper given at the IAEA Conference,
Philadelphia, September. Available at:
http://www.cambridgeassessment.org.uk/research/confproceedingsetc/IAEA2004AP
Qualifications & Curriculum Authority (QCA) 2004 E-assessment expert seminar BECTa
th
9 Dec 2004
Qualifications & Curriculum Authority (QCA) May 2005 Assessment for Learning: website
http://www.qca.org.uk/7659.html
Thurstone, LL. (1927) “A law of Comparative judgment”. Psychological Review, 34, 273-286
Tomlinson Report 2004 14-19: Curriculum and Qualifications reform. Final report of the
Working Group on 14-19 reform DfES Publications DFE-0976-2004
99
e-scape e-solutions for creative assessment in portfolio environments
QCA website
http://www.qca.org.uk/7659.html
References
100