Вы находитесь на странице: 1из 13

Data collection instruments

The data collected for this action research project was large in scope and it was collected
using data collection instruments that included a pre-test, a post-test, a pre-survey, a post-
survey, students’ portfolios which contained the students’ work and results in the different
inquiry-based tasks, and teacher field notes.

The sources of data for this action research project were the students and me, the science
teacher. In order to get data from students, I applied tests and surveys and asked students to
keep portfolios that included the whole collection of the students’ work along the intervention
(Hopkins, 2008). The data from me, the teacher researcher, came especially from field notes
that I took throughout the research project.

The pre-test and the post-test were administered and scored in a consistent manner. Both
tests were designed in such a way that the questions, the test conditions, and the scoring
procedures and interpretations were administered and scored in a predetermined, standard
manner so that they were valid and relevant. The tests were composed of seven multiple-
choice questions and three open questions about the most important issues related to the
systems of the human body. In order to score the tests, criterion-referenced score
interpretation was used in which interpretations compared test-takers to a criterion or a
formal definition of the scientific content, regardless of the scores of other examinees (Brown,
1990).

The content validity of these two standard achievement tests was evaluated by making sure
that the test items matched the instructional objectives proposed in the science program for
fourth grade and the content contained in the unit about the systems of the human body in the
students’ Scott Foresman Science textbooks for this grade. Internal consistency of the tests was
high as the questions were taken from the Scott Foresman Science series for grade four which
means that they were written by experienced science teachers, pretested, and selected on the
basis of the results of a quantitative item analysis (Foresman, 2004)..

The pre-survey and the post-survey that were used to collect data about students’ progress in
terms of inquiry, learning styles and learner autonomy tended to be strong on validity or the
degree to which the study accurately reflected or assessed these specific concepts that I was
attempting to measure (Burns, 1999), and also tended to be strong on reliability or the extent
to which these instruments yielded the same result on repeated trials.
The survey format put a strain on validity since the students’ real feelings were hard to grasp
in terms of such dichotomies as "totally agree/disagree," "neutral,” “disagree,” “totally
disagree” etc., these were only approximate indicators of what I had in mind when I created
the survey questions. Reliability, on the other hand, was a clearer matter. The two surveys
presented all students with a standardized stimulus, and so they went a long way toward
eliminating unreliability in my observations (Schwalbach, 2003). Careful wording, format,
content, etc., also reduced significantly the students’ own unreliability.

In general, the surveys were reliable because they measured things consistently and they were
valid since they measured what they said they were measuring.

The field notes used allowed me to make a written account of the inquiry-based lessons about
the systems of the human body that were taught to this group of forth grade students while
they were taking place. The notes were taken as soon as the events were happening in these
science sessions so that the information collected was fresh and not distorted (Jimenez, 2006).
The field notes were in the form of a record of work (Burns, 1999) of the students in the
science lessons in which information such as date, time, class, objectives, work done, way the
work was done by the students, homework, participation, things that worked well, and things
that did not work very well were all included. This record-keeping provided valuable data
about the intervention to draw conclusions about how the inquiry-based learning unit increased
the level of autonomy awareness in this group of forth grade students thus contributing to
validity and relevance.

When the students’ portfolio was created, I made sure that it was appropriate and fair for the
students as this was essential in terms of validity and reliability. Validity and reliability were
considered for both the individual pieces and the entire portfolio and it was evaluated for
these traits since the portfolio was a collection of students’ work and assessments. Both the
audience and the purpose of the portfolio played a major role in determining content validity
(Freeman, 1998). This instrument had high content validity because it was developed for a
specific science class working toward a certain purpose and specific objectives.

To evaluate the content validity I asked myself if the portfolio matched the instructional
purpose and objectives of the inquiry-based project and whether the portfolio assessed what I
set out to assess from the beginning. To ensure content validity, I set the purpose of the
portfolio in line with project objectives, matched the contents of the portfolio to the purpose,
and established clear criteria in relation to the original objectives (McFarland, 1997). In
general, the students’ portfolios had high content validity because they integrated instruction
and assessment as the work that students produced in the classroom and at home showed
improvement in terms of autonomy awareness.

Data collection procedures

Due to the fact that the data I collected was large in scope because it included standardized
test results, survey results, portfolios and field notes, I decided to sift through every piece of
information that I had and put it into categories that then became groups (Burns, 1999). One
group was related to the tests I applied; another group contained the survey results; another
contained the field notes; and the last one contained the student portfolios which had the
students’ work and results in the different inquiry-based tasks. After painstaking sifting, sorting
and grouping, some of my research findings began to surface.

Since sorting into categories is one of the most important steps in the data analysis process and
I had already grouped the events that took place during the implementation of the inquiry-
based learning unit, I followed the steps described below in order to make sense of the data I
collected. The main purpose was to start drawing out some theories and explanations in order
to interpret the meaning of those trends, characteristics or features that became apparent
(Burns, 1999).

Bearing in mind that this was an action research project, I knew I had to analyze the data,
interpret it and develop a theory about what the data meant in order to improve my teaching
practice in science classes while increasing the students’ autonomy awareness.

To shape the overall process of data analysis, I used the framework adapted from McKernan
(1996). I started by assembling the data that I had collected over the period of my research:
pre-test, pre-survey, students’ portfolios, field notes, post-test and post-survey. The initial
questions that began my research project provided me a starting point for rereading the data
which I scanned first of all in a general way.

I noted down thoughts, ideas and impressions that occurred to me during this initial
examination of the information that I gathered especially from the field notes that I took
throughout the intervention. At this stage, broad patterns began to show up which I thought
could be compared and contrasted later on to analyze what fit together (Burns, 1999). Once I
gathered all the data that I had collected through the data collection instruments I used I
found that students:

• were more willing to start the class.


• enjoyed doing the experiments and watching the videos I played them.
• were attentive to the teacher instructions and read the instructions to perform some tasks.
• wanted to show the teacher their portfolios and their findings at home.
• searched for information on their own about the topics.
• included in their portfolios a good number of different items related to the systems of the
human body.
• started asking more questions than they used to at the beginning of the unit.
• tried to find the answers to the questions on their own using their textbooks and materials
from their portfolios.
• enjoyed working with their classmates in pairs and groups.
• participated actively in experiments and hands-on activities.
• were able to draw conclusions after inquiry activities.
• showed in their pre-test results that they didn’t know much about the systems of the human
body even though they had studied this topic in previous years.
• showed in their post-test results improvement in their knowledge about the systems of the
human body.
• showed in the pre-survey results that most of them preferred student-centered classes.
• showed in the pre-survey results that they did not feel very autonomous.
• showed in the pre-survey results that they were hesitant about their preferred learning
styles.
• showed in the post-survey results that they felt more autonomous.
• showed in the post-survey results that they identified their preferred learning styles better.

Once there had been some overall examination of the data that I thought illuminated the
question I was researching, I created categories related to inquiry-based learning and its effect
on the students’ autonomy awareness and jotted a name for each one of them until I
completed a list of categories (Burns, 1999). The objective was to reduce the amount of data
that I had collected to more manageable categories of concepts, themes or types. The
following are the categories for my data:
Category 1: Student-centered methodology vs. teacher-centered methodology
Category 2: Encouraging students to inquire
Category 3: Learning styles
Category 4: Learning autonomy

In order to move beyond describing, categorizing, coding and comparing to make sense of the
data I had collected, I got to a point in the data analysis process that demanded a certain
amount of creative thinking since it was time to articulate underlying concepts and developing
theories about why particular patterns of behaviors, interactions or attitudes had emerged.

I came back to the data I had collected several times and I posed questions about it, rethought
the connections among the data and developed explanations of the bigger picture underpinning
my action research project (Burns, 1999). Then I discussed the data patterns and themes with
some of the school’s science teachers trying to find new discoveries or interpretations.

Using Data Collection Instruments


Using Data Collection Instruments for People
There are many kinds of data collection instruments; highly scientific instruments designed to
analyze data for scientists, or simple tests to check babies developmental progress they are
both different kinds data collection instruments. People by far are the most complicated
subjects a company, scientists, or historians will ever encounter; people's feelings and opinions
are especially sought by companies.
Sometimes people feel different about certain subjects when in a group then they might when
left on their own, conversely they may feel different about a subject when they are in front of
an interviewer than they do when left alone with their thoughts and are able to express
themselves in writing. For this reason there are often three main data collection instruments
scientists or company analysts will use to gauge people's opinions.
Written Questionnaires
One data collection instrument often used is a written questionnaire or quiz; a written
questionnaire is the least expensive and least bias data collection instrument readily available
to scientists. The fact that the questions will always stay constant and that it can be more
convenient for respondents to answer. On the downside though because there is no one
monitoring the questions, there is the potential that some people will choose not to answer the
questionnaire at all.
Face to Face Interview
Sometimes scientists or interviewers may use face to face interviews as a data collection
instrument; face to face interviews allow for more open ended questions and as a result a more
personal view of the question at hand. The interviewer or scientist could however skew the
results using their own judgment or interpretation of the information given; also this method
can be costly because employees need to be paid for giving the interviews.
Focus Groups
Another data collection instrument commonly used by companies and scientists are focus group
interviews; these interviews can be give some challenging responses to turn into readable data
because there are so many different opinions given at once and because it is scientific fact that
the majority of the population would rather agree with the people around them to be accepted
than stand out from a crowd with their opinions.
However, this can be a terrific data collection instrument as well because there is greater
potential for more concise data and because there is an interviewer on site, the interviewer
may ask more open ended questions and there for get more in depth information from their
subjects.
Data collection instruments may be varied and the term itself has a general and broad range,
of meanings; every branch of intelligence uses some form of data collection instruments to
help their business or research progress.
Doctors use sophisticated technology to collect data about diseases and cures, businesses use
other people to collect data about many aspects of their company; some businesses use data
collection instruments to improve the efficiency of their company, others want to use the
information to train their employee's or make their employee's happier. In any circumstance
data collection proves the old saying that knowledge is power.

43
ACTIVITY LIST
1. Collections development
• Review and identify materials for selection, including gift and exchange and
aggregations
• Review and identify materials or formats for cancellation
• Collection analysis and work with collection reports (including vendors, in-
house)
• Maintain relevant collections development statistics
• Usage statistics gathering and analysis
2. Negotiations and licensing
• Work with consortia, vendors, publishers, etc.
• Discuss and attempt to alter pricing and other terms
3. Subscription processing, routine renewal, and termination
• Order new subscriptions, not including collection development (see #1 above).
Includes downloading bibliographic record, verifying title information, and
creating the purchase order.
• Register and activate electronic subscriptions
• Renew existing subscriptions and licenses, including receiving, verifying,
accepting vendor quotes. Does not include negotiating (see #2 above).
• Order periodical back-orders, microfilm backfiles
• Maintain access to electronic subscriptions, including claiming missing or
incomplete items and communicating with vendors and publishers regarding
access problems
• Cancel subscriptions or licenses, not including collection development decisions
(see #1 above).
• Notify vendors of IP range changes for electronic subscriptions
• Claiming missing items
• Identify and place orders for missing/lost items
• Set up vendor information in payment system, post invoices there
• Verify and approve payments and transfer information to accounts payable
• Investigate invoice payments for vendors and publishers
4. Receipt and check-in
• Periodicals delivery to campus (preparing bins, boxes, etc)
• Periodicals check-in (for the currently received issues)
• Identify and make changes to current issue display (includes addition of notes
and setting up or changing check-in patterns)
5. Routing of issues and/or tables of contents
• Create and maintain periodical route lists
• Perform actual routing for periodicals and related follow-up
6. Cataloging
• Copy, original, and enhanced cataloging for new periodicals and for title
changes, cessations, etc
• Catalog maintenance, including updating URLs
• Create or maintain a list of journals, Web-based or otherwise, other than the
OPAC itself
• Perform authority control functions on records
• Create and update volume holdings
• Correct holdings and check in errors
APPENDIX B:
Data-Collection Instruments 44 Roger C. Schonfeld, Donald

W. King, Ann Okerson, Eileen Gifford Fenton


• Withdrawal activities (location information and last copy withdrawal)
• Union listing activities with OCLC, RLG, etc
7. Linking services
• Maintain and enhance linking services such as SFX
8. Physical processing
• Spine labeling
• Bar coding
• Inserting and applying bookplates
• Tattle-taping
• Stamping and marking
• Binding, rebinding, and related activities
• Initial shelving of item upon receipt
9. Stacks maintenance (including microform and current issues areas)
• Shelf-reading of current periodicals and bound volumes
• Shelf maintenance; i.e., labeling shelves/ranges
• Collection shifting
• Collection weeding, including transfer of journals to remote storage
• Cleaning of stacks and materials
10. Circulation
• Checkout
• Paging
• Searching for missing items
• Recalling overdue materials
• Check-in
• Reserves activities
• Reshelving as a result of circulation or other use
11. Reference and research
• Directional/access questions
• Reference assistance, including over the phone, Internet, and in person
• Assistance that requires going “off the desk” (such as to the stacks)
• Creation of resources/guides
12. User instruction
• Prepare for and conduct tours, briefings, sessions, demonstrations
• Other user instruction
13. Preservation
• Conservation and repair
• Preservation microfilming
• All preservation/archiving associated with electronic periodicals
• Disaster recovery planning and activities
• Binding is not included in this category: see item #8 above
14. Electronic infrastructure and support
• This category is intended to capture those activities, for any format, that require
electronic infrastructure and associated support, including:
Maintaining hardware and software for OPAC, Library Management
System, and other relevant servers
LAN support
Workstation support
Other relevant systems office activities
15. Other
• Please explain in detail on the activity log45 The Nonsubscription Side of Periodicals
ACTIVITY LOG
Study of the Operating Costs of
Periodicals Collections in Various Formats
Staff Activity Log – Representative Month
The materials that you now have in hand are part of a study that is being conducted
in order to learn about library operating costs for different kinds of periodical
collections. The data that this study will gather are important to us because they will
help to shape JSTOR’s new Electronic-Archiving Initiative, which has as its goal the
long-term preservation of electronic versions of scholarly materials. Insuring the
longevity of these materials is a challenging task, and this study is a most important
early step in the effort. Your personal help with this important effort is sincerely
appreciated. Your library is one of a small number of academic libraries partnering
with us in this research effort. Through this study, we are hoping in particular to
understand the economic effects of the transition from print toward electronic
journals, which will in turn help us to understand how an archive of electronic
journals will relate to existing library costs.
There are two components to this study. The most important component is the one
that you are now reading, the Staff Activity Log. We hope that you will help us by
carefully completing this document, which will allow us to understand how you and
the other staff of your library contribute to the periodicals operation. Be assured that
this study has been carefully designed to ensure your personal anonymity. The
second component to this study is an Institutional Survey, which is being completed
by your library to document other components of periodicals work. Together, these
two components should provide JSTOR, and the broader scholarly community, with
unprecedented data on the internal operating costs of the various periodicals
formats. This in turn will help to inform all manner of decisions about periodicals
collecting and storage.
Thank you very much in advance for your assistance in this effort. We appreciate the
time and attention that you are giving to this project.
-Eileen Gifford Fenton and Roger Schonfeld
Instructions
We ask you to complete the Staff Activity Log on the attached sheet to help
determine what activities related to periodicals you have performed in a recent
representative month and to indicate how much time you spent on these activities.
This will be easiest to do if you begin by identifying your work-related activities and
locate them on the Activity List (which is provided separately). Then, specify the
format related to each activity. Finally, estimate the percentage of the representative
month that you devoted to each activity. Please be sure to read the definition of
periodicals carefully and consult the more detailed directions below.
Definition of periodicals. Please note that periodicals are defined as serial
publications that contain separate articles, stories, other writings, etc., and are
published or distributed generally more frequently than annual. Newspapers and
monographic serials are NOT periodicals.43
43This definition is substantively identical to, and was adapted from, the 006 code for Type of
Continuing Resource, which appears in the Online Computer Library Center’s Bibliographic Formats
and Standards, Third Edition, available at http://www.oclc.org/bibformats/ .46 Roger C. Schonfeld,
Donald W. King, Ann Okerson, Eileen Gifford Fenton
Identify your activities. Please look through the Activity List and select the activities
that best describe your work related to periodicals. Make a note of any periodicals-
related activity that you performed in the month in any capacity of your library job. If
you did not hold your present position for the entire month, please estimate the work
that would have been if completed if you had worked the entire month. Don’t worry
about listing the activities in any particular order. Find the activity number on the
Activity List and record it on the Staff Activity Log along with a brief description.
If you only work with periodicals as part of your job (say, half the day in serials
cataloging, the other half in general reference), you can lump together all the non-
periodical activities without breaking them down further.
Vacation, sick leave, and holidays should be indicated as a separate activity. Breaks,
not including lunch, should also be indicated as a separate activity.
In past studies, it has been useful for the participants to first think about your
occasional or irregular activities. Then think about your daily activities, such as lunch,
coffee breaks, checking and responding to email, and so forth. Next, identify your
regular periodicals-related activities. Finally, be sure to complete the Staff Activity
Log’s last line, indicating any non-periodicals work that you perform.
Note the periodical format. It is vitally important for this study that you note as
accurately as possible how your activities are distributed among the four formats,
Hardcopy Current Issues (C), Hardcopy Backfiles (H), Microform (M), or Electronic (E).
Please record the format on the appropriate column of the Staff Activity Log. If a
given activity involves more than one format, please split it into separate activities,
one for each format.
Estimate the time spent. You can provide this information using either hours or
percentages, whichever will be easier and more accurate for you. Record the amount
or proportion of time you spent in the month performing each activity you have
listed.
As a guide, if you work 9 to 5, i.e. a 40-hour week, each day is about 5% of your
month. Therefore, if you took one week of vacation, it would account for 25% of your
month. Two 15-minute coffee breaks taken each day account for about 6% of your
monthly time. On the reverse of this page is a guide for converting actual time spent
to a percentage of total time. Be sure the percentage column totals 100%.
Send your completed Staff Activity log in the stamped, addressed envelope provided.
Guide for converting “Actual Time” to “Percentage of Time” for a 40-hour Work Week
Actual Time Percentage (Rounded) of the Month
Two hours 1%
One day, assuming 6 hours worked 4%
One full day, 8 hours total 5%
One full week, or 40 hours total 25%
Two coffee breaks at 15 minutes each day 6%
One hour per day, every day 13%
Formula. Often, it will be easier to convert the amount of time you worked on an
activity into a percentage based on this formula:
% = (# of hours spent on one activity) ÷ (# of hours you work in a given month)
SAMPLE COMPLETED ACTIVITY LOG
Activity Number Activity Format Time Spent in the
(1-15) Please take Hardcopy Current Representative
See Activity List language from the Issues (C) Month
Activity List or jot a Hardcopy Backfiles Express as a
more detailed (B) Number of Hours or
activity description. Microform (M) as a Percentage
Electronic (E)
3 Maintain access to e- E 20%
subscriptions
5 Route electronic TOCs E 10%
to faculty
11 Provide reference E 10%
services
11 Provide reference B 10%
services
11 Provide reference C 5%
services
1 Select periodicals for E 10%
the collection
N/A Breaks (not including N/A 20%
lunch)
N/A Vacation, holidays, and N/A 5%
sick leave
N/A Non-Periodicals Work N/A 10%
Total 100%
Creating Data Collection Instruments

Welcome! This resource page was designed to accompany our various


workshops on creating on-line survey instruments. We keep this page active as
a service to those of you who would like to learn more about creating and using
online data collection tools.

Background Material

Surveys are just one part of a complete data collection and evaluation
strategy. Within this context, surveys are powerful tools that can provide
insight into skills, competencies, and attitudes among teachers, administrators,
and other members of your educational community. Nevertheless, we have
found that all too many technology coordinators and/or district staff have
simply substituted "survey" for evaluation when it comes to assessing
technology integration and impact. In the NECC workshop and in article
referenced below we have articulated the need for a richer evaluation and
assessment process.

• Powerpoint presentation for NECC MP356


• An article on data collection strategies
• Information on our technology evaluation process

Sample Surveys

These samples are designed to showcase some of the most common features of
simple online surveys. All of these make use of the BNBFORM cgi script which is
referenced later on this page. Feel free to view the samples, but note that
they will not actually "work" in terms of being a vehicle for your actual data
collection. To use or adapt them, you would first need to install the BNBFORM
script on your webserver and modify various form variables for your own
environment per the instructions provided with the script.

The Script

OK, for those of you who were looking for the techie part, here it is. Online
surveys make use of a scripted set of commands that instruct your web server
to collect, parse, and report the data submitted via a web-based survey form.
This is all of the behind-the-scenes technical stuff that happens when you click
the SUBMIT button on a survey form. In our workshop, we have used a freeware
cgi script which is available from a cgi script resource site called
BigNoseBird.com.

While this script is well documented and can be installed by someone with a
relatively small amount of technical know-how, it's still important that you
(assuming you're the person with the "relatively small amount of technical
know-how") have access to a technical web professional who understands how
your server is configured and operates. This person can explain some of the
more mysterious elements of the script configuration, and in particular help
you with the locations (on your server) of various utilities and programs that
are necessary for script functionality. Yes, this is a bit more complicated than
the use of day-to-day software applications.

A good test to see if you need technical assistance -- or to see if you might be
getting in over your head -- is to view the readme.txt file for the script. If this
stuff makes sense to you, and seems like something you can handle...go for it.
Otherwise, maybe you should show this to your webmaster.

• Download cgi script from BigNoseBird.com.


• View the script as a text file
• Read the readme.txt file for the script

HTML Tools

We highly suggest that you use a web design tool such as Dreamweaver,
FrontPage, or Go Live! to create and edit your scripts. While it is possible to
develop a survey page with a plain text editor (by typing in raw html code) or a
word processor (such as Word), editors and word processors do not have the
integrated capacity to do things like easily switch between code and wysiwyg
or to upload your completed pages to a server. Frankly, the frustration you
might experience using somewhat inappropriate tools is not worth it when you
can purchase a real web design tool for under $200 (academic pricing).

• We use Macromedia's Dreamweaver to produce our web pages. You can


dowload a preview version of Dreamweaver from the Macromedia
website.
• If you must use a text editor, one of the best is Bare Bones Software's
BBEdit Lite 6.1. BBEdit Lite is free, but the paid version includes
expanded functionality (especially for creating web pages).

Creating Your Own Survey

The best way to get started creating online surveys is to modify an existing
survey form. Before you do this, you need to understand the principles of basic
HTML (or use one of the tools mentioned above) and the basic structure of the
bnbform.cgi script. Once you have these prerequisites, here's a very simple
script to play with. From your browser, view this as SOURCE and save it as a
local file. Then, you can procede to edit it using an HTML tool (or if you dare, a
simple text editor).