Академический Документы
Профессиональный Документы
Культура Документы
Quality:
Choosing Among
Surveys and Other
Assessments of
College Quality
Introduction ....................................................................................................1
General Issues..................................................................................................3
National Assessments of Institutional Quality ..................................................5
Using Assessment Results Effectively ..............................................................11
Conclusion ....................................................................................................17
Table 1. Instrument, Administrator, Purpose, Use of Data,
History, and Information Collected ....................................................18
Table 2. Target Institutions and Samples, Participation, Format,
Administration Procedure, and Timeline............................................26
Table 3. Reporting, Data Availability, Local Items, Costs, and
Contact Information..........................................................................34
As the public and political leaders have come to perceive higher education as both
more important and more expensive than ever, demand has grown for account-
ability data and consumer information on the relative quality of individual col-
leges. The Survey of College and University Quality (SCUQ) was developed by
leaders in the field of higher education assessment to help postsecondary institu-
tions meet the demands of governing boards, accrediting agencies, and other
stakeholders. Participating institutions have found this to be a rich source of data
for marketing and recruitment as well. Perhaps most importantly, college and
university faculty have embraced the results of the SCUQ as the most credible and
useful evidence of student learning in college.
We invite you to join the hundreds of leading colleges and universities that
participate in this survey . . .
2 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
General
Issues
4 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
National Assessments of
Institutional
Quality
The three tables at the end of this guide summarize the characteristics of 27 national
assessment instruments and services. The first 21 instruments and services
assess the attitudes, experiences, and learning goals and gains of entering students
(6), various groups of enrolled undergraduates (8), student proficiencies and
learning outcomes (5), and alumni (2). Two services offer a series of instruments for
students at varying points in their academic careers. The final four instruments and
services assess institutional and program effectiveness through the views of various
constituents, including faculty, administrators, students, and board members.
6 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
(NSSE), which was developed by a panel Student proficiencies and learning
of leading assessment scholars as a outcomes
model for quality in undergraduate The enrolled student surveys reviewed
education. The NSSE uses principles to this point focus on attitudinal and
similar to those that guided the develop- behavioral aspects of the student experi-
ment of the CSEQ; however, the CSEQ ence. Faculty at a number of colleges
is a longer instrument that covers spe- and universities now use one of several
cific outcomes and experiences in more assessment instruments focusing on stu-
detail. The NSSE, although relatively dent learning outcomes in specific con-
new, has a larger base of institutional tent areas and general education. ACT’s
participants than the CSEQ. Collegiate Assessment of Academic
HERI’s newest assessment instru- Proficiency (CAAP) consists of a set of
ment was developed in collaboration modules that assess student proficiency
with the Policy Center on the First Year in writing, reading, math, science rea-
of College at Brevard College. Your First soning, and critical thinking. Users may
College Year (YFCY) is both a follow-up customize these modules based on insti-
to the CIRP freshman survey and an tutional needs. The Academic Profile
assessment of students’ experiences developed by the Educational Testing
with first-year programs such as learn- Service (ETS) provides a similar assess-
ing communities, residential interest ment of general education skills and
groups, and introductory courses. proficiencies. ETS also offers a specific
Similar to the NSSE, the YFCY focuses assessment for critical thinking as
on specific types of programs and stu- well as a set of major field tests in 14
dent behaviors that have emerged from academic subject areas, such as biology,
higher education literature as best prac- economics, music, and psychology.
tices in undergraduate learning. The Project for Area Concentration
Achievement Testing (PACAT), housed
During the past 10 years, many insti- at Austin Peay State University, offers
tutions have adopted the Noel-Levitz flexible content in eight subject-specific
Student Satisfaction Inventory (SSI) as Area Concentration Achievement Tests
part of a strategic enrollment manage- (ACAT), which it can customize for
ment initiative. The Noel-Levitz instru- individual institutions. PACAT also
ment uses a gap analysis technique to offers three additional subject area
array students’ satisfaction against their tests that do not yet have the flexible
perceived importance of various aspects content option.
of the college experience. Noel-Levitz
also has recently released a version of Institutions typically use the general
the SSI tailored to the needs of adult education and subject area instruments
learners, called the Adult Student from ACT, ETS, and PACAT for internal
Priorities Survey (ASPS). assessment and improvement purposes.
However, many institutions tap into
the rich source of learning outcomes
information from these instruments to
demonstrate their effectiveness to
accrediting agencies. College adminis-
8 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
among the priorities of various con-
stituents, including students, faculty,
and staff. NCHEMS offers an instru-
ment, in both two- and four-year institu-
tion versions, that asks faculty, students,
and staff to answer similar sets of ques-
tions about institutional performance
and effectiveness. Finally, ETS offers the
Program Self-Assessment Service, in
both undergraduate and graduate ver-
sions, which assists academic programs
engaging in a self-guided review.
The set of instruments and services
reviewed in this section is by no means
an exhaustive representation of all the
assessments currently available. They
offer a sampling of the various tools that
college and university faculty and admin-
istrators can use in their assessment and
accountability efforts. Too often, institu-
tions see the results of one of these
assessments as the endpoint of the pro-
cess. The next section considers the ques-
tions and issues that determine whether
such assessment efforts provide useful
information for planning, evaluation,
and decision-making processes that pro-
mote accountability and improvement.
12 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
often limited. Furthermore, adding to national assessments for public account-
the length of any assessment instrument ability may become more complicated
detracts from the response rate and in the future. In the short run, these
response quality. Response formats for instruments provide a valuable source of
local items usually are limited to five- campus-level accountability information
point Likert-type scales (for example, for governing boards, public constituen-
strongly disagree, disagree, neutral, cies, and accreditation agencies. In the
agree, and strongly agree). The institu- long run, the use of these and other
tion may be able to vary the response types of assessments for internal plan-
scale, but doing so often detracts from ning and improvement may be the best
the reliability of the instrument. support for external accountability.
Colleges and universities that aggres-
sively evaluate their programs and
How can campuses, governing boards,
services—and act on that information to
policy makers, or other constituents use
improve those programs and services—
the results for public accountability?
will gather a rich body of evidence to
With all the inherent limitations of any support their claims of institutional
particular assessment, college and uni- effectiveness.
versity presidents must concern them-
selves with who receives the results of
these assessments and how their institu- What are the advantages and
tion packages and disseminates these disadvantages of using national surveys,
results to external audiences. The data compared to local instruments?
from most of the instruments described National surveys are a relatively cost-
in this guide are considered the property effective way to gather assessment infor-
of the institution. As such, the adminis- mation. The organizations that develop
trators, faculty, and staff of the institu- these instruments often devote greater
tion control the way the information is technical and financial resources than
packaged and presented. can an individual institution. They also
test these instruments on a broader pop-
Several of the instruments discussed
ulation than can a single institution. The
in this guide were designed with public
availability of comparative data from
accountability as a primary purpose.
other institutional participants is an
One explicit goal of the NSSE is to
important advantage to using a national
impact national rankings of colleges and
assessment, but, as mentioned earlier,
universities. Peterson’s College Results
comparative benchmarks may be of
Survey, which students and alumni can
limited use if the comparison group does
complete outside the control of their
not include institutions with student
institution, may become a centerpiece of
profiles similar to the target institution
Peterson’s consumer information ser-
or if student profile differences other-
vices. An increasing number of state and
wise are not taken into account.
university systems are using common
Moreover, comparative data often are
assessment surveys to benchmark insti-
not available at disaggregate levels,
tutional effectiveness.
where results can be most potent.
Because of this trend, the use and
Local assessment instruments provide
control of results from some of these
much greater control and attention to
14 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
well-developed assessment programs organizations who interact with and hire
typically use a variety of assessments, college and university graduates:
including but not limited to national employers, graduate school administra-
instruments. No single method offers tors, social agencies, and so forth.
the best approach. Each has benefits and Several specialized accrediting agencies
limitations, which is why a comprehen- require college program administrators
sive assessment program usually to survey employers, but there is, as of
includes multiple, convergent methods. yet, no popular national instrument.
For institutions that do not have well- Many of the popular surveys of the
established assessment programs, the college student experience (for example,
national surveys described in this guide the CIRP Freshman Survey, CSEQ, and
can be catalysts for further development. NSSE) originate from research on
Specific findings can lead to action. traditional-aged students attending resi-
Institutions can use perceived limitations dential universities. Some instruments
as points of departure to identify the are tailored for two-year colleges, such as
focus of future assessments. the CCSEQ and Faces of the Future.
Among the 15 instruments included in
ACT’s Evaluation and Survey Services
Do these assessments encompass the
program is an assessment designed to
views of all major student and other
evaluate the experiences of adult learn-
constituent groups?
ers (for example, nontraditional-aged
Most of the instruments reviewed in this and commuter students). Noel-Levitz’s
guide draw information about the col- new Adult Student Priorities Survey
lege student experience directly from (ASPS) also is tailored toward this group.
students: new, continuing, and gradu- However, the research base still is some-
ated. A few instruments gather input on what skewed toward the definitions of
institutional quality from faculty, admin- quality that emerge from the traditional
istrators, and staff. One voice that is college experience.
completely absent from these assess-
ments is that of the individuals and
ENTERING UNDERGRADUATES
Cooperative Institutional Collects demographic and attitudinal Admissions and recruitment; academic
Research Program (CIRP) information on incoming students. program development and review; self-
Freshman Survey/Entering Serves as baseline for longitudinal study and accreditation; public rela-
Student Survey (ESS) follow-up. Measures trends in higher tions and development; institutional
education and characteristics of research and assessment; retention
Higher Education Research Institute
American college freshmen. studies; longitudinal research about
(HERI) at UCLA and
the impacts of policies and programs.
American Council on Education
(ACE)
Freshman Class Profile Service Summarizes the characteristics of ACT- Institutional marketing and recruit-
tested enrolled and nonenrolled students ment: knowledge of competitors,
American College Testing (ACT)
by institution. characteristics of enrolled students,
feeder high schools, etc.
Student Descriptive Questionnaire Provides a basic profile of students who Admissions and recruitment; institu-
(SDQ) took the SAT. tional research and assessment; reten-
tion studies.
The College Board
Admitted Student Questionnaire Studies students’ perceptions of their Recruitment; understanding of market
(ASQ) and Admitted Student institution and its admissions process. position; evaluation of institutional
Questionnaire Plus (ASQ Plus) Facilitates competitor and overlap com- image; calculation of overlap win/loss;
parisons. evaluation of financial aid packaging.
The College Board
College Student Expectations Assesses new students’ expectations Comparison with CSEQ data to identify
Questionnaire (CSXQ) upon matriculation. Findings can be areas where the first-year experience
compared with student reports of their can be improved. Also can be used
Center for Postsecondary Research
actual experiences as measured by the for campus research and assessment
and Planning (CPRP) at Indiana
College Student Experiences initiatives.
University
Questionnaire (CSEQ).
ENROLLED UNDERGRADUATES
College Student Survey (CSS) Evaluates students’ experiences and sat- Student assessment activities; accredi-
isfaction to assess how students have tation and self-study; campus planning;
HERI
changed since entering college. Can be policy analysis; retention analysis; and
used longitudinally with the CIRP study of other campus issues.
Freshman Survey.
18 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED
ENTERING UNDERGRADUATES
ASQ was developed in 1987, ASQ Plus Student assessment of programs, admis-
in 1991. sions procedures, literature, institutional
image; financial aid packages; common
acceptances; comparative evaluations.
ASQ Plus provides specific institutional
comparisons.
ENROLLED UNDERGRADUATES
The CSS was initiated in 1993 to per- Satisfaction with college experience; stu-
mit individual campuses to survey dent involvement; cognitive and affective
undergraduates at any level and to development; student values, attitudes,
conduct follow-up studies of their and goals; degree aspirations and career
CIRP Freshman Survey respondents. plans; Internet, e-mail, and other com-
puter uses.
ENROLLED UNDERGRADUATES
Faces of the Future Assesses the current state of the commu- Community college students; bench-
nity college population and explores the marking; comparisons to national data;
American Association of
role community colleges play in stu- tracking of trends in student population.
Community Colleges (AACC)
dents’ lives.
and ACT
College Student Experiences Measures quality of students’ experi- Outcomes of college; accreditation
Questionnaire (CSEQ) ences inside and outside the classroom, review; institutional research, evalua-
perceptions of environment, satisfaction, tion, and assessment; student recruit-
CPRP
and progress toward 25 desired learning ment and retention; assessment of
and personal development outcomes. undergraduate education.
Community College Student Measures students’ progress and experi- Self-study and accreditation review;
Experiences Questionnaire ences. assessment of institutional effectiveness;
(CCSEQ) evaluation of general education, transfer,
and vocational programs; use of technol-
University of Memphis, Center for the
ogy; measurement of student interest,
Study of Higher Education
impressions, and satisfaction.
National Survey of Student Gathers outcomes assessment, under- Institutional improvement and bench-
Engagement (NSSE) graduate quality, and accountability marking; monitoring of progress over
data. Measures students’ engagement time; self-studies and accreditation; and
CPRP
in effective educational practices other private and public accountability
(level of challenge, active learning, efforts.
student-faculty interaction, supportive
environment, etc.).
Your First College Year (YFCY) Designed as a follow-up survey to the Admissions and recruitment; academic
CIRP Freshman Survey. Assesses student program development and review; self-
HERI and Policy Center on the First
development during the first year of study and accreditation; public rela-
Year of College at Brevard College
college. tions and development; institutional
research and assessment; retention
studies; longitudinal research; first-
year curriculum efforts.
Student Satisfaction Inventory Measures students’ satisfaction. Student retention; student recruitment;
(SSI) strategic planning and institutional
effectiveness.
Noel-Levitz
Adult Student Priorities Survey Measures satisfaction of students age 25 Student retention; student recruitment;
(ASPS) and older. strategic planning and institutional
effectiveness.
Noel-Levitz
20 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED
ENROLLED UNDERGRADUATES
SSI was piloted in 1993 and became Ratings on importance of and satisfac-
available to institutions in 1994. tion with various aspects of campus.
The survey covers most aspects of stu-
dent experience.
ASPS was piloted and became avail- Ratings on importance of and satisfac-
able to institutions in 2000. tion with various aspects of campus.
The survey is specific to the experience
of adult students.
Collegiate Assessment of Academic Assesses college students’ academic Document levels of proficiency; compare
Proficiency (CAAP) achievement in general education skills. local populations via user norms; estab-
lish eligibility requirements; report edu-
ACT
cational outcomes for accountability and
accreditation; improve teaching; and
enhance student learning.
Academic Profile Assesses college-level general education Describe performance of individuals and
skills. groups; measure growth in learning; use
Educational Testing Service (ETS) and
data as a guidance tool and performance
The College Board
standard.
Tasks in Critical Thinking Assesses proficiency in college-level, Each student receives a confidential
higher order thinking skills. report on skills performance. Data can
ETS
help institution learn more about teach-
ing and program effectiveness.
Major Field Tests Assesses students’ academic achieve- Measure student academic achievement
ment in major field of study. and growth and assess effectiveness of
ETS
departmental curricula for planning and
development.
Area Concentration Achievement Assesses outcomes and provides Provide specific program analysis.
Tests (ACAT) curriculum-specific feedback on
student achievement.
Project for Area Concentration
Achievement Testing (PACAT) at
Austin Peay State University
ALUMNI
Comprehensive Alumni Measures evidence of institutional effec- Help clarify institutional mission and
Assessment Survey (CAAS) tiveness and reports on alumni personal goals and assist in developing new goals.
development and career preparation. Evaluate mission attainment and impact
National Center for Higher Education
of general education programs, core
Management Systems (NCHEMS)
requirements, and academic support
services.
College Results Survey (CRS) Identifies personal values, abilities, Peterson’s uses data collected online
occupations, work skills, and participa- for consumer information at
Peterson’s, a Thomson Learning
tion in lifelong learning of college gradu- http://www.bestcollegepicks.com.
Company
ates. Uses alumni responses to establish Institutions use data collected in collabo-
a unique institutional profile. ration with Peterson’s for self-study.
22 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED
These tests originally were based on Factual knowledge; ability to analyze and
the GRE subject tests and are jointly solve problems; ability to understand
sponsored by ETS and the GRE Board. relationships; and ability to interpret
Each test is periodically updated to material. Available for 15 disciplines;
maintain currency with standard see www.ets.org/hea for listing.
undergraduate curricula.
ALUMNI
SERIES OF INSTRUMENTS
Student Outcomes Information Collects information about students’ Longitudinal assessment of students’
Survey (SOIS) needs and reactions to their educational experiences and opinions.
experiences.
NCHEMS
Evaluation/Survey Services Assesses needs, development, attitudes, Accreditation; program and service
and opinions of students and alumni. assessment; outcomes assessment;
ACT
retention; alumni follow-up; institu-
tional self-study.
Faculty Survey Collects information about the work- Accreditation and self-study reports;
load, teaching practices, job satisfaction, campus planning and policy analysis;
HERI
and professional activities of collegiate faculty development programs; bench-
faculty and administrators. marking faculty characteristics.
Institutional Priorities Survey Assesses faculty, staff, and administrative Student retention; student recruitment;
(IPS) perceptions and priorities (recom- strategic planning and institutional
mended with the SSI to determine where effectiveness. Institutions can pinpoint
Noel-Levitz
priorities overlap with those of students). areas of consensus on campus.
Program Self-Assessment Service Assesses students’ opinions on under- Used by departments for self-study and as
(PSAS) and Graduate Program graduate and graduate programs. additional indicators of program quality
Self-Assessment Service (GPSAS) for accreditation purposes.
ETS
24 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED
SERIES OF INSTRUMENTS
ENTERING UNDERGRADUATES
Cooperative Institutional Research All types/Incoming students (ESS specif- Since 1966, 1,700 institutions and 10
Program (CIRP) Freshman ically designed for two-year institutions). million students have participated. In
Survey/Entering Student Survey fall 2000, 717 institutions and 404,000
(ESS) students participated.
Higher Education Research Institute
(HERI) at UCLA and American
Council on Education (ACE)
Freshman Class Profile Service All types/All ACT test-takers. Over 1 million high school students are
tested each year. This service includes
American College Testing (ACT)
more than 550,000 enrolled students
from 900 institutions each year.
Student Descriptive Questionnaire All types/All SAT test-takers. All students who participate in the SAT
(SDQ) complete the SDQ. Responses only sent
if student indicates ‘Yes’ to being
The College Board
included in Student Search Service.
Admitted Student Questionnaire All types/All admitted students. Every year, 220 institutions participate
(ASQ) and Admitted Student and 400,000 students are surveyed.
Questionnaire Plus (ASQ Plus)
The College Board
College Student Expectations Four-year public and private institu- More than 33,000 students at two
Questionnaire (CSXQ) tions/ Incoming students. dozen different types of colleges and
universities participate.
Center for Postsecondary Research
and Planning (CPRP) at Indiana
University
ENROLLED UNDERGRADUATES
College Student Survey (CSS) All types/All students. CSS has collected data from more than
230,000 students at 750 institutions.
HERI
26 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE
ENTERING UNDERGRADUATES
Four-page paper survey. Colleges order surveys from HERI, Register for survey in the spring.
administer surveys on campus, and Surveys are administered in the sum-
return completed surveys for processing. mer and fall, usually during orienta-
Most campuses administer survey in tion. Report available in December.
proctored groups.
Responses are collected via paper as Students complete this when registering Institutions register in July of each year.
part of the registration materials for for the ACT. Responses and ACT scores Enrollment information is sent to ACT
the ACT Assessment. They are later are sent to schools and institutions. from September through June; reports
electronically combined with assess- are produced within 30 to 60 days.
ment results for reporting and
research purposes.
This paper-and-pencil instrument Students complete questionnaire prior to Tapes and/or diskettes are sent to insti-
is completed as part of the test regis- taking the SAT. Responses and SAT tutions six times per year as part of SAT
tration process. scores are sent to schools. Test reports.
Each program has matriculating and Colleges administer and collect surveys, Institutions determine when to mail
nonmatriculating student version of a then send them to The College Board for surveys, but The College Board recom-
standardized paper survey. Optional processing. ASQ Plus asks colleges to mends that they do so as soon as they
web version also is available. identify their major competitors, and know who will enroll (usually mid- to late
students rate their college choice vs. May). Follow-up strongly recommended.
other top choices.
Four-page paper survey; web version Institutions administer surveys and Most institutions administer the survey
under development. Takes 10 minutes return completed instruments to CPRP during fall orientation. To compare stu-
to complete. Demo version at for processing. Web version is adminis- dent expectations with actual experi-
www.indiana.edu/~cseq. tered via a server at Indiana University; ences, colleges administer the CSEQ to
institutions provide student contact the same students the following spring.
information.
ENROLLED UNDERGRADUATES
Four-page paper survey. Campuses administer surveys and return Register January 1 or May 1. Two admin-
them to data processing center. Campuses istration periods available: January
may choose to survey students who through June and July through
completed the CIRP for the purposes of December. Reports from first period
longitudinal study. available in fall, from second period in
February of subsequent year.
ENROLLED UNDERGRADUATES
College Student Experiences Four-year public and private institu- More than 500 colleges and universi-
Questionnaire (CSEQ) tions/All students. ties and approximately 250,000 stu-
dents since 1983 (when second edition
CPRP
was published) have participated.
Community College Student Community colleges/All students. The 1991 edition collected data from
Experiences Questionnaire 45,823 students at 57 institutions. The
(CCSEQ) 1999 edition collected data from
18,483 students at 40 institutions.
University of Memphis, Center for the
Study of Higher Education
Faces of the Future Community colleges/All students In fall 1999, more than 100,000 stu-
(credit and noncredit). dents at 250 institutions participated
American Association of Community
in the survey.
Colleges (AACC) and ACT
National Survey of Student Four-year public and private institu- After 1999 field test, the first national
Engagement (NSSE) tions/First-year and senior students. administration was in spring 2000 with
195,000 students at 276 institutions.
CPRP
CPRP annually surveys approximately
200,000 students at 275 to 325 colleges
and universities.
Your First College Year (YFCY) All types/Students near the end of the Total of 58 institutions and 19,000 first-
first year of college. year students will participate in spring
HERI and Policy Center on the First
2001 pilot. Participation expected to be
Year of College at Brevard College
open to all institutions in spring 2002.
Student Satisfaction Inventory All types (four-year, two-year, and career SSI is used by more than 1,200 colleges
(SSI) school versions are available)/All stu- and universities. More than 800,000
dents. student records are in the national
Noel-Levitz
database.
Adult Student Priorities Survey All types/All students 25 years and older. ASPS was piloted by more than 30 insti-
(ASPS) tutions and more than 4,000 students in
spring 2000.
Noel-Levitz
28 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE
ENROLLED UNDERGRADUATES
Eight-page paper survey; identical Institutions administer surveys and Most institutions administer the survey
web survey also available. Takes 20 to return completed instruments to CPRP at the mid-point or later in the spring
25 minutes to complete. Demo ver- for processing. Web version is adminis- term so that students have enough
sion at www.indiana.edu/~cseq. tered via a server at Indiana University; experience on campus to provide valid,
institutions provide student contact reliable judgments. For research pur-
information. poses, the CSEQ also can be adminis-
tered at other times.
Paper survey, self-report (Likert Instruments can be mailed to students or The Center provides surveys upon
scale). distributed in classes, through student receipt of an order. Scoring is completed
organizations, or other student assem- and results are mailed two to three weeks
blies. Completion of instrument takes 20 after colleges return instruments to the
to 30 minutes. Center.
Paper survey. Colleges order materials from ACT and Surveys can be administered during
administer surveys on campus. AACC/ACT fall administration
Completed surveys are returned to ACT (October) for a reduced cost, or can
for scoring and processing. be administered at other times at
regular cost.
Students can complete either a four- Schools send student data files, letter- Institutions send data files to CPRP in
page paper survey or the identical head, and invitation letters to CPRP, late fall. Surveys are mailed to students
online version. Students at one-fifth of which handles data collection, including in late winter and early spring. Follow-
participating schools complete the random sampling, sending surveys to ups continue through the spring. CPRP
web survey. Demo version at students, and conducting follow-ups. sends institutional reports and data to
www.indiana.edu/~nsse. Students return surveys to CPRP. schools in late summer.
Four-page paper survey; web survey HERI oversees administration of paper Institutions register for survey in the
also available. or web-based survey instrument; stu- fall and administer survey in the
dents return completed survey forms to spring. Reports are available in late
data processing center. summer.
Paper survey. In spring 2001, the sur- SSI is generally administered in a class- Students can complete the survey any-
vey also will be available on the web. room setting and takes 25 to 30 minutes. time during the academic year. Surveys
Web version takes 15 to 20 minutes. The generally arrive on campus within one
URL is e-mailed to students along with a week of ordering. Institutions send com-
specific student numeric password to pleted surveys to Noel-Levitz for process-
enter the survey area. ing. Reports are ready for shipment in 12
to 15 business days.
Paper survey. In spring 2001, the sur- ASPS is administered in a classroom set- Students can complete the survey any-
vey will be available on the web. ting and takes 25 to 30 minutes. Web time during the academic year. Surveys
completion takes students 15 to 20 min- generally arrive on campus within one
utes. For the web version, the URL and week of ordering. Institutions send
password are e-mailed to students. completed surveys to Noel-Levitz for
processing. Reports are ready for ship-
ment in 12 to 15 business days.
Collegiate Assessment of Academic All types/All students. More than 600 institutions have used
Proficiency (CAAP) CAAP since 1988. More than 450,000
students have tested between 1998 and
ACT
2000.
Academic Profile All types/All students. This survey has been used by 375 insti-
tutions and 1 million students.
Educational Testing Service (ETS)
and The College Board
Major Field Tests Four-year colleges and In the 1999–2000 academic year, more
universities/Senior students. than 1,000 departments from 606 higher
ETS
education institutions administered
nearly 70,000 tests. Current national
comparative data include accumulated
scores from 96,802 seniors.
Area Concentration Achievement Two- and four-year public and private Approximately 300 institutions and
Tests (ACAT) institutions/Generally seniors, although more than 50,000 students have partici-
ACAT can serve as a pre-test. pated.
Project for Area Concentration
Achievement Testing (PACAT) at
Austin Peay State University
ALUMNI
Comprehensive Alumni All types (two-year and four-year versions Information not available.
Assessment Survey (CAAS) available)/Alumni.
NCHEMS
College Results Survey (CRS) Bachelor degree—granting institutions/ The pilot study included 80 institutions
Alumni, preferably four to 10 years and 40,000 instruments. The web-based
Peterson’s, a Thomson Learning
following degree attainment. survey is open to any graduate. There is
Company
Recommended sample size is 2,000. no limit on the number of participants.
30 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE
Demographic questions collected on Colleges order assessment battery from Flexible administration schedule. Each
paper with assessment battery. Users ACT, administer it during a locally deter- assessment module can be administered
may add up to nine additional items; mined two-week test period, and return within a 50-minute class period.
they also may design their own assess- it to ACT for processing. Institutions must order assessments at
ment test battery by choosing from the least two weeks prior to administration
six different skill modules. period.
Paper survey (long and short forms). Colleges order materials from ETS and Institutions administer tests on their
Long form contains 108 multiple- administer them to students. Colleges own timeline. Tests are scored weekly,
choice questions and takes 100 min- return tests to ETS for scoring. and reports are issued approximately
utes. Short form contains 36 three weeks after ETS receives tests.
questions. Optional essay is available.
Open-ended or performance-based 90- Colleges order materials from ETS and Colleges decide who and when to test.
minute “tasks” in humanities, social administer them to students. ETS trains Faculty decides scoring schedule or ETS
sciences, or natural sciences. The faculty to score students’ responses, or provides a three- to four-week
score range for each skill is 1 to 6, with ETS scores the tasks. There are nine turnaround for issuing a report.
4 as the core score. separate tasks; three tasks can be used
for assessing fewer than 100 students.
Paper-and-pencil test. Institutions order tests, administer them Must order three to four weeks prior to
onsite to students, and return them to administration for standard shipping.
ETS for processing. Answer sheets received by the beginning
of each month are scored that month
(no scoring in January or September).
Reports are mailed three weeks after
scoring.
Paper survey. Multiple-choice test Institutions order surveys, administer Must order surveys at least 15 days
requiring 48 to 120 minutes, depend- them to students, and return them to prior to administration date. PACAT
ing on content. PACAT for scoring and analysis. scores surveys during the last full work-
ing week of the month and mails
reports the first working week of the
month.
ALUMNI
Paper survey. Colleges order surveys from NCHEMS, NCHEMS mails results three weeks from
administer surveys, and return to date surveys are returned for scoring.
NCHEMS for scoring.
Web-based survey comprised of four Alumni visit web site to complete sur- Unlimited online availability or as
sections. Takes 15 to 20 minutes to vey. Models for working with individual arranged.
complete. institutions are under development.
Institutions identify alumni cohorts,
who Peterson’s then contacts and
directs to the online instrument.
SERIES OF INSTRUMENTS
Student Outcomes Information All types (two- and four-year versions Information not available.
Survey (SOIS) available)/ Questionnaires for entering
students, continuing students, former
NCHEMS
students, graduating students, recent
alumni, and long-term alumni.
Evaluation/Survey Services All types/New students, enrolled stu- Since 1979, 1,000 institutions have
dents, non-returning students, and administered more than 6 million stan-
ACT
alumni. dardized surveys nationwide.
Faculty Survey All types/Full-time undergraduate In 1998–99, data were collected from
faculty and academic administrators. more than 55,000 faculty at 429 colleges
HERI
and universities.
Institutional Performance Survey All types (two-year and four-year versions Information not available.
(IPS) available)/Faculty, administrators, and
board members.
NCHEMS
Institutional Priorities Survey All types (two-year and four-year versions More than 400 institutions have used
(IPS) available)/Faculty, administrators, and the IPS.
staff.
Noel-Levitz
Program Self-Assessment Service College and university programs/ In 1999–2000, 65 institutions and
(PSAS) and Graduate Program Students, faculty, and alumni (separate 12,000 students, faculty members, and
Self-Assessment Service (GPSAS) questionnaires for each group). GPSAS alumni participated.
has separate questionnaires for master’s
ETS
and Ph.D. programs.
32 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE
SERIES OF INSTRUMENTS
Paper survey. Colleges order surveys from NCHEMS, NCHEMS mails results two weeks from
administer surveys, and return them to date surveys are returned for scoring.
NCHEMS for scoring.
Most surveys are four-page paper doc- Administration procedures are estab- Institutions mail completed surveys to
uments; one is two pages in length. lished at the discretion of the institution. ACT for processing. Scanning occurs
every second and fourth Friday; ACT
produces and mails reports three to four
weeks after scanning.
Four-page paper survey. Faculty surveys are sent to campuses in Institutions register in the spring and
the fall. Campuses are responsible for summer. HERI administers surveys in
survey distribution. HERI provides out- the fall and winter. HERI issues campus
going envelopes and pre-addressed, profile reports the following spring and
postage-paid return envelopes that summer.
respondents mail directly to HERI’s
survey processing center.
Paper survey. Colleges order surveys and distribute NCHEMS returns results three weeks
them. Surveys include a postage-paid after institutionally determined cut-off
return envelope for respondents to date.
return survey directly to NCHEMS to
maintain anonymity.
Paper survey. In spring 2001, the sur- The paper survey takes about 30 minutes Institutions can administer the IPS any-
vey also will be available on the web. and can be distributed via various meth- time during the academic year. Surveys
ods on campus, including campus mail, generally arrive on campus within a week
face-to-face distribution, and staff meet- of ordering. Institutions return com-
ings. The web version takes about 20 pleted surveys to Noel-Levitz for process-
minutes. URL and password can be ing. Reports are ready for shipment
e-mailed to staff. within 12 to 15 business days.
Paper survey. Institutions purchase and administer the Processing begins the first working day
questionnaires and send completed ques- of each month. ETS ships reports about
tionnaires back to ETS for reporting. three weeks after start of processing.
ENTERING UNDERGRADUATES
Cooperative Institutional Research Paper report with local results, and Yes—national results included in stan-
Program (CIRP) Freshman aggregate results for similar institutions dard report.
Survey/Entering Student Survey derived from the national norms.
(ESS) Separate profiles for transfer and part-
time students. Special reports and data
Higher Education Research Institute
file available for a fee.
(HERI) at UCLA and American
Council on Education (ACE)
Freshman Class Profile Service Paper report containing an executive Yes—national user data and college
summary; college attractions; academic student profiles available.
American College Testing (ACT)
achievement, goals and aspirations;
plans and special needs; high school
information; competing institutions;
and year-to-year trends. A range of free
and for-fee reports available.
Student Descriptive Questionnaire Institutions receive SDQ responses for Yes—national and state-level benchmark
(SDQ) students who indicate “yes” to the reports available on paper and on
Student Search Service on the registra- The College Board web site.
The College Board
tion form.
Admitted Student Questionnaire Highlight report (executive summary), Yes—included in standard report.
(ASQ) and Admitted Student detailed report with all data, competitor
Questionnaire Plus (ASQ Plus) report for ASQ Plus only, norms report
with national data. Data file also available.
The College Board
College Student Expectations Computer diskette containing raw insti- No—tentative norms are under develop-
Questionnaire (CSXQ) tutional data file and output file with ment and will be available summer
descriptive statistics. Schools also 2001. Norms reports will include
Center for Postsecondary Research
receive a hard copy of the output file. relevant comparison group data by
and Planning (CPRP) at Indiana
Additional analyses available for a fee. Carnegie type.
University
34 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL
ENTERING UNDERGRADUATES
Contains up to 21 additional local Participation fee of $400 plus $1 per Higher Education Research Institute,
questions. Consortia analyses avail- returned survey for processing. UCLA Graduate School of Education and
able for a fee. Information Studies, 3005 Moore Hall—
Box 951521, Los Angeles, CA 90095-
1521. Phone: 310-825-1925. Fax: 310-
206-2228. E-mail: heri@ucla.edu
www.gseis.ucla.edu/heri/cirp.htm
By providing additional data, cam- There is no cost for the basic information. Freshman Class Profile Service
puses can use this service to summa- Coordinator
rize variables at all stages of the Phone: 319-337-1113
enrollment funnel: students who sub-
www.act.org/research/services/
mitted their ACT scores, those who
freshman/index.html
applied, those who were admitted, and
those who enrolled.
Standard overlap with all common ASQ $600; ASQ Plus $925. Phone: 800-927-4302
acceptances in both surveys; specific Questionnaire Printing Fee: ASQ $.55 E-mail: info@aes.collegeboard.org
overlap analysis includes five competi- per form; ASQ Plus $.60 per form.
www.collegeboard.org/aes/asq/html/
tor schools in ASQ Plus. Both surveys Processing Fee: ASQ $2.00 per form
index000.htm
can be customized by specifying returned; ASQ Plus $2.25 per form
characteristics of interest to school. returned.
Limited local questions are available.
Local additional questions and consor- For regular paper survey administered by College Student Expectations
tia analyses are available. the institution, the cost is $125 plus $.75 Questionnaire, Center for Postsecondary
per survey and $1.50 scoring fee per Research and Planning, Indiana
completed questionnaire. University, Ashton Aley Hall Suite 102,
1913 East 7th St., Bloomington, IN
47405-7510. Phone: 812-856-5825.
Fax: 812-856-5150. E-mail: cseq@
indiana.edu
www.indiana.edu/~cseq
ENROLLED UNDERGRADUATES
College Student Survey (CSS) The Campus Profile Report includes the Yes—national aggregates for similar
results of all respondents. The Follow-up institutions. Complete national aggre-
HERI
Report contains matched CIRP and CSS gates available from HERI.
results for easy comparison. Special
reports and data files available for a fee.
College Student Experiences Computer diskette containing raw insti- No—an annual national report is not
Questionnaire (CSEQ) tutional data file and output file with planned; however, norms reports are
descriptive statistics. Schools also regularly updated and institutional
CPRP
receive a hard copy of the output file. reports include relevant aggregated
Additional analyses available for a fee. comparison group data by Carnegie
type.
Community College Student Diskette containing all responses and Yes—national data can be found in
Experiences Questionnaire scores for students and a summary com- the CCSEQ manual, which is available
(CCSEQ) puter report are available for a fee of $75. for $12.
University of Memphis, Center for the
Study of Higher Education
36 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL
ENROLLED UNDERGRADUATES
Local questions are available. $450 participation fee plus $1 for each Higher Education Research Institute,
Consortia analyses are available for a survey returned for processing. UCLA Graduate School of Education
fee. and Information Studies, 3005 Moore
Hall—Box 951521, Los Angeles, CA
90095-1521. Phone: 310-825-1925. Fax:
310-206-2228. E-mail: heri@ucla.edu
www.gseis.ucla.edu/heri/cirp.htm
Local questions are available for a For regular paper administration: $125 College Student Experiences
$250 charge. Consortia analyses are institutional registration fee plus $.75 per Questionnaire, Center for
available. survey ordered and $1.50 scoring fee per Postsecondary Research and Planning,
completed questionnaire. Web adminis- Indiana University, Ashton Aley Hall
tration cost is $495 institutional registra- Suite 102, 1913 East 7th St.,
tion fee plus $2.25 per completed survey. Bloomington, IN 47405-7510.
Phone: 812-856-5825. Fax: 812-856-
5150. E-mail: cseq@indiana.edu
www.indiana.edu/~cseq
Up to 20 local questions are available. $.75 per survey purchased and $1.50 per Center for the Study of Higher Education,
CCSEQ can be used in statewide assess- survey for scoring; $75 for print report 308 Browning Hall, The University of
ment efforts to provide data for strate- and data on diskette. Memphis, Memphis, TN 38152.
gic planning and staff development. Phone: 901-678-2775. Fax: 901-678-
4291. E-mail: ccseqlib@memphis.edu
www.people.memphis.edu/~coe_cshe/
CCSEQ_main.htm
Colleges may add up to 10 local items. AACC/ACT administration: $.75 per Contact Kent Phillippe, Senior Research
Statewide administration is available. survey (includes scoring) plus $50 pro- Associate, AACC. Phone: 202-728-0200,
cessing and reporting fee. Standard ext. 222
administration: $13.65 per 25 surveys E-mail: kphillippe@ aacc.nche.edu
plus $.80 each for scanning, $50 process-
www.aacc.nche.edu/initiatives/faces/
ing fee, and $50 reporting fee.
f_index.htm
Schools or state systems (i.e., urban, $275 participation fee plus per-student National Survey of Student Engagement,
research, selective privates) may form sampling fee based on undergraduate Center for Postsecondary Research and
a consortium of at least eight institu- enrollment. Total cost range varies, from Planning, Indiana University, Ashton
tions and can ask up to 20 additional approximately $2,500 to $5,500. Aley Hall Suite 102, 1913 East 7th St.,
consortium-specific questions. Targeted over-sampling is available for Bloomington, IN 47405-7510. Phone:
additional per-student fee. 812-856-5824. Fax: 812-856-5150.
E-mail: nsse@indiana.edu
www.indiana.edu/~nsse
ENROLLED UNDERGRADUATES
Your First College Year (YFCY) Paper report provides in-depth profile of Yes.
first-year students by sex, and compara-
HERI and Policy Center on the First
tive data for similar institutions. Data file
Year of College at Brevard College
also available.
Student Satisfaction Inventory The standard campus report includes the Yes—four national comparison groups
(SSI) mean data for all students alongside are standard, are available based on insti-
national averages. Optional reports and tution type, and are updated twice a year.
Noel-Levitz
raw data are available for an additional fee.
Adult Student Priorities Survey The standard campus report includes the Yes—the national comparison group
(ASPS) mean data for all students alongside includes data from four-year and two-
national averages. Optional reports and year institutions and is updated twice a
Noel-Levitz
raw data are available for an additional year. As of May 2000, the national
fee. group included 4,063 students from 32
institutions.
Collegiate Assessment of Academic Institutional summary report and two Yes—for freshmen or sophomores at
Proficiency (CAAP) copies of each student’s score report. two- or four-year, public or private
Certificate of achievement for students institutions.
ACT
scoring at or above national average on
one or more modules. Supplemental
reports and data file available for a fee.
Academic Profile Summary score report contains both Yes—provided by class level and by
criterion-referenced proficiency levels Carnegie classification.
Educational Testing Service (ETS) and
and norm-referenced scores. Scores vary
The College Board
slightly from long form to short form.
Data diskette included in fee.
38 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL
ENROLLED UNDERGRADUATES
Not available during pilot stages. No fees during pilot stages. Higher Education Research Institute,
Local items and consortia options UCLA Graduate School of Education
will be available with the full-scale and Information Studies, 3005 Moore
administration beginning in 2002. Hall—Box 951521, Los Angeles, CA
90095-1521. Phone: 310-825-1925. Fax:
310-206-2228. E-mail: yfcy@ucla.edu
www.gseis.ucla.edu/heri/yfcy
www.brevard.edu/fyc/Survey/
YFCYsurvey.htm
Special comparison group reports are $50 processing and setup fee plus $1.50 Julie Bryant, Program Consultant: julie-
available for a fee. to $1.95 per survey, depending on the bryant@noellevitz.com or Lisa Logan,
quantity ordered. Program Consultant: lisa-logan@noelle-
vitz.com. Phone: 800-876-1117
www.noellevitz.com
Special comparison group reports are $50 processing and setup fee plus $1.50 Julie Bryant, Program Consultant:
available for a fee. to $2.95 per survey, depending on the julie-bryant@noellevitz.com or Lisa
quantity ordered. Logan, Program Consultant: lisa-
logan@noellevitz.com. Phone: 800-
876-1117
www.noellevitz.com
Nine optional local questions may be $330 participation fee plus $8.95 to ACT, Outcomes Assessment, P.O. Box
added at no additional charge. $16.55 per student, depending on the 168, Iowa City, IA 52243-0168. Phone:
number of students and the number of 319-337-1053. Fax: 319-337-1790.
modules purchased (includes instru- E-mail: outcomes@act.org
ments, scoring, and reporting).
www.act.org/caap/index.html
Up to 50 local questions are available. $300 annual institutional fee. Price Jan Lewis at 609-683-2271. Fax: 609-
Institutions can customize compari- varies by form and number purchased 683-2270. E-mail: jlewis@ets.org
son groups from list of participating ($9 to $11.25 for short form and $14.50
www.ets.org/hea/heaweb.html
schools (minimum of eight per group). to $16.75 for long form). $2.25 each for
optional essay (includes scoring guide).
Minimum order of 50 tests.
None. $16.50 each for first 30 to 100. Jan Lewis at 609-683-2271. Fax: 609-
683-2270. E-mail: jlewis@ets.org
www.ets.org/hea/heaweb.html
Major Field Tests Reports include individual scaled scores, Yes—for each test. Percentile tables for
departmental summary with department all seniors taking the current form of
ETS
mean-scaled scores, and demographic each test are published each year.
information. Special score reports Departments may obtain custom com-
available for an additional fee. parative data for an additional fee.
Area Concentration Achievement Schools receive two copies of the score Yes.
Tests (ACAT) report for each student. Standard scores
compare students to five-year national
Project for Area Concentration
sample. Raw percentage scores of items
Achievement Testing (PACAT) at
correct also included. Additional analy-
Austin Peay State University
ses and data file available for a fee.
ALUMNI
College Results Survey (CRS) Institutions receive data file of responses No. Analytic tools for peer comparisons
in spreadsheet format for analyses. have been developed and are available
Peterson’s, a Thomson Learning
Analytic tools for institution-based analy- to participating institutions at a secure
Company
ses and peer comparisons are being web site.
explored.
SERIES OF INSTRUMENTS
40 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL
Group scores are reported for up to 50 $23.50 per test ($23 for 100 or more), Dina Langrana at 609-683-2272
locally written questions. plus shipping. Includes Test E-mail: dlangrana@ets.org
Administration Manual, standard pro-
www.ets.org/hea
cessing, and national comparative data.
Schools can customize most tests to Price ranges from $4 to $11 per student PACAT, Box 4568, Austin Peay State
model the test after major require- survey depending on discipline, pre-test University, Clarksville, TN 37044.
ments. Art and Literature in English vs. senior test, and two-year vs. four-year Phone: 931-221-7451. Fax: 931-221-
cannot be customized. Social work school. Price includes use of materials, 6127. E-mail: pacat@pacat.apsu.edu
customization will be available in scoring, two copies of the score report,
http://pacat.apsu.edu/pacat
June 2001. and long-term maintenance of score
histories.
ALUMNI
Up to 20 local questions are available $.85 per questionnaire plus shipping and NCHEMS, P.O. Box 9752, Boulder, CO
for a data entry fee of $1.25 per handling. $200 for analysis (includes 80301-9752. Clara Roberts at
question. one analytical report). 303-497-0390
E-mail: clara@nchems.org
www.nchems.org/surveys/caas.htm
Collaborative administration among There is no respondent cost to complete Rocco P. Russo, VP, Research,
institutions can be explored. the online CRS. Peterson’s, a Thomson Learning
Company, Princeton Pike Corporate
Costs for institutional applications of the
Center, 2000 Lenox Drive, P.O. Box
CRS are being explored as collaborative
67005, Lawrenceville, NJ 08648. Phone:
models are identified.
609-896-1800 ext. 3250, toll-free: 800-
338-3282 ext. 3250. Fax: 609-896-4535
E-mail: rocco.russo@petersons.com
www.petersons.com/collegeresults
SERIES OF INSTRUMENTS
Up to 15 local questions are available $.30 per questionnaire plus shipping and NCHEMS, P.O. Box 9752, Boulder, CO
for a data entry fee of $1.25 per handling. $150 for analysis, which 80301-9752. Clara Roberts at
question. includes one analytical report. 303-497-0390
E-mail: clara@nchems.org
www.nchems.org/sois.htm
Up to 30 local questions are available. $14.35 for 25 four-page surveys. $.84 per ACT, Postsecondary Services, Outcomes
Consortia reports are available for survey returned for processing. $168 Assessment, P.O. Box 168, Iowa City, IA
a fee. for basic reporting package (summary 52243-0168. Phone: 319-337-1053
report, graphics report, and normative Fax: 319-337-1790
report). E-mail: outcomes@act.org
www.act.org/ess/index.html
Faculty Survey Campus profile report includes faculty Yes—in normative profile report.
responses by gender. Separate profiles
HERI
of teaching faculty and academic admin-
istrators also are provided. Normative
profile includes national data by institu-
tional type. Data file is also available.
Institutional Performance Survey Report contains data for total campus, No.
(IPS) total faculty, and targeted populations.
National Center for Higher Education
Management Systems (NCHEMS)
Institutional Priorities Survey The standard campus report includes the Yes—three national comparison groups
(IPS) mean data for all respondents alongside are standard, are available based on
national averages for like-type institu- institution type, and are updated twice
Noel-Levitz
tions. Optional reports (including a year.
IPS/SSI reports) and raw data are avail-
able for an additional fee.
42 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL
Local questions are available. $325 plus $3.25 per returned survey. Higher Education Research Institute,
Consortia analyses are available for UCLA Graduate School of Education
a fee. and Information Studies, 3005 Moore
Hall—Box 951521, Los Angeles, CA
90095-1521. Phone: 310-825-1925. Fax:
310-206-2228. E-mail: heri@ucla.edu
www.gseis.ucla.edu/heri/cirp.htm
Up to 20 local questions are available. $1,600 for 100 questionnaires. Includes NCHEMS, P.O. Box 9752, Boulder, CO
survey, pre-paid return postage, stan- 80301-9752. Clara Roberts at
dard analyses, and report summary. 303-497-0390
After first 100 questionnaires, $150 for E-mail: clara@nchems.org
each additional 50.
www.nchems.org/surveys/ips.html
Special comparison group reports are $140 processing and setup fee plus $1.50 Julie Bryant, Program Consultant:
available for a fee. to $2.95 per survey, depending on the julie-bryant@noellevitz.com or Lisa
quantity ordered. Logan, Program Consultant: lisa-logan
@noellevitz.com. Phone: 800-876-1117
www.noellevitz.com
Local questions are available. $37 for 25 questionnaires plus shipping Karen Krueger at 609-683-2273
and handling (minimum purchase of 75 Fax: 609-683-2270
questionnaires). $150 for summary data E-mail: kkrueger@ets.org
report plus $3.99 per booklet processed.
www.ets.org/hea/heaweb.html#psas