Вы находитесь на странице: 1из 44

Running head: HIGHER LEVEL QUESTIONING 1

Encouraging Higher Level Questioning: Exploring the Effects of Professional Development and

Co-planning on a Novice Teachers Questioning Practices

Meredith Mitchell

George Mason University

EDRS 823 Fall 2016

Dr. Anya Evmenova


HIGHER LEVEL QUESTIONING 2

Abstract

A single subject case study was used to explore the functional relation between

professional development and lesson co-planning on the rate of higher level thinking questions

posed by an elementary school teacher to a class of students. One participant was studied using

an ABAB reversal design by recording the first 15 minutes of direct instruction during the

mathematics block. The recordings allowed for the questions posed by the teacher to later be

coded using Bloom's Taxonomy and further reviewed by an interobserver. The study

demonstrated with moderate to strong evidence of a functional relation suggesting that

professional development and lesson co-planning may be an effective strategy for supporting

novice teachers in their implementation of higher level questioning in the classroom.


HIGHER LEVEL QUESTIONING 3

Encouraging Higher Level Questioning: Exploring the Effects of Professional Development

and Co-planning on a Novice Teachers Questioning Practices

Organizations such as the Partnership for 21st Century Learning, and many other

research based organizations, have begun to highlight the necessity of refocusing our

educational system on the significant issue of equipping students with the skillsets they need to

be productive in todays global economy (Kay & Greenhill, 2011; Voogt & Roblin, 2012).

Their research suggests that it is not rote memorization of facts and content knowledge, which

is what the majority of mandated national and state standardized assessments currently

measure, that matters; it is the ability to create, collaborate, communicate and think critically

that will allow young scholars to be equipped with what they need in todays job market

(Supovitz, 2009; Kay & Greenhill, 2011; Voogt & Roblin, 2012). As the body of research on

21st century teaching and learning gains momentum, many school districts have adopted

measures and practices to commit to teaching these educational ideals, which is an important

first step in reforming education (Gunn & Hollingsworth, 2013). While the explicit adoption

of a shared vision of 21st century learning is important, merely stating our intention of better

equipping students for the future cannot and will not suffice.

Schools that will adequately refocus their intentions on developing 21st century skills

must intentionally plan for the strategic implementation of these skillsets in the classroom.

One of these skillsets, critical thinking, has a rich and robust literature base that demonstrates

how teachers might elicit higher level thinking from their students (Ennis, 1985; Kugelman,

n.d.; Seker & Komur, 2008). One such strategy for doing so is a framework of questioning

called Blooms Taxonomy, which delineates six different levels of thinking that become

progressively more complex (Ennis, 1985; Kugelman, n.d.; Seker & Komur, 2008). The
HIGHER LEVEL QUESTIONING 4

extensive research conducted on Blooms Taxonomy has yielded a variety of training and

reference guides that can be utilized in the classroom and throughout professional development

experiences as exemplars of questioning for varied levels of thinking.

While there is a wealth of information about critical thinking, the utility of Blooms

Taxonomy, and higher level thinking questioning, it will be important to identify which

training and teaching methods result in teachers actually implementing this research for the

benefit of students. Research related to teacher learning and professional development

suggests that leadership theory, including instructional leadership theory and distributed

leadership theory, hold utility in contributing to a culture of professional growth and sustained

instructional practices (Robinson & Timperley, 2007; Spillane, Halverson, & Diamond, 2001).

Additionally, the research on professional learning communities, PLCs, as group learning

organizations within school buildings has been shown to engage educators in learning and

implementation processes that sustain a school improvement (Vescio, Ross, & Adams, 2008).

Currently, significant research would suggest creativity, communication, critical thinking, and

collaboration are skillsets that are important for all our students (Kay & Greenhill, 2011;

Jacobsen-Lundeberg, 2013), but more research needs to be done to determine how to create the

learning environments where these skillsets are practiced. Practically speaking, it will be

important to explore how we change teachers teaching behaviors in order for this vast body of

research to be realized.

The purpose of this research is to determine whether targeted professional

development and a co-planning intervention in critical thinking questioning techniques can

improve new teachers ability to provide elementary students with greater opportunities to

answer critical thinking questions. While there is a gap in the literature regarding the utility of
HIGHER LEVEL QUESTIONING 5

these two interventions in higher level thinking questioning, the evidence the literature

provides on effectively implemented PLCs as collaborative structures in which teachers can

plan instruction together is promising for understanding the utility of co-planning as an

intervention (Vescio, Ross, & Adams, 2008). This research will demonstrate whether such

interventions are effective strategies for implementing a district wide vision for students 21st

century learning.

Research Question

This study will contribute to these understandings by addressing the following

question: Does providing a new elementary school teacher professional development on

higher level thinking questioning and providing them with a lesson co-planning experience

specifically for critical thinking, increase the frequency of higher level thinking questioning

that occurs in their classroom?

Method

This question will be addressed using a single subject research study. Single subject

research allows for a researcher to quantitatively evaluate how a participant responds to both

baseline and treatment conditions (Gast & Ledford, 2014). In this study, a novice elementary

school teacher will be recorded during her mathematics direct instruction in order to determine

the percentage of questions she asks the class that would be classified as higher level thinking

questions on Blooms Taxonomy. The percentage of higher level thinking questions recorded

during this time will represent the teachers baseline data, but for the treatment phase, the

participant will be trained on Blooms Taxonomy and the six levels of questioning and will

also participate in a lesson co-planning experience where the researcher and participant work

together to generate a set of higher level thinking questions that align to the content being
HIGHER LEVEL QUESTIONING 6

taught in subsequent math lessons. The study will take place within the novice teachers

mathematics class, which is part of a small neighborhood school in a large Mid-Atlantic school

district. It is anticipated that the Blooms Taxonomy professional development and lesson co-

planning experience will result in an increase in the number of higher level thinking questions

posed to the class during the teachers direct instruction.

Participant

In this study, one novice elementary teacher was monitored during direct instruction

during the first 15 minutes of the mathematics block. This teacher was selected because she is

new to the district, and having taught in the county for less than three months, is largely

unfamiliar with the district level strategic plan and vision for critical thinking and 21st century

learning. The teacher is in her late 30s and is a career switcher who moved from California less

than a year ago. She has had some previous teaching experience in a neighboring district in the

prior school year. She was hired by the school in May 2016 and after being evaluated by an

administrator, was the recipient of some instructional support in an effort to improve the quality

of instruction in certain subject areas, namely mathematics and language arts. The teacher is in

charge of one homeroom class of 28 students and she teaches them all core curricular subject

areas. The students represent a diverse population of learners, including seven English Language

learners. She meets weekly with a collaborative team within her grade level who support each

other in planning, instruction and assessment. This collaborative team includes two other

general education teachers, one special education teacher, and one special education instructional

assistant.

The participant was selected from a pool of teachers at the study site who have less than 3

years teaching experience and are unfamiliar with the districts strategic plan for 21st century
HIGHER LEVEL QUESTIONING 7

learning. This made it more likely that the participant was not currently utilizing critical thinking

questioning techniques, which would leave more room for improvement in this regard. There

were two other new teachers that fit these selection criteria who were excluded from the study.

These two teachers both had at least one additional year of teaching experience and have also

been evaluated by an administrator who determined they were currently meeting an acceptable

standard for instructional quality, and therefore would not have as much to gain from such an

intervention. For this reason, the researcher focused intervention efforts on the single selected

participant. Because the researcher did not intend on conducting a formal study with IRB

approval, in lieu of an informed consent document, the researcher had a face to face conversation

with the participant describing that they would be a support during the mathematics block and

asked permission to utilize this time in her classroom to conduct some research on questioning

practices. While the researcher was hesitant that the participant would not accept the conditions

of the study, the participant was actually extremely gracious and eager to accept both support

within the classroom, the chance to learn about higher level thinking questioning techniques, and

also excited to reciprocate some help and support back to the researcher.

Setting

The study was conducted within a school district that will be referred to as Large Mid-

Atlantic County. Large Mid-Atlantic County is named for its actual geographic location and

demographics, and was selected not only for convenience purposes, but also because it has

recently adopted a vision and mission at the district level to provide students with more

frequent and authentic 21st century learning experiences. The school selected within this

district is a smaller neighborhood school servicing nearly 500 students from PreK to 6th grade.

This school has recently become a Local Level IV services provider, meaning it has a
HIGHER LEVEL QUESTIONING 8

specialized Advanced Academics curriculum it can offer to qualifying students within the 3rd

through 6th grade. The school is predominately comprised of students who identify as White

(56% of students), with the remaining students identifying as Asian (25%), Hispanic (8%),

Black (4%) or Other (7%). Approximately 85% of students are proficient in English, meaning

about 15% of the students are classified as English Learners and receive special services to

support their language acquisition. About 88% of students are not recipients of free or reduced

priced meals which can be used as a proxy for determining that about 11% of students identify

as living within homes with a socioeconomic disadvantage.

The classroom within the school is a general education classroom and is not one of the

classrooms receiving the specialized advanced curriculum. The grade level is considered an

upper elementary classroom and participates in the mandated state testing requirements in both

mathematics and reading. There are 28 students in the classroom and all participate in a

mathematics instructional block from 9:00- 10:00 am each morning throughout the school

week. The mathematics block follows a math workshop model, also known as guided math,

which begins with a brief opportunity for students to check their previous nights homework,

then a 15 minute direct instruction lesson about the topic, followed by two to four independent

or small group activities for students to complete. The 15 minute direct instruction lesson

served as the source of the data the researcher collected because it was during this time when

the teacher was addressing the group of students in order to build upon their understandings,

teach new content, and check for understanding. This teacher directed time is when the

teacher herself was posing the most questions for student learning. During this instructional

time, students gather at the front of the room on a large carpet and sit arranged in rows facing

the board and SMARTboard. The teacher then directs a lesson about the content for the day
HIGHER LEVEL QUESTIONING 9

using either the white board to show examples or the SMARTboard to show examples and

sometimes instructional videos used to teach the content to the students. This time also

incorporates frequent questioning of students and when posed a question, students are directed

to turn and talk to a student sitting near them or to raise their hand to respond to the teacher

directly. This direct instruction time typically begins when students leave their desk area,

where they were checking homework, and concludes when students are dismissed back to their

seats for independent or small group working time.

Independent Variables

This study explores the effect of a combination of a professional development session

and lesson co-planning experience on the rate of higher level thinking questions posed during

instruction. For each treatment phase, the participant received both the professional

development training and the planning experience in the school day prior to the beginning of

the treatment phase. These experiences were provided by the researcher during either the

participants planning block from 10:50 11:55am, for the first treatment phase, or at end of

the school day at 1:30pm, for the second treatment phase. The second treatment phase training

occurred at the end of an early closing day, where students were dismissed from school two

hours early before a holiday weekend.

The professional development and co-planning session included 15 minutes of

training on the six levels of questioning in Blooms Taxonomy, using the reference provided in

Appendix A. After describing the meaning of each of the six level of questioning and going

over each levels keywords and question starters, the participant was asked to generate their

own question that would be classified within each level of the taxonomy, using the reference

as a guide. These exemplars were confirmed by the researcher and assessed for
HIGHER LEVEL QUESTIONING 10

appropriateness. If it were necessary, the researcher would have provided an immediate

correction if any of the participants responses did not suit the six levels of questioning,

however this was not an issue during the training.

The next part of the intervention consisted of the researcher and participant planning

the math content, resources, and learning activities, as well as generating higher level thinking

questions to pose to the class for each of the subsequent five math lessons that the participant

was going to teach. The planning template in Appendix B was used to structure and guide

these planning sessions. Additionally, the Blooms Taxonomy questioning guide was readily

available for reference throughout these co-planning sessions. It is notable that the template

prompted the researcher and participant to generate up to six higher level thinking questions,

classified as a Level IV, V, or VI on Blooms Taxonomy.

Dependent Variable

The study aimed to see how this intervention, a professional development and lesson

co-planning session, affected the percentage of higher level thinking questions asked during

mathematics direct instruction. To calculate this percentage, the researcher tallied the

frequency of lower level questions during the 15 minute observation block and also tallied the

frequency of higher level questions during the same block of time. The rate of higher level

questions posed was calculated by dividing the number of number of higher level questions

asked by the total number of questions asked, and then converting that decimal to a

percentage.

Questions that are considered Level 1 (Knowledge), Level II (Comprehension), or

Level III (Application) on Blooms Taxonomy were categorized as lower level thinking

questions. Knowledge questions are those that ask students to recall basic facts or vocabulary
HIGHER LEVEL QUESTIONING 11

and will include keywords including: what is, who, why, when, where, which, choose, find,

select, name, recall, select, and describe. Comprehension questions are characterized as

questions that require students to interpret or relate concepts, and include key terms such as:

compare, contrast, interpret, relate, summarize, classify, show, explain, and infer. Application

questions will require students to apply knowledge or skills in a new way and will include

question frames such as: build, apply, plan, model, organize, and solve.

Questions that are classified as Level IV (Analysis), Level V (Synthesis), or Level VI

(Evaluation) on Blooms Taxonomy were categorized as higher level thinking questions.

Analysis questions will require students to find evidence within given information in order to

support their ideas or inferences. These questions will use key terms including: analyze,

categorize, classify, compare, contrast, examine, infer, conclusion, and relationship. Synthesis

questions will require students to combine given information to propose a new solution and will

include key words such as build, choose, combine, construct, create, predict, plan, improve,

formulate, theorize, and develop. Evaluation questions will require students to make judgments

about the quality or validity based on a set of criteria. These questions may include keywords

such as determine, justify, measure, compare, prioritize, prove, disprove, assess, and interpret.

Materials

Baseline materials. The researcher provided the participant with no instructional

materials or support during the two baseline phases of the study. While the participant

continued to plan mathematics content with her collaborative learning team, which consisted

of her grade level teammates and a special education teacher, this support represents the status

quo of what was received by the participant in terms of instructional support. The teams

planning sessions were not generally focused on questioning techniques to use during
HIGHER LEVEL QUESTIONING 12

instruction.

The researcher used an iPhone to voice record the 15 minute mathematics direct

instruction lesson during the baseline phases. The researcher also used the recording tool in

Appendix C to record the level of lower and higher level questions posed during each recorded

session of the mathematics direction instruction shortly after the session was recorded. A

sample record from the baseline phase is attached in Appendix D.

Treatment materials. The researcher provided the participant with a Blooms

Taxonomy guide and five lesson planning templates during the two treatment phases of the

study. The participant was allowed to keep the Blooms Taxonomy guide after the initial

training, and was also allowed to use this guide throughout the lesson planning session which

immediately followed training. The Blooms Taxonomy guide is attached in Appendix A and

the co-planning lesson template is provided in Appendix B. A sample record of the co-

planning template in the treatment phase is attached in Appendix E.

The researcher used an iPhone to voice record the 15 minute mathematics direct

instruction lesson during the treatment phases, just as in the baseline phases. The researcher

also used the same recording tool from the baseline phase, in Appendix C, to record the level

of lower and higher level questions posed during each recorded session of the mathematics

direction instruction shortly after the session was recorded. A sample record from the

treatment phase is attached in Appendix F.

Research Design

The design of this study is characterized as an ABAB design, or reversal design (Gast

& Baekey, 2014). This design allows the researcher to use the same participant in order to

replicate the first basic effect demonstrated within the first two conditions. This allows for
HIGHER LEVEL QUESTIONING 13

some experimental control and also holds an advantage over ABA designs, which does not

leave the participant in a treatment phase and also does not allow for another demonstration of

effect from baseline into treatment. This design is particularly appropriate for the current study

because a return to baseline causes no immediate harm or ethical dilemma, as questioning

behavior is a relatively low stakes behavior. It is largely indeterminable as to whether this

questioning behavior is likely to be reversed without the co-planning intervention, and while

ABAB designs are best suited for behaviors that are likely to be reversed at baseline (i.e. not

learned behaviors), it is reasonable to assume that a novice teacher likely wont plan for higher

level thinking questioning on their own, unless provided the structure and support to do so.

This study was designed to meet all four design standards, including: a systematic

manipulation of the independent variable, an adequate standard of inter-assessor agreement,

three opportunities for a basic effect to be demonstrated, and a total of five data points in each of

the four phases (Kratochwill, Hitchcock, Horner, Levin, Odom, Rindskopf, & Shadish, 2010). In

this study, a professional development and co-planning session were employed only for the five

sessions in each treatment phase; no other supports were given to the participant during this time

or during baseline. This allowed the researcher to determine precisely how conditions differed

from baseline to treatment. At least one session in each of the four phases was planned to be

assessed by an additional observer using the permanent product of an audio recording. This

allows for 20% of all data to be assessed in each condition. Adequate training and the ability to

review the permanent product will help ensure that at least 80% agreement was met in each

phase. The reversal design allows for three basic effects to be demonstrated: between A1 and B1,

between B1 and A2, and between A2 and B2. This design also requires at least five data points

within each phase, which is also reflected in this studys procedures which includes precisely
HIGHER LEVEL QUESTIONING 14

five sessions of data collection in each phase.

Data Collection Procedures

Baseline procedures. The researcher entered the classroom at 9:00am each morning and

set the recording device, an iPhone, to voice record the events taking place in the classroom.

Shortly after 9:00am, the participant would instruct students to get out their homework, check

their answers with a partner, and upon finishing, make their way to carpet located in the front of

the classroom. Students were in the routine of transitioning to the carpet and arranging

themselves in neat rows facing the white board and SMARTboard mounted on the front wall.

When students were settled, the researcher listened for the precise moment when the participant

began direct instruction related to the content of the days math block. For example, if the

participant first asked students about how students were doing or if they had completed their

previous nights homework, these questions were ignored by the researcher. When direct

instruction began, the researcher would look at the timer on the voice recording and record the

minutes and seconds, for use as a reference for when the permanent product was later reviewed.

The participant engaged in a 15 to 20 minute lesson that was developed entirely on her own or

with the help of her grade level teammates in a previous planning session. After students were

dismissed to their seats to work on independent or group activities, the researcher stopped the

audio recording. Within seven days of each recording, the researcher reviewed the permanent

product, using the recording tool in Appendix C to record the frequency of lower and higher

level questions posed during the direct instruction time. This data was then used to calculate the

percentage of higher level questions asked during the session. The researcher also logged the

experience, noting any special circumstances, on a data collection spreadsheet included in

Appendix G.
HIGHER LEVEL QUESTIONING 15

Treatment procedures. During the school day prior to each treatment phase, the

researcher scheduled a time to meet with the participant and at least one other member of the

participants grade level team to conduct the Blooms Taxonomy based professional

development and engage in a lesson co-planning session for the lessons that were to be taught in

the subsequent five days of mathematics instruction. This single training session served to

prepare the participant for all lessons that would be taking place during the treatment phase.

The grade level teammates were provided the fidelity of treatment checklist in Appendix H, and

were asked to complete the checklist throughout the professional development portion of the

meeting.

The researcher presented the Blooms Taxonomy reference guide, included in Appendix

A, to the participant and focused on each of the six levels of questioning. The training included

going over the keywords and question starters used at each level and concluded with the

researcher checking for the participants understanding by asking them to generate an exemplar

question at each level. Each training lasted about 15 minutes and upon completion, the

researched and participant immediately engaged in planning for the subsequent math lessons.

The researcher and participant used the grade levels mathematics pacing guide, provided

by the county, to identify the lessons and standards that were to be taught over the next five days.

After the standards were identified, the participant and researcher planned activities and

resources to be used during each days lesson. This planning process was recorded on a lesson

planning template, included in Appendix B. The template also required the researcher and

participant to develop up to six possible higher level thinking questions that would be suited for

that individual lesson, with a possibility of two questions at levels IV, V, and VI of Blooms

Taxonomy. A sample lesson plan is included in Appendix E. After this process was completed
HIGHER LEVEL QUESTIONING 16

for all five lessons of the treatment phase, the training session concluded.

After this professional development and planning session, the data collection sessions for

the treatment phase utilized the same procedures as the baseline phase. There were instances

when disruptions during planned data collection sessions of the treatment phase resulted in the

researcher meeting an additional time to co-plan additional lessons for the subsequent teaching

days. The additional lesson co-planning session did not include the Blooms Taxonomy training,

the researcher simply worked with the participant on two additional lesson plan templates so that

the treatment phase could still include five data points and continue to meet design standards

without reservations (Kratochwill, et al., 2010).

Interobserver Agreement

The recording of one session of each baseline and each treatment phase was also shared

with an additional observer, who, as a graduate student, has experience with data coding and as a

teacher, has experience with observing instruction. The additional observer was trained using

the same procedures as the participant, and was provided the same Blooms Taxonomy guide,

attached in Appendix A. After the researcher reviewed the six levels of questioning, described

their key characteristics, and listed the common keywords and sentence starters for each level,

the interobserver was then asked to generate an exemplar question for each level of questioning.

The interobserver responded with appropriate exemplars for each level of questioning, indicating

that they had the knowledge and understanding necessary to discern whether or not the

participants questions were to be classified as a lower level or higher level question. The

observers responses represent training to 100% agreement with the researcher, which is

sufficient considering the minimum of 85% agreement during observer training.

Because each session was voice recorded on an iPhone, the observer randomly selected a
HIGHER LEVEL QUESTIONING 17

session from each of the four phases to listen to and record data from. The selection of one

session from each phase represents 20% of the total collected data, and also represents 20% of

the data within each baseline and each treatment phase (Kratochwill, et al., 2010). Due to

scheduling conflicts and time constraints, all four of the interobservers data coding sessions

were conducted at the same time at the end of the study using the permanent products. While


this scenario is not ideal, using the total agreement formula to calculate IOA, 100,

the interobserver agreement coefficient was calculated for each of the four sessions. In the first

baseline, second baseline, and second treatment phases, there was an interobserver agreement of

100%, whereby both the participant and the researcher coded the same percentage of higher level

thinking questions posed by the participant. However, in the first treatment phase, the

interobserver agreement was calculated to be just 91%, as there was a discrepancy between the

researcher and observer as to the total number of questions posed to the group. The average of

these four sessions resulted in an overall interobserver agreement of 98%.

Fidelity of Treatment

During the professional development and co-planning intervention, a range of one to two

observers were present to verify that the researcher adhered to the intervention procedures. For

the first treatment phase, which was conducted during the participants planning time, two of the

participants grade level team members were available to observe the session and utilize the

checklist in Appendix G which denoted the steps to be taken by the researcher throughout the

training. These included introducing each level of Blooms Taxonomy, sharing the key

characteristics, keywords, and question frames used within each of the six levels, and prompting

the participant to generate their own examples of questions that could be classified within each

level of the taxonomy.


HIGHER LEVEL QUESTIONING 18

Each of the fidelity of treatment observers were familiar with Blooms Taxonomy and

agreed to check off each step as they saw it implemented by the researcher. While both

observers participated prior to the first treatment phase, only one of the observers was available

to participate before the second treatment phase, due to the early school closing prior to a long

holiday weekend. These observers also checked over the completed co-planning templates to

ensure completion. While there no discrepancies concerning the adequate completion of each

planning template, if there had been, the researcher and participant would have gone back to the

templates in question to fill them out more completely.

The fidelity of treatment was calculated by looking at the percentage of check marks on

the checklist that were marked by each observer as a percentage of the total check marks that

were possible on the checklist. In the case of the first treatment, the average of these two

percentages was calculated. The fidelity of treatment for the first treatment phase was 100%, as

both observers indicated that all steps on the checklist occurred during the intervention. The

fidelity of treatment of the second treatment phase was also 100%, as the remaining observer

indicated that all steps in the intervention were carried out a second time.

Social Validity

A subjective evaluation of the social validity was collected throughout the course of the

research study using interviews with the participant and observers. Prior to the studys first

intervention phase, the researcher asked the participant and grade level team about their

perceptions of the importance of questioning in the classroom, to which the team unanimously

responded that they felt questioning was an important teacher practice, but also explained was

difficult to master on the spot. The teachers also lamented how it can be difficult to ask the kinds

of questions that require students to make broader connections across the curriculum. This
HIGHER LEVEL QUESTIONING 19

feedback validates the concept of taking the time to explicitly plan out questions for lessons, to

ensure students are subjected to rich, thought provoking questions that are often difficult to

develop on the spot, especially for teachers who are new to the field.

After the second intervention, the fidelity of treatment observer was asked about the

utility of the training tools and lesson planning template utilized throughout the intervention and

whether or not these procedures were reasonable and adequate to support the studys goals. The

observer felt that the materials effectively prompted teachers to engage with different levels of

questioning and provided a no excuses bank of questions from which the teacher could draw

upon throughout their lessons, but also noted that it would be difficult to maintain this level of

training and this level of detail throughout all subject areas for every lesson taught during the

school day. This concern about reasonability and practicality will certainly be noted as a

limitation of the study, even though the observer commented on its potential efficacy.

Data Analysis

Visual analysis. The researcher utilized the evidence standards to conduct a visual

analysis of the collected data (Kratochwill, et al., 2010). Visual analysis requires that baseline

line data be assessed to determine whether or not it is indicative of a relevant problem and

demonstrates a predictable pattern within the phase. Secondly, each phase must be assessed to

determine whether the data points demonstrate a predictable pattern within the phase. Third, a

visual analysis must include an assessment of the basic effect between phases. This is

accomplished by considering the six components of basic effect: level, trend, variability,

immediacy, consistency, and overlap of data. Three of these components are analyzed within the

phase; the level, which represents the mean of data within a phase, trend, which represents the

slope of the best-fit line within a phase, and variability, which is the deviation of the data around
HIGHER LEVEL QUESTIONING 20

the best-fit line. The remaining three components, are analyzed between phases; immediacy

demonstrates the magnitude of change between one phase and the next, consistency is a measure

that describes the extent to which data patterns are similar across like phases, and overlap

indicates the percentage of data in a treatment phase that is in the same data range as the phase

that comes before it. .

For the overlap measure, the most commonly used effect size, Percent of Non

Overlapping Data (PND) will be calculated within this study. This method identifies the highest

point in baseline, and then assesses the percent of points in treatment that exceed that level. This

is an appropriate measure for this study because it was not anticipated that there would be any

extreme outlier data, which is a limitation of PND. The researcher will use Scruggs, Mastropieri,

and Castos guidelines which indicate a PND greater than 70% means the treatment was

effective, 50 -70% means the intervention has questionable effectiveness, and less than 50%

means there was no observed effect (1987). Lastly, visual analysis includes an assessment of

experimental control, wherein studies with strong effects show three different demonstrations of

effect at different times and no intervention phase failed to demonstrate an effect.

Results

Visual Analysis

During this study, the participant alternated between a baseline phase where the percent

of higher level thinking questions during the teachers mathematics direction instruction was

calculated, and a treatment phase, where the percent of higher level thinking questions was

calculated after having participated in a professional development and co-planning session. In

each session, every question posed during the mathematics direct instruction was coded as either

higher level or lower level using Blooms Taxonomy, and the percent of total questions that were
HIGHER LEVEL QUESTIONING 21

classified as higher level thinking questions was calculated. The data collected from the study

indicate that the participant demonstrated a mean increase between baseline and treatment (See

Figure 1), which was the anticipated direction of change. In the baseline phases, the participant

demonstrated a mean of 0 (SD =0) for percentage of higher level questions posed to the class

during direct instruction. In treatment phases, the participant demonstrated a mean of 10.10 (SD

= 6.74) for percentage of higher level thinking questions asked, demonstrating a mean increase

of 10.10% from baseline to treatment. The overall trend was slightly negative as the participant

demonstrated a slight downward trend between treatment phases. Within each phase, variability

was nonexistent in baseline and was much higher in treatment phases, with a range of 20

percentage points between the highest and lowest data points in the treatment phases. The

participant demonstrated an immediacy of change upon the introduction of the intervention. The

overlap between phases was measured using Percentage of Non-Overlapping Data (PND;

Scruggs, Mastropieri, & Casto, 1987). The PND across all phases was 80%, which indicated that

the treatment was effective given the standard that a PND greater than 70% is considered

effective (Scruggs & Mastropieri, 1998). The data from the participant also demonstrated

consistency, as an increase in higher level thinking questioning occurred during the intervention

phase. Based on the visual inspection of data presented in Figure 1, there is evidence of a

moderate to strong effect of professional development and co-planning on teachers usage of

higher level thinking questions.


HIGHER LEVEL QUESTIONING 22

25%
Rate of Higher Level Questions Baseline 1 Treatment 1 Baseline 2 Treatment 2

20%

15%

10%

5%

0%
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Sessions

Figure 1. Higher level thinking questioning. The percent of higher level thinking questions

asked during baseline () and across intervention () of one participant during direct

instruction in a mathematics block.

In the first baseline phase, it is clear that there is a predictable pattern of no higher level

thinking questions being posed to the students in the classroom. This consistent lack of higher

level thinking questions being asked yields a mean of 0 (SD =0) within the phase. The stability

of the data after the collection of five data points indicated to the researcher that it was

acceptable to initiate the first treatment phase.

In the first treatment phase, there was immediate effect compared to the percentage of

higher level thinking questions asked in baseline, and an apparent increase in the level of the

treatment phase data. The mean of data in second treatment phase was 13.2 (SD= 5.89). The

trend of this data is increasing throughout the duration of the phase and a PND of 100% was

calculated, indicating that all treatment data in this phase exceeded that of the baseline data.

While the data pattern in the first treatment phase lacks a strong pattern of consistency, the other

measures of basic effect demonstrate strong evidence for the potential efficacy of the
HIGHER LEVEL QUESTIONING 23

intervention at this point in time.

In the second baseline phase, there is another predictable pattern of no higher level thinking

questions being posed to the students in the classroom. The data continues to be consistent

throughout the phase and results in a mean of 0 (SD =0) within the phase. The stability of the

data after the collection of five data points again justifies the initiation of the second treatment

phase.

In the second treatment phase, there was only a slight immediacy of effect compared to the

percentage of higher level thinking questions asked in the previous baseline phase, but a clearer

increase in the level of the treatment phase data. The mean of the data in the second treatment

phase was 7.0 (SD= 6.59). There is no clear trend in this data throughout the duration of the

phase and a PND of 60% was calculated, indicating that there is only a moderate demonstration

of effect compared to the previous phase. While the data pattern in the second treatment phase

also lacks a strong pattern of consistency, the other measures of basic effect demonstrate at least

moderate evidence for the potential efficacy of the intervention compared to the previous phase.

Discussion

The overall assessment of the baseline and treatment data can be analyzed as providing

moderate to strong evidence of a functional relation given the evidence standards (Kratochwill,

et al., 2010). The baseline data demonstrated a problem pertaining to the research question, in

that there is a predictable pattern of no higher level thinking questions being posed to the class

throughout phases where no intervention was in place. In each phase, there was a predictably

stable pattern of responses and a basic effect is evident between phases: the percentage of higher

level thinking questions posed to the class consistently increased when the participant was

trained and planned with the researcher, and the percentage of higher level thinking questions
HIGHER LEVEL QUESTIONING 24

dropped when no intervention was in place. Experimental control was established through the

three demonstrations of effect at different points in time, between each of the four phases, and at

no time did an intervention phase fail to produce a mean increase compared to baseline.

Practical Implications

Students will need 21st century learning skills to be successful in their futures and this

includes opportunities for students to think critically at higher levels. It will be important for

teachers in todays 21st century classrooms to understand how to pose higher level thinking

questions to their students and to regularly implement this type of questioning in their

classrooms. As the body of research related to 21st century skills grows, and the importance of

preparing todays students for new, dynamic challenges in a globalized economy becomes

more evident, it will be crucial for educational leaders and administrators to be able to identify

which teachers are successfully providing their pupils with higher level thinking opportunities

and how best to support teachers who can improve in this regard. While it may prove

impractical for teachers to co-plan every lesson with an educator skilled at posing higher level

thinking questions, it may be possible to develop these co-planning capacities within schools

professional learning communities (PLCs).

Limitations and Future Research

While this study yielded some noteworthy findings, it is important to address the major

limitations of this study. One of the largest considerations is the fact that there was a preexisting

professional relationship between the researcher and participant prior to the beginning of the

study, which may have influenced the willingness of the participant to put forth the effort

necessary to make a change. The establishment of a relationship prior to professional

development or training may have yielded behavioral changes that may not have emerged
HIGHER LEVEL QUESTIONING 25

otherwise. It is also possible that these changes were intentionally made in the presence of the

researcher, with little intention to continue the behavior beyond the scope of the study.

Another limitation of the study pertains to the timing of the second intervention training

and co-planning session. Due to the tight timeline of the study, this session necessarily fell after

the fifth session in the second baseline phase, which also happened to be the Wednesday before

students and teachers left on their Thanksgiving break. This timing was not ideal in terms of the

participants level of professional focus or engagement. Additionally, at this session the

participant also divulged to the researcher that she would be resigning from her current teaching

position and would only be carrying out her professional responsibilities for another two and half

weeks. A combination of these factors could understandably result in a decline in responsiveness

to the treatment procedures.

Another limitation to the study concerns the time that is necessarily expended in order to

plan each individual lesson with the participant. This level of support would not be practical for

the long term so without future research to determine whether or not the procedures resulted in

the improvement of teachers planning behaviors, or their ability to spontaneously generate higher

level thinking questions throughout the course of their instructional delivery, it is difficult to

ascertain whether or not the procedures would be worthwhile. Future research should include a

maintenance phase or focus on developing the capacity to plan for questioning within grade level

teams in a school. Future research may also employ some randomization techniques to discern

whether or not the small increases in the percentage of higher level questions posed during

treatment are significant.

Reflection

My experiences in learning about single subject research designs and planning and
HIGHER LEVEL QUESTIONING 26

conducting my own study were very valuable in demonstrating the multiple factors and

considerations one must anticipate when conducting research. The greatest implication of this

experience for me was understanding that even though I thought I had planned for and deeply

considered my methodology and procedures, the process of data collection showed me there

were many things I had in fact not factored in to my design ahead of time and needed to adjust

for throughout the research process. For example, I had planned to conduct my data collection

during the direct instruction portion of the math workshop lesson, which I realized in practice, is

perhaps too loose of a definition. Prior to my research, I tended to think of this instructional time

as the time shortly after the beginning of class where students have just completed checking their

homework in pairs and gathered on the carpet for the mini-lesson. I have been accustomed to

observing and participating in this routine, but when in the process of data collection, I realized

that this direct instruction time is not always fully utilized for new instruction; it is sometimes

used for some routine book keeping; i.e. questioning students about their partners homework

completion, going over the schedule for the day or noting upcoming assignments and quizzes,

etc. This type of questioning was not instructional in nature, and often made it difficult for me to

ascertain when the precise moment that actual instruction was beginning. Even though I was

familiar with the research setting prior to data collection, I never would have realized how much

time was devoted to these items prior to conducting this study. I ultimately had to readjust my

assumptions of when to start recording the mini-lesson, as it was not as compartmentalized to a

given time as one would ideally hope.

Another realization that only came about through the process of data collection is that

even with operationalized definitions based in a largely researched concept, it was still difficult

to categorize the level of questioning for some questions. It is true that in Blooms Taxonomy, it
HIGHER LEVEL QUESTIONING 27

is possible to use a particular keyword in multiple levels of questioning, so that ambiguity often

made it much more difficult to discern a level for each question asked. For this reason, I am

extremely glad to have gone through with collecting a permanent product, a voice recording of

each direct instruction lesson, in order to capture each question asked and be afforded more time

to deliberate on the best categorization for each. I ultimately only cared about whether or not a

question fell in one of two categories, lower level or higher level, as opposed to Blooms six

levels of questioning, which made these determinations easier, but quickly realized that it would

have been an almost insurmountable task to come up with a consistent level of interobserver

agreement had I tried to distinguish between all six levels.

While the permanent products certainly relieved the pressure of coding data on the spot,

one of my greatest lessons was learning to not defer coding the data. I did not make time to code

any of my initial baseline data until the very end of the phase, and it was only then that I realized

how difficult it was to hear based on the position I had placed my iPhone, the recording device I

used, relative to my participant. I ultimately had to hook up the recording device to a pair of

large external speakers which proved to be very cumbersome. At the very least, I was able to

catch this issue early enough in the process to adjust for future recordings. This initial recording

issue was not my only technological faux pas, however; during one session my recording device,

my personal iPhone, was not adequately charged, ran out of battery, and resulted in me needing

to exclude the session from my data collection. This was supposed to be a session in my first

treatment phase, which then also meant I had to go back to my participant and conduct an

additional lesson co-planning session to reflect the material covered for an additional

instructional day.

While these lessons are largely a result of my own learning process as a researcher, I also
HIGHER LEVEL QUESTIONING 28

learned about how tremendously factors outside of my control have and will change my research.

While the structure of the math block is largely consistent throughout the school year, and

largely conforms to the math workshop model, there are times when the typical structure was

broken by the teachers choice or an external factor, such as school assemblies, an assessment

that day, and participant absenteeism. While I ultimately had to adjust to these schedule

disruptions, whether it meant skipping a planned session because the participant was absent or

recording a more limited direct instruction time than what usually occurred, it helped me to

realize just how difficult it is to control for the variety of variables that may impact how a

particular session might go. As a researcher, these adjustments can be quite frustrating, but as a

practitioner, I also understand that these changes in plans are sometimes unavoidable.

A final reflection I have for my research study is an issue I have grappled with

extensively throughout my project planning process, as well: what recording and/or

transformation of my data best illustrates how a teachers questioning practices impact students

exposure to higher level thinking opportunities? On some occasions during my study, a very rich

discussion occurred after the higher level thinking question was posed by the teacher, however,

sometimes the teacher also happened to ask a higher frequency of lower level questions in these

sessions, essentially washing out the data for what actually occurred that day in the classroom.

One well posed question can certainly result in many minutes of discussion, so in the future it

might be useful to look at the duration of time students have to consider the higher level thinking

questions posed during their instructional time. While this ultimately changes the behaviors

being observed in the teacher and necessitates a different recording tool and procedure, along

with factoring in things such as how much wait time should ideally be provided to students, it

may yield more reflective results to what students were experiencing in terms of higher level
HIGHER LEVEL QUESTIONING 29

thinking as a direct result of a changing teacher behavior.

While single subject research design is not a typical methodology used within my field of

study, this class and this experience have been a very worthwhile experience. I am glad to have

expanded my methodological understandings beyond that of group research and as a future

educational leader, can see the utility in understanding this methodology for use with students

who may have special learning circumstances. Additionally, the lessons I have learned through

the process of conducting a study are not necessarily specific to this research design, and will

ultimately shape my way of thinking and encourage my preparedness when I embark on future

research endeavors.
HIGHER LEVEL QUESTIONING 30

References

Ennis, R.H. (1985). A logical basis for measuring critical thinking skills. Educational

Leadership, 43 (2), p. 44-48.

Gast, D.L. & Ledford, J.R. (2014). Applied research in education and behavioral sciences. In D.

Gast & J. Ledford (Eds.) Single case research methodology: Applications in special

education and behavioral sciences (pp. 1-18). New York, NY: Routledge

Gast, D.L. & Baekey, D.H. (2014). Withdrawal and reversal designs. In D. Gast & J. Ledford

(Eds.) Single case research methodology: Applications in special education and

behavioral sciences (pp. 211-250). New York, NY: Routledge

Gunn, T. & Hollingsworth, M. (2013). The implementation and assessment of shared 21st

century learning vision: A district-based approach. Journal of Research on Technology

in Education, 45(3), 201-228. doi:10.1080/15391523.2013.10782603

Jacobsen-Lundeberg, V. (2013). Communication, collaboration and credibility: Empowering

marginalized youth with 21st century skills. International Journal of Vocational

Education & Training, 21(2).

Kay, K. & Greenhill, V. (2011). Twenty-first century students need 21st century skills. In G. Wan

& D. Gut (Eds.), Bringing schools into the 21st century (pp. 41-65). Netherlands:

Springer. http://dx.doi.org/10.1007/978-94-007-0268-4_3

Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M &

Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from

What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.

Kugelman, F. (n.d.) Blooms taxonomy. Retrieved from

http://www.bloomstaxonomy.org/Blooms%20Taxonomy%20questions.pdf.
HIGHER LEVEL QUESTIONING 31

Robinson, V. & Timperley, H. (2007). The leadership of the improvement of teaching and

learning: Lessons from initiatives with positive outcomes for students. Australian Journal

of Education, 51, 247-262.

Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single

subject research: Methodology and validation. Remedial and Special Education, 8, 24-33.

Scruggs, T. E., & Mastropieri, M. A. (1998). Summarizing single-subject research: Issues and

applications. Behavior Modification, 22, 221-242.

Seker, J. & Komur, S. (2008). The relationship between critical thinking skills and in class

questioning behaviors of English language teaching students. European Journal of

Teacher Education, 31 (4), p. 389-402.

Spillane, J. P., Halverson, R., & Diamond, J. B. (2001). Investigating school leadership practice:

A distributed perspective. Educational Researcher, 30, 23-28.

Supovitz, J. (2009). Can high stakes testing leverage educational improvement? Prospects from

the last decade of testing and accountability reform. Journal of Educational Change, 10,

211-227.

Vescio , V. Ross, D. & Adams, A. (2008). A review of research on the impact of professional

learning communities on teaching practice and student learning. Teaching and Teacher

Education 24, 8091. doi:10.1016/j.tate.2007.01.004

Voogt, J. & Roblin, N. (2012). A comparative analysis of international frameworks for 21st

century competences: Implications for national curriculum policies. Journal of

Curriculum Studies, 44(3), 299-321. doi:10.1080/00220272.2012.668938


HIGHER LEVEL QUESTIONING 32

Appendix A

Blooms Taxonomy Reference Guide

The following reference was printed out in hard copy and used at the beginning of both

treatment phases as an introduction and training tool on Blooms Taxonomy. It was adapted

from an online resource to provide both the participant and the interobserver a succinct guide

with useful keywords and question starters for each of the six levels of questioning (Kugelman,

F., n.d). After the initial training, both the participant and interobserver were given the reference

guide to use at their discretion.

BLOOMS TAXONOMY

Blooms Taxonomy provides an important framework for teachers to use to focus on

higher order thinking. By providing a hierarchy of levels, this taxonomy can assist teachers in

designing performance tasks, crafting questions for conferring with students, and providing

feedback on student work

This resource is divided into different levels each with Keywords that exemplify the level

and questions that focus on that same critical thinking level. Questions for Critical Thinking can

be used in the classroom to develop all levels of thinking within the cognitive domain. The

results will be improved attention to detail, increased comprehension and expanded problem

solving skills. Use the keywords as guides to structuring questions and tasks. Finish the

Questions with content appropriate to the learner.

The six levels are:

Level I Knowledge

Level II Comprehension
HIGHER LEVEL QUESTIONING 33

Level III Application

Level IV Analysis

Level V Synthesis

Level VI Evaluation

Blooms Level I: Knowledge

Exhibits memory of previously learned material by recalling fundamental facts, terms, basic

concepts and answers about the selection.

Keywords: who, what, why, when, omit, where, which, choose, find, how, define, label, show,

spell, list, match, name, relate, tell, recall, select

Questions:

What is? Can you select? Where is? When did ____ happen?

Who were the main? Which one? Why did? How would you describe?

When did? Can you recall? Who was? How would you explain?

How did ___happen? Can you list the three..? How is?

How would you show?

Blooms Level II: Comprehension

Demonstrate understanding of facts and ideas by organizing, comparing, translating, interpreting,

giving descriptors and stating main ideas.

Keywords: compare, contrast, demonstrate, interpret, explain, extend, illustrate, infer, outline,

relate, rephrase, translate, summarize, show, classify

Questions:

How would you classify the type of? How would you compare? Contrast?

Will you state or interpret in your own words?


HIGHER LEVEL QUESTIONING 34

How would you rephrase the meaning?

What facts or ideas show? What is the main idea of ?

Which statements support? Which is the best answer?

What can you say about ? How would you summarize ?

Can you explain what is happening? What is meant by?

Blooms Level III: Application

Solve problems in new situations by applying acquired knowledge, facts, techniques and rules in

a different, or new way.

Keywords: apply. build, choose, construct, develop, interview, make use of, organize,

experiment with, plan, select, solve, utilize, model, identify

Questions:

How would you use? How would you solve ___ using what youve learned?

What examples can you find to? How would you show your understanding of?

How would you organize _______ to show?

How would you apply what you learned to develop?

What approach would you use to? What other way would you plan to?

What would result if? Can you make use of the facts to?

What elements would you use to change? What facts would you select to show?

What questions would you ask during an interview?

Blooms Level IV: Analysis

Examine and break information into parts by identifying motives or causes. Make inferences and

find evidence to support generalizations.

Keywords: analyze, categorize, classify, compare, contrast, discover, dissect, divide, examine,
HIGHER LEVEL QUESTIONING 35

inspect, simplify, survey, test for, distinguish, list, distinction, theme, relationships, function,

motive, inference, assumption, conclusion, take part in

Questions:

What are the parts or features of . . . ? How is _______ related to . . . ?

Why do you think . . . ? What is the theme . . . ? What motive is there . . . ?

Can you list the parts . . . ? What inference can you make . . . ?

What conclusions can you draw . . . ? How would you classify . . . ?

How would you categorize . . . ? Can you identify the different parts . . . ?

What evidence can you find . . . ? What is the relationship between . . . ?

Can you make a distinction between . . . ? What is the function of . . . ?

What ideas justify . . . ?

Blooms Level V: Synthesis

Compile information together in a different way by combining elements in a new pattern or

proposing alternative solutions.

Keywords: build, choose, combine, compile, compose, construct, create, design, develop,

estimate, formulate, imagine, invent, make up, originate, plan, predict, propose, solve, solution,

suppose, discuss, modify, change, original, improve, adapt, minimize, maximize, theorize,

elaborate, test, happen, delete

Questions:

What changes would you make to solve? How would you improve?

What would happen if? Can you elaborate on the reason?

Can you propose an alternative? Can you invent?

How would you adapt ____________ to create a different?


HIGHER LEVEL QUESTIONING 36

How could you change (modify) the plot (plan)? What facts can you compile?

What way would you design? What could be combined to improve (change)?

Suppose you could _____what would you do? How would you test?

Can you formulate a theory for? Can you predict the outcome if?

How would you estimate the results for? What could be done to minimize (maximize)?

Can you construct a model that would change? How is _____ related to?

Can you think for an original way for the? What are the parts or features of?

Why do you think? What is the theme? What motive is there?

Can you list the parts? What inference can you make? ? What ideas justify?

What conclusions can you draw? How would you classify?

How would you categorize? Can you identify the different parts?

What evidence can you find? What is the relationship between?

Can you make the distinction between? What is the function of?

Blooms Level VI: Evaluation

Present and defend opinions by making judgments about information, validity of ideas or quality

of work based on a set of criteria.

Keywords: award, choose, conclude, criticize, decide, defend, determine, dispute, evaluate,

judge, justify, measure, compare, mark, rate, recommend, rule on, select, agree, appraise,

prioritize, opinion, interpret, explain, support importance, criteria, prove, disprove, assess,

influence, perceive, value, estimate, deduct

Questions:

Do you agree with the actions/outcome? What is your opinion of?

How would you prove/ disprove? Can you assess the value or importance of?
HIGHER LEVEL QUESTIONING 37

Would it be better if? Why did they (the character) choose?

What would you recommend? How would you rate the?

How would you evaluate? How would you compare the ideas? the people?

How could you determine? What choice would you have made?

What would you select? How would you prioritize? How would you justify?

What judgment would you make about? Why was it better that?

How would you prioritize the facts? What would you cite to defend the actions?

What data was used to make the conclusion?

What information would you use to support the view?

Based on what you know, how would you explain?


HIGHER LEVEL QUESTIONING 38

Appendix B

Lesson Co-Planning Template

Date of Lesson:
Standard:

Key Vocabulary:
Key Understandings:

Direct Instruction Activity:

Higher Level Thinking Level 4 Level 5 Level 6


Questions During Direct
- - -
Instruction:

- - -

Independent Activities:

Methods of Checking for


Understanding:
HIGHER LEVEL QUESTIONING 39

Appendix C

Recording Tool

Levels of Questioning Recording Tool


Date: Start Time End Time: Duration of Observation:

Phase (circle one): A B A (Baseline): teacher plans and B (treatment): teacher implements
implements lesson without support lesson after Blooms Taxonomy
Session #: training and lesson co-planning
session

Teacher Behavior Frequency Total Questions:

Lower Level Thinking Questions

Poses Knowledge Question


Poses Comprehension Question
Poses Application Question Percentage of Questions that are
Higher Level Thinking:
Higher Level Thinking Questions

Poses Analysis Question


Poses Synthesis Question
Poses Evaluation Question
HIGHER LEVEL QUESTIONING 40

Appendix D

Sample Baseline Record


HIGHER LEVEL QUESTIONING 41

Appendix E

Sample Lesson Co-Planning Template


HIGHER LEVEL QUESTIONING 42

Appendix F

Sample Treatment Record


HIGHER LEVEL QUESTIONING 43

Appendix G

Data Collection Spreadsheet

INTEROBSERVER
Tot
al
Total # # Qu
Sessi Pha # of # of Questi Session of of esti
Date on # se LL HL ons % Notes Date # Phase LL HL ons % IOA
F F 0
10/28 1 B1 16 0 16 0% 10/28 1 B1 16 0 16 % 100%
M
10/31 2 B1 10 0 10 0%
T
11/1 3 B1 7 0 7 0%
W
11/2 4 B1 12 0 12 0%
Treatment
training
during Th
R planning
11/3 5 B1 8 0 8 0% block
F 10 ##
11/4 6 T1 9 1 10 % F 11/4 6 T1 8 1 9 # 91%
M/T school
holidays, W
R recording
11/10 7 T1 19 1 20 5% issue
F 20
11/11 8 T1 8 2 10 %
Tues-
M 17 Teacher
11/14 9 T1 5 1 6 % Absent
W 14
11/16 10 T1 12 2 14 %
R R 0
11/17 11 B2 13 0 13 0% 11/17 11 B2 13 0 13 % 100%
F
11/18 12 B2 6 0 6 0%
M
11/21 13 B2 12 0 12 0%
T
11/22 14 B2 16 0 16 0%
Treatment
training after
school at
1:20 before
W Tgiving
11/23 15 B2 14 0 14 0% holiday
M M 0
11/28 16 T2 8 0 8 0% 11/28 16 T2 8 0 8 % 100%
T 13 AVG:
11/29 17 T2 7 1 8 % 98%
W
11/30 18 T2 10 1 11 9%
R
12/1 19 T2 11 0 11 0%
F 13
12/2 20 T2 13 2 15 %
HIGHER LEVEL QUESTIONING 44

Appendix H
Fidelity of Treatment Checklist

Checklist for Blooms Taxonomy Professional Development

DIRECTIONS: please put a check mark by each bulleted item as you observe it being
completed. If you do not observe the item taking place, do not check the bulleted point.

o Blooms Taxonomy summary pages were distributed to teacher

o The teacher was informed that they would be learning about different levels of

questioning during todays training

o All six levels of Blooms Taxonomy were read aloud to the teacher

o Keywords at each of the six level were read to the teacher

o Sentence starters at each of the six levels were read to the teacher

o Teacher asked to generate an exemplar question for each of the six levels

o Researcher affirms or corrects the teachers exemplar question, pointing out why or why

not the question would be classified within each level

o Blooms Taxonomy summary pages were given to teacher and it was explained that these

materials could be used as a reference or guide for their use at any time.

Вам также может понравиться