Вы находитесь на странице: 1из 11

Scot W. McNary Ph.D.

Associate Professor

Department of Educational Technology and Literacy

College of Education

Detailed Supporting Statement: Teaching


For all my courses I create and maintain Blackboard sites to contain assignments, notes

for the course in the form of PowerPoint presentations, example papers, public datasets used in

examples for the class, a syllabus, and external resources helpful for the course. I use the

discussion board extensively in some courses (EDUC 761/ISTC 685), with assigned posts, and

less so in others (EDUC 605, EDUC 715), and not at all in others (EDUC 790). I use interactive

white boards extensively (Polyvision and Smart Board) for projecting slide presentations,

viewing video, software demonstrations, and on-the-fly sketches and computation. Increasingly,

I have relied on using Blackboard to assign, collect, and grade written products. I have

experimented with audio feedback and flipped classrooms with mixed results. I have continued

to use online probability and effect size calculators for in-class and homework exercises, but

more recently have begun to ask students to use simulation exercises in both master’s and

doctoral level classes to help make clear concepts related to probability and statistical analysis.

In the spring of 2016, Dr. Lohnes Watulak, Dr. Sadera and I developed a new course to

change the research sequence in the doctoral program. Prior to 2016, the research sequence

consisted of an intermediate and advanced version of quantitative analysis and qualitative

analysis, yielding four courses of which doctoral students were required to choose three. The

revision we implemented created three courses—one foundational course (EDUC 789), and two

intermediate-advanced courses, one for quantitative methods (EDUC 790), and one for
qualitative methods (EDUC 791). I taught, along with Dr. Lohnes Watulak, the initial offering of

EDUC 789 (initially labeled a special topics course: EDUC 670) in the spring of 2016.

Doctoral Program courses

EDUC 715 Statistical Principles of Research Design and Analysis

This course was a required methods courses for doctoral students prior to the change of

the research sequence in 2016. I taught this course twice in the review period. Most students

have some apprehension about taking the course as indicated in their conversations in and

outside of the classroom. However, there is also considerable variability in how prepared

students are to take it. This represents a challenge since no pace will match all students’ needs,

but it is a rewarding class to teach because I believe it equips students with tools needed to be

critical readers for the literature they will be compiling, and to develop conceptual plans for their

dissertations. A few non-ISTC doctoral students have taken the course too.

EDUC 790 Advanced Measurements and Statistics in Education

I taught this course twice in the review period. The first time was before the change in the

research sequence when this course was an elective to follow EDUC 715. Only one student

enrolled in the course and we treated it like an extended independent study in statistical analysis

including generalized linear model approaches (e.g., logistic regression) and latent variable

approaches (e.g., principal components and factor analysis). I had never covered the breadth of

topics in that course before. The student was capable and interested so we were able to move

quite quickly over basic material and move to more advanced material that matched his interests.

The second time I taught this course it was part of the new research sequence, which meant that

more students with less prior exposure to statistical analysis would take it. It is also required

now, rather than a potential elective course. This means that my focus needed to be adapted to
make sure a common set of materials were mastered by students. I decided to focus on

assumptions of significance tests and non-parametric alternatives. My intention was to provide

heuristics to help students with future statistical reasoning challenges by considering the various

assumptions each analysis technique requires, and demonstrating alternatives for problems that

violate those assumptions. I found this very satisfying since it took us away from a “cookbook”

approach to analysis and allowed students to think through what assumptions each analysis

makes and why. During this same semester I was inspired by a lecture on gamification and so

edited my homework assignments to include elements of play, discovery, and challenge. Student

feedback at the end of the semester was mixed, but comments within the classroom suggested

they were engaging. This course has an inherent challenge: students understand statistical

analysis best when they have an immediate application for their learning. Although I provide

practice datasets, there is no substitute for one’s own data when it comes to learning data

analysis. I am hopeful that introducing simulation into the coursework may be a compromise

between canned problems (minimal interest but minimal difficulty to obtain), and dissertation

data (maximal interest, but maximal difficulty to obtain).

ISTC 695 Independent Study

I have taught six independent study courses, all of which entailed students developing

elements of their dissertations, e.g., a data analysis plan. These experiences were entirely self-

paced, and resulted in a written final project.

ISTC 691/694 Directed Reading

Directed Reading courses are self-paced, one credit experiences that provide students

with a refresher or introduction to statistical analysis. They are assigned on the basis of perceived

need by the admissions committee. I taught nine Directed Reading experiences in the review
period. There are three required assignments for each student: 1) five chapter summaries out of

Salkind, N. (2011). Statistics for People Who (Think They) Hate Statistics. 5th ed. Sage. 2) Five

computation problems 3) Three article critiques–the articles are up to the students to choose. The

first two assignments can be corrected after grading until mastery. The article critiques cannot be

re-attempted. I offer to meet with the students as frequently as they wish, in person, by phone, or

email. Most students interact with me via email when they turn in assignments; however a few

students have requested regular in-person meetings. Most students do quite well in the

experience; however the two students who have received poor grades in the course did so

because of a failure to complete assignments.

Master’s Program courses

EDUC 605 Research and Information Technology

I have taught this course five times in the review period. In principle, it is a pre-requisite

for EDUC 761 or ISTC 685 and a literature review is the key assignment for the course. In

addition to teaching students where to find academic literature, and how to evaluate it for

credibility and suitability, students must learn how to synthesize the literature they compile in a

persuasive way. The last three times I taught this course in a hybrid fashion. Online modules

consist exercises in which students put together elements of their literature review for evaluation

by me and peers. During the in-person meetings, I discuss how to evaluate empirical research

articles and they use a workbook by Pyrczak (Pyrczak, F. Evaluating Research in Academic

Journals, 5th ed) to practice evaluating the different sections of them. I also ask students to

complete in-class writing assignment to flesh out their draft literature reviews (e.g., background,

statement of purpose).
EDUC 761 Research in Education

I taught this course twice in the review period. The M.Ed. programs for elementary,

secondary, and middle schools did not have students to enroll in this course. The few that did

take it were admitted to the ISTC 685 course. I teach both courses in the same way, so including

both classes in the same course presents no obvious difficulty for me or the students. There is

some suggestion that the M.Ed. programs will have more candidates who may take EDUC 761 in

the future so its status as a standalone course may be re-established.

ISTC 685 Research in Instructional Technology

I taught this course ten times in the review period, many of them off campus for cohorts.

I have had EDUC 761 students take this course as mentioned above. The key assignment for

ISTC 685/EDUC 761 is to write a full research proposal, including literature review and

methodology for a study on a topic of the student’s choice. The course entails learning about

research design, sampling, measurement, ethics, data analysis, and how to write in a technical

writing genre. More recently, I have introduced simulations through third-party applications and

some of my own to help students understand measurement concepts and data analysis

procedures. I have the most experience teaching this course, and so am most comfortable

teaching it as well as attempting new procedures.

Towson Seminar. I have taught this course four times in the review period. This is the

only course I teach to undergraduates. The goal of the course is to teach writing a research paper

using APA format. I rely heavily on the ideas from Graff & Birkenstein’s “They Say, I Say”

college writing text: I have them complete weekly in-class writing assignments in which they

summarize their understanding of readings (“They Say”) and their responses to them (“I Say”). I

have them read four to five contentious topics in education (e.g., paying students for grades), in a
variety of forms (e.g., expert opinion, original primary research), on various sides of the

argument each week, and allow them to pick one of the topics for their final paper, or some other

topic of interest. In the last two offerings of the course, I attempted to gamify instruction on

APA formatting concepts by having students work in pairs to find as many APA-formatting

mistakes as possible in a fixed time-period in poorly formatted writing samples. I had five

weekly sessions of “Find-A-Goof” during the semester.

Consulting

Doctoral Students

Working with dissertation students is one of the joys of my career. It is a rare opportunity

to watch the development of an idea and see it come to fruition. I have consulted with between

12 and 15 doctoral students per year—many of them carry over. Including both students who I

am a committee member for and those for who I am not, I have spent a total of 615 hours in

consultation with doctoral students in the review period. I have been a committee member for 23

doctoral students at Towson since 2013, and am currently a member of 11 dissertation

committees. I also have consulted with five doctoral students who did not elect to have me on

their committee.

Masters students

I began to offer statistical and design consultation to ISTC 685 students during their

courses with me. Since 2013, I have had two students take me up on my offer. I helped them

with their Institutional Review Board (IRB) application as faculty sponsor, and then met with

them about data collection, management, and analysis. I helped them with minor edits of the

drafts as needed.
Faculty

One of the joys of my career is consulting with valued colleagues to help them produce

the research they are invested in. Since 2013, I have consulted with 21 different faculty

members for 388 hours, between five and 13 per year, and an average of 76 hours per year. Our

work has resulted in the submission of several papers and presentations in which I am a joint

author (see below for an explication of all papers submitted).

Trajectory

I expect to continue teaching research courses for M.Ed. and Ed.D. students. The

relatively new EDUC 789 course will need refinement as it is taught for the second time in

spring 2019. I am considering teaching EDUC 790, the quantitative methodology course, as a

flipped class again. I will continue to teach for the Towson Learning Network and on campus. I

am interested in developing the idea of helping students use simulation to generate data to use for

analytic purposes, so I will continue to explore third-party and home-grown applications for that

purpose.
Teaching Philosophy

The students who take my courses are adults and typically education professionals with

full-time careers, and family obligations. It is rare to have a student in class who is a full-time

student. This means the students are adult learners and I try to accommodate needs and wishes of

these students as I plan my courses and interact with them in class. First, I try to make as much

of what I teach relevant to their experience as possible by providing real-world examples, and

providing lots of opportunity for interaction with me and classmates. Second, I believe in

providing case-like problems (in the doctoral program classes) and individual projects (in

Master’s program classes) to allow for individuality in learning and demonstration of knowledge.

I have more recently begun to introduce simulations into graduate level coursework to enhance

individuality of learning and trial and error. Third, I provide considerable feedback and multiple

opportunities to redo assignments because I believe it promotes mastery learning, a more

appropriate method of grading for self-motivated students. Finally, for graduate students, but

especially for doctoral students, I believe presenting controversial points of view on

methodology and research design develops fluency, but also a critical perspective, necessary for

the independent researcher.


Scholarship Artifact

There are several reasons to showcase this article as an example of scholarship. First, I

appreciate how our work flowed naturally from a qualitative study before I became involved into

a mixed-method study after I began work with Drs. McQuitty and Ballock. Researchers who

specialize in mixed-methods design (e.g., Cresswell J.W. & Plano-Clark V.L. (2018). Designing

and Conducting Mixed-method Research, Sage) and others might call this a QUAL>quan design

(exploratory sequential), in which the qualitative analysis preceded and was primary to the

quantitative data analysis. Drs. McQuitty and Ballock explored their open- and axial coding as

far as they felt they could, but yet still felt there was more to learn from the data. Dr. McQuitty

approached me for ideas how to look at their data in a quantitative way. They wanted to see how

all of their coding categories of student noticing collectively hung together. In this way, we could

make maximal use of something humans do well- make meaning from socially mediated events-

but also make maximal use of what computer algorithms do best, which is to efficiently find

patterns in multivariate space. In keeping with mixed-method practice, we took the patterns

discerned from the quantitative analysis back to the transcripts and used cases to determine that

the quantitative findings made sense. In our case, the qualitative data was the standard and the

quantitative analysis was the tool compared to the standard.

Second, I enjoyed how this stretched me to learn about a quantitative method (Multiple

Correspondence Analysis) I had been playing with and was curious about. There is nothing like a

real problem in statistics to drive discovery. I need to do my homework, practice in the method

with sample problems, and then I felt able to apply it to the noticing data.
Third, the iterative process we went through was labor-intensive but so very much like

how we solve problems in real life: we analyze, then interpret, and then test our interpretations

against our data, then cycle back to perhaps re-code the data, re-analyze, re-interpret, and repeat

again. The feeling of anticipation when eagerly poring over the tabular findings and graphical

displays for what new insight might be revealed was stimulating. On a typical afternoon, I would

run some analyses, share them with Drs. McQuitty and Ballock to see if they made sense, then

incorporate their feedback into a new round of analyses. We had all agreed before beginning that

the quantitative results had to made sense before we used them. Because the qualitative analyses

preceded and were conceptually prioritized over the quantitative analysis, we used the qualitative

results as the standard by which meaningfulness of the quantitative results were judged. But we

also found that the quantitative findings caused Drs. McQuitty and Ballock to look back at their

transcripts and coding scheme with fresh eyes. Their subsequent recoding of the data led to

another round of quantitative analyses. There were at least two rounds of re-coding and re-

analysis like this. In the end, the qualitative and the quantitative analyses were both informative

of, and informed by, the other. If there were a path forward I could point to for our colleagues

and our students about how to do mixed-methods research, I'd give this as an example. Both

approaches led to mutually reinforcing interpretations.


Fourth, I think we did a very nice job of not making our paper a tutorial on Multiple

Correspondence Analysis. When one writes about a method that is not commonly used in a

journal, there is some responsibility to inform the reader about what one is are doing. The risk of

writing clearly about methods is that it can overwhelm the substantive content of the paper,

which is the real focus. I believe we were clear enough in our description that the method was

not swept under the rug, but not so wordy that the focus on the results of the noticing research

were diluted. I am proud of our collaboration.

Mary Neville with The Journal of Teacher Education interviewed us for the JTE Insider

podcast. The link to that interview is here:

https://edwp.educ.msu.edu/jte-insider/2018/podcast-interview-ballock-mcquitty-mcnary/

Вам также может понравиться