Gregory A. Petrow, Associate Professor of Political Science, University of Nebraska, Omaha
Teaching quantitative analysis is important in disciplines such as political science that conduct the scientific study of human society. It allows students to become better consumers of political science research. It opens up the empirical exploration of the political world to them. It makes them better consumers of news (see, for example, Lupia (2000) on the public and applied values of political science research). Additionally, we recognize that some of our students will continue on to advanced study and will conduct statistical analysis themselves; for these students, we want to begin the training for them to analyze political data. Reflecting just how important this kind of training is to the mission of political science departments, most of them require students to pass a research methods class before graduating, which frequently includes a data analysis component (Thies and Hogan 2005). Presumably thousands or tens of thousands of students take such courses every semester. This makes understanding the pedagogy of teaching data analysis to political science students very important.
Unfortunately, teaching political statistics well is difficult. Students are afraid of data analysis due to math anxiety (e.g. Papanastasiou and Zembylas 2006), and they are disinterested in the topic because they do not perceive it as being relevant to their future occupations (Murtonen, Olkinuora, Tynjala, and Lehtinen 2008). Anecdotally, many political science students claim to study political science in order to avoid quantitative analysis, and so when they end up in these required data analysis courses, they are uninterested and unmotivated (see Earley 2014 for a review). To make the teaching of political analysis even more challenging, undergraduate degree programs are moving on-line. Students who may have been wrestling with math anxiety when they were sitting in a classroom with the professor and fellow students, may now also feel alone and isolated. This may contribute even more to a feeling of being overwhelmed.
Against this backdrop, scholars study the efficacy of different teaching techniques to increase learning and student satisfaction in the on-line delivery mode. Researchers find that multimedia presentations of material improve student satisfaction and engagement. Of course, the value of multimedia is greatest in the online delivery mode, as the lack of a physical classroom also removes the personal dynamics found there (Mandernach, 2009). Because of the limits of on-line education, instructor-created content may have a greater effect in that delivery mode. After all, instructor-created multimedia lets online students experience instructors as living, breathing people, such that the teaching comes across in ways more similar to how it would in the flesh and blood environment.
Commonly used multimedia tools include recorded Power Point lectures, and recording live lectures. Another tool available to some instructors is the light board. A light board is an illuminated sheet of glass situated between a video camera and the instructor. The instructor writes on it with florescent markers, similar to writing on a chalkboard or white board. While writing, the instructor addresses the camera directly. Of course, the writing would appear backward to the camera, and so the entire image is flipped by the software. As a result, students’ computer screens appear as a chalkboard – but with the added benefit that the writing has a fluorescent, multi-colored glow. Another effect is that the backdrop is jet black and the instructor is lit, creating a strong contrast with the background that draws focus to the instructor. The benefit of this type of tool is that the professor engages with the course material and the student simultaneously. This is superior to the chalk board or white board because the instructor’s back faces the audience while he or she writes. The light board maintains the pedagogical advantages and ease of use of the traditional white board lecture, while then improving upon them for the on-line learning format (Friedland, Knipping, Schulte & Tapia, 2004).
The instructors who can use this technology are limited to those at universities that have purchased the light boards. I find the light board to be a great method to use to teach statistics because I can write equations and work algebraic problems on the board, as well as draw out concepts in statistics (like the bell-shaped curve of the normal distribution), all the while I keep my attention focused on addressing the camera. With a chalk board, when it is full, the instructor has to take the time to erase everything while students sit passively. However, with the light board, the recording technician can either edit out the cleaning of the board, or pause the recording. From the students’ perspective, the writing simply vanishes and the professor continues with the new material.
There is evidence that students especially appreciate when instructors use the light board (Southard and Young 2018). My own experience supports this conclusion as well. At this point in my career, I have been teaching statistics on-line for about 15 years. I started recording my lectures on-line early by writing on a computer tablet and recording the accompanying video and audio. I’ve been updating these lectures for obvious reasons, using the light board, but also recording screen captures on my computer while I speak and perform analysis with software. In my course evaluations, students clearly prefer the light board presentations, and they request more of them. Unfortunately, right after I recorded my first few light board lectures, the COVID pandemic hit, and the lab housing the light board had to close. I am anticipating the time when I can return to record more lectures.
Earley, Mark A. 2014. “A Synthesis of the Literature on Research Methods Education.” Teaching in Higher Education 19(3): 242-53.
Friedland, Gerald, Lars Knipping, Joachim Schulte, and Ernesto Tapia. 2004. “E-Chalk: A Lecture Recording System using the Chalkboard Metaphor.” Interactive Technology & Smart Education 1: 9-20.
Lupia, Arthur. 2000. “Evaluating Political Science Research: Information for Buyers and Sellers.” PS: Political Science & Politics 33 (March): 7-13.
Mandernach, B. Jean. 2009. “Effect of Instructor-Personalized Multimedia in the Online Classroom.” International Review of Research in Open and Distance Learning 10(3): 1- 19.
Murtonen, Mari, Erkki Olkinuora, Paivi Tynjala, and Erno Lehtinen. 2008. “‘Do I Need Research Skills in Working Life?’: University Students Motivation and Difficulties in Quantitative Methods Courses.” Higher Education 56 (5): 599-612.
Papanastasiou, Elena C. and Michalinos Zembylas. 2006. “Anxiety in Undergraduate Research Methods Courses: Its Nature and Implications.” Paper presented at the annual meeting of the American Educational Research Association, San Francisco, CA, April.
Southard, Sheryne and Karen Young. 2018. “An Exploration of Online Students’ Impressions of Contextualization, Segmentation, and Incorporation of Light Board Lectures in Multimedia Instructional Content.” The Journal of Public and Professional Sociology 10(1): Article 7. [https://digitalcommons.kennesaw.edu/jpps/vol10/iss1/7/] (Accessed July 29, 2021).
Thies, Cameron G. and Robert E. Hogan. 2005. “The State of Undergraduate Research Methods Training in Political Science.” PS: Political Science & Politics 38 (April): 293-97.
Published since 2005, The Political Science Educator is the newsletter of the Political Science Education Section of the American Political Science Association. Dr. Bobbi Gentry (Bridgewater College) was the Editor for the Fall 2021 edition. Since 2020, APSA Educate has co-published the Political Science Educator. You can see last years publication here. A curated list of select essays can be viewed here. The entire archived collection can be viewed here.