Political Science Educator: volume 28, issue 1
Reflections
By Janet L. Donavan (janet.donavan@colorado.edu)
As a political scientist who regularly teaches Introduction to American Politics and Media and Politics, helping students develop information literacy is a key learning goal in my teaching. An important topic in these courses is learning how data is utilized to tailor content and advertising, as well as in behavioral modeling. The importance of understanding how and why our data is being used has led me to reconsider some of the educational technology, or “edtech,” that I regularly use in teaching. As many of us understand how behavioral modeling works and how regulatory systems do or can work, we as political scientists have a role in helping students and others consider how to weigh the benefits of edtech versus the potential problems with these technologies.
I tend to put edtech into three categories: useful for my students and myself; interesting but not useful for my students and myself; or not a good technology. Until recently, I have had a positive take on integrating technology into instruction. I have served as a Faculty Fellow with the Arts & Sciences Support of Education through Technology (ASSETT) for two separate two-year periods, and I didn’t have much trouble pivoting online during COVID-19 because I already knew the technologies. However, I have had some troubling encounters with these technologies, either in my own courses or as the Director of Undergraduate Studies in our department. Ed tech products are in some cases violating student privacy, using student and faculty data to create and improve products (including training AI) without compensation, using our data to develop behavioral models and predictive analytics without compensation, and retaining data on us indefinitely. Helping students understand how their data is being used, allowing them consent, and giving them the tools to decide whether regulation is necessary or desirable is part of understanding the role of government and the nature of data and data analysis.
Like me, the literature on the use of edtech in higher education has mostly been positive. Among other findings, technology helps students visualize content, promotes interactions with instructor and other students, supports meaningful student reflection, provides opportunities for authentic learning, and improves the quality and quantity of student practice (West and Graham 2005). Learning management systems (LMSs) increase engagement and participation in simulation activities (Mathews and Latronica Herb 2013), clickers increase student engagement among students who otherwise might not participate (Holland, Schwartz-Shea, and Yim 2013; Ulberg 2013; Ulbeg and Notman 2012) and technologies like online video production and social media increase students’ skills (Florez-Morris and Tafur 2010; Le and Pol 2022). Although the literature is largely positive, other scholars have been more mixed or negative on the findings of effectiveness, with Michels (2023) warning about the promise and potential peril of artificial intelligence (AI) and Mancillas and Brusoe (2015) finding no significant learning outcome differences in students taught with varying levels of edtech.
Beyond effectiveness, it is important to grapple with the ethics of new technologies, and especially the privacy and intellectual property implications for our students, as well as the potential that data being created by our use of various technologies is being monetized in ways that exploit our students and ourselves. Edtech products collect learning analytics (LA) data which can be used by institutions or individual faculty to address learner needs but can also be used by the companies behind the products for their own purposes (Daniel 2017). Anti-plagiarism software collects student work as its database and trains both AI and AI-detection products using that content (Brown, et al 2007; Morris and Stommel 2017). Various applications track students’ personal information and location in ways that threaten individual privacy and treat students and their data as products (Apps, Beckman and Howard (2022). These concerns apply to one of the most popular edtech tools. I have avoided naming the specific products, as the issues identified are similar with most products in each category.
Plagiarism and AI Detectors
I confess to adopting plagiarism detection without much thought. Our LMS has an integrated plagiarism detector and I “checked the box” for plagiarism detection for years. My thought would help students to see their “plagiarism score” in advance of submitting the paper and make adjustments. I design assignments using best practices to make them difficult to plagiarize (WPA 2017) and I have only identified a few cases of plagiarism in my classes using this software across thousands of students. In most cases, these were cases of sloppy attribution, or an overly aggressive algorithm citing common phrases as plagiarism. The main value is to encourage students to catch plagiarism before submitting work. This year, an instructor reached out for advice after being contacted by the plagiarism detection software provider asking for a student assignment to be sent to an instructor at another institution for review. Because the student saved the assignment with their name as the file name (which students are often instructed to do, for example 101_Donavan_Midterm1), the student’s name had been retained in the plagiarism system and revealed to this other instructor. I became aware of how student papers as well as instructor feedback are being saved in these systems and fed through plagiarism algorithms in ways that are sloppy at best at providing student privacy, with student or faculty names being shared by the system when they are in the file name. The significant work of both the student and the professor are being used as data, without compensation. This led me to stop “checking the box.” If I read a paper and suspect plagiarism, I will track it down myself in the future.
In another case, an instructor reached out for advice after suspecting AI use on how to proceed. Through communicating with our office of information technology and running the assignment through multiple AI detectors, we learned that there is no clear evidence of the accuracy of AI detectors. With both plagiarism detectors and AI detectors, student writing is used to program and train software that we are being charged to use or will likely be charged for in the future, and for which we are not receiving compensation. The use of either plagiarism or AI detecting software is one where good instructors may differ, but I encourage considering the implications. I have decided to use other means for evaluating papers that may have used AI; tactics include checking citations to make sure they are real, reading the paper for inaccurate ideas or attributions, and advising students to be more concise and specific in their writings. I allow students to use AI, but warn that they are responsible for the content and required to specify that they used AI and why.
Learning Management Systems
I first used an LMS in 2007 to teach distance learning courses. Prior to that, I had taught distance learning courses with both mailed assignments and emailed assignments; teaching through an LMS for distance learning was certainly an improvement. However, I have always tried to ensure that for courses with an in-person modality, students could choose to use the LMS as a tool or not. During COVID-19, this became more difficult; since COVID-19, inclusive access policies at my institution have encouraged using textbooks that are integrated into the LMS. It is difficult to ask or require students to turn in hard copies, as there are few printing facilities on campus. The expectation has become that everything is done through the LMS.
These developments have increased data being collected, as each individual is recorded clicking on material, spending time on activities, and completing assessments. It has also increased the amount of data being collected on how we teach, how much time we spend grading, and more. In fact, LMS software is constantly collecting data on students and faculty, and companies use this data to develop products and research how different people teach and learn (Jones 2019). This data is used for predictive analytics in predicting who will be successful in which courses with which pedagogical approaches (Daniels 2017). This holds hope for inclusive student success, but it also creates the possibility of selecting students for admission based on expected performance. Either way, we are being used in studies without our consent. I am still using the LMS, but exploring ways to mitigate privacy concerns, such as reducing integrations (such as lecture capture, textbooks, plagiarism detectors) and having work submitted and graded outside the LMS. I believe a broader discussion is necessary, including institutional negotiation with edtech companies and the consideration of privacy legislation.
Conclusions
Developing our own course policies on edtech and explaining our decisions for students can be a launching point for discussing issues of governmental regulation of technology, the right to privacy, data collection and analysis, behavioral modeling and the political implications of AI. I recommend addressing these issues as part of a “tech check” when setting up a class for the semester. Important questions to ask about the tech tools we are using include asking whether the current tool is the best tool for the job, whether there are ways to minimize the exposure of both student and instructor data and behavior, whether and how to make the use of data more transparent, and whether and how the ed tech policies can be teachable political science issues. Then, we can discuss and share our policies with our students and colleagues across the academy. Information and technological literacy are some of the biggest challenges of our times, and opportunities to examine these issues are embedded in all of our courses that use edtech tools.
References
Apps, Tiffani, Karley Beckman and Sarah K. Howard. 2022. “Edtech is Treating Students Like Products: Here’s How We Can Protect Children’s Digital Rights” The Conversation. https://theconversation.com/edtech-is-treating-students-like-products-heres-how-we-can-protect-childrens-digital-rights-184312.
Brown, Renee, Brian Fallon, Jessica Lott, Elizabeth Matthews, and Elizabeth Mintie. 2007. “Taking on Turnitin: Tutors Advocating Change.” The Writing Center Journal 27 (1): 7–28. http://www.jstor.org/stable/43442823.
Council of Writing Program Administrators (WPA). 2019. “Defining and Avoiding Plagiarism: The WPA Statement on Best Practices.” https://wpacouncil.org/aws/CWPA/pt/sd/news_article/272555/_PARENT/layout_details/false.
Daniel, Ben. 2017. “Contemporary Research Discourse and Issues on Big Data in Higher Education.” Educational Technology 57(1):18–22. http://www.jstor.org/stable/44430536.
Florez-Morris, Mauricio, and Irene Tafur. 2010. “Using Video Production in Political Science Courses as an Instructional Strategy for Engaging Students in Active Learning.” Journal of Political Science Education 6(3): 315–19. doi:10.1080/15512169.2010.494472.
Holland, Lauren, Peregrine Schwartz-Shea, and Jennifer M. J. Yim. “Adapting Clicker Technology to Diversity Courses: New Research Insights.” Journal of Political Science Education 9, no. 3 (2013): 273–91. doi:10.1080/15512169.2013.796234.
Jones, Kyle M. L. 2019. “Learning Analytics and Higher Education: A Proposed Model for Establishing Informed Consent Mechanisms to Promote Student Privacy and Autonomy.” International Journal of Educational Technology in Higher Education. 16: Article 24. https://educationaltechnologyjournal.springeropen.com/articles/10.1186/s41239-019-0155-0#Sec1
Le, Danvy, and Antoinette Pole. 2023. “Beyond Learning Management Systems: Teaching Digital Fluency.” Journal of Political Science Education 19(1): 134–53. doi:10.1080/15512169.2022.2139268.
Mancillas, Linda K., and Peter W. Brusoe. 2016. “Born Digital: Integrating Media Technology in the Political Science Classroom.” Journal of Political Science Education 12(4): 375–86. doi:10.1080/15512169.2015.1096792
Mathews, A. Lanethea, and Alexandra LaTronica-Herb. 2013. “Using Blackboard to Increase Student Learning and Assessment Outcomes in a Congressional Simulation.” Journal of Political Science Education 9(2): 168–83. doi:10.1080/15512169.2013.770986.
Michels, Steven. 2023. “Teaching (with) Artificial Intelligence: The Next Twenty Years.” Journal of Political Science Education 1–12. doi:10.1080/15512169.2023.2266848.
Morris, Sean Michael and Jesse Stommel. 2017. “A Guide for Resisting Ed Tech: The Case Against Turnitin.” Hybrid Pedagogy https://hybridpedagogy.org/resisting-edtech/
Paris, Britt, Rebecca Reynolds and Catherine McGowen. 2021. “Platforms like Canvas Play Fast and Loose with Student Data” The Nation. https://www.thenation.com/article/society/canvas-surveillance/
Ulbig, Stacy G. 2016. “I Like the Way This Feels: Using Classroom Response System Technology to Enhance Tactile Learners’ Introductory American Government Experience.” Journal of Political Science Education 12(1): 41–57. doi:10.1080/15512169.2015.1063435.
Ulbig, Stacy G., and Fondren Notman. 2012. “Is Class Appreciation Just a Click Away?: Using Student Response System Technology to Enhance Shy Students’ Introductory American Government Experience.” Journal of Political Science Education 8(4): 352–71. doi:10.1080/15512169.2012.729450.
West, Richard E., and Charles R. Graham. 2005. “Five Powerful Ways Technology Can Enhance Teaching and Learning in Higher Education.” Educational Technology 45(3): 20–27. http://www.jstor.org/stable/44429208.
—
Dr. Janet L. Donavan is a Teaching Professor, Associate Chair and Director of Undergraduate Studies in the Political Science Department at the University of Colorado Boulder. She teaches courses in American politics and American political thought.
Published since 2005, The Political Science Educator is the newsletter of the Political Science Education Section of the American Political Science Association. All issues of The Political Science Educator can be viewed here.
Editors: Colin Brown (Northeastern University), Matt Evans (Northwest Arkansas Community College)
Submissions: editor.PSE.newsletter@gmail.com
As part of APSA’s mission to support political science education across the discipline, APSA Educate has republished The Political Science Educator since 2021. Please visit APSA Educate’s Political Science Educator digital collection here.



