Discipline: Technology and Engineering
Subcategory: Education
Session: 2
Room: Exhibit Hall
Dhvani Jain - Princeton University
In recent decades, technology use in education has blossomed into a critical component of most classroom environments. It has been a facilitator of learning by allowing access to tools that promote heightened accessibility to resources and gaining skills in areas such as writing, reading, and critical thinking. Especially in the wake of increased online learning due to the COVID-19 pandemic, the industry of Educational Technology, or “EdTech,” has gone from burgeoning to a giant. A few examples of EdTech include games, websites, testing platforms, and programs used by students and teachers to write and create projects. Some examples in writing and language specifically include Google’s “Docs,” Microsoft’s “Word” and “PowerPoint.” The aforementioned programs rely heavily on natural language processing algorithms underlying grammar and spell checkers, and are also used in nearly every educational setting involving digital writing. One of the major consequences of the aforementioned language programs, as predicted and found in the present study, is linguistic discrimination. Additionally, research on racial bias in and negative effects of current technological language tools and algorithms used in educational spaces is limited, so the EdTech industry may not be properly apprised of the severity of the potential consequences of the current technological regime. Awareness may not exist on the need for urgent action and improvements to the existing technology. In the present study, it was predicted that educational tools involving language algorithms would promote linguistic discrimination by discounting African-American English (AAE) – a dialect of English used in African-American communities, with complex differences from Standardized American English (SAE) – or by labeling AAE as wrong English rather than simply another variety. It was also hypothesized that this discrimination would result in harmful sociocultural and sociolinguistic effects in the classroom, both within Black students and in other students’ attitudes towards these students. I conducted a combination of a review of the literature and examination of algorithms in use on writing platforms. Additionally, I evaluated conventional grammar and spell checkers, such as those in use by Google and Microsoft, on AAE, and recorded and compared performance of the language algorithms on AAE to SAE. By drawing on the literature and connecting it to findings from this evaluation, I examined the sociolinguistic implications of the performance of these grammar checkers in educational settings. I found that language technology found in use on an everyday basis in classrooms propagates Ladson-Billings’ educational debt and creates linguistic discrimination by furthering notions of the inferiority of African-American English (AAE). Hence, everyday EdTech is blocking the bridging of gaps in our society. Drawing upon this research, future applications in the development of new technology or additions on current technologies were proposed and recommended. Future research should include prototypical development of sociolinguistically aware grammar and spell-checkers and other language algorithms relevant in the educational setting.
Funder Acknowledgement(s): This study was supported by a grant from NSF awarded to Dr Anne Charity Hudley, Associate Dean of the Graduate School of Education, Stanford University, Palo Alto, California.
Faculty Advisor: Dr. Anne Charity Hudley, acharityhudley@stanford.edu
Role: I worked on this alone under the advisorship of Dr. Anne Charity Hudley through my REU program.