My research examines the social structures that help shape the forms and content of data science, and the use of data science in social spheres and its implications. In my ethnographic and participatory-design based work, I have explored: 1) the motivations behind the move to use data science in online fraud detection and the potential consequences for embedding structural discrimination that could arise from a lack of attention to diverse cultural norms and disparate regional internet activities, 2) how teachers in online and hybrid school environments navigate data-driven decision making, using the products of data science to understand and engage students in otherwise opaque situations but also utilizing their own professional judgement to complement algorithmic outputs, 3) how youth envision the role of data science and other advanced computing methods and technologies in future school environments and what this can tell us about current technology policies and designs.
In a soon to be published paper for ACM CHI, my co-authors and I look at how educators and policymakers are increasingly trying to control youth access to technology in the classroom, while simultaneously working to deploy technology for purposes of surveillance and behavioral control. While many scholars have explored the implications of intensifying dataveillance and disciplinary practices deployed by teachers in K-12 schooling, few have investigated how students’ visions of technology deployment and use might align or diverge from those of designers and teachers. Using the resulting data from participatory design workshops and ethnographic research with students and staff in alternative hybrid schools, we explore students’ concepts of future technologies for the classroom and how these artifacts reflect student perceptions of safety and good behavior. Rather than simply accepting or resisting the role of technology in discipline and punishment as presented by technology creators, wherein disciplinary decisions are made by teachers using technology, students actively respond to these narratives to increase the objectivity and accuracy of punishment. The results of this work show how visions of future technology can sometimes reify new forms of power and other times respond to unmet student needs to exert control in the classroom.
In my dissertation project, “Blank Slate: Freedom, Connection, and Accountability in U.S. Virtual Schools,” I draw on interviews and observations with teachers, parents, students, and industry professionals across the U.S., to find that virtual schools are fueled by existing social inequality, precarity, and destruction of public infrastructure, and contribute to realigning responsibilities from the public and private sectors onto individuals and families. However, this shift is often welcomed or at least tolerated by participants in exchange for a promise to accommodate the diverse conditions under which students and teachers work, and the ability to transform their varied labors into commensurable data without requiring adherence to rigid norms of behavior. I explore how this process unfolds in a constant tangle between precarious webs of care developed through intensive relational labor and coercive efforts to produce quantifiable metrics legible to the state. While digitization does often increase the sense of distance and opacity between teachers and students, interfering with teachers’ core ability to practice attunement, teachers develop strategies to recuperate visibility and practice caring responsiveness. Affective and relational labor undergirds the functioning of these virtual school systems, providing a way to produce necessary data to meet accountability standards without standardization and to emotionally sustain teachers, students, and parents through difficult times. Yet it also provides workers with leverage against attempts at automation and “de-skilling,” leaving both companies and cash-strapped districts with reasons to work towards diminishing or replacing these methods in search of higher profits and unchecked authority. These processes threaten to reinstate the very harms participants sought to use virtual schools to escape.
I was an English major in college focusing on gender & sexuality and media studies, but I really wanted to move beyond analyzing media artifacts and narratives to include actually observing and talking to people. After working in the grassroots nonprofit open source technology space as an Americorps VISTA, I became further invested in exploring the development and uses of digital technologies and the burgeoning role of automation and data science. While working at the Barnard Center for Research on Women, I took classes at Teachers College and served as part of a committee on online and on-campus learning that considered best practices for incorporating new digital methods into high-quality hands-on pedagogical techniques. I joined the UC Berkeley School of Information for graduate school to be a part of an interdisciplinary community working at the cutting-edge of weaving together Computer Science, Law & Policy, Sociology and more. There, I participated in the Algorithmic Fairness and Opacity Working Group and attended the first FaCCT conference, engaging with the developing interest in the ethics of data-science and threats and possibilities for social justice given the staggering ubiquity of data-science throughout our daily lives. After receiving my PhD during the COVID-pandemic restrictions, I was a Computing Innovation Fellow at Michigan State University for my postdoctoral research, which centers LGBTQ+ youth in participatory-design based methods.