The DUB Shorts format focuses on sharing a research paper in a 15 to 20-minute talk, similar to traditional conference presentations of a paper. Speakers will first present the paper, then participate in Q&A.
DUB shorts will be conducted using Zoom, via an invitation distributed to the DUB mailing list. Participants who are logged into Zoom using a UW account will be directly admitted, and participants who are not logged in to a UW account will be admitted using a Zoom waiting room.
Computer Science & Engineering
https://homes.cs.washington.edu/~kentrell/
"You Gotta Watch What You Say": Surveillance of Communication with Incarcerated People
Surveillance of communication between incarcerated and non-incarcerated people has steadily increased, enabled partly by technological advancements. Third-party vendors control communication tools for most U.S. prisons and jails and offer surveillance capabilities beyond what individual facilities could realistically implement. Frequent communication with family improves mental health and post-carceral outcomes for incarcerated people, but does discomfort about surveillance affect how their relatives communicate with them? To explore this and the understanding, attitudes, and reactions to surveillance, we conducted 16 semi-structured interviews with participants who have incarcerated relatives. Among other findings, we learn that participants communicate despite privacy concerns that they felt helpless to address. We also observe inaccuracies in participants’ beliefs about surveillance practices. We discuss implications of inaccurate understandings of surveillance, misaligned incentives between end-users and vendors, how our findings enhance ongoing conversations about carceral justice, and recommendations for more privacy-sensitive communication tools.
Information School
Domain Experts’ Interpretations of Assessment Bias in a Scaled, Online Computer Science Curriculum
Understanding inequity at scale is necessary for designing equitable online learning experiences, but also difficult. Statistical techniques like differential item functioning (DIF) can help identify whether items/questions in an assessment exhibit potential bias by disadvantaging certain groups (e.g. whether item disadvantages woman vs man of equivalent knowledge). While testing companies typically use DIF to identify items to remove, we explored how domain-experts such as curriculum designers could use DIF to better understand how to design instructional materials to better serve students from diverse groups. Using Code.org’s online Computer Science Discoveries (CSD) curriculum, we analyzed 139,097 responses from 19,617 students to identify DIF by gender and race in assessment items (e.g. multiple choice questions). Of the 17 items, we identified six that disadvantaged students who reported as female when compared to students who reported as non-binary or male. We also identified that most (13) items disadvantaged AHNP (African/Black, Hispanic/Latinx, Native American/Alaskan Native, Pacific Islander) students compared to WA (white, Asian) students. We then conducted a workshop and interviews with seven curriculum designers and found that they interpreted item bias relative to an intersection of item features and student identity, the broader curriculum, and differing uses for assessments. We interpreted these findings in the broader context of using data on assessment bias to inform domain-experts’ efforts to design more equitable learning experiences.