The DUB Shorts format focuses on sharing a research paper in a 15 to 20-minute talk, similar to traditional conference presentations of a paper. Speakers will first present the paper, then participate in Q&A.
DUB shorts will be conducted using Zoom, via an invitation distributed to the DUB mailing list. Participants who are logged into Zoom using a UW account will be directly admitted, and participants who are not logged in to a UW account will be admitted using a Zoom waiting room.
Carnegie Mellon University
http://www.cs.cmu.edu/~minhunl/
A Human-AI Collaborative Approach for Clinical Decision Making on Rehabilitation Assessment
Advances in artificial intelligence (AI) have made it increasingly applicable to supplement expert’s decision-making in the form of a decision support system on various tasks. For instance, an AI-based system can provide therapists quantitative analysis on patient’s status to improve practices of rehabilitation assessment. However, there is limited knowledge on the potential of these systems. In this paper, we present the development and evaluation of an interactive AI-based system that supports collaborative decision making with therapists for rehabilitation assessment. This system automatically identifies salient features of assessment to generate patient-specific analysis for therapists, and tunes with their feedback. In two evaluations with therapists, we found that our system supports therapists significantly higher agreement on assessment (0.71 average F1-score) than a traditional system without analysis (0.66 average F1-score, p < 0.05). After tuning with therapist’s feedback, our system significantly improves its performance from 0.8377 to 0.9116 average F1-scores (p < 0.01). This work discusses the potential of a human-AI collaborative system to support more accurate decision making while learning from each other’s strengths.
Human Centered Design & Engineering
The Effects of User Comments on Science News Engagement
Online sources such as social media have become increasingly important for the proliferation of science news,and past research has shown that reading user-generated comments after an article (i.e. in typical online news and blog formats) can impact the way people perceive the article. However, recent studies have shown that people are likely to read the comments before the article itself, due to the affordances of platforms like Reddit which display them up-front. This leads to questions about how comments can affect people’s expectations about an article, and how those expectations impact their interest in reading it at all. This paper presents two experimental studies to better understand how the presentation of comments on Reddit affects people’s engagement with science news, testing potential mediators such as the expected difficulty and quality of the article. Study 1 provides experimental evidence that difficult comments can reduce people’s interest in reading an associated article. Study 2 is a pre-registered follow-up that uncovers a similarity heuristic; the various qualities of a comment (difficulty, information quality, entertainment value) signal that the article will be of similar quality, ultimately affecting participants’ interest in reading it. We conclude by discussing design implications for online science news communities.