The Augmented Reality technology combined with an Artificial Intelligence assistant holds the promise to provide us with assistance at the right time, at the right place and with the right information. The vision of always-on AI-assisted AR that can be used in a continuous fashion for an entire day depends on solving many difficult problems including display technology, computing power, batteries, localization, tracking, and contextual sensing, in addition to delivering the multimodal AI models to power the inference and assistance. However, to deliver truly useful all-day mobile AI+AR experiences, we must solve the fundamental problem of how to effectively interact with this technology that goes beyond just speaking to it. The solution to the interaction problem requires that we invent novel sensing and haptic technologies for all-day wearable devices as well as leverage the ever more powerful contextual AI understanding to yield truly frictionless and expressive experiences. In this talk, I will cover the recent advances in this research area from Meta Reality Labs Research.
11:45am - 12:15pm: | Food and community socializing. |
12:15pm - 1:15pm: | Presentation with Q&A. Available hybrid via Zoom. |
1:30pm - 2:15pm: | Student meeting with speaker, held in CSE2 371. Students will walk to this from the seminar. |
Dr. Hrvoje Benko is a Director of Research Science at Meta Reality Labs Research where he is developing novel interactions, devices and interfaces for Contextualized AI, Augmented and Virtual Reality. He is leading the efforts to invent novel wearable devices that enable people to use their gestures, gaze and voice to express themselves while harnessing the contextualized understanding of their environment for better interactions. He currently leads a multi-disciplinary organization that includes scientists and engineers with expertise in human computer interaction, computer vision, machine learning, AI, design, neuroscience and cognitive psychology.
He is an expert in the field of Human-Computer Interaction (HCI) where he has coauthored more than 80 scientific articles and 70 issued patents. His research has been awarded 13 best paper awards or honorable mentions at the top HCI conferences and he has received the ACM UIST Lasting Impact Award in 2022 for his co-authored work “OmniTouch: Wearable Multitouch Interaction Everywhere”. He has been active in the organization of the ACM User Interface Systems and Technology conference, the premiere technical conference in HCI, serving as the program chair in 2012 and as the general chair in 2014. He sits on the editorial board of the TOCHI Journal, the premiere journal in the HCI field.
He also holds an Affiliate Full Professor position at the University of Washington Information School. Prior to his current role at Meta, he was a Principal Researcher at Microsoft Research, where he worked on novel haptic handheld devices, multi-touch interactions, large-scale projection-mapping environments, and novel AR/VR interactions and technologies. He received his Ph.D. in Computer Science from Columbia University in 2007 investigating mobile augmented reality and multi-touch interactive technologies. In 2023, he was inducted into the SIGCHI Academy for his research contributions to the field of Human-Computer Interaction.