november, 2021
Event Details
Abstract:In most cases, deep neural networks (DNNs) require a great deal of data. There are approaches, such as zero-shot and few-shot learning, that can produce high quality
Event Details
Abstract:
In most cases, deep neural networks (DNNs) require a great deal of data. There are approaches, such as zero-shot and few-shot learning, that can produce high quality DNNs with less or no data. However, these approaches still assume a large source dataset or a large secondary dataset to guide the transfer of knowledge. These are not assumptions that hold true when our goal is to model individual humans, who tend to produce much less data. In this talk we present a novel transfer learning method for producing a DNN for modeling the behaviour of a specific individual on an unseen target task, by leveraging a small dataset produced by that same individual on a secondary task. We make use of a specialized transfer learning representation and Monte Carlo Tree Search (MCTS). We demonstrate that our approach outperforms standard transfer learning approaches and other optimization methods on two human modeling domains: financial health and video game design.
What You’ll Learn:
About a novel approach for modelling users
Matthew Guzdial is an Assistant Professor in the Computing Science department of the University of Alberta and a Canada CIFAR AI Chair at the Alberta Machine Intelligence Institute (Amii). He is a recipient of an Early Career Researcher Award from NSERC, a Unity Graduate Fellowship, and two best conference paper awards. His work has been featured in the BBC, WIRED, Popular Science, and Time.
more
Time
(Thursday) 10:55 AM - 11:40 AM
Recent Comments