Emotional robots and magical objects: What part of our internal experience is readable by a touch-sensitive machine?
UPDATE: This seminar has been postponed out of respect for Strike for Black Lives (#ShutDownStem). Read more here.
(Prof. MacLean's talk will be rescheduled on a different date)
We communicate emotion through touch. Information about emotion is present in our touch; sometimes deliberate, sometimes involuntary. Watch people fidget; study how your cat responds to your strokes. Notice how intertwined the sensation and action is; you touch emotively because of the way the cat moves under you hand. Why does it feel so good to both of you? If a cat can do this to you, shouldn’t a machine be able to manage some part of it? Over the last 10+ years, we’ve been breaking down the challenges into manageable chunks, and putting them back together: the movement, specifying the movement, the sensors, making sense of the touch, wrapping it all into an interactive loop. We’ve learned a lot and gotten help from many directions – emotion psychologists and pediatric therapists, neuroscientists, theatre and voice actors, materials scientists and chemists, artificial intelligence and machine learning experts. There’s lots more to do.