Social Robots: From Research to Commercialization
Dr. Cynthia Breazeal: MIT Media Lab, Jibo Inc., USA
The fields of Social Robotics and Human-Robot Interaction are undergoing rapid growth, motivated by important societal challenges facing the general public such as aging in place, healthcare, education, manufacturing, transportation, and much more. Such applications motivate the development of ever more intelligent and capable autonomous robots and technologies that can work collaboratively with people in human environments. In the consumer marketplace, intelligent conversational technologies are entering the home at a surprising rate as Internet of Things devices are enabled by AI-based SDKs. Affordable social robots are also poised to enter the market as mass consumer products and 3rd party developer platforms.
The dual importance of both developing and understanding the longitudinal impact of such technologies never been more relevant. This keynote presentation highlights a number of provocative research findings from the Personal Robots Group at the MIT Media Lab. We develop social robots and apply them as a new kind of scientific tool to understand human behavior. We then use these insights to design and develop social robots that can longitudinally engage people to enhance quality of life outcomes. In this presentation, we highlight relevant work in healthcare and early childhood education.
Research in social robotics builds a critical foundation for real world applications. The entrepreneurial journey of bringing a social robot consumer product to market in the USA will also be discussed in the context of Jibo, Inc.
Dr. Cynthia Breazeal is an Associate Professor of Media Arts and Sciences at the Massachusetts Institute of Technology where she founded and directs the Personal Robots Group at the Media Lab. She is also founder and Chief Scientist of Jibo, Inc. She is a pioneer of Social Robotics and Humam-Robot Interaction. Her recent work investigates the impact of social robots on helping people of all ages to achieve personal goals that contribute to quality of life in domains such as education, health, wellbeing, and empathy and engagement despite distance. Her work balances technical innovation and design with understanding the human psychology of engaging with social robots to maximize beneficial outcomes. She has authored the book Designing Sociable Robots, has published over 150 peer-reviewed articles She is an award-winning innovator, designer, and entrepreneur. She did her graduate work at the MIT Artificial Intelligence Lab, and received her doctorate in 2000 in Electrical Engineering and Computer Science from the Massachusetts Institute of Technology.
Tue, Mar 7
Wed, Mar 8
Thu, Mar 9
Acting, Interacting, Collaborative Robots
Dr Danica Kragic, Royal Institute of Technology (KTH), Centre for Autonomous Systems, Stockholm, Sweden, email@example.com
The current trend in computer vision is development of data-driven approaches where the use of large amounts of data tries to compensate for the complexity of the world captured by cameras. Are these approaches also viable solutions in robotics? Apart from ‘seeing’, a robot is capable of acting, thus purposively change what and how it sees the world around it. There is a need for an interplay between processes such as attention, segmentation, object detection, recognition and categorization in order to interact with the environment. In addition, the parameterization of these is inevitably guided by the task or the goal a robot is supposed to achieve. In this talk, I will present the current state of the art in the area of robot vision and discuss open problems in the area. I will also show how visual input can be integrated with proprioception, tactile and force-torque feedback in order to plan, guide and assess robot’s action and interaction with the environment.
Interaction between two agents builds on the ability to engage in mutual prediction and signaling. Thus, human-robot interaction requires a system that can interpret and make use of human signaling strategies in a social context. Our work in this area focuses on developing a framework for human motion prediction in the context of joint action in HRI. We base this framework on the idea that social interaction is highly influences by sensorimotor contingencies (SMCs). Instead of constructing explicit cognitive models, we rely on the interaction between actions the perceptual change that they induce in both the human and the robot. This approach allows us to employ a single model for motion prediction and goal inference and to seamlessly integrate the human actions into the environment and task context.
We employ a deep generative model that makes inferences over future human motion trajectories given the intention of the human and the history as well as the task setting of the interaction. With help predictions drawn from the model, we can determine the most likely future motion trajectory and make inferences over intentions and objects of interest.
Danica Kragic is a Professor at the School of Computer Science and Communication at the Royal Institute of Technology, KTH. She received MSc in Mechanical Engineering from the Technical University of Rijeka, Croatia in 1995 and PhD in Computer Science from KTH in 2001. She has been a visiting researcher at Columbia University, Johns Hopkins University and INRIA Rennes. She is the Director of the Centre for Autonomous Systems. Danica received the 2007 IEEE Robotics and Automation Society Early Academic Career Award. She is a member of the Royal Swedish Academy of Sciences, Royal Swedish Academy of Engineering Sciences and Young Academy of Sweden. She holds a Honorary Doctorate from the Lappeenranta University of Technology. She chaired IEEE RAS Technical Committee on Computer and Robot Vision and served as an IEEE RAS AdCom member. Her research is in the area of robotics, computer vision and machine learning. In 2012, she received an ERC Starting Grant. Her research is supported by the EU, Knut and Alice Wallenberg Foundation, Swedish Foundation for Strategic Research and Swedish Research Council. She is an IEEE Fellow.
Tue, Mar 7
Wed, Mar 8
Thu, Mar 9
Of Space and Smell: The Strange Evolution of the Human Nose
Dr. Lucia Jacobs, University of California, Berkeley, Berkeley, California
We humans are the Cyrano de Bergerac’s of the primate world, with conspicuously large external noses compared to other great apes. To understand why our nose evolved we must first understand the function of olfaction. To do this requires traveling back in time to the evolution of the first brain. I will describe the hypothesis that the sense of smell evolved as a sense of direction, playing a critical role in navigation and that this function explains why olfactory systems are so plastic and variable in size across animals. The navigation function of olfaction in humans has been largely neglected and I will describe studies showing that humans can orient accurately to odors in real world and virtual reality environments. But why such a nose? To understand this, I return to an evolutionary framework to describe how the first nose, a structure used both for respiration and olfaction, appeared in air-breathing fish. I describe a new hypothesis (PROUST: perceiving and representing odor utility in space and time) to explain how the evolution of air-breathing could have forced vertebrates to segregate olfactory mapping of space to the main olfactory system and the mapping of odors across time to the newly evolved second (vomeronasal) olfactory system. This dichotomy of function and the subsequent conflict between mapping time versus space using odors, could have led to a number of novel vertebrate solutions to sample odors, including the forked tongue of the snake and the human external nose. I will end by proposing that this perspective on the evolution and function of human olfaction could enhance current paradigms in human-robot communication and decision making.
Professor Jacobs heads the Cognitive Biology Lab in the Department of Psychology and the Helen Wills Institute of Neuroscience at the University of California, Berkeley. After training in animal behavior (1978 B.S., Cornell), ecology and evolutionary biology (1987 Ph.D., Princeton) and postdoctoral fellowships in neuroscience (Universities of Toronto, Pittsburgh and Utah), she joined the Berkeley faculty in 1993. Her group focuses on the ecology and evolution of navigating choices: how animals make choices about what and where to eat, how to navigate and map new terrains and how generally to integrate diverse sources of information to make adaptive decisions in uncertain environments. Animal species include humans, search dogs and rodents (domestic and wild). Her theoretical work on navigation focuses on the evolution of limbic structures (hippocampus, olfactory systems) and their integrated role in spatial navigation. She is the recipient of a NSF CAREER award, Hellman Junior Faculty Award, Prytanean Faculty Award, Mary Rennie Epilepsy Award and a 2015 NSF Ideas Lab Collaborative Grant for olfactory navigation. She has published over 50 papers in the fields of animal behavior, animal cognition, behavioral neuroscience and brain and behavioral evolution.