Code3: A System for End-to-End Programming of Mobile Manipulator Robots for Novices and Experts
Justin Huang, Maya Cakmak
This paper introduces Code3, a system for user-friendly and rapid programming of mobile manipulator robots. The system enables general programmers with little to no robotics experience to program robots. Code3 consists of three integrated components: perception, manipulation, and high-level programming. The perception component helps users define a library of object and scene parts that the robot can later detect. The manipulation component lets users define actions for manipulating objects or scene parts through programming by demonstration techniques. Finally, the high-level programming component offers a drag-and-drop interface with which users can define logic and control flow to accomplish a task using their previously specified perception and manipulation capabilities. We present findings from a two-session user study with non-roboticist programmers (N=10) that demonstrate their ability to quickly learn Code3 and program a PR2 robot to do useful tasks. We also demonstrate how an expert can use the system to program complex tasks in orders of magnitude less time than it would take to code by hand in traditional robot programming frameworks such as ROS.
Kinesthetic Teaching of Re-usable Skills â€“ Rapid and Simple Programming of a Safe Industrial Robot
Maj Stenmark, Elin Anna Topp, Mathias Haage
The development of robust non-expert programming systems is a long-standing challenge in robotics. Now emphasized with recently emerged collaborative industrial robots with a new feature set, such as built-in force controlled motion, vision, 7 degrees of freedom arms and dual arms. These features and the fact that the operator is being able to stay in close proximity during both programming and execution phases calls for a re-visit to shop-floor programming tools. This paper presents a tool prototype for iconic robot programming with a hybrid programming and execution mode. The tool was evaluated with 21 non-expert users with varying programming and robotics experience, including one nine year old. We also present a comparison of the programming times for an expert robot programmer using traditional tools versus this new method. The expert could program the same tasks in 1/5 of the time compared to traditional tools and the non-experts were able to program and debug a LEGO building task using the robot within 30 minutes.
Situated Tangible Robot Programming
Yasaman Sefidgar, Prerna Agarwal, Maya Cakmak
This paper introduces situated tangible robot programming in which a robot is programmed by placing specially designed tangible “blocks” in its workspace. These blocks are used for annotating objects, locations, or regions, and specifying actions and their ordering. The robot compiles a program by detecting blocks and objects in its workspace and grouping them into instructions by solving constraints. We present a proof-of-concept implementation using blocks with unique visual markers in a pick-and-place task domain. Three user studies evaluate the intuitiveness and learnability of situated tangible programming and iterate the block design. We characterize common challenges and gather feedback on how to further improve the design of blocks. Our studies demonstrate that people can interpret, generalize, and create many different situated tangible programs with minimal instruction or even with no instruction.
Not your cup of tea? How teaching a robot can increase perceived self-efficacy in HRI and technology acceptance
Astrid Rosenthal von der Pütten, Nikolai Bock, Katharina Brockmann
The goal of this work is to explore the influence of do-it-yourself customization of a robot on technologically experienced students and unexperienced elderly users’ perceived self-efficacy in HRI, uncertainty, and technology acceptance. We introduce the Self-Efficacy in HRI Scale and present two experimental studies. In study 1 (students, n=40) we found that actively teaching a robot objects relevant for a subsequent social interaction significantly increases perceived self-efficacy in HRI in comparison to reading a fact sheet about the robot’s capabilities. Moreover, interacting with the robot itself regardless of the previous treatment increased self-efficacy. In a second study with elderly users (n=60) we could replicate the positive effect of the interaction on self-efficacy, but not the effect of do-it-yourself customization by training the robot. We discuss limitations of the setting and implications for questionnaire design for elderly participant
Event Timeslots (1)
Thu, Mar 9