Enabling Robots to Understand Indirect Speech Acts in Task-Based Interactions

Gordon Briggs, Tom Williams, Matthias Scheutz

Abstract


An important open problem for enabling truly taskable robots is the lack of task-general natural language mechanisms within cognitive robot architectures that enable robots to understand typical forms of human directives and generate appropriate responses. In this paper, we first provide experimental evidence that humans tend to phrase their directives to robots indirectly, especially in socially conventionalized contexts. We then introduce pragmatic and dialogue-based mechanisms to infer intended meanings from such indirect speech acts and demonstrate that these mechanisms can handle all indirect speech acts found in our experiment as well as other common forms of requests.


Keywords


Human-robot dialogue, human perceptions of robot communication, robot architectures, speech act theory, intention understanding

Full Text:

PDF


DOI: https://doi.org/10.5898/JHRI.6.1.Briggs

Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.