Please use this identifier to cite or link to this item:
Type of publication: research article
Type of publication (PDB): Straipsnis Clarivate Analytics Web of Science / Article in Clarivate Analytics Web of Science (S1)
Field of Science: Informatika / Informatics (N009)
Author(s): Woergoetter, Florentin;Tamošiūnaitė, Minija
Title: Helping a robot to understand human actions and objects: a grammatical view
Is part of: Artificial life and robotics. New York: Springer, 2020, Vol. 25, iss. 3
Extent: p. 388-392
Date: 2020
Keywords: Veiksmas;Žmogaus stebėjimas;Objektas;Semantika;Sintaksė;Action;Human observation;Object;Semantics;Syntax
Abstract: Humans are able to perform a wide variety of complex actions manipulating a very large number of objects. We can make predictions on the outcome of our actions and on how to use different objects. Hence, we have excellent action and object understanding. Artificial agents, on the other hand, still miserably fail in this respect. It is particularly puzzling how inexperienced, young humans can acquire such knowledge; bootstrapped by exploration and extended by supervision. In this study we have, therefore, addressed the question how to structure the realm of actions and objects into dynamic representations, which allow for the easy learning of different action and object concepts. Performing different manipulation actions on a table top (e.g. the actions of “making a breakfast”), we show with our robots that this will indeed lead to some kind of implicit (un-reflected) understanding of action and object concepts allowing the agent to generalize actions and redefine object uses according to need
Appears in Collections:Universiteto mokslo publikacijos / University Research Publications

Files in This Item:
marc.xml6.29 kBXMLView/Open

MARC21 XML metadata

Show full item record
Export via OAI-PMH Interface in XML Formats
Export to Other Non-XML Formats

CORE Recommender

Page view(s)

checked on May 1, 2021


checked on May 1, 2021

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.