PlumX Metrics
Embed PlumX Metrics

Generalization-based acquisition of training data for motor primitive learning by neural networks

Applied Sciences (Switzerland), ISSN: 2076-3417, Vol: 11, Issue: 3, Page: 1-17
2021
  • 11
    Citations
  • 0
    Usage
  • 6
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    11
    • Citation Indexes
      11
  • Captures
    6

Article Description

Autonomous robot learning in unstructured environments often faces the problem that the dimensionality of the search space is too large for practical applications. Dimensionality reduction techniques have been developed to address this problem and describe motor skills in low-dimensional latent spaces. Most of these techniques require the availability of a sufficiently large database of example task executions to compute the latent space. However, the generation of many example task executions on a real robot is tedious, and prone to errors and equipment failures. The main result of this paper is a new approach for efficient database gathering by performing a small number of task executions with a real robot and applying statistical generalization, e.g., Gaussian process regression, to generate more data. We have shown in our experiments that the data generated this way can be used for dimensionality reduction with autoencoder neural networks. The resulting latent spaces can be exploited to implement robot learning more efficiently. The proposed approach has been evaluated on the problem of robotic throwing at a target. Simulation and real-world results with a humanoid robot TALOS are provided. They confirm the effectiveness of generalization-based database acquisition and the efficiency of learning in a low-dimensional latent space.

Bibliographic Details

Zvezdan Lončarević; Rok Pahič; Aleš Ude; Andrej Gams

MDPI AG

Materials Science; Physics and Astronomy; Engineering; Chemical Engineering; Computer Science

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know