Language-Conditioned Goal Generation: a New Approach to Language Grounding in RL - ENSTA Paris - École nationale supérieure de techniques avancées Paris Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2021

Language-Conditioned Goal Generation: a New Approach to Language Grounding in RL

Résumé

In the real world, linguistic agents are also embodied agents: they perceive and act in the physical world. The notion of Language Grounding questions the interactions between language and embodiment: how do learning agents connect or ground linguistic representations to the physical world? This question has recently been approached by the Reinforcement Learning community under the framework of instruction-following agents. In these agents, behavioral policies or reward functions are conditioned on the embedding of an instruction expressed in natural language. This paper proposes another approach: using language to condition goal generators. Given any goal-conditioned policy, one could train a language-conditioned goal generator to generate language-agnostic goals for the agent. This method allows to decouple sensorimotor learning from language acquisition and enable agents to demonstrate a diversity of behaviors for any given instruction. We propose a particular instantiation of this approach and demonstrate its benefits.
Fichier principal
Vignette du fichier
2006.07043.pdf (1013.19 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03099887 , version 1 (06-01-2021)

Identifiants

  • HAL Id : hal-03099887 , version 1

Citer

Cédric Colas, Ahmed Akakzia, Pierre-Yves Oudeyer, Mohamed Chetouani, Olivier Sigaud. Language-Conditioned Goal Generation: a New Approach to Language Grounding in RL. 2021. ⟨hal-03099887⟩
68 Consultations
273 Téléchargements

Partager

Gmail Facebook X LinkedIn More