Joint Intrinsic Motivation for Coordinated Exploration in Multi-Agent Deep Reinforcement Learning

M Toquebiau, N Bredeche, F Benamar… - arXiv preprint arXiv …, 2024 - arxiv.org
M Toquebiau, N Bredeche, F Benamar, JY Jun
arXiv preprint arXiv:2402.03972, 2024arxiv.org
Multi-agent deep reinforcement learning (MADRL) problems often encounter the challenge
of sparse rewards. This challenge becomes even more pronounced when coordination
among agents is necessary. As performance depends not only on one agent's behavior but
rather on the joint behavior of multiple agents, finding an adequate solution becomes
significantly harder. In this context, a group of agents can benefit from actively exploring
different joint strategies in order to determine the most efficient one. In this paper, we …
Multi-agent deep reinforcement learning (MADRL) problems often encounter the challenge of sparse rewards. This challenge becomes even more pronounced when coordination among agents is necessary. As performance depends not only on one agent's behavior but rather on the joint behavior of multiple agents, finding an adequate solution becomes significantly harder. In this context, a group of agents can benefit from actively exploring different joint strategies in order to determine the most efficient one. In this paper, we propose an approach for rewarding strategies where agents collectively exhibit novel behaviors. We present JIM (Joint Intrinsic Motivation), a multi-agent intrinsic motivation method that follows the centralized learning with decentralized execution paradigm. JIM rewards joint trajectories based on a centralized measure of novelty designed to function in continuous environments. We demonstrate the strengths of this approach both in a synthetic environment designed to reveal shortcomings of state-of-the-art MADRL methods, and in simulated robotic tasks. Results show that joint exploration is crucial for solving tasks where the optimal strategy requires a high level of coordination.
arxiv.org
Showing the best result for this search. See all results