List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Learning Efficient Multi-agent Communication: An Information Bottleneck Approach - 2020

Learning efficient multi-agent communication: An information bottleneck approach

Research paper on Learning Efficient Multi-agent Communication: An Information Bottleneck Approach

Research Area:  Machine Learning

Abstract:

We consider the problem of the limited-bandwidth communication for multi-agent reinforcement learning, where agents cooperate with the assistance of a communication protocol and a scheduler. The protocol and scheduler jointly determine which agent is communicating what message and to whom. Under the limited bandwidth constraint, a communication protocol is required to generate informative messages. Meanwhile, an unnecessary communication connection should not be established because it occupies limited resources in vain. In this paper, we develop an Informative Multi-Agent Communication (IMAC) method to learn efficient communication protocols as well as scheduling. First, from the perspective of communication theory, we prove that the limited bandwidth constraint requires low-entropy messages throughout the transmission. Then inspired by the information bottleneck principle, we learn a valuable and compact communication protocol and a weight-based scheduler. To demonstrate the efficiency of our method, we conduct extensive experiments in various cooperative and competitive multi-agent tasks with different numbers of agents and different bandwidths. We show that IMAC converges faster and leads to efficient communication among agents under the limited bandwidth as compared to many baseline methods.

Keywords:  
Multi-agent Communication
Bottleneck Approach
multi-agent reinforcement learning
Deep Learning

Author(s) Name:  Rundong Wang, Xu He, Runsheng Yu, Wei Qiu, Bo An, Zinovi Rabinovich

Journal name:  

Conferrence name:  Proceedings of the 37th International Conference on Machine Learning

Publisher name:  MLR Press

DOI:  

Volume Information: