List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Simplified Priors for Object-Centric Learning - 2024

simplified-priors-for-object-centric-learning.png

Research Paper on Simplified Priors for Object-Centric Learning

Research Area:  Machine Learning

Abstract:

Humans excel at abstracting data and constructing emph{reusable} concepts, a capability lacking in current continual learning systems. The field of object-centric learning addresses this by developing abstract representations, or slots, from data without human supervision. Different methods have been proposed to tackle this task for images, whereas most are overly complex, non-differentiable, or poorly scalable. In this paper, we introduce a conceptually simple, fully-differentiable, non-iterative, and scalable method called SAMP Simplified Slot Attention with Max Pool Priors). It is implementable using only Convolution and MaxPool layers and an Attention layer. Our method encodes the input image with a Convolutional Neural Network and then uses a branch of alternating Convolution and MaxPool layers to create specialized sub-networks and extract primitive slots. These primitive slots are then used as queries for a Simplified Slot Attention over the encoded image. Despite its simplicity, our method is competitive or outperforms previous methods on standard benchmarks.

Keywords:  

Author(s) Name:  Vihang Patil, Andreas Radler, Daniel Klotz, Sepp Hochreiter

Journal name:  Computer Vision and Pattern Recognition

Conferrence name:  

Publisher name:  arXiv

DOI:  10.48550/arXiv.2410.00728

Volume Information:  Volume 31, (2024)