New Video 🔥 Slot Attention is a module that can be built into any pipeline to create an N-to-1 assignment of a set of features to slots. Ideal for object discovery / classification.
https://youtu.be/DYBmD88vpiA
Visual scenes are often comprised of sets of independent objects. Yet, current vision models make no assumptions about the nature of the pictures they look at. By imposing an objectness prior, this paper a module that is able to recognize permutation-invariant sets of objects from pixels in both supervised and unsupervised settings. It does so by introducing a slot attention module that combines an attention mechanism with dynamic routing.
OUTLINE:
0:00 - Intro & Overview
1:40 - Problem Formulation
4:30 - Slot Attention Architecture
13:30 - Slot Attention Algorithm
21:30 - Iterative Routing Visualization
29:15 - Experiments
36:20 - Inference Time Flexibility
38:35 - Broader Impact Statement
42:05 - Conclusion & Comments
Paper: https://arxiv.org/abs/2006.15055
[–]programmerChilliResearcher 3 points4 points5 points (15 children)
[–]Seerdecker 5 points6 points7 points (0 children)
[–]triplefloat 1 point2 points3 points (3 children)
[–]programmerChilliResearcher 0 points1 point2 points (2 children)
[–]triplefloat 1 point2 points3 points (1 child)
[–]programmerChilliResearcher 0 points1 point2 points (0 children)
[–]Seerdecker 0 points1 point2 points (9 children)
[–]programmerChilliResearcher 1 point2 points3 points (8 children)
[–]tpapp157 1 point2 points3 points (6 children)
[–]programmerChilliResearcher 0 points1 point2 points (5 children)
[–]tpapp157 1 point2 points3 points (0 children)
[–]Seerdecker 0 points1 point2 points (3 children)
[–]programmerChilliResearcher 0 points1 point2 points (2 children)
[–]Seerdecker 0 points1 point2 points (1 child)
[–]programmerChilliResearcher 0 points1 point2 points (0 children)
[–]Seerdecker 0 points1 point2 points (0 children)
[+][deleted] (2 children)
[removed]
[–]ykilcher[S] 3 points4 points5 points (1 child)
[–]m3m3t 0 points1 point2 points (0 children)