< Back to more papers

Associative Memory Augmented Asynchronous Spatiotemporal Representation Learning for Event-based Perception


Uday Kamal, Saurabh Dash, Saibal Mukhopahdyay


We propose EventFormer– a computationally efficient event-based representation learning framework for asynchronously processing event camera data. EventFormer treats sparse input events as a spatially unordered set and models their spatial interactions using self-attention mechanism. An associative memory augmented recurrent module is used to correlate with the stored representation computed from past events. A memory addressing mechanism is proposed to store and retrieve the latent states only where these events occur and update them only when they occur. The representation learning shift from input space to the latent memory space resulting in reduced computation cost for processing each event. We show that EventFormer achieves 0.5% and 9% better accuracy with 30000× and 200× less computation compared to the state-of-the-art dense and event-based method, respectively, on event-based object recognition datasets.