Creating and sharing knowledge for telecommunications

$infty$-former: Infinite Memory Transformer

Martins, P. H. ; Marinho, Z. ; Martins, A.

$infty$-former: Infinite Memory Transformer, Proc Annual Meeting of the Association for Computational Linguistics - ACL, Dublin, Ireland, Vol. , pp. - , May, 2022.

Digital Object Identifier:

 

Abstract
Transformers are unable to model long-term memories effectively, since the amount of computation they need to perform grows with the context length. While variations of efficient transformers have been proposed, they all have a finite memory capacity and are forced to drop old information. In this paper, we propose the $infty$-former, which extends the vanilla transformer with an unbounded long-term memory. By making use of a continuous-space attention mechanism to attend over the long-term memory, the $infty$-former's attention complexity becomes independent of the context length, trading off memory length with precision. In order to control where precision is more important, $infty$-former maintains ``sticky memories,'' being able to model arbitrarily long contexts while keeping the computation budget fixed.
Experiments on a synthetic sorting task, language modeling, and document grounded dialogue generation demonstrate the $infty$-former's ability to retain information from long sequences.