ICML 2024 - Memory Efficient Neural Processes via Constant Memory Attention Block
Автор: Borealis AI
Загружено: 2024-07-19
Просмотров: 117
Описание:
Memory Efficient Neural Processes via Constant Memory Attention Block, L. Feng, F. Tung, H. Hajimirsadeghi, Y. Bengio, M. O. Ahmed, International Conference on Machine Learning (ICML), 2024.
This paper proposes Constant Memory Attention Block (CMAB), a novel general-purpose attention block that (1) is permutation invariant, (2) computes its output in constant memory, and (3) performs updates in constant computation. Building on CMAB, we propose Constant Memory Attentive Neural Processes (CMANPs), an NP variant which only requires \textbf{constant} memory. Empirically, we show CMANPs achieve state-of-the-art results on popular NP benchmarks (meta-regression and image completion) while being significantly more memory efficient than prior methods.
A link to the paper can be found here: https://arxiv.org/pdf/2305.14567
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: