Skip to content

Modern Hopfield Networks with Continuous-Time Memories

License

Notifications You must be signed in to change notification settings

deep-spin/CHM-Net

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Modern Hopfield Networks With Continuous-Time Memories

Official implementation of the paper Modern Hopfield Networks With Continuous-Time Memories.

Saul Santos, António Farinhas, Daniel McNamee and André Martins

Abstract: Recent research has established a connection between modern Hopfield networks (HNs) and transformer attention heads, with guarantees of exponential storage capacity. However, these models still face challenges scaling storage efficiently. Inspired by psychological theories of continuous neural resource allocation in working memory, we propose an approach that compresses large discrete Hopfield memories into smaller, continuous-time memories. Leveraging continuous attention, our new energy function modifies the update rule of HNs, replacing the traditional softmax-based probability mass function with a probability density over the continuous memory. This formulation aligns with modern perspectives on human executive function, offering a principled link between attractor dynamics in working memory and resource-efficient memory allocation. Our framework maintains competitive performance with HNs while leveraging a compressed memory, reducing computational costs across synthetic and video datasets.


If you use this code in your work, please cite our paper.


Resources

All material is made available under the MIT license. You can use, redistribute, and adapt the material for non-commercial purposes, as long as you give appropriate credit by citing our paper and indicating any changes that you've made.

Reproducibility

Run the corresponding scripts.

About

Modern Hopfield Networks with Continuous-Time Memories

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages