Skip to content

repo of random sequential attention mechanism experiments, paper reproductions, sandboxing, etc.

Notifications You must be signed in to change notification settings

velocirabbit/Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Sequential Attention mechanisms

Repo of sequential attention mechanisms from various papers, including their implementations and studies of how they behave within various model architectures.

Using and viewing

The implementations themselves are primarily done in PyTorch (v0.3 for now). They're built in a way that lets them be easily imported and used into any model following the documented input/output shapes.

The tests and studies of the implementations are done in Jupyter notebooks and can be viewed without having PyTorch installed (but not runnable without it).

References

About

repo of random sequential attention mechanism experiments, paper reproductions, sandboxing, etc.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published