Skip to content

PyTorch implementation of ResNeSt : Split-Attention Networks

License

Notifications You must be signed in to change notification settings

STomoya/ResNeSt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ResNeSt

PyTorch implementation of ResNeSt : Split-Attention Networks [1].

This implementation is only for my understanding of the architecture of ResNeSt.
Mostly the radix-major implementation of the bottleneck block.

The official implementation

Requirements

  • docker
  • docker-compose

Model

  • Only supports dilation=1.

ToDo

  • Evaluate the model

Reference

[1] ResNeSt : Split-Attention Networks, Hang Zhang, Chongruo Wu, Zhongyue Zhang, Yi Zhu, Zhi Zhang, Haibin Lin, Yue Sun, Tong He, Jonas Mueller, R. Manmatha, Mu Li, Alexander Smola, https://arxiv.org/abs/2004.08955

Author

Sawada Tomoya

About

PyTorch implementation of ResNeSt : Split-Attention Networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published