项目作者: d-li14

项目描述 :
PyTorch implementation of Lambda Network and pretrained Lambda-ResNet
高级语言: Python
项目地址: git://github.com/d-li14/lambda.pytorch.git
创建时间: 2020-11-28T00:25:43Z
项目社区:https://github.com/d-li14/lambda.pytorch

开源协议:

下载


lambda.pytorch

[NEW!] Check out our latest work involution in CVPR’21 that bridges convolution and self-attention operators.


PyTorch implementation of LambdaNetworks: Modeling long-range Interactions without Attention.

Lambda Networks apply associative law of matrix multiplication to reverse the computing order of self-attention, achieving the linear computation complexity regarding content interactions.

Similar techniques have been used previously in A2-Net and CGNL. Check out a collection of self-attention modules in another repository dot-product-attention.

Training Configuration

✓ SGD optimizer, initial learning rate 0.1, momentum 0.9, weight decay 0.0001

✓ epoch 130, batch size 256, 8x Tesla V100 GPUs, LR decay strategy cosine

✓ label smoothing 0.1

Pre-trained checkpoints

Architecture Parameters FLOPs Top-1 / Top-5 Acc. (%) Download
Lambda-ResNet-50 14.995M 6.576G 78.208 / 93.820 /g/personal/dlibh_connect_ust_hk/EUZkICtpXitIq6PGa6h6m_YBnFXCiCYTSuqoIUqiR33C5A?e=mhgEbC">model /g/personal/dlibh_connect_ust_hk/EQuZ1itCS2dFpN2MBVepL5YBQe9N-ZUv6y4vNdO5uiVFig?e=dX7Id1">log

Citation

If you find this repository useful in your research, please cite

  1. @InProceedings{Li_2021_CVPR,
  2. author = {Li, Duo and Hu, Jie and Wang, Changhu and Li, Xiangtai and She, Qi and Zhu, Lei and Zhang, Tong and Chen, Qifeng},
  3. title = {Involution: Inverting the Inherence of Convolution for Visual Recognition},
  4. booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  5. month = {June},
  6. year = {2021}
  7. }
  1. @inproceedings{
  2. bello2021lambdanetworks,
  3. title={LambdaNetworks: Modeling long-range Interactions without Attention},
  4. author={Irwan Bello},
  5. booktitle={International Conference on Learning Representations},
  6. year={2021},
  7. }