svn>> dda>> 返回
项目作者: moskomule

项目描述 :
Differentiable Data Augmentation Library
高级语言: Jupyter Notebook
项目地址: git://github.com/moskomule/dda.git
创建时间: 2020-02-29T11:25:44Z
项目社区:https://github.com/moskomule/dda

开源协议:MIT License

下载


Differentiable Data Augmentation Library

This library is a core of Faster AutoAugment and its descendants. This library is research oriented, and its AIP may change in the near future.

Requirements and Installation

Requirements

  1. Python>=3.8
  2. PyTorch>=1.5.0
  3. torchvision>=0.6
  4. kornia>=0.2

Installation

  1. pip install -U git+https://github.com/moskomule/dda

APIs

dda.functional

Basic operations that can be differentiable w.r.t. the magnitude parameter mag. When mag=0, no augmentation is applied, and when mag=1 (and mag=-1 if it exists), the severest augmentation is applied. As introduced in Faster AutoAugment, some operations use straight-through estimator to be differentiated w.r.t. their magnitude parameters.

  1. def operation(img: torch.Tensor,
  2. mag: Optional[torch.Tensor]) -> torch.Tensor:
  3. ...

dda.pil contains the similar APIs using PIL (not differentiable).

dda.operations

  1. class Operation(nn.Module):
  2. def __init__(self,
  3. initial_magnitude: Optional[float] = None,
  4. initial_probability: float = 0.5,
  5. magnitude_range: Optional[Tuple[float, float]] = None,
  6. probability_range: Optional[Tuple[float, float]] = None,
  7. temperature: float = 0.1,
  8. flip_magnitude: bool = False,
  9. magnitude_scale: float = 1,
  10. debug: bool = False):
  11. ...

If magnitude_range=None, probability_range=None, then magnitude, probability is not Parameter but Buffer, respectively.

magnitude moves in magnitude_scale * magnitude_range.
For example, dda.operations.Rotation has magnitude_range=[0, 1] and magnitude_scale=30 so that magnitude is between 0 to 30 degrees.

To differentiate w.r.t. the probability parameter, RelaxedBernoulli is used.

Examples

Citation

dda (except RandAugment) is developed as a core library of the following research projects.

If you use dda in your academic research, please cite hataya2020a.

  1. @inproceesings{hataya2020a,
  2. title={{Faster AutoAugment: Learning Augmentation Strategies using Backpropagation}},
  3. author={Ryuichiro Hataya and Jan Zdenek and Kazuki Yoshizoe and Hideki Nakayama},
  4. year={2020},
  5. booktitle={ECCV}
  6. }
  7. ...