项目作者: avivga

项目描述 :
Pytorch re-implementation of "Demystifying Inter-Class Disentanglement", ICLR 2020.
高级语言: Python
项目地址: git://github.com/avivga/lord-pytorch.git
创建时间: 2021-01-12T15:02:04Z
项目社区:https://github.com/avivga/lord-pytorch

开源协议:Other

下载


LORD

Demystifying Inter-Class Disentanglement
Aviv Gabbay and Yedid Hoshen
International Conference on Learning Representations (ICLR), 2020.
Pytorch re-implementation (thanks to @dneuhof) [Official tensorflow implementation]

Content transfer between classes

Cars3D SmallNorb KTH
image image image
CelebA
image

Usage

Dependencies

  • python >= 3.6
  • numpy >= 1.15.4
  • pytorch >= 1.3.0
  • opencv >= 3.4.4
  • dlib >= 19.17.0

Getting started

Training a model for disentanglement requires several steps.

Preprocessing an image dataset

Preprocessing a local copy of one of the supported datasets can be done as follows:

  1. lord.py --base-dir <output-root-dir> preprocess
  2. --dataset-id {mnist,smallnorb,cars3d,shapes3d,celeba,kth,rafd}
  3. --dataset-path <input-dataset-path>
  4. --data-name <output-data-filename>

Splitting a preprocessed dataset into train and test sets can be done according to one of two configurations:

  1. lord.py --base-dir <output-root-dir> split-classes
  2. --input-data-name <input-data-filename>
  3. --train-data-name <output-train-data-filename>
  4. --test-data-name <output-test-data-filename>
  5. --num-test-classes <number-of-random-test-classes>
  1. lord.py --base-dir <output-root-dir> split-samples
  2. --input-data-name <input-data-filename>
  3. --train-data-name <output-train-data-filename>
  4. --test-data-name <output-test-data-filename>
  5. --test-split <ratio-of-random-test-samples>

Training a model

Given a preprocessed train set, training a model with latent optimization (first stage) can be done as follows:

  1. lord.py --base-dir <output-root-dir> train
  2. --data-name <input-preprocessed-data-filename>
  3. --model-name <output-model-name>

Training encoders for amortized inference (second stage) can be done as follows:

  1. lord.py --base-dir <output-root-dir> train-encoders
  2. --data-name <input-preprocessed-data-filename>
  3. --model-name <input-model-name>

Citing

If you find this project useful for your research, please cite

  1. @inproceedings{gabbay2020lord,
  2. author = {Aviv Gabbay and Yedid Hoshen},
  3. title = {Demystifying Inter-Class Disentanglement},
  4. booktitle = {International Conference on Learning Representations (ICLR)},
  5. year = {2020}
  6. }