tengfei-wang / hfgi

High-Fidelity GAN Inversion for Image Attribute Editing

  • Public
  • 22.1K runs
  • GitHub
  • Paper

Input

Output

Run time and cost

This model costs approximately $0.0011 to run on Replicate, or 909 runs per $1, but this varies depending on your inputs. It is also open source and you can run it on your own computer with Docker.

This model runs on Nvidia T4 GPU hardware. Predictions typically complete within 5 seconds.

Readme

High-Fidelity GAN Inversion for Image Attribute Editing

Update: We released the inference code and the pre-trained model on Oct. 31. The training code is coming soon.

paper | project website | demo video

Introduction

We present a novel high-fidelity GAN inversion framework that enables attribute editing with image-specific details well-preserved (e.g., background, appearance and illumination).

Citation

If you find this work useful for your research, please cite:

@article{wang2021HFGI,
      author = {Tengfei Wang and Yong Zhang and Yanbo Fan and Jue Wang and Qifeng Chen},
      title = {High-Fidelity GAN Inversion for Image Attribute Editing}, 
      journal = {arxiv:2109.06590},  
      year = {2021}
}