🚀 Want to run this model with an API? Get started

arielreplicate/tres_iqa

Public
No-Reference Image Quality Assessment via Transformers
123 runs

Run time and cost

Predictions run on Nvidia T4 GPU hardware.

No-Reference Image Quality Assessment via Transformers, Relative Ranking, and Self-Consistency (WACV 2022) Video

wacv2021

About the demo

Note: Lower score is better!
This demo uses the pretrained model from the LIVE dataset
downloaded from Here.

About the model

Image quality assessment (IQA) attempts to use computational models to predict the image quality in a manner that is consistent with quality ratings provided by human subjects.
No-Reference Image Quality Assessment (NR-IQA) means assessing the image quality without a "clean" image to compare to, i.e, predict a score given a single image input.
This model is a NR-IQA model.

Acknowledgement

This code is borrowed parts from HyperIQA and DETR.

Citation

If you find this work useful for your research, please cite our paper:

@InProceedings{golestaneh2021no,
  title={No-Reference Image Quality Assessment via Transformers, Relative Ranking, and Self-Consistency},
  author={Golestaneh, S Alireza and Dadsetan, Saba and Kitani, Kris M},
  booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
  pages={3209--3218},
  year={2022}
}

If you have any questions about our work, please do not hesitate to contact isalirezag@gmail.com