Readme
Marigold Intrinsics (IID Lighting v1.1)
This model estimates intrinsic image decomposition from a single RGB image using the Marigold IID Lighting pipeline from ETH Zurich. It produces three physically-meaningful outputs:
Albedo β surface color without lighting effects
Shading β illumination & geometry-based lighting
Residual β information that cannot be fully decomposed
This is useful for:
Room visualization & material replacement
Relighting workflows
AR/VR scene understanding
Normal map / lighting estimation pipelines
π How it Works
This implementation wraps the official model: prs-eth/marigold-iid-lighting-v1-1
When you upload an RGB image, the model runs a short diffusion process and outputs the decomposed intrinsic layers.
π₯ Input Parameter Type Default Description image File (PNG/JPG/WebP) required Input RGB image num_inference_steps Integer 4 Denoising steps (higher = better quality, slower) return_zip Boolean false If true, all outputs returned as one ZIP file π€ Output
You will receive 3 separate images (or ZIP if selected):
Filename Description albedo.png Pure surface color shading.png Lighting-only component residual.png Residual signal
These files can be brought into Blender/Photoshop for texture editing, or combined for relighting.
π§ Model Details Property Value Framework PyTorch + Diffusers Device GPU if available Precision FP16 on GPU, FP32 on CPU Source Model Marigold IID Lighting v1.1 π Example Usage Basic Inference
Upload any clean, well-lit indoor scene image:
image: input.png
Higher Quality / Slower num_inference_steps: 15
ZIP Output return_zip: true
π‘ Tips for Best Results
β Indoor scenes with defined surfaces β Avoid extreme motion blur or very noisy images β Higher resolution inputs give more detail
π Credits
π¬ Research: π Marigold: Intrinsics-Guided Diffusion for Inverse Rendering ETH Zurich β Visual Computing Group
Model Source: Hugging Face β prs-eth/marigold-iid-lighting-v1-1