Readme
DINOv3 (ViT-L/16 Distilled)
This is a deployment of DINOv3 (ViT-L/16 Distilled), developed by Meta FAIR. It is a self-supervised vision transformer trained on the massive LVD-1689M dataset (1.69 billion images).
This specific model uses the Distilled weights, offering near-teacher performance (from the 7B model) while being efficient enough to run on standard hardware.
Model Details
- Architecture: ViT-L/16 (Vision Transformer Large, Patch Size 16)
- Parameters: ~300 Million
- Training Objective: Self-supervised learning (DINOv3)
- Training Data: LVD-1689M
Intended Use
This model computes dense visual embeddings. It excels at: - Visual Similarity Search: Finding duplicate or near-duplicate images (e.g., distinguishing specific logos or merchandise tags). - Image Retrieval: Finding images that look similar based on geometry and texture rather than just text captions. - Feature Extraction: Generating embeddings for downstream classification or clustering tasks.
How to use
This model accepts an image as input and returns the Normalized CLS Token Embedding.
Python Example
You can use the output vector to calculate Cosine Similarity between two images.
import replicate
import numpy as np
from numpy.linalg import norm
def cosine_similarity(a, b):
return np.dot(a, b) / (norm(a) * norm(b))
# 1. Get embedding for Image A
output_a = replicate.run(
"your-username/dinov3-vit-large-distilled:version_id",
input={"image": open("tag_a.jpg", "rb")}
)
# 2. Get embedding for Image B
output_b = replicate.run(
"your-username/dinov3-vit-large-distilled:version_id",
input={"image": open("tag_b.jpg", "rb")}
)
# 3. Compare
score = cosine_similarity(output_a, output_b)
print(f"Similarity Score: {score:.4f}")
Attribution & License
This model is a wrapper around the DINOv3 weights released by Meta. Original Authors: Meta Fundamental AI Research (FAIR) Repository: facebookresearch/dinov3 License: Usage of this model is subject to the DINOv3 License Agreement. Disclaimer: This is a third-party deployment. Please refer to the official Meta repository for original source code and research papers.