A ~7B parameter language model from Deepseek for SOTA repository level code completion
This model is cold. You'll get a fast response if the model is warm and already running, and a slower response if the model is cold and starting up.