Readme
This model doesn't have a readme.
pip install replicate
REPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
import replicate
Run isaacgv/cog-whisper-v2 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run(
"isaacgv/cog-whisper-v2:b1dafca59e8c39871f0e84c191cbe1ccb14a28089505a7527a462cc5cab7dc9e",
input={
"model": "large-v2",
"language": "af",
"translate": False,
"temperature": 0,
"transcription": "plain text",
"suppress_tokens": "-1",
"word_timestamps": True,
"logprob_threshold": -1,
"no_speech_threshold": 0.6,
"condition_on_previous_text": True,
"compression_ratio_threshold": 2.4,
"temperature_increment_on_fallback": 0.2
}
)
print(output)
To learn more, take a look at the guide on getting started with Python.
No output yet! Press "Submit" to start a prediction.
This model runs on Nvidia A100 (80GB) GPU hardware. We don't yet have enough runs of this model to provide performance information.
This model doesn't have a readme.