ttsds
/
fishspeech_1_5
The Fish Speech V1.5 model.
Prediction
ttsds/fishspeech_1_5:f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bcIDdqh5dvhtqxrm80cmnrdsxtxqwgStatusSucceededSourceWebHardwareL40STotal durationCreatedInput
- text
- With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.
- text_reference
- and keeping eternity before the eyes, though much
- speaker_reference
- Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%00:00:000Stream Type LIVERemaining Time -00:00:0001x
- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "text": "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run ttsds/fishspeech_1_5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "ttsds/fishspeech_1_5:f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc", { input: { text: "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", text_reference: "and keeping eternity before the eyes, though much", speaker_reference: "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run ttsds/fishspeech_1_5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "ttsds/fishspeech_1_5:f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc", input={ "text": "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run ttsds/fishspeech_1_5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc", "input": { "text": "With tenure, Suzie\'d have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%00:00:000Stream Type LIVERemaining Time -00:00:0001x- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "completed_at": "2025-01-28T18:11:58.474117Z", "created_at": "2025-01-28T18:10:07.679000Z", "data_removed": false, "error": null, "id": "dqh5dvhtqxrm80cmnrdsxtxqwg", "input": { "text": "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" }, "logs": "2025-01-28 18:11:53.629 | INFO | tools.llama.generate:generate_long:789 - Encoded text: With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.\n2025-01-28 18:11:53.630 | INFO | tools.llama.generate:generate_long:807 - Generating sentence 1/1 of sample 1/1\n 0%| | 0/8055 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.10/lib/python3.11/site-packages/torch/backends/cuda/__init__.py:342: FutureWarning: torch.backends.cuda.sdp_kernel() is deprecated. In the future, this context manager will be removed. Please see, torch.nn.attention.sdpa_kernel() for the new context manager, with updated signature.\nwarnings.warn(\n 0%| | 4/8055 [00:00<03:46, 35.48it/s]\n 0%| | 8/8055 [00:00<03:41, 36.25it/s]\n 0%| | 12/8055 [00:00<03:40, 36.52it/s]\n 0%| | 16/8055 [00:00<03:39, 36.67it/s]\n 0%| | 20/8055 [00:00<03:38, 36.78it/s]\n 0%| | 24/8055 [00:00<03:38, 36.70it/s]\n 0%| | 28/8055 [00:00<03:37, 36.85it/s]\n 0%| | 32/8055 [00:00<03:37, 36.81it/s]\n 0%| | 36/8055 [00:00<03:37, 36.84it/s]\n 0%| | 40/8055 [00:01<03:37, 36.88it/s]\n 1%| | 44/8055 [00:01<03:41, 36.10it/s]\n 1%| | 48/8055 [00:01<03:49, 34.91it/s]\n 1%| | 52/8055 [00:01<03:47, 35.15it/s]\n 1%| | 56/8055 [00:01<03:46, 35.34it/s]\n 1%| | 60/8055 [00:01<03:48, 34.99it/s]\n 1%| | 64/8055 [00:01<03:58, 33.51it/s]\n 1%| | 68/8055 [00:01<03:59, 33.34it/s]\n 1%| | 72/8055 [00:02<03:52, 34.33it/s]\n 1%| | 76/8055 [00:02<03:47, 35.08it/s]\n 1%| | 80/8055 [00:02<03:43, 35.62it/s]\n 1%| | 84/8055 [00:02<03:41, 35.98it/s]\n 1%| | 88/8055 [00:02<03:40, 36.07it/s]\n 1%| | 92/8055 [00:02<03:40, 36.06it/s]\n 1%| | 96/8055 [00:02<03:38, 36.34it/s]\n 1%| | 100/8055 [00:02<03:38, 36.43it/s]\n 1%|▏ | 104/8055 [00:02<03:37, 36.54it/s]\n 1%|▏ | 108/8055 [00:03<03:37, 36.47it/s]\n 1%|▏ | 112/8055 [00:03<03:38, 36.34it/s]\n 1%|▏ | 116/8055 [00:03<03:37, 36.50it/s]\n 1%|▏ | 120/8055 [00:03<03:36, 36.69it/s]\n 2%|▏ | 124/8055 [00:03<03:36, 36.70it/s]\n 2%|▏ | 128/8055 [00:03<03:36, 36.64it/s]\n 2%|▏ | 132/8055 [00:03<03:36, 36.54it/s]\n 2%|▏ | 136/8055 [00:03<03:35, 36.69it/s]\n 2%|▏ | 140/8055 [00:03<03:35, 36.78it/s]\n 2%|▏ | 144/8055 [00:03<03:34, 36.88it/s]\n2%|▏ | 147/8055 [00:04<03:40, 35.85it/s]\n2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:861 - Generated 149 tokens in 4.31 seconds, 34.57 tokens/sec\n2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:864 - Bandwidth achieved: 22.06 GB/s\n2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:869 - GPU Memory used: 2.02 GB\nNext sample", "metrics": { "predict_time": 5.177791408, "total_time": 110.795117 }, "output": "https://replicate.delivery/xezq/XjposOtGbJrtOhGl7IWW9BrMUdatfe0Te7VWRd5HbsCcfzmQB/generated.wav", "started_at": "2025-01-28T18:11:53.296326Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/bsvm-gx5knbbafqr7m2nlgopphhswg6ocq7gtldvgbkqooqxfykeggp2a", "get": "https://api.replicate.com/v1/predictions/dqh5dvhtqxrm80cmnrdsxtxqwg", "cancel": "https://api.replicate.com/v1/predictions/dqh5dvhtqxrm80cmnrdsxtxqwg/cancel" }, "version": "f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc" }
Generated in2025-01-28 18:11:53.629 | INFO | tools.llama.generate:generate_long:789 - Encoded text: With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good. 2025-01-28 18:11:53.630 | INFO | tools.llama.generate:generate_long:807 - Generating sentence 1/1 of sample 1/1 0%| | 0/8055 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.10/lib/python3.11/site-packages/torch/backends/cuda/__init__.py:342: FutureWarning: torch.backends.cuda.sdp_kernel() is deprecated. In the future, this context manager will be removed. Please see, torch.nn.attention.sdpa_kernel() for the new context manager, with updated signature. warnings.warn( 0%| | 4/8055 [00:00<03:46, 35.48it/s] 0%| | 8/8055 [00:00<03:41, 36.25it/s] 0%| | 12/8055 [00:00<03:40, 36.52it/s] 0%| | 16/8055 [00:00<03:39, 36.67it/s] 0%| | 20/8055 [00:00<03:38, 36.78it/s] 0%| | 24/8055 [00:00<03:38, 36.70it/s] 0%| | 28/8055 [00:00<03:37, 36.85it/s] 0%| | 32/8055 [00:00<03:37, 36.81it/s] 0%| | 36/8055 [00:00<03:37, 36.84it/s] 0%| | 40/8055 [00:01<03:37, 36.88it/s] 1%| | 44/8055 [00:01<03:41, 36.10it/s] 1%| | 48/8055 [00:01<03:49, 34.91it/s] 1%| | 52/8055 [00:01<03:47, 35.15it/s] 1%| | 56/8055 [00:01<03:46, 35.34it/s] 1%| | 60/8055 [00:01<03:48, 34.99it/s] 1%| | 64/8055 [00:01<03:58, 33.51it/s] 1%| | 68/8055 [00:01<03:59, 33.34it/s] 1%| | 72/8055 [00:02<03:52, 34.33it/s] 1%| | 76/8055 [00:02<03:47, 35.08it/s] 1%| | 80/8055 [00:02<03:43, 35.62it/s] 1%| | 84/8055 [00:02<03:41, 35.98it/s] 1%| | 88/8055 [00:02<03:40, 36.07it/s] 1%| | 92/8055 [00:02<03:40, 36.06it/s] 1%| | 96/8055 [00:02<03:38, 36.34it/s] 1%| | 100/8055 [00:02<03:38, 36.43it/s] 1%|▏ | 104/8055 [00:02<03:37, 36.54it/s] 1%|▏ | 108/8055 [00:03<03:37, 36.47it/s] 1%|▏ | 112/8055 [00:03<03:38, 36.34it/s] 1%|▏ | 116/8055 [00:03<03:37, 36.50it/s] 1%|▏ | 120/8055 [00:03<03:36, 36.69it/s] 2%|▏ | 124/8055 [00:03<03:36, 36.70it/s] 2%|▏ | 128/8055 [00:03<03:36, 36.64it/s] 2%|▏ | 132/8055 [00:03<03:36, 36.54it/s] 2%|▏ | 136/8055 [00:03<03:35, 36.69it/s] 2%|▏ | 140/8055 [00:03<03:35, 36.78it/s] 2%|▏ | 144/8055 [00:03<03:34, 36.88it/s] 2%|▏ | 147/8055 [00:04<03:40, 35.85it/s] 2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:861 - Generated 149 tokens in 4.31 seconds, 34.57 tokens/sec 2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:864 - Bandwidth achieved: 22.06 GB/s 2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:869 - GPU Memory used: 2.02 GB Next sample
Prediction
ttsds/fishspeech_1_5:f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bcIDdqh5dvhtqxrm80cmnrdsxtxqwgStatusSucceededSourceWebHardwareL40STotal durationCreatedInput
- text
- With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.
- text_reference
- and keeping eternity before the eyes, though much
- speaker_reference
- Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%00:00:000Stream Type LIVERemaining Time -00:00:0001x
- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "text": "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run ttsds/fishspeech_1_5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "ttsds/fishspeech_1_5:f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc", { input: { text: "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", text_reference: "and keeping eternity before the eyes, though much", speaker_reference: "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run ttsds/fishspeech_1_5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "ttsds/fishspeech_1_5:f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc", input={ "text": "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run ttsds/fishspeech_1_5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc", "input": { "text": "With tenure, Suzie\'d have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%00:00:000Stream Type LIVERemaining Time -00:00:0001x- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "completed_at": "2025-01-28T18:11:58.474117Z", "created_at": "2025-01-28T18:10:07.679000Z", "data_removed": false, "error": null, "id": "dqh5dvhtqxrm80cmnrdsxtxqwg", "input": { "text": "With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.", "text_reference": "and keeping eternity before the eyes, though much", "speaker_reference": "https://replicate.delivery/pbxt/MNFXdPaUPOwYCZjZM4azsymbzE2TCV2WJXfGpeV2DrFWaSq8/example_en.wav" }, "logs": "2025-01-28 18:11:53.629 | INFO | tools.llama.generate:generate_long:789 - Encoded text: With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good.\n2025-01-28 18:11:53.630 | INFO | tools.llama.generate:generate_long:807 - Generating sentence 1/1 of sample 1/1\n 0%| | 0/8055 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.10/lib/python3.11/site-packages/torch/backends/cuda/__init__.py:342: FutureWarning: torch.backends.cuda.sdp_kernel() is deprecated. In the future, this context manager will be removed. Please see, torch.nn.attention.sdpa_kernel() for the new context manager, with updated signature.\nwarnings.warn(\n 0%| | 4/8055 [00:00<03:46, 35.48it/s]\n 0%| | 8/8055 [00:00<03:41, 36.25it/s]\n 0%| | 12/8055 [00:00<03:40, 36.52it/s]\n 0%| | 16/8055 [00:00<03:39, 36.67it/s]\n 0%| | 20/8055 [00:00<03:38, 36.78it/s]\n 0%| | 24/8055 [00:00<03:38, 36.70it/s]\n 0%| | 28/8055 [00:00<03:37, 36.85it/s]\n 0%| | 32/8055 [00:00<03:37, 36.81it/s]\n 0%| | 36/8055 [00:00<03:37, 36.84it/s]\n 0%| | 40/8055 [00:01<03:37, 36.88it/s]\n 1%| | 44/8055 [00:01<03:41, 36.10it/s]\n 1%| | 48/8055 [00:01<03:49, 34.91it/s]\n 1%| | 52/8055 [00:01<03:47, 35.15it/s]\n 1%| | 56/8055 [00:01<03:46, 35.34it/s]\n 1%| | 60/8055 [00:01<03:48, 34.99it/s]\n 1%| | 64/8055 [00:01<03:58, 33.51it/s]\n 1%| | 68/8055 [00:01<03:59, 33.34it/s]\n 1%| | 72/8055 [00:02<03:52, 34.33it/s]\n 1%| | 76/8055 [00:02<03:47, 35.08it/s]\n 1%| | 80/8055 [00:02<03:43, 35.62it/s]\n 1%| | 84/8055 [00:02<03:41, 35.98it/s]\n 1%| | 88/8055 [00:02<03:40, 36.07it/s]\n 1%| | 92/8055 [00:02<03:40, 36.06it/s]\n 1%| | 96/8055 [00:02<03:38, 36.34it/s]\n 1%| | 100/8055 [00:02<03:38, 36.43it/s]\n 1%|▏ | 104/8055 [00:02<03:37, 36.54it/s]\n 1%|▏ | 108/8055 [00:03<03:37, 36.47it/s]\n 1%|▏ | 112/8055 [00:03<03:38, 36.34it/s]\n 1%|▏ | 116/8055 [00:03<03:37, 36.50it/s]\n 1%|▏ | 120/8055 [00:03<03:36, 36.69it/s]\n 2%|▏ | 124/8055 [00:03<03:36, 36.70it/s]\n 2%|▏ | 128/8055 [00:03<03:36, 36.64it/s]\n 2%|▏ | 132/8055 [00:03<03:36, 36.54it/s]\n 2%|▏ | 136/8055 [00:03<03:35, 36.69it/s]\n 2%|▏ | 140/8055 [00:03<03:35, 36.78it/s]\n 2%|▏ | 144/8055 [00:03<03:34, 36.88it/s]\n2%|▏ | 147/8055 [00:04<03:40, 35.85it/s]\n2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:861 - Generated 149 tokens in 4.31 seconds, 34.57 tokens/sec\n2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:864 - Bandwidth achieved: 22.06 GB/s\n2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:869 - GPU Memory used: 2.02 GB\nNext sample", "metrics": { "predict_time": 5.177791408, "total_time": 110.795117 }, "output": "https://replicate.delivery/xezq/XjposOtGbJrtOhGl7IWW9BrMUdatfe0Te7VWRd5HbsCcfzmQB/generated.wav", "started_at": "2025-01-28T18:11:53.296326Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/bsvm-gx5knbbafqr7m2nlgopphhswg6ocq7gtldvgbkqooqxfykeggp2a", "get": "https://api.replicate.com/v1/predictions/dqh5dvhtqxrm80cmnrdsxtxqwg", "cancel": "https://api.replicate.com/v1/predictions/dqh5dvhtqxrm80cmnrdsxtxqwg/cancel" }, "version": "f81057e21ad025b00703b8a2f63283d108829b7512f85c4c723c3edcc125f1bc" }
Generated in2025-01-28 18:11:53.629 | INFO | tools.llama.generate:generate_long:789 - Encoded text: With tenure, Suzie'd have all the more leisure for yachting, but her publications are no good. 2025-01-28 18:11:53.630 | INFO | tools.llama.generate:generate_long:807 - Generating sentence 1/1 of sample 1/1 0%| | 0/8055 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.10/lib/python3.11/site-packages/torch/backends/cuda/__init__.py:342: FutureWarning: torch.backends.cuda.sdp_kernel() is deprecated. In the future, this context manager will be removed. Please see, torch.nn.attention.sdpa_kernel() for the new context manager, with updated signature. warnings.warn( 0%| | 4/8055 [00:00<03:46, 35.48it/s] 0%| | 8/8055 [00:00<03:41, 36.25it/s] 0%| | 12/8055 [00:00<03:40, 36.52it/s] 0%| | 16/8055 [00:00<03:39, 36.67it/s] 0%| | 20/8055 [00:00<03:38, 36.78it/s] 0%| | 24/8055 [00:00<03:38, 36.70it/s] 0%| | 28/8055 [00:00<03:37, 36.85it/s] 0%| | 32/8055 [00:00<03:37, 36.81it/s] 0%| | 36/8055 [00:00<03:37, 36.84it/s] 0%| | 40/8055 [00:01<03:37, 36.88it/s] 1%| | 44/8055 [00:01<03:41, 36.10it/s] 1%| | 48/8055 [00:01<03:49, 34.91it/s] 1%| | 52/8055 [00:01<03:47, 35.15it/s] 1%| | 56/8055 [00:01<03:46, 35.34it/s] 1%| | 60/8055 [00:01<03:48, 34.99it/s] 1%| | 64/8055 [00:01<03:58, 33.51it/s] 1%| | 68/8055 [00:01<03:59, 33.34it/s] 1%| | 72/8055 [00:02<03:52, 34.33it/s] 1%| | 76/8055 [00:02<03:47, 35.08it/s] 1%| | 80/8055 [00:02<03:43, 35.62it/s] 1%| | 84/8055 [00:02<03:41, 35.98it/s] 1%| | 88/8055 [00:02<03:40, 36.07it/s] 1%| | 92/8055 [00:02<03:40, 36.06it/s] 1%| | 96/8055 [00:02<03:38, 36.34it/s] 1%| | 100/8055 [00:02<03:38, 36.43it/s] 1%|▏ | 104/8055 [00:02<03:37, 36.54it/s] 1%|▏ | 108/8055 [00:03<03:37, 36.47it/s] 1%|▏ | 112/8055 [00:03<03:38, 36.34it/s] 1%|▏ | 116/8055 [00:03<03:37, 36.50it/s] 1%|▏ | 120/8055 [00:03<03:36, 36.69it/s] 2%|▏ | 124/8055 [00:03<03:36, 36.70it/s] 2%|▏ | 128/8055 [00:03<03:36, 36.64it/s] 2%|▏ | 132/8055 [00:03<03:36, 36.54it/s] 2%|▏ | 136/8055 [00:03<03:35, 36.69it/s] 2%|▏ | 140/8055 [00:03<03:35, 36.78it/s] 2%|▏ | 144/8055 [00:03<03:34, 36.88it/s] 2%|▏ | 147/8055 [00:04<03:40, 35.85it/s] 2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:861 - Generated 149 tokens in 4.31 seconds, 34.57 tokens/sec 2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:864 - Bandwidth achieved: 22.06 GB/s 2025-01-28 18:11:57.940 | INFO | tools.llama.generate:generate_long:869 - GPU Memory used: 2.02 GB Next sample
Want to make some of these yourself?
Run this model