ictnlp
/
llama-omni
Seamless Speech Interaction with Large Language Models
Prediction
ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351ID7pzb1229gxrgg0cj3g5bzdg0dwStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedby @chenxwhInput
- top_p
- 0
- prompt
- Please directly answer the questions in the user's speech
- input_audio
- Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x
- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
- temperature
- 0
- max_new_tokens
- 256
{ "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbWz5nAdlqDatmo2feweGHjcVyJHdQhqZYRNHqfJ7EyKxXa/helpful_base_1.wav", "temperature": 0, "max_new_tokens": 256 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", { input: { top_p: 0, prompt: "Please directly answer the questions in the user's speech", input_audio: "https://replicate.delivery/pbxt/LfbWz5nAdlqDatmo2feweGHjcVyJHdQhqZYRNHqfJ7EyKxXa/helpful_base_1.wav", temperature: 0, max_new_tokens: 256 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", input={ "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbWz5nAdlqDatmo2feweGHjcVyJHdQhqZYRNHqfJ7EyKxXa/helpful_base_1.wav", "temperature": 0, "max_new_tokens": 256 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", "input": { "top_p": 0, "prompt": "Please directly answer the questions in the user\'s speech", "input_audio": "https://replicate.delivery/pbxt/LfbWz5nAdlqDatmo2feweGHjcVyJHdQhqZYRNHqfJ7EyKxXa/helpful_base_1.wav", "temperature": 0, "max_new_tokens": 256 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
text
The origin of US state names is varied, but most were named by European explorers and settlers. Many were named after Native American tribes, Spanish and Mexican cities, or royal figures. Some states were also named after natural features, like rivers or mountains.audio
Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "completed_at": "2024-09-22T23:24:34.030323Z", "created_at": "2024-09-22T23:21:29.223000Z", "data_removed": false, "error": null, "id": "7pzb1229gxrgg0cj3g5bzdg0dw", "input": { "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbWz5nAdlqDatmo2feweGHjcVyJHdQhqZYRNHqfJ7EyKxXa/helpful_base_1.wav", "temperature": 0, "max_new_tokens": 256 }, "logs": "/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:567: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0.0` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.\nwarnings.warn(\nThe attention layers in this model are transitioning from computing the RoPE embeddings internally through `position_ids` (2D tensor with the indexes of the tokens), to using externally computed `position_embeddings` (Tuple of tensors, containing cos and sin). In v4.45 `position_ids` will be removed and `position_embeddings` will be mandatory.\nThe origin of US state names is varied, but most were named by European explorers and settlers. Many were named after Native American tribes, Spanish and Mexican cities, or royal figures. Some states were also named after natural features, like rivers or mountains.\noutput_units: 202 393 946 215 406 538 187 594 908 246 466 503 523 705 11 283 488 620 352 931 932 148 258 436 139 340 483 384 879 70 32 835 683 67 589 702 576 822 89 194 664 506 29 116 281 428 822 89 194 627 545 711 510 169 237 865 641 124 243 526 384 249 466 405 53 664 555 208 417 755 237 193 128 665 547 833 368 945 29 73 324 789 6 908 380 828 835 67 940 118 243 935 101 741 663 575 116 281 428 822 89 194 627 208 944 833 368 837 81 664 258 436 573 391 24 870 188 485 841 488 620 352 487 219 522 589 126 712 593 592 103 466 663 969 198 711 510 362 684 136 912 519 589 26 204 280 576 6 879 185 794 788 402 663 969 198 711 510 297 265 675 237 415 772 497 63 991 162 73 172 871 877 384 879 179 961 207 428 950 321 948 86 787 935 101 741 663 575 116 281 428 822 89 194 627 915 208 944 878 423 27 907 430 70 390 595 600 702 788 663 575 116 281 428 822 89 194 664 539 794 680 910 161 998 885 148 878 565 734 498 172 871 877 384 466 503 487 319 501 137 161 488 352 143 38 777 728 227 647 655 764 837 81 194 664 506 545 711 510 297 265 675 237 307 128 665 780 519 589 126 323 576 384 430 179 961 428 333 432 431 531 362 488 352 136 889 871 384 969 219 522 866 586 314 333 319 990 501 137 333 352 915 912 519 26 204 280 314 333 523 793 50 534 321 948 198 711 510 297 265 675 237 187 594 461 969 156 824 592 103 483 81 327 635 205 521 382 390 479 330 776 333 350 836 74 377 969 377 198 711 510 297 265 675 237 415 772 497 63 780 519 289 26 204 280 668 167 104 896 627 912 519 589 702 874 576 822 89 194 664 506 545 85 510 243 101 663 538 187 594 310 346 540 295 76 614 380 116 281 428 822 89 194 627 915 208 944 907 430 70 958 595 315 702 788 663 575 116 281 62 384 761 430 70 185 477 728 716 205 521 382 390 479 330 776 485 948 86 539 557 477 74 663 969 198 711 510 265 675 237 307 665 991 781 645 837 81 664 219 107 29 156 824 442 333 998 885 692 154 559 663 969 198 711 510 362 461 969 498 889 338 359 6 761 907 597 816 274 794 75 788 15 377 832 758 545 85 510 297 265 675 237 415 772 497\n<class 'str'>", "metrics": { "predict_time": 18.496706166, "total_time": 184.807323 }, "output": { "text": "The origin of US state names is varied, but most were named by European explorers and settlers. Many were named after Native American tribes, Spanish and Mexican cities, or royal figures. Some states were also named after natural features, like rivers or mountains.", "audio": "https://replicate.delivery/pbxt/A1reTPbzlkQiCKrfbuUyI78O1wWDkNPnQPkL9HGjUN7wklfmA/out.wav" }, "started_at": "2024-09-22T23:24:15.533617Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/7pzb1229gxrgg0cj3g5bzdg0dw", "cancel": "https://api.replicate.com/v1/predictions/7pzb1229gxrgg0cj3g5bzdg0dw/cancel" }, "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351" }
Generated in/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:567: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0.0` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`. warnings.warn( The attention layers in this model are transitioning from computing the RoPE embeddings internally through `position_ids` (2D tensor with the indexes of the tokens), to using externally computed `position_embeddings` (Tuple of tensors, containing cos and sin). In v4.45 `position_ids` will be removed and `position_embeddings` will be mandatory. The origin of US state names is varied, but most were named by European explorers and settlers. Many were named after Native American tribes, Spanish and Mexican cities, or royal figures. Some states were also named after natural features, like rivers or mountains. output_units: 202 393 946 215 406 538 187 594 908 246 466 503 523 705 11 283 488 620 352 931 932 148 258 436 139 340 483 384 879 70 32 835 683 67 589 702 576 822 89 194 664 506 29 116 281 428 822 89 194 627 545 711 510 169 237 865 641 124 243 526 384 249 466 405 53 664 555 208 417 755 237 193 128 665 547 833 368 945 29 73 324 789 6 908 380 828 835 67 940 118 243 935 101 741 663 575 116 281 428 822 89 194 627 208 944 833 368 837 81 664 258 436 573 391 24 870 188 485 841 488 620 352 487 219 522 589 126 712 593 592 103 466 663 969 198 711 510 362 684 136 912 519 589 26 204 280 576 6 879 185 794 788 402 663 969 198 711 510 297 265 675 237 415 772 497 63 991 162 73 172 871 877 384 879 179 961 207 428 950 321 948 86 787 935 101 741 663 575 116 281 428 822 89 194 627 915 208 944 878 423 27 907 430 70 390 595 600 702 788 663 575 116 281 428 822 89 194 664 539 794 680 910 161 998 885 148 878 565 734 498 172 871 877 384 466 503 487 319 501 137 161 488 352 143 38 777 728 227 647 655 764 837 81 194 664 506 545 711 510 297 265 675 237 307 128 665 780 519 589 126 323 576 384 430 179 961 428 333 432 431 531 362 488 352 136 889 871 384 969 219 522 866 586 314 333 319 990 501 137 333 352 915 912 519 26 204 280 314 333 523 793 50 534 321 948 198 711 510 297 265 675 237 187 594 461 969 156 824 592 103 483 81 327 635 205 521 382 390 479 330 776 333 350 836 74 377 969 377 198 711 510 297 265 675 237 415 772 497 63 780 519 289 26 204 280 668 167 104 896 627 912 519 589 702 874 576 822 89 194 664 506 545 85 510 243 101 663 538 187 594 310 346 540 295 76 614 380 116 281 428 822 89 194 627 915 208 944 907 430 70 958 595 315 702 788 663 575 116 281 62 384 761 430 70 185 477 728 716 205 521 382 390 479 330 776 485 948 86 539 557 477 74 663 969 198 711 510 265 675 237 307 665 991 781 645 837 81 664 219 107 29 156 824 442 333 998 885 692 154 559 663 969 198 711 510 362 461 969 498 889 338 359 6 761 907 597 816 274 794 75 788 15 377 832 758 545 85 510 297 265 675 237 415 772 497 <class 'str'>
Prediction
ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351IDf2kxbr66tsrgg0cj3g8t30dcycStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- top_p
- 0
- prompt
- Please directly answer the questions in the user's speech
- input_audio
- Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x
- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
- temperature
- 0
- max_new_tokens
- 256
{ "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbejSnvb4F4huGzzV2mZSGDGKXV5JC5axoH7iPkpeLtyz7g/helpful_base_5.wav", "temperature": 0, "max_new_tokens": 256 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", { input: { top_p: 0, prompt: "Please directly answer the questions in the user's speech", input_audio: "https://replicate.delivery/pbxt/LfbejSnvb4F4huGzzV2mZSGDGKXV5JC5axoH7iPkpeLtyz7g/helpful_base_5.wav", temperature: 0, max_new_tokens: 256 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", input={ "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbejSnvb4F4huGzzV2mZSGDGKXV5JC5axoH7iPkpeLtyz7g/helpful_base_5.wav", "temperature": 0, "max_new_tokens": 256 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", "input": { "top_p": 0, "prompt": "Please directly answer the questions in the user\'s speech", "input_audio": "https://replicate.delivery/pbxt/LfbejSnvb4F4huGzzV2mZSGDGKXV5JC5axoH7iPkpeLtyz7g/helpful_base_5.wav", "temperature": 0, "max_new_tokens": 256 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
text
To dice without slivering your finger, place the vegetable on a stable cutting board, hold the knife at a forty-five degree angle, and make gentle, rocking motions to chop the vegetable into small cubes. Keep your fingers curled under and out of the way.audio
Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "completed_at": "2024-09-22T23:29:57.269611Z", "created_at": "2024-09-22T23:29:40.054000Z", "data_removed": false, "error": null, "id": "f2kxbr66tsrgg0cj3g8t30dcyc", "input": { "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbejSnvb4F4huGzzV2mZSGDGKXV5JC5axoH7iPkpeLtyz7g/helpful_base_5.wav", "temperature": 0, "max_new_tokens": 256 }, "logs": "/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\nTo dice without slivering your finger, place the vegetable on a stable cutting board, hold the knife at a forty-five degree angle, and make gentle, rocking motions to chop the vegetable into small cubes. Keep your fingers curled under and out of the way.\noutput_units: 79 868 220 196 721 549 238 462 104 837 81 664 32 835 67 510 935 271 523 196 921 238 761 597 506 29 85 85 589 26 126 593 645 453 466 398 212 455 258 436 663 969 390 479 330 776 333 212 455 350 836 663 969 524 726 44 605 63 665 213 260 712 593 822 89 194 664 32 835 67 940 884 393 946 734 692 526 559 384 523 705 431 884 702 788 59 790 716 205 521 267 538 187 594 493 361 931 565 734 742 98 519 589 600 702 15 822 89 194 885 790 716 205 521 867 45 914 445 469 167 104 70 185 794 398 212 455 143 290 978 592 103 969 660 555 208 944 755 44 605 63 665 991 821 908 693 521 555 29 202 393 946 734 575 116 281 62 462 104 837 81 664 835 67 66 423 173 945 944 565 734 390 479 330 435 592 103 660 351 794 680 910 321 948 86 390 479 330 776 167 655 837 81 885 148 29 721 250 161 487 319 263 416 426 56 485 321 948 633 406 384 879 488 816 325 350 836 716 205 521 916 44 115 193 111 63 665 644 254 823 175 684 136 143 889 172 871 877 822 89 664 219 107 522 705 11 576 384 879 443 93 274 794 75 788 716 205 521 524 44 902 752 63 665 991 162 156 824 655 104 246 70 185 501 398 212 455 889 324 789 6 908 380 828 817 146 283 832 758 545 711 510 297 884 79 868 220 105 914 531 668 167 104 404 876 29 202 393 946 734 692 526 559 879 702 788 790 716 205 521 267 25 771 46 812 222 274 79 799 220 742 519 589 337 126 324 789 592 103 521 382 45 914 445 137 436 139 340 846 611 545 85 510 297 265 675 237 415 772 497 63 662 914 445 476 534 485 974 86 539 876 233 258 436 635 663 969 390 479 330 776 6 333 212 455 350 836 663 198 711 510 337 850 914 119 607 663 693 205 521 555 208 944 878 167 650 816 325 801 549 663 969 44 823 175 684 136 944 27 761 907 597 660 944 366 148 202 393 946 734 787 935 101 741 822 89 377 458 584 902 736 341 661 497\n<class 'str'>", "metrics": { "predict_time": 17.116064754, "total_time": 17.215611 }, "output": { "text": "To dice without slivering your finger, place the vegetable on a stable cutting board, hold the knife at a forty-five degree angle, and make gentle, rocking motions to chop the vegetable into small cubes. Keep your fingers curled under and out of the way.", "audio": "https://replicate.delivery/pbxt/vPRD6GIO2oobChMaTd1po1BQKTB1WK4f0ULquPnewvL0plfmA/out.wav" }, "started_at": "2024-09-22T23:29:40.153546Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/f2kxbr66tsrgg0cj3g8t30dcyc", "cancel": "https://api.replicate.com/v1/predictions/f2kxbr66tsrgg0cj3g8t30dcyc/cancel" }, "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351" }
Generated in/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( To dice without slivering your finger, place the vegetable on a stable cutting board, hold the knife at a forty-five degree angle, and make gentle, rocking motions to chop the vegetable into small cubes. Keep your fingers curled under and out of the way. output_units: 79 868 220 196 721 549 238 462 104 837 81 664 32 835 67 510 935 271 523 196 921 238 761 597 506 29 85 85 589 26 126 593 645 453 466 398 212 455 258 436 663 969 390 479 330 776 333 212 455 350 836 663 969 524 726 44 605 63 665 213 260 712 593 822 89 194 664 32 835 67 940 884 393 946 734 692 526 559 384 523 705 431 884 702 788 59 790 716 205 521 267 538 187 594 493 361 931 565 734 742 98 519 589 600 702 15 822 89 194 885 790 716 205 521 867 45 914 445 469 167 104 70 185 794 398 212 455 143 290 978 592 103 969 660 555 208 944 755 44 605 63 665 991 821 908 693 521 555 29 202 393 946 734 575 116 281 62 462 104 837 81 664 835 67 66 423 173 945 944 565 734 390 479 330 435 592 103 660 351 794 680 910 321 948 86 390 479 330 776 167 655 837 81 885 148 29 721 250 161 487 319 263 416 426 56 485 321 948 633 406 384 879 488 816 325 350 836 716 205 521 916 44 115 193 111 63 665 644 254 823 175 684 136 143 889 172 871 877 822 89 664 219 107 522 705 11 576 384 879 443 93 274 794 75 788 716 205 521 524 44 902 752 63 665 991 162 156 824 655 104 246 70 185 501 398 212 455 889 324 789 6 908 380 828 817 146 283 832 758 545 711 510 297 884 79 868 220 105 914 531 668 167 104 404 876 29 202 393 946 734 692 526 559 879 702 788 790 716 205 521 267 25 771 46 812 222 274 79 799 220 742 519 589 337 126 324 789 592 103 521 382 45 914 445 137 436 139 340 846 611 545 85 510 297 265 675 237 415 772 497 63 662 914 445 476 534 485 974 86 539 876 233 258 436 635 663 969 390 479 330 776 6 333 212 455 350 836 663 198 711 510 337 850 914 119 607 663 693 205 521 555 208 944 878 167 650 816 325 801 549 663 969 44 823 175 684 136 944 27 761 907 597 660 944 366 148 202 393 946 734 787 935 101 741 822 89 377 458 584 902 736 341 661 497 <class 'str'>
Prediction
ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351IDh37rz8yhxsrgj0cj3ga8njq4gmStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- top_p
- 0
- prompt
- Please directly answer the questions in the user's speech
- input_audio
- Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x
- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
- temperature
- 0
- max_new_tokens
- 256
{ "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbhsklJg0LQqTpm3AIM0YQqP5p30J4BDTJMq4HFumAVUW3U/vicuna_4.wav", "temperature": 0, "max_new_tokens": 256 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", { input: { top_p: 0, prompt: "Please directly answer the questions in the user's speech", input_audio: "https://replicate.delivery/pbxt/LfbhsklJg0LQqTpm3AIM0YQqP5p30J4BDTJMq4HFumAVUW3U/vicuna_4.wav", temperature: 0, max_new_tokens: 256 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", input={ "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbhsklJg0LQqTpm3AIM0YQqP5p30J4BDTJMq4HFumAVUW3U/vicuna_4.wav", "temperature": 0, "max_new_tokens": 256 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", "input": { "top_p": 0, "prompt": "Please directly answer the questions in the user\'s speech", "input_audio": "https://replicate.delivery/pbxt/LfbhsklJg0LQqTpm3AIM0YQqP5p30J4BDTJMq4HFumAVUW3U/vicuna_4.wav", "temperature": 0, "max_new_tokens": 256 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
text
To increase productivity while working from home, create a dedicated workspace, set a schedule, minimize distractions, and take regular breaks. Stay focused and prioritize tasks to achieve more in less time.audio
Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "completed_at": "2024-09-22T23:33:16.222087Z", "created_at": "2024-09-22T23:32:59.502000Z", "data_removed": false, "error": null, "id": "h37rz8yhxsrgj0cj3ga8njq4gm", "input": { "top_p": 0, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbhsklJg0LQqTpm3AIM0YQqP5p30J4BDTJMq4HFumAVUW3U/vicuna_4.wav", "temperature": 0, "max_new_tokens": 256 }, "logs": "/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\nTo increase productivity while working from home, create a dedicated workspace, set a schedule, minimize distractions, and take regular breaks. Stay focused and prioritize tasks to achieve more in less time.\noutput_units: 79 868 220 483 328 409 914 468 678 485 948 32 835 67 337 243 850 973 288 750 104 901 921 503 487 219 952 471 737 333 234 523 50 53 321 458 787 935 101 741 837 81 693 521 787 935 101 741 969 660 351 501 398 212 455 143 390 515 647 366 896 627 470 821 908 380 896 627 168 343 44 115 121 111 128 914 119 678 485 113 327 822 89 194 664 506 944 423 565 734 196 537 721 250 384 879 166 250 161 487 319 501 137 822 89 194 664 599 161 523 555 233 787 935 101 741 969 934 219 107 85 519 589 337 126 323 576 822 89 194 664 835 67 297 265 675 237 307 128 665 780 519 26 204 280 576 384 879 945 944 878 423 565 742 98 519 589 126 137 576 384 879 523 196 166 705 11 74 716 205 521 524 44 115 752 111 128 665 991 162 73 172 871 333 179 961 428 754 748 872 336 359 655 837 81 198 711 124 884 250 333 432 742 170 589 600 728 647 167 761 430 70 219 727 146 283 832 758 711 510 265 675 237 823 175 684 136 38 105 244 583 576 822 89 194 664 990 107 233 156 824 442 879 487 350 836 74 228 259 453 402 663 969 870 290 978 647 822 89 194 664 990 107 545 85 297 265 675 237 415 772 497 780 589 702 576 822 89 194 664 958 390 479 330 776 435 908 380 382 268 501 860 137 333 32 835 683 67 940 118 64 944 175 684 136 481 973 288 796 750 837 81 327 635 894 764 466 969 523 185 403 75 583 874 167 655 837 81 198 835 683 67 940 884 244 583 576 761 907 430 70 835 67 940 118 107 545 85 297 884 79 868 220 483 734 793 105 326 531 976 534 485 321 948 86 885 692 148 243 889 324 826 592 103 969 25 771 46 812 222 915 781 645 384 879 70 835 67 940 884 244 583 874 167 655 764 837 81 377 627 385 584 902 415 497\n<class 'str'>", "metrics": { "predict_time": 16.621842295, "total_time": 16.720087 }, "output": { "text": "To increase productivity while working from home, create a dedicated workspace, set a schedule, minimize distractions, and take regular breaks. Stay focused and prioritize tasks to achieve more in less time.", "audio": "https://replicate.delivery/pbxt/jXGGf9uias32ISB9e7sCxSZQXgHYmdcXjLMXPe4DSkT2ZLfNB/out.wav" }, "started_at": "2024-09-22T23:32:59.600245Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/h37rz8yhxsrgj0cj3ga8njq4gm", "cancel": "https://api.replicate.com/v1/predictions/h37rz8yhxsrgj0cj3ga8njq4gm/cancel" }, "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351" }
Generated in/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( To increase productivity while working from home, create a dedicated workspace, set a schedule, minimize distractions, and take regular breaks. Stay focused and prioritize tasks to achieve more in less time. output_units: 79 868 220 483 328 409 914 468 678 485 948 32 835 67 337 243 850 973 288 750 104 901 921 503 487 219 952 471 737 333 234 523 50 53 321 458 787 935 101 741 837 81 693 521 787 935 101 741 969 660 351 501 398 212 455 143 390 515 647 366 896 627 470 821 908 380 896 627 168 343 44 115 121 111 128 914 119 678 485 113 327 822 89 194 664 506 944 423 565 734 196 537 721 250 384 879 166 250 161 487 319 501 137 822 89 194 664 599 161 523 555 233 787 935 101 741 969 934 219 107 85 519 589 337 126 323 576 822 89 194 664 835 67 297 265 675 237 307 128 665 780 519 26 204 280 576 384 879 945 944 878 423 565 742 98 519 589 126 137 576 384 879 523 196 166 705 11 74 716 205 521 524 44 115 752 111 128 665 991 162 73 172 871 333 179 961 428 754 748 872 336 359 655 837 81 198 711 124 884 250 333 432 742 170 589 600 728 647 167 761 430 70 219 727 146 283 832 758 711 510 265 675 237 823 175 684 136 38 105 244 583 576 822 89 194 664 990 107 233 156 824 442 879 487 350 836 74 228 259 453 402 663 969 870 290 978 647 822 89 194 664 990 107 545 85 297 265 675 237 415 772 497 780 589 702 576 822 89 194 664 958 390 479 330 776 435 908 380 382 268 501 860 137 333 32 835 683 67 940 118 64 944 175 684 136 481 973 288 796 750 837 81 327 635 894 764 466 969 523 185 403 75 583 874 167 655 837 81 198 835 683 67 940 884 244 583 576 761 907 430 70 835 67 940 118 107 545 85 297 884 79 868 220 483 734 793 105 326 531 976 534 485 321 948 86 885 692 148 243 889 324 826 592 103 969 25 771 46 812 222 915 781 645 384 879 70 835 67 940 884 244 583 874 167 655 764 837 81 377 627 385 584 902 415 497 <class 'str'>
Prediction
ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351ID7kxhnhz2v5rgm0cj3gar5s5qc8StatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- top_p
- 0.75
- prompt
- Please directly answer the questions in the user's speech
- input_audio
- Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x
- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
- temperature
- 0.5
- max_new_tokens
- 256
{ "top_p": 0.75, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbiyaVvY3JawrMfmfIHCYB1NFrOmYRth8MHJfENjZ4Q19wQ/vicuna_2.wav", "temperature": 0.5, "max_new_tokens": 256 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", { input: { top_p: 0.75, prompt: "Please directly answer the questions in the user's speech", input_audio: "https://replicate.delivery/pbxt/LfbiyaVvY3JawrMfmfIHCYB1NFrOmYRth8MHJfENjZ4Q19wQ/vicuna_2.wav", temperature: 0.5, max_new_tokens: 256 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "ictnlp/llama-omni:36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", input={ "top_p": 0.75, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbiyaVvY3JawrMfmfIHCYB1NFrOmYRth8MHJfENjZ4Q19wQ/vicuna_2.wav", "temperature": 0.5, "max_new_tokens": 256 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run ictnlp/llama-omni using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351", "input": { "top_p": 0.75, "prompt": "Please directly answer the questions in the user\'s speech", "input_audio": "https://replicate.delivery/pbxt/LfbiyaVvY3JawrMfmfIHCYB1NFrOmYRth8MHJfENjZ4Q19wQ/vicuna_2.wav", "temperature": 0.5, "max_new_tokens": 256 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
text
To deal with stress, try deep breathing exercises, meditation, and physical activity like yoga or a brisk walk. Also, prioritize tasks, set realistic goals, and take regular breaks to help manage stress.audio
Video Player is loading.Current Time 00:00:000/Duration 00:00:000Loaded: 0%Stream Type LIVERemaining Time -00:00:0001x- Chapters
- descriptions off, selected
- captions settings, opens captions settings dialog
- captions off, selected
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
{ "completed_at": "2024-09-22T23:34:25.923573Z", "created_at": "2024-09-22T23:34:09.369000Z", "data_removed": false, "error": null, "id": "7kxhnhz2v5rgm0cj3gar5s5qc8", "input": { "top_p": 0.75, "prompt": "Please directly answer the questions in the user's speech", "input_audio": "https://replicate.delivery/pbxt/LfbiyaVvY3JawrMfmfIHCYB1NFrOmYRth8MHJfENjZ4Q19wQ/vicuna_2.wav", "temperature": 0.5, "max_new_tokens": 256 }, "logs": "/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\nTraceback (most recent call last):\nFile \"<string>\", line 1, in <module>\nFile \"/root/.pyenv/versions/3.10.15/lib/python3.10/multiprocessing/spawn.py\", line 116, in spawn_main\nexitcode = _main(fd, parent_sentinel)\nFile \"/root/.pyenv/versions/3.10.15/lib/python3.10/multiprocessing/spawn.py\", line 126, in _main\nself = reduction.pickle.load(from_parent)\n_pickle.UnpicklingError: pickle data was truncated\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\n/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.\nwarnings.warn(\nTo deal with stress, try deep breathing exercises, meditation, and physical activity like yoga or a brisk walk. Also, prioritize tasks, set realistic goals, and take regular breaks to help manage stress.\noutput_units: 79 868 196 721 250 485 948 284 327 635 205 521 787 935 271 333 523 918 85 519 589 600 702 728 647 576 384 879 70 835 67 297 265 675 237 63 665 662 777 227 647 655 837 81 664 555 537 721 250 485 948 86 539 251 876 561 213 973 288 824 56 485 321 948 86 166 398 212 455 406 25 423 384 879 219 522 586 362 59 432 924 866 586 668 167 655 837 81 198 347 376 955 53 198 711 510 297 265 675 237 307 128 665 991 73 172 871 877 384 879 196 166 503 161 523 793 403 794 244 583 15 822 89 194 664 817 146 283 352 385 343 44 902 752 63 644 254 823 175 684 136 233 233 390 479 422 330 776 333 873 347 975 59 319 263 501 716 205 521 453 384 761 879 70 219 952 315 471 737 910 333 234 161 523 50 910 321 948 86 781 645 655 837 81 664 990 107 233 258 436 635 803 791 894 382 350 836 494 87 513 296 714 187 594 461 969 565 734 870 290 978 824 333 32 835 683 67 940 337 126 107 233 535 935 101 387 741 908 246 764 611 916 506 107 417 755 193 415 772 497 63 644 254 594 310 540 295 76 614 803 791 52 524 44 63 665 662 213 973 288 796 750 104 837 81 327 635 579 764 466 969 663 523 185 403 75 583 874 167 655 837 81 198 711 683 940 884 244 583 576 761 907 430 70 835 67 940 118 545 85 297 265 675 237 63 665 780 519 26 204 280 576 384 879 945 185 233 156 824 442 56 284 327 635 259 303 333 432 32 683 940 118 702 15 333 487 990 29 561 537 416 836 908 693 205 521 711 510 297 675 237 823 175 684 136 38 105 244 583 576 822 89 664 990 107 233 156 824 384 879 487 350 836 74 228 259 453 402 663 969 870 251 290 978 426 647 822 89 194 664 990 107 545 85 297 884 79 868 220 470 821 576 693 268 876 29 73 889 338 359 877 384 761 907 430 933 179 961 428 333 523 555 705 519 589 600 702 728 647 576 384 879 377 70 835 67 297 265 675 415 772 497\n<class 'str'>", "metrics": { "predict_time": 16.456674207, "total_time": 16.554573 }, "output": { "text": "To deal with stress, try deep breathing exercises, meditation, and physical activity like yoga or a brisk walk. Also, prioritize tasks, set realistic goals, and take regular breaks to help manage stress.", "audio": "https://replicate.delivery/pbxt/ZoR9ogxyh8ZUCdeFQxJpnSJrsX2DoSAaCaKf50BDCN0AulfmA/out.wav" }, "started_at": "2024-09-22T23:34:09.466898Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/7kxhnhz2v5rgm0cj3gar5s5qc8", "cancel": "https://api.replicate.com/v1/predictions/7kxhnhz2v5rgm0cj3gar5s5qc8/cancel" }, "version": "36c9bcf70a56f40d9a27445c30c769308b18180296749f86ec9b682baf7ad351" }
Generated in/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( Traceback (most recent call last): File "<string>", line 1, in <module> File "/root/.pyenv/versions/3.10.15/lib/python3.10/multiprocessing/spawn.py", line 116, in spawn_main exitcode = _main(fd, parent_sentinel) File "/root/.pyenv/versions/3.10.15/lib/python3.10/multiprocessing/spawn.py", line 126, in _main self = reduction.pickle.load(from_parent) _pickle.UnpicklingError: pickle data was truncated /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( To deal with stress, try deep breathing exercises, meditation, and physical activity like yoga or a brisk walk. Also, prioritize tasks, set realistic goals, and take regular breaks to help manage stress. output_units: 79 868 196 721 250 485 948 284 327 635 205 521 787 935 271 333 523 918 85 519 589 600 702 728 647 576 384 879 70 835 67 297 265 675 237 63 665 662 777 227 647 655 837 81 664 555 537 721 250 485 948 86 539 251 876 561 213 973 288 824 56 485 321 948 86 166 398 212 455 406 25 423 384 879 219 522 586 362 59 432 924 866 586 668 167 655 837 81 198 347 376 955 53 198 711 510 297 265 675 237 307 128 665 991 73 172 871 877 384 879 196 166 503 161 523 793 403 794 244 583 15 822 89 194 664 817 146 283 352 385 343 44 902 752 63 644 254 823 175 684 136 233 233 390 479 422 330 776 333 873 347 975 59 319 263 501 716 205 521 453 384 761 879 70 219 952 315 471 737 910 333 234 161 523 50 910 321 948 86 781 645 655 837 81 664 990 107 233 258 436 635 803 791 894 382 350 836 494 87 513 296 714 187 594 461 969 565 734 870 290 978 824 333 32 835 683 67 940 337 126 107 233 535 935 101 387 741 908 246 764 611 916 506 107 417 755 193 415 772 497 63 644 254 594 310 540 295 76 614 803 791 52 524 44 63 665 662 213 973 288 796 750 104 837 81 327 635 579 764 466 969 663 523 185 403 75 583 874 167 655 837 81 198 711 683 940 884 244 583 576 761 907 430 70 835 67 940 118 545 85 297 265 675 237 63 665 780 519 26 204 280 576 384 879 945 185 233 156 824 442 56 284 327 635 259 303 333 432 32 683 940 118 702 15 333 487 990 29 561 537 416 836 908 693 205 521 711 510 297 675 237 823 175 684 136 38 105 244 583 576 822 89 664 990 107 233 156 824 384 879 487 350 836 74 228 259 453 402 663 969 870 251 290 978 426 647 822 89 194 664 990 107 545 85 297 884 79 868 220 470 821 576 693 268 876 29 73 889 338 359 877 384 761 907 430 933 179 961 428 333 523 555 705 519 589 600 702 728 647 576 384 879 377 70 835 67 297 265 675 415 772 497 <class 'str'>
Want to make some of these yourself?
Run this model