Replies: 2 comments
-
for example: |
Beta Was this translation helpful? Give feedback.
0 replies
-
anyone could help? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
do not want to download model to local path, how to set up --model with online model name on remote server?
python3 ~/vllm/benchmarks/benchmark_serving.py --backend vllm
--model ~/deepseek-R1 --port 8000
--dataset-name random
--random-input 1234
--random-output 2345
--random-range-ratio 0.8
--dataset-path ~/ShareGPT_V3_unfiltered_cleaned_split.json
--max-concurrency 16
--num-prompts 64
Beta Was this translation helpful? Give feedback.
All reactions