-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vllm error #15
Comments
Hi! Here's how I intialized the Python environment (copied from Qwen2-VL's official repo):
Could you have a try whether these can solve your issue? |
Also feel free to reach out to us if you encounter any issues. |
I also had some trouble trying to get the grounding code working. In particular, I ran into this bug with error: assert "factor" in rope_scaling I ended up creating an environment with a python version of 3.12: conda create -n ugroundv1 python=3.12 -y setting up a python only build as suggested in the vllm docs: git clone https://github.com/vllm-project/vllm.git then only after that pip installing transformers, accelerate, and qwen-vl-utils[decord] since I have vllm in my filesystem, I replaced I also had to manually set a new max_model_len to 5000 when calling LLM(.) as my machine (A5000 24 Gb) was too small for the full input size! |
您好,感谢您的回复,问题已经解决! |
in ruing grounding/uground_qwen2vl.py:
/python3.10/site-packages/vllm/model_executor/layers/rotary_embedding.py", line 1003, in get_rope
[rank0]: raise ValueError(f"Unknown RoPE scaling type {scaling_type}")
[rank0]: ValueError: Unknown RoPE scaling type default
same error occurred when run “vllm serve osunlp/UGround-V1-7B --api-key token-abc123 --dtype float16”
Can you provide a possible solution? Thank you!
The text was updated successfully, but these errors were encountered: