vllm部署模型后,返回只有!

#1
by richbinbin - opened

使用vllm部署服务后,模型只返回!,
image

image

OpenGuardrails org

Thanks for the report! Could you please share more details about your deployment setup?
Many users have successfully deployed this with vLLM, so with a bit more information we should be able to help identify what’s going wrong in your case.

Sign up or log in to comment