Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] 使用OpenRouter做的Deepseek R1模型,无法显示推理过程 #5860

Closed
zhufeizzz opened this issue Feb 7, 2025 · 8 comments
Closed
Labels
🐛 Bug Something isn't working | 缺陷

Comments

@zhufeizzz
Copy link

📦 部署环境

Docker

📌 软件版本

v1.51.13

💻 系统环境

Windows

🌐 浏览器

Edge

🐛 问题描述

难道只有Provider选择DeepSeek时才能显示推理过程?这也太鸡肋了吧,现在基本都是买的第三方API服务。

Image

📷 复现步骤

No response

🚦 期望结果

No response

📝 补充信息

No response

@zhufeizzz zhufeizzz added the 🐛 Bug Something isn't working | 缺陷 label Feb 7, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Feb 7, 2025
@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


📦 Deployment environment

Docker

📌 Software version

v1.51.13

💻 System environment

Windows

🌐 Browser

Edge

🐛 Question description

Can the reasoning process be displayed only when the Provider selects DeepSeek? This is too useless. Now we basically buy third-party API services.

Image

📷 Reproduction steps

No response

🚦 Expected results

No response

📝 Supplementary information

No response

@lobehubbot
Copy link
Member

👀 @zhufeizzz

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Feb 7, 2025

I found a similar open issue regarding the DeepSeek R1 model not displaying the reasoning content when using OpenRouter. You can view the issue here [1]. Unfortunately, no solution has been found in the comments of that issue.

The expected behavior for the DeepSeek R1 model is to handle the 'content' and 'reasoning_content' fields separately from the streaming chunks, ensuring compatibility with the response structure that includes these fields [2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@arvinxx
Copy link
Contributor

arvinxx commented Feb 7, 2025

OpenRouter 自己不提供显示 thinking 过程的返回结果,和 LobeChat 无关

@arvinxx arvinxx closed this as completed Feb 7, 2025
@github-project-automation github-project-automation bot moved this from Roadmap - Chat 1.x to Done in Lobe Chat Routine Feb 7, 2025
@lobehubbot
Copy link
Member

@zhufeizzz

This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


OpenRouter does not provide the display thinking process, and has nothing to do with LobeChat.

@aquanow-jeffen
Copy link

OpenRouter 自己不提供显示 thinking 过程的返回结果,和 LobeChat 无关

Isn't Openrouter already provided the API to return reason token? https://openrouter.ai/announcements/reasoning-tokens-for-thinking-models

@deephbz
Copy link
Contributor

deephbz commented Feb 9, 2025

@arvinxx OpenRouter supports reasoning outputs now. Please see: Should be solved by #5903. I've tested

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷
Projects
Status: Done
Development

No branches or pull requests

5 participants