-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Description
Bug
In packages/openai-adapters/src/vercelStreamConverter.ts lines 88-92, the reasoning-delta case maps to content instead of reasoning_content:
case "reasoning-delta":
return chatChunk({
content: part.text, // Should be reasoning_content
model,
});This causes reasoning output to be mixed into assistant content for any provider using the Vercel streaming adapter, breaking the separation that fromChatCompletionChunk() in core/llm/openaiTypeConverters.ts relies on to route reasoning_content → { role: "thinking" } messages.
Impact
Models served through Vercel-compatible adapters will have their reasoning content appear as regular assistant text. The GUI's thinking/content separation won't work for these providers.
Context
Found while tracing the reasoning_content handling path through the codebase to clarify ownership of #10783 and #10785. Those issues are upstream (LM Studio's content field contamination when its Harmony parser's phase tracking fails — lmstudio-ai/lmstudio-bug-tracker#1589, lmstudio-ai/lmstudio-bug-tracker#1592). Continue's OpenAI adapter layer correctly maps what it receives. This Vercel converter bug is a separate issue on a different adapter path.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status