Skip to content

Vercel stream converter maps reasoning-delta to content instead of reasoning_content #11069

@shanevcantwell

Description

@shanevcantwell

Bug

In packages/openai-adapters/src/vercelStreamConverter.ts lines 88-92, the reasoning-delta case maps to content instead of reasoning_content:

case "reasoning-delta":
  return chatChunk({
    content: part.text,  // Should be reasoning_content
    model,
  });

This causes reasoning output to be mixed into assistant content for any provider using the Vercel streaming adapter, breaking the separation that fromChatCompletionChunk() in core/llm/openaiTypeConverters.ts relies on to route reasoning_content{ role: "thinking" } messages.

Impact

Models served through Vercel-compatible adapters will have their reasoning content appear as regular assistant text. The GUI's thinking/content separation won't work for these providers.

Context

Found while tracing the reasoning_content handling path through the codebase to clarify ownership of #10783 and #10785. Those issues are upstream (LM Studio's content field contamination when its Harmony parser's phase tracking fails — lmstudio-ai/lmstudio-bug-tracker#1589, lmstudio-ai/lmstudio-bug-tracker#1592). Continue's OpenAI adapter layer correctly maps what it receives. This Vercel converter bug is a separate issue on a different adapter path.

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:integrationIntegrations (context providers, model providers, etc.)kind:bugIndicates an unexpected problem or unintended behavior

    Type

    No type

    Projects

    Status

    Todo

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions