Skip to content

feat: add Vercel AI Runner protocol implementation (AIC-2388)#1339

Draft
jsonbailey wants to merge 1 commit intojb/aic-2388/js-langchain-runnerfrom
jb/aic-2388/js-vercel-runner
Draft

feat: add Vercel AI Runner protocol implementation (AIC-2388)#1339
jsonbailey wants to merge 1 commit intojb/aic-2388/js-langchain-runnerfrom
jb/aic-2388/js-vercel-runner

Conversation

@jsonbailey
Copy link
Copy Markdown
Contributor

Summary

Adds Runner-protocol classes for the Vercel AI provider, completing the JS Runner migration for the three provider packages. New classes:

  • VercelModelRunnerRunner for chat completions; run(messages, outputType?) returns RunnerResult { content, metrics, raw, parsed? }. Uses Vercel's generateText for chat and generateObject for structured output. Preserves v4/v5 token field handling.
  • VercelRunnerFactory — exposes createModel(config). No agent runner is provided because the Vercel AI SDK is a thin model layer rather than an agent framework.
  • convertMessagesToVercel, mapProviderName, mapUsageDataToLDTokenUsage, getAIMetricsFromResponse, getAIMetricsFromStream — helper functions. getAIMetricsFromStream is preserved as-is for the streaming use case.

mapProvider is renamed to mapProviderName on both the helper module and the VercelProvider class; the old name is preserved as a @deprecated alias so existing callers continue to work.

The legacy VercelProvider class is preserved so AIProviderFactory (also @deprecated) keeps working until the managed layer fully migrates to Runner.run().

Stacked on jb/aic-2388/js-langchain-runner (JS PR 8).

Test plan

  • yarn workspace @launchdarkly/server-sdk-ai-vercel test — 54 tests pass (existing 34 + 20 new)
  • yarn workspace @launchdarkly/server-sdk-ai-vercel lint — clean
  • yarn workspace @launchdarkly/server-sdk-ai-vercel run build — clean

🤖 Generated with Claude Code

Adds VercelModelRunner and VercelRunnerFactory implementing the Runner
protocol introduced in JS PR 6. The runner returns RunnerResult instead
of the legacy ChatResponse / StructuredResponse and preserves the v4/v5
token field handling already present in VercelProvider.

No agent runner is provided because the Vercel AI SDK is a thin model
layer rather than an agent framework — the existing
getAIMetricsFromStream helper covers the streaming use case and is
exported alongside the new Runner classes.

mapProvider is renamed to mapProviderName on both the helper module and
the VercelProvider class; the old mapProvider name is kept as a
deprecated alias. convertMessagesToVercel is added for parity with the
other provider helpers.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant