Skip to content

Conversation

@Loule95450
Copy link
Contributor

@Loule95450 Loule95450 commented Nov 17, 2025

Summary

This PR adds first-class support for embedding models in the OpenRouter provider, compatible with AI SDK v5 and v4. It introduces the OpenRouterEmbeddingModel class for generating embeddings, along with schema validation, API integration, and comprehensive tests. The provider now exposes textEmbeddingModel (v5) and a deprecated embedding alias (v4) for parity with other AI SDK providers.

This implementation enables semantic search, RAG pipelines, and vector-native features using OpenRouter's embedding API, with support for custom routing preferences and user tracking.

Changes

  • src/embedding/: New directory with index.ts implementing the OpenRouterEmbeddingModel class, schemas.ts for response validation using Zod, and index.test.ts with Vitest tests covering instantiation, single/batch embeddings, custom settings, and edge cases (e.g., missing usage data).
  • src/facade.ts and src/provider.ts: Extended to expose textEmbeddingModel and embedding methods, creating embedding model instances with configurable settings.
  • src/types/openrouter-embedding-settings.ts: New types for embedding-specific settings, including provider routing (e.g., order, fallbacks) and user ID.
  • src/types/index.ts: Re-exports embedding settings types for easy access.
  • README.md: Updated documentation with usage examples for v5/v4, batch embeddings, and a list of supported models.

Usage Examples

For AI SDK v5

import { embed } from 'ai';
import { openrouter } from '@openrouter/ai-sdk-provider';

const { embedding } = await embed({
  model: openrouter.textEmbeddingModel('openai/text-embedding-3-small'),
  value: 'sunny day at the beach',
});

For batch embeddings:

import { embedMany } from 'ai';
import { openrouter } from '@openrouter/ai-sdk-provider';

const { embeddings } = await embedMany({
  model: openrouter.textEmbeddingModel('openai/text-embedding-3-small'),
  values: [
    'sunny day at the beach',
    'rainy day in the city',
    'snowy mountain peak',
  ],
});

For AI SDK v4 (Deprecated)

import { embed } from 'ai';
import { openrouter } from '@openrouter/ai-sdk-provider';

const { embedding } = await embed({
  model: openrouter.embedding('openai/text-embedding-3-small'),
  value: 'sunny day at the beach',
});

Custom settings example (e.g., provider routing):

const model = openrouter.textEmbeddingModel('openai/text-embedding-3-small', {
  user: 'test-user-123',
  provider: {
    order: ['openai'],
    allow_fallbacks: false,
  },
});

Tests

  • Added 253 lines of tests in src/embedding/index.test.ts covering:
    • Provider method exposure (textEmbeddingModel, embedding).
    • Model instantiation and configuration.
    • Single and batch embedding generation.
    • Custom request settings (e.g., user, provider options).
    • Handling responses without usage data.
  • Uses mocked fetch responses to simulate API behavior.

References

Introduces the OpenRouterEmbeddingModel class for embedding generation, including schema validation and API integration. Adds comprehensive tests for model instantiation, embedding functionality, custom settings, and response handling.
Introduces text embedding model support to the OpenRouter provider, including a new textEmbeddingModel method and a deprecated embedding alias for backward compatibility. Updates both the facade and provider to expose and implement these methods.
Introduced new type definitions for OpenRouter embedding settings in src/types/openrouter-embedding-settings.ts and re-exported them from index.ts. These types provide configuration options for embedding model requests, including provider routing preferences and user identification.
Expanded the README with details on using embedding models with OpenRouter, including usage examples for AI SDK v5 and v4, batch embeddings, and a list of supported embedding models.
@Loule95450
Copy link
Contributor Author

It is currently awaiting review/merge from the repository owners.

In the meantime – quick workaround

If you need the feature right now, you can use the branch from my fork:

# Clone my fork and checkout the feature branch
git clone https://github.com/Loule95450/ai-sdk-provider.git
cd ai-sdk-provider
git checkout feat/add-embed-support

# Install & build
pnpm install
pnpm build

# Then install it locally in your project
cd /path/to/your/project
npm i /path/to/ai-sdk-provider-cloned

@envoy1084
Copy link

Hey any updates on when this should be live?

@subtleGradient
Copy link
Contributor

also, we're working on https://github.com/OpenRouterTeam/ai-sdk-provider/tree/next which should be available on the alpha channel soon. we'll have to bring embeddings in there too

@subtleGradient subtleGradient merged commit e31db39 into OpenRouterTeam:main Dec 6, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants