Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
13 changes: 9 additions & 4 deletions docs/content/docs/features/ai/backend-integration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,16 @@ The most common (and recommended) setup to integrate BlockNote AI with an LLM is
## Default setup (Vercel AI SDK)

The example below closely follows the [basic example from the Vercel AI SDK](https://ai-sdk.dev/docs/ai-sdk-ui/chatbot#example) for Next.js.
The only difference is that we're retrieving the BlockNote tools from the request body and using the `toolDefinitionsToToolSet` function to convert them to AI SDK tools. The LLM will now be able to invoke these tools to make modifications to the BlockNote document as requested by the user. The tool calls are forwarded to the client application where they're handled automatically by the AI Extension.
The only difference is that we're retrieving the BlockNote tools from the request body and using the `toolDefinitionsToToolSet` function to convert them to AI SDK tools. We also forward the serialized document state (selection, cursor, block IDs) that BlockNote adds to every user message by calling `injectDocumentStateMessages`. The LLM will now be able to invoke these tools to make modifications to the BlockNote document as requested by the user. The tool calls are forwarded to the client application where they're handled automatically by the AI Extension.

```ts app/api/chat/route.ts
import { openai } from "@ai-sdk/openai";
import { convertToModelMessages, streamText } from "ai";
import { toolDefinitionsToToolSet } from "@blocknote/xl-ai";
import {
aiDocumentFormats,
injectDocumentStateMessages,
toolDefinitionsToToolSet,
} from "@blocknote/xl-ai/server";

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
Expand All @@ -26,7 +30,8 @@ export async function POST(req: Request) {

const result = streamText({
model: openai("gpt-4.1"), // see https://ai-sdk.dev/docs/foundations/providers-and-models
messages: convertToModelMessages(messages),
system: aiDocumentFormats.html.systemPrompt,
messages: convertToModelMessages(injectDocumentStateMessages(messages)),
tools: toolDefinitionsToToolSet(toolDefinitions),
toolChoice: "required",
});
Expand Down Expand Up @@ -103,4 +108,4 @@ You can connect BlockNote AI features with more advanced AI pipelines. You can i
with BlockNote AI, [get in touch](/about).
</Callout>

- By default, BlockNote AI composes the LLM request (messages) based on the user's prompt and passes these to your backend. See [this example](https://github.com/TypeCellOS/BlockNote/blob/main/examples/09-ai/07-server-promptbuilder/src/App.tsx) for an example where composing the LLM request (prompt building) is delegated to the server.
- By default, BlockNote AI sends the entire LLM chat history to the backend. See [the server persistence example](https://github.com/TypeCellOS/BlockNote/tree/main/examples/09-ai/07-server-persistence) for a pattern where the backend stores chat and only the latest message is sent to the backend.
15 changes: 13 additions & 2 deletions docs/content/docs/features/ai/getting-started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,11 @@ This example follows the [basic example from the AI SDK](https://ai-sdk.dev/docs
```ts app/api/chat/route.ts
import { openai } from "@ai-sdk/openai";
import { convertToModelMessages, streamText } from "ai";
import { toolDefinitionsToToolSet } from "@blocknote/xl-ai";
import {
aiDocumentFormats,
injectDocumentStateMessages,
toolDefinitionsToToolSet,
} from "@blocknote/xl-ai/server";

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
Expand All @@ -95,7 +99,8 @@ export async function POST(req: Request) {

const result = streamText({
model: openai("gpt-4.1"), // see https://ai-sdk.dev/docs/foundations/providers-and-models
messages: convertToModelMessages(messages),
system: aiDocumentFormats.html.systemPrompt,
messages: convertToModelMessages(injectDocumentStateMessages(messages)),
tools: toolDefinitionsToToolSet(toolDefinitions),
toolChoice: "required",
});
Expand All @@ -104,6 +109,12 @@ export async function POST(req: Request) {
}
```

This follows the regular `streamText` pattern of the AI SDK, with 3 exceptions:

- the BlockNote document state is extracted from message metadata and injected into the messages, using `injectDocumentStateMessages`
- BlockNote client-side tool definitions are extracted from the request body and passed to the LLM using `toolDefinitionsToToolSet`
- The system prompt is set to the default BlockNote system prompt (`aiDocumentFormats.html.systemPrompt`). You can override or extend the system prompt. If you do so, make sure your modified system prompt still explains the AI on how to modify the BlockNote document.

See [Backend integrations](/docs/features/ai/backend-integration) for more information on how to integrate BlockNote AI with your backend.

# Full Example
Expand Down
118 changes: 57 additions & 61 deletions docs/content/docs/features/ai/reference.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,14 @@ type AIRequestHelpers = {
*/
transport?: ChatTransport<UIMessage>;

/**
* Use the ChatProvider to customize how the AI SDK Chat instance is created.
* For example, when you want to reuse an existing Chat instance used in the rest of your application.
*
* @note you cannot use both `chatProvider` and `transport` together.
*/
chatProvider?: () => Chat<UIMessage>;

/**
* Customize which stream tools are available to the LLM.
*/
Expand All @@ -43,12 +51,11 @@ type AIRequestHelpers = {
chatRequestOptions?: ChatRequestOptions;

/**
* Responsible for submitting a BlockNote AIRequest to the AI SDK.
* Use to transform the messages sent to the LLM.
* Build the serializable document state that will be forwarded to the backend.
*
* @default defaultAIRequestSender(aiDocumentFormats.html.defaultPromptBuilder, aiDocumentFormats.html.defaultPromptInputDataBuilder)
* @default aiDocumentFormats.html.defaultDocumentStateBuilder
*/
aiRequestSender?: AIRequestSender;
documentStateBuilder?: DocumentStateBuilder<any>;
};
```

Expand Down Expand Up @@ -126,7 +133,7 @@ class AIExtension {
}
```

### `InvokeAIOptions`
### `InvokeAI`

Requests to an LLM are made by calling `invokeAI` on the `AIExtension` object. This takes an `InvokeAIOptions` object as an argument.

Expand All @@ -146,6 +153,8 @@ type InvokeAIOptions = {
} & AIRequestHelpers; // Optionally override helpers per request
```

Because `InvokeAIOptions` extends `AIRequestHelpers`, you can override these options on a per-call basis without changing the global extension configuration.

## `getStreamToolsProvider`

When an LLM is called, it needs to interpret the document and invoke operations to modify it. Use a format's `getStreamToolsProvider` to obtain the tools the LLM may call while editing. In most cases, use `aiDocumentFormats.html.getStreamToolsProvider(...)`.
Expand All @@ -167,85 +176,72 @@ type getStreamToolsProvider = (
) => StreamToolsProvider;
```

## `AIRequest` and `AIRequestSender` (advanced)
## Document state builders (advanced)

The AIRequest models a single AI operation against the editor (prompt, selection, tools). The AIRequestSender is responsible for submitting that request to the AI SDK layer.
When BlockNote AI sends a request it also forwards a serialized snapshot of the editor. LLMs use this document state to understand document, cursor position and active selection. The `DocumentStateBuilder` type defines how that snapshot is produced:

```typescript
type DocumentStateBuilder<T> = (
aiRequest: Omit<AIRequest, "documentState">,
) => Promise<
| {
selection: false;
blocks: BlocksWithCursor<T>[];
isEmptyDocument: boolean;
}
| {
selection: true;
selectedBlocks: { id: string; block: T }[];
blocks: { block: T }[];
isEmptyDocument: boolean;
}
>;
```

By default, `aiDocumentFormats.html.defaultDocumentStateBuilder` is used.

## `AIRequest` (advanced)

`buildAIRequest` returns everything BlockNote AI needs to execute an AI call:

```typescript
type AIRequest = {
editor: BlockNoteEditor;
chat: Chat<UIMessage>;
userPrompt: string;
selectedBlocks?: Block[];
emptyCursorBlockToDelete?: string;
streamTools: StreamTool<any>[];
};

type AIRequestSender = {
sendAIRequest: (
aiRequest: AIRequest,
options: ChatRequestOptions,
) => Promise<void>;
documentState: DocumentState<any>;
onStart: () => void;
};
```

The default `AIRequestSender` used is `defaultAIRequestSender(aiDocumentFormats.html.defaultPromptBuilder, aiDocumentFormats.html.defaultPromptInputDataBuilder)`. It takes an AIRequest and the default prompt builder (see below) to construct the updated messages array and submits this to the AI SDK.

## PromptBuilder (advanced)
## `sendMessageWithAIRequest` (advanced)

A `PromptBuilder` allows you to fine-tune the messages sent to the LLM. A `PromptBuilder` mutates the AI SDK `UIMessage[]` in place based on the user prompt and document-specific input data. Input data is produced by a paired `PromptInputDataBuilder`.

We recommend forking the [default PromptBuilder](https://github.com/TypeCellOS/BlockNote/blob/main/packages/xl-ai/src/api/formats/html-blocks/defaultHTMLPromptBuilder.ts) as a starting point.
Use `sendMessageWithAIRequest` when you need to manually call the LLM without updating the state of the BlockNote AI menu.
For example, you could use this when you want to submit LLM requests from a different context (e.g.: a chat window).
`sendMessageWithAIRequest` is similar to `chat.sendMessages`, but it attaches the `documentState` to the outgoing message metadata, configures tool streaming, and forwards tool definitions (JSON Schemas) to your backend.

```typescript
// Mutates the messages based on format-specific input data
export type PromptBuilder<E> = (
messages: UIMessage[],
inputData: E,
) => Promise<void>;

// Builds the input data passed to the PromptBuilder from a BlockNote AIRequest
export type PromptInputDataBuilder<E> = (aiRequest: AIRequest) => Promise<E>;

// Create an AIRequestSender from your custom builders.
// This lets you plug your PromptBuilder into the request pipeline used by invokeAI/executeAIRequest.
function defaultAIRequestSender<E>(
promptBuilder: PromptBuilder<E>,
promptInputDataBuilder: PromptInputDataBuilder<E>,
): AIRequestSender;
async function sendMessageWithAIRequest(
chat: Chat<UIMessage>,
aiRequest: AIRequest,
message?: Parameters<Chat<UIMessage>["sendMessage"]>[0],
options?: Parameters<Chat<UIMessage>["sendMessage"]>[1],
): Promise<Result<void>>;
```

## Lower-level functions (advanced)

The `invokeAI` function automatically passes the default options set in the `AIExtension` to the LLM request. It also handles the LLM response and updates the state of the AI menu accordingly.
## `buildAIRequest` (advanced)

For advanced use cases, you can also directly use the lower-level `buildAIRequest` and `executeAIRequest` functions to issue an LLM request directly.

### `buildAIRequest`

Use buildAIRequest to assemble an AIRequest from editor state and configuration.
Use `buildAIRequest` to assemble an `AIRequest` from editor state if you are bypassing `invokeAI` and call `sendMessageWithAIRequest` directly.

```typescript
function buildAIRequest(opts: {
async function buildAIRequest(opts: {
editor: BlockNoteEditor;
chat: Chat<UIMessage>;
userPrompt: string;
useSelection?: boolean;
deleteEmptyCursorBlock?: boolean;
streamToolsProvider?: StreamToolsProvider<any, any>;
documentStateBuilder?: DocumentStateBuilder<any>;
onBlockUpdated?: (blockId: string) => void;
}): AIRequest;
```

### `executeAIRequest`

Use executeAIRequest to send it with an AIRequestSender and process streaming tool calls.

```typescript
function executeAIRequest(opts: {
aiRequest: AIRequest;
sender: AIRequestSender;
chatRequestOptions?: ChatRequestOptions;
onStart?: () => void;
}): Promise<void>;
}): Promise<AIRequest>;
```
2 changes: 1 addition & 1 deletion docs/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@
"@vercel/analytics": "^1.5.0",
"@vercel/og": "^0.6.8",
"@y-sweet/react": "^0.6.3",
"ai": "^5.0.45",
"ai": "^5.0.102",
"babel-plugin-react-compiler": "19.1.0-rc.2",
"better-auth": "^1.3.27",
"better-sqlite3": "^11.10.0",
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/01-minimal/.bnexample.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"dependencies": {
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
}
}
2 changes: 1 addition & 1 deletion examples/09-ai/01-minimal/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"react": "^19.2.0",
"react-dom": "^19.2.0",
"@blocknote/xl-ai": "latest",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
},
"devDependencies": {
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/02-playground/.bnexample.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"dependencies": {
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
}
}
2 changes: 1 addition & 1 deletion examples/09-ai/02-playground/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"react": "^19.2.0",
"react-dom": "^19.2.0",
"@blocknote/xl-ai": "latest",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
},
"devDependencies": {
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/03-custom-ai-menu-items/.bnexample.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"dependencies": {
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"react-icons": "^5.2.1",
"zustand": "^5.0.3"
}
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/03-custom-ai-menu-items/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"react": "^19.2.0",
"react-dom": "^19.2.0",
"@blocknote/xl-ai": "latest",
"ai": "^5.0.45",
"ai": "^5.0.102",
"react-icons": "^5.2.1",
"zustand": "^5.0.3"
},
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/04-with-collaboration/.bnexample.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"dependencies": {
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"y-partykit": "^0.0.25",
"yjs": "^13.6.27",
"zustand": "^5.0.3"
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/04-with-collaboration/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"react": "^19.2.0",
"react-dom": "^19.2.0",
"@blocknote/xl-ai": "latest",
"ai": "^5.0.45",
"ai": "^5.0.102",
"y-partykit": "^0.0.25",
"yjs": "^13.6.27",
"zustand": "^5.0.3"
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/05-manual-execution/.bnexample.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"dependencies": {
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"y-partykit": "^0.0.25",
"yjs": "^13.6.27",
"zustand": "^5.0.3"
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/05-manual-execution/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"react": "^19.2.0",
"react-dom": "^19.2.0",
"@blocknote/xl-ai": "latest",
"ai": "^5.0.45",
"ai": "^5.0.102",
"y-partykit": "^0.0.25",
"yjs": "^13.6.27",
"zustand": "^5.0.3"
Expand Down
2 changes: 1 addition & 1 deletion examples/09-ai/06-client-side-transport/.bnexample.json
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"@ai-sdk/groq": "^2.0.16",
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
}
}
2 changes: 1 addition & 1 deletion examples/09-ai/06-client-side-transport/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"react-dom": "^19.2.0",
"@ai-sdk/groq": "^2.0.16",
"@blocknote/xl-ai": "latest",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
},
"devDependencies": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"dependencies": {
"@blocknote/xl-ai": "latest",
"@mantine/core": "^8.3.4",
"ai": "^5.0.45",
"ai": "^5.0.102",
"zustand": "^5.0.3"
}
}
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# AI Integration with server LLM execution + promptbuilder
# AI Integration with server LLM message persistence

This example shows how to setup to add AI integration while handling the LLM calls (in this case, using the Vercel AI SDK) on your server, using a custom executor.

Prompt building is done on the server as well
Instead of sending all messages, these are kept server-side and we only submit the latest message.
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>AI Integration with server LLM execution + promptbuilder</title>
<title>AI Integration with server LLM message persistence</title>
<script>
<!-- AUTO-GENERATED FILE, DO NOT EDIT DIRECTLY -->
</script>
Expand Down
Loading
Loading