Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,12 @@ on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
branches: [ "main", "staging" ]

jobs:
test-net6:
runs-on: ubuntu-latest
container: mcr.microsoft.com/dotnet/sdk:6.0

steps:
- name: Checkout
uses: actions/checkout@v4
Expand Down
95 changes: 86 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,6 @@ A non-official DashScope (Bailian) service SDK maintained by Cnblogs.

## Quick Start

### Using `Microsoft.Extensions.AI` Interface

Install NuGet package `Cnblogs.DashScope.AI`
```csharp
var client = new DashScopeClient("your-api-key").AsChatClient("qwen-max");
var completion = await client.CompleteAsync("hello");
Console.WriteLine(completion)
```

### Console Application

Install NuGet package `Cnblogs.DashScope.Sdk`
Expand Down Expand Up @@ -90,7 +81,93 @@ public class YourService(IDashScopeClient client)
}
}
```
### Using `Microsoft.Extensions.AI` Interface

Install NuGet package `Cnblogs.DashScope.AI`

```csharp
var client = new DashScopeClient("your-api-key").AsChatClient("qwen-max");
var completion = await client.GetResponseAsync("hello");
Console.WriteLine(completion.Text);
```

#### Fallback to raw messages

If you need to use input data or parameters not supported by `Microsoft.Extensions.AI`, you can directly invoke the underlying SDK by passing a raw `TextChatMessage` or `MultimodalMessage` via `RawPresentation`.

Similarly, to pass unsupported parameters, you can also do so directly by setting the `raw` property within `AdditionalProperties`.

Example(Using `qwen-doc-turbo`)

```csharp
var messages = new List<TextChatMessage>()
{
TextChatMessage.DocUrl(
"从这两份产品手册中,提取所有产品信息,并整理成一个标准的JSON数组。每个对象需要包含:model(产品的型号)、name(产品的名称)、price(价格(去除货币符号和逗号))",
[
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20251107/jockge/%E7%A4%BA%E4%BE%8B%E4%BA%A7%E5%93%81%E6%89%8B%E5%86%8CA.docx",
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20251107/ztwxzr/%E7%A4%BA%E4%BE%8B%E4%BA%A7%E5%93%81%E6%89%8B%E5%86%8CB.docx"
])
};
var parameters = new TextGenerationParameters()
{
ResultFormat = "message", IncrementalOutput = true,
};

var response = client
.AsChatClient("qwen-doc-turbo")
.GetStreamingResponseAsync(
messages.Select(x => new ChatMessage() { RawRepresentation = x }),
new ChatOptions()
{
AdditionalProperties = new AdditionalPropertiesDictionary() { { "raw", parameters } }
});
await foreach (var chunk in response)
{
Console.Write(chunk.Text);
}
```

Similarly, you can also retrieve the raw message from the `RawPresentation` in the message returned by the model.

Example (Getting the token count for an image when calling `qwen3-vl-plus`):

```csharp
var response = client
.AsChatClient("qwen3-vl-plus")
.GetStreamingResponseAsync(
new List<ChatMessage>()
{
new(
ChatRole.User,
new List<AIContent>()
{
new UriContent(
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20241022/emyrja/dog_and_girl.jpeg",
MediaTypeNames.Image.Jpeg),
new UriContent(
"https://dashscope.oss-cn-beijing.aliyuncs.com/images/tiger.png",
MediaTypeNames.Image.Jpeg),
new TextContent("这些图展现了什么内容?")
})
},
new ChatOptions());
var lastChunk = (ChatResponseUpdate?)null;
await foreach (var chunk in response)
{
Console.Write(chunk.Text);
lastChunk = chunk;
}

Console.WriteLine();

// Access underlying raw response
var raw = lastChunk?.RawRepresentation as ModelResponse<MultimodalOutput, MultimodalTokenUsage>;
Console.WriteLine($"Image token usage: {raw?.Usage?.ImageTokens}");
```

## Supported APIs

- [Text Generation](#text-generation) - QWen3, DeepSeek, etc. Supports reasoning/tool calling/web search/translation scenarios
- [Conversation](#conversation)
- [Thinking Models](#thinking-models)
Expand Down
103 changes: 92 additions & 11 deletions README.zh-Hans.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,6 @@

## 快速开始

### 使用 `Microsoft.Extensions.AI` 接口

安装 NuGet 包 `Cnblogs.DashScope.AI`

```csharp
var client = new DashScopeClient("your-api-key").AsChatClient("qwen-max");
var completion = await client.CompleteAsync("hello");
Console.WriteLine(completion)
```

### 控制台应用

安装 NuGet 包 `Cnblogs.DashScope.Sdk`。
Expand Down Expand Up @@ -88,6 +78,93 @@ public class YourService(IDashScopeClient client)
}
```

### 使用 `Microsoft.Extensions.AI` 接口

安装 NuGet 包 `Cnblogs.DashScope.AI`

```csharp
var client = new DashScopeClient("your-api-key").AsChatClient("qwen-max");
var completion = await client.GetResponseAsync("hello");
Console.WriteLine(completion.Text);
```

#### 调用原始 SDK

如果需要使用 `Microsoft.Extensions.AI` 不支持的输入数据或参数,可以通过 `RawPresentation` 直接传入原始的 `TextChatMessage` 或者 `MultimodalMessage` 来直接调用底层 SDK。

类似地,当需要传入不支持的参数时,也可以通过设置 `AdditionalProperties` 里的 `raw` 直接传入原始参数。

示例(调用 `qwen-doc-turbo`)

```csharp
var messages = new List<TextChatMessage>()
{
TextChatMessage.DocUrl(
"从这两份产品手册中,提取所有产品信息,并整理成一个标准的JSON数组。每个对象需要包含:model(产品的型号)、name(产品的名称)、price(价格(去除货币符号和逗号))",
[
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20251107/jockge/%E7%A4%BA%E4%BE%8B%E4%BA%A7%E5%93%81%E6%89%8B%E5%86%8CA.docx",
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20251107/ztwxzr/%E7%A4%BA%E4%BE%8B%E4%BA%A7%E5%93%81%E6%89%8B%E5%86%8CB.docx"
])
};
var parameters = new TextGenerationParameters()
{
ResultFormat = "message", IncrementalOutput = true,
};

var response = client
.AsChatClient("qwen-doc-turbo")
.GetStreamingResponseAsync(
messages.Select(x => new ChatMessage() { RawRepresentation = x }),
new ChatOptions()
{
AdditionalProperties = new AdditionalPropertiesDictionary() { { "raw", parameters } }
});
await foreach (var chunk in response)
{
Console.Write(chunk.Text);
}
```

类似地,也可以通过模型返回消息里的 `RawPresentation` 获取原始消息。

示例(调用 `qwen3-vl-plus` 时获取图像消耗的 Token 数):

```csharp
var response = client
.AsChatClient("qwen3-vl-plus")
.GetStreamingResponseAsync(
new List<ChatMessage>()
{
new(
ChatRole.User,
new List<AIContent>()
{
new UriContent(
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20241022/emyrja/dog_and_girl.jpeg",
MediaTypeNames.Image.Jpeg),
new UriContent(
"https://dashscope.oss-cn-beijing.aliyuncs.com/images/tiger.png",
MediaTypeNames.Image.Jpeg),
new TextContent("这些图展现了什么内容?")
})
},
new ChatOptions());
var lastChunk = (ChatResponseUpdate?)null;
await foreach (var chunk in response)
{
Console.Write(chunk.Text);
lastChunk = chunk;
}

Console.WriteLine();

// 访问原始消息
var raw = lastChunk?.RawRepresentation as ModelResponse<MultimodalOutput, MultimodalTokenUsage>;
Console.WriteLine($"Image token usage: {raw?.Usage?.ImageTokens}");
```



## 支持的 API

- [文本生成](#文本生成) - QWen3, DeepSeek 等,支持推理/工具调用/网络搜索/翻译等场景
Expand All @@ -97,6 +174,10 @@ public class YourService(IDashScopeClient client)
- [工具调用](#工具调用)
- [前缀续写](#前缀续写)
- [长上下文(Qwen-Long)](#长上下文(Qwen-Long))
- [翻译能力(Qwen-MT)](#翻译能力(Qwen-MT))
- [角色扮演(Qwen-Character)](#角色扮演(Qwen-Character))
- [数据挖掘(Qwen-doc-turbo)](#数据挖掘(Qwen-doc-turbo))
- [深入研究(Qwen-Deep-Research)](#深入研究(Qwen-Deep-Research))
- [多模态](#多模态) - QWen-VL,QVQ 等,支持推理/视觉理解/OCR/音频理解等场景
- [视觉理解/推理](#视觉理解/推理) - 图像/视频输入与理解,支持推理模式
- [文字提取](#文字提取) - OCR 任务,读取表格/文档/公式等
Expand Down Expand Up @@ -2670,7 +2751,7 @@ var completion = client.GetMultimodalGenerationAsync(

示例:

![倾斜的图像](sample/Cnblogs.DashScope.Sample/tilted.png)
![网页](sample/Cnblogs.DashScope.Sample/webpage.jpg)

```csharp
Console.WriteLine("Text:");
Expand Down
98 changes: 98 additions & 0 deletions sample/Cnblogs.DashScope.Sample/MsExtensionsAI/RawInputExample.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
using Cnblogs.DashScope.Core;
using Microsoft.Extensions.AI;

namespace Cnblogs.DashScope.Sample.MsExtensionsAI;

public class RawInputExample : MsExtensionsAiSample
{
/// <inheritdoc />
public override string Description => "Chat with raw message and parameter input";

/// <inheritdoc />
public override async Task RunAsync(IDashScopeClient client)
{
var messages = new List<TextChatMessage>()
{
TextChatMessage.DocUrl(
"从这两份产品手册中,提取所有产品信息,并整理成一个标准的JSON数组。每个对象需要包含:model(产品的型号)、name(产品的名称)、price(价格(去除货币符号和逗号))",
[
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20251107/jockge/%E7%A4%BA%E4%BE%8B%E4%BA%A7%E5%93%81%E6%89%8B%E5%86%8CA.docx",
"https://help-static-aliyun-doc.aliyuncs.com/file-manage-files/zh-CN/20251107/ztwxzr/%E7%A4%BA%E4%BE%8B%E4%BA%A7%E5%93%81%E6%89%8B%E5%86%8CB.docx"
])
};
var parameters = new TextGenerationParameters()
{
ResultFormat = "message", IncrementalOutput = true,
};

var response = client
.AsChatClient("qwen-doc-turbo")
.GetStreamingResponseAsync(
messages.Select(x => new ChatMessage() { RawRepresentation = x }),
new ChatOptions()
{
AdditionalProperties = new AdditionalPropertiesDictionary() { { "raw", parameters } }
});
await foreach (var chunk in response)
{
Console.Write(chunk.Text);
}
}
}

/*
```json
[
{
"model": "PRO-100",
"name": "智能打印机",
"price": "8999"
},
{
"model": "PRO-200",
"name": "智能扫描仪",
"price": "12999"
},
{
"model": "PRO-300",
"name": "智能会议系统",
"price": "25999"
},
{
"model": "PRO-400",
"name": "智能考勤机",
"price": "6999"
},
{
"model": "PRO-500",
"name": "智能文件柜",
"price": "15999"
},
{
"model": "SEC-100",
"name": "智能监控摄像头",
"price": "3999"
},
{
"model": "SEC-200",
"name": "智能门禁系统",
"price": "15999"
},
{
"model": "SEC-300",
"name": "智能报警系统",
"price": "28999"
},
{
"model": "SEC-400",
"name": "智能访客系统",
"price": "9999"
},
{
"model": "SEC-500",
"name": "智能停车管理",
"price": "22999"
}
]
```
*/
Loading