22
33Large Language Model API interface. This is a simple API interface for large language models
44which run on [ Ollama] ( https://github.com/ollama/ollama/blob/main/docs/api.md ) ,
5- [ Anthopic] ( https://docs.anthropic.com/en/api/getting-started ) and [ Mistral] ( https://docs.mistral.ai/ ) .
5+ [ Anthopic] ( https://docs.anthropic.com/en/api/getting-started ) and [ Mistral] ( https://docs.mistral.ai/ )
6+ (OpenAI might be added later).
67
78The module includes the ability to utilize:
89
910* Maintaining a session of messages
10- * Tool calling support
11- * Creating embeddings from text
11+ * Tool calling support, including using your own tools (aka Tool plugins)
12+ * Creating embedding vectors from text
1213* Streaming responses
14+ * Multi-modal support (aka, Images and Attachments)
1315
1416There is a command-line tool included in the module which can be used to interact with the API.
15- For example,
17+ If you have docker installed, you can use the following command to run the tool, without
18+ installation:
1619
1720``` bash
1821# Display help
@@ -21,12 +24,13 @@ docker run ghcr.io/mutablelogic/go-llm:latest --help
2124# Interact with Claude to retrieve news headlines, assuming
2225# you have an API key for Anthropic and NewsAPI
2326docker run \
24- --interactive -e ANTHROPIC_API_KEY -e NEWSAPI_KEY \
27+ --interactive -e MISTRAL_API_KEY -e NEWSAPI_KEY \
2528 ghcr.io/mutablelogic/go-llm:latest \
26- chat claude-3-5-haiku-20241022
29+ chat claude-3-5-haiku-20241022 --prompt " What is the latest news? "
2730```
2831
29- See below for more information on how to use the command-line tool.
32+ See below for more information on how to use the command-line tool (or how to install it
33+ if you have a ` go ` compiler).
3034
3135## Programmatic Usage
3236
@@ -46,7 +50,7 @@ import (
4650)
4751
4852func main () {
49- // Create a new agent
53+ // Create a new agent - replace the URL with the one to your Ollama instance
5054 agent , err := ollama.New (" https://ollama.com/api/v1/" )
5155 if err != nil {
5256 panic (err)
@@ -57,7 +61,7 @@ func main() {
5761
5862To create an
5963[ Anthropic] ( https://pkg.go.dev/github.com/mutablelogic/go-llm/pkg/anthropic )
60- agent,
64+ agent with an API key stored as an environment variable ,
6165
6266``` go
6367import (
@@ -66,7 +70,7 @@ import (
6670
6771func main () {
6872 // Create a new agent
69- agent , err := anthropic.New (os.Getev (" ANTHROPIC_API_KEY" ))
73+ agent , err := anthropic.New (os.Getenv (" ANTHROPIC_API_KEY" ))
7074 if err != nil {
7175 panic (err)
7276 }
@@ -83,7 +87,7 @@ import (
8387
8488func main () {
8589 // Create a new agent
86- agent , err := mistral.New (os.Getev (" MISTRAL_API_KEY" ))
90+ agent , err := mistral.New (os.Getenv (" MISTRAL_API_KEY" ))
8791 if err != nil {
8892 panic (err)
8993 }
@@ -94,6 +98,28 @@ func main() {
9498You can append options to the agent creation to set the client/server communication options,
9599such as user agent strings, timeouts, debugging, rate limiting, adding custom headers, etc. See [ here] ( https://pkg.go.dev/github.com/mutablelogic/go-client#readme-basic-usage ) for more information.
96100
101+ There is also an _ aggregated_ agent which can be used to interact with multiple providers at once. This is useful if you want
102+ to use models from different providers simultaneously.
103+
104+ ``` go
105+ import (
106+ " github.com/mutablelogic/go-llm/pkg/agent"
107+ )
108+
109+ func main () {
110+ // Create a new agent which aggregates multiple providers
111+ agent , err := agent.New (
112+ agent.WithAnthropic (os.Getenv (" ANTHROPIC_API_KEY" )),
113+ agent.WithMistral (os.Getenv (" MISTRAL_API_KEY" )),
114+ agent.WithOllama (os.Getenv (" OLLAMA_URL" )),
115+ )
116+ if err != nil {
117+ panic (err)
118+ }
119+ // ...
120+ }
121+ ```
122+
97123### Chat Sessions
98124
99125You create a ** chat session** with a model as follows,
@@ -120,6 +146,9 @@ func session(ctx context.Context, agent llm.Agent) error {
120146}
121147```
122148
149+ The ` Context ` object will continue to store the current session and options, and will
150+ ensure the session is maintained across multiple calls.
151+
123152### Embedding Generation
124153
125154TODO
@@ -146,16 +175,16 @@ type Model interface {
146175 // Set session-wide options
147176 Context (...Opt ) Context
148177
149- // Add attachments (images, PDF's) to a user prompt
178+ // Add attachments (images, PDF's) to a user prompt for completion
150179 UserPrompt (string , ...Opt ) Context
151180
152- // Set embedding options
181+ // Create an embedding vector with embedding options
153182 Embedding (context.Context , string , ...Opt ) ([]float64 , error )
154183}
155184
156185type Context interface {
157186 // Add single-use options when calling the model, which override
158- // session options. You can also attach files to a user prompt.
187+ // session options. You can attach files to a user prompt.
159188 FromUser (context.Context , string , ...Opt ) error
160189}
161190```
0 commit comments