Skip to content

aritra1999/slate

Repository files navigation

slate

slate

One UI for hosting communicating to all LLMs. Our UI, your LLM.

  • No more complex docker setups to communicate to ollama running locally or remotely.
  • Bring your own API tokens and pay for what you use
  • Completely free to use.
  • Everything's stored locally.

Development

Running locally

bun install
bun dev

Roadmap / Feature Set

  • Chat
    • Basic text chat - #18
    • Rendering response as markdown - #20
    • File upload for additional context
  • Support for local models
    • With Ollama - #15
    • With LM Studio
    • Custom config
  • Support for third-party models using API keys
    • OpenAI
    • Anthropic
    • Geminai
    • Deepseek
  • Support for self-hosted models
    • With Ollama
    • With LM Studio
    • Custom config
  • Documentation

Docs

This will be replaced by dedicated documentation later.

How to setup ollama locally

  • Setup ollama following the documentation on https://ollama.com/download
  • Choose any model or models from https://ollama.com/search that you wanna run and pull them using ollama pull <MODEL_NAME>
  • Run one of the following commands to start ollama locally
    • For unix systems OLLAMA_ORIGINS=https://slatechat.vercel.app ollama serve
    • For powershell $env:OLLAMA_ORIGINS="https://slatechat.vercel.app"; ollama serve
  • Go to https://slatechat.vercel.app/chat and start a new local chat, you should see your models listed in the dropdown

image

If you're using the Brave browser or any other ad blockers, don't forget to disable the ad blocking or it won't work.

About

Open source chat UI for local, self-hosted and cloud llms.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •