v1.0.0 – First Public Release
Copilot OpenAI API (Go Edition)
A high-performance, production-ready Go proxy that exposes GitHub Copilot’s chat and embeddings APIs as an OpenAI-compatible service, with Anthropic/Claude compatibility.
🚀 Features
/v1/chat/completions– OpenAI-compatible chat completions (with streaming)/v1/embeddings– Embeddings endpoint/v1/messages– Anthropic/Claude compatibility (experimental, PRs welcome!)/v1/models– Lists all available models and capabilities- Secure authentication (Bearer token)
- Automatic Copilot token management and refresh
- CORS support
- Cross-platform: Linux, macOS, Windows (amd64 & arm64)
- Easy configuration via
.envor environment variables - YO LICENSE – see LICENSE
🏗️ Binaries
Download the binary for your platform:
go-copilot-api-linux-amd64go-copilot-api-linux-arm64go-copilot-api-darwin-amd64go-copilot-api-darwin-arm64go-copilot-api-windows-amd64.exego-copilot-api-windows-arm64.exe
🛠️ Usage
See the README for full setup and usage instructions.
- Configure your
.envwith at leastCOPILOT_TOKENand (optionally)DEFAULT_MODELandCOPILOT_SERVER_PORT. - Start the server and point your OpenAI-compatible or Anthropic-compatible client at it.
- Use
/v1/modelsto discover available model IDs.
⚠️ Notes
- Anthropic/Claude compatibility is untested. If you use Claude Code or Anthropic clients and encounter issues, we would appreciate any PRs or feedback to help improve support!
- For model selection, see
/v1/modelsor the README. - The default port is
9191. SetCOPILOT_SERVER_PORTin your.envto change it.
📄 License
YO LICENSE – see LICENSE
Copyright (C) 2025 Travis Peacock
Thank you for trying Copilot OpenAI API (Go Edition)! If you like it, star the repo, open issues, or send PRs. :) If you like the software remember that you should give me a "Yo".