An open-source alternative to GitHub copilot that runs locally.
If you haven't done already, please pick one of the following platforms to run LLM of your choice on your system locally.
Please note that you need to configure LLM for code completion and chat feature separately. Some of the popular LLMs that we recommend are as follows. Please pick the size (i.e. 1.3b, 7b, 13b or 34b) of the model based on your hardware capabilities.
| Code Completion | Chat | Links |
|---|---|---|
| deepseek-coder:{1.3b or 6.7b or 33b }-base | deepseek-coder:{1.3b or 6.7b or 33b}-instruct | Ollama Tags, Home |
| codellama:{7b or 13b or 34b}-code | codellama:{7b or 13b or 34b}-instruct | Ollama Tags, Home |
| mistral:{7b}-instruct | Ollama Tags, Home |
You can also pick a model by evaluating your local LLMs using Benchllama.
You can install Privy extension from the Visual Studio Code Marketplace or from the Open VSX Registry.
Please set the following options in the settings for Privy extension.
required): Pick the platform that is being used for running LLMs locally. There is support for using OpenAI, but this will affect the privacy aspects of the solution. The default is Ollama.required): The URL of the platform that is being used for running LLMs locally. The default is http://localhost:11434.custom and configure privy.customModel accordingly.| Shortcut (Mac) | Description |
|---|---|
Alt + (for Windows/Linux) or Cmd + (for Mac) |
Trigger inline code completion |
Ctrl + Alt + c (for Windows/Linux) or Ctrl + Cmd + c(for Mac) |
Start Chat |
Understanding these concepts will help you get the most out of Privy.
Lars Grammel ? ? ? |
Iain Majer ? |
Nicolas Carlo ? |
RatoGBM ? |
Lionel Okpeicha ? |
MercerK ? |
Lundeen.Bryan ? |
DucoG ? |
sbstn87 ? |
Manuel ? |
alessandro-newzoo ? |
Void&Null ? |
WittyDingo ? |
Eva ? |
AlexeyLavrentev ? |
linshu123 |
Michael Adams ? |
restlessronin |
Read our contributing guide to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes.
To help you get your feet wet and become familiar with our contribution process, we have a list of good first issues that contains things with a relatively limited scope. This is a great place to get started!