Proxy that allows you to use ollama as a copilot like Github copilot
Ensure ollama is installed:
curl -fsSL https://ollama.com/install.sh | sh
Or follow the manual install.
To use the default model expected by ollama-copilot
:
ollama pull codellama:7b-code
sudo make USER=$(whoami) install
- Install copilot.vim
- Configure variables
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false
- Install copilot extension
- Sign-in or sign-up in github
- Configure open settings config and insert
{
"github.copilot.advanced": {
"debug.overrideProxyUrl": "http://localhost:11437"
},
"http.proxy": "http://localhost:11435",
"http.proxyStrictSSL": false
}
- Enable completions APIs usage; fill in the middle.
- Enable flexible configuration model (Currently only supported llamacode:code).
- Create self-installing functionality.
- Windows setup
- Documentation on how to use.