Skip to content

Proxy that allows you to use ollama as a copilot like Github copilot

Notifications You must be signed in to change notification settings

JamsMendez/ollama-copilot

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Copilot

Proxy that allows you to use ollama as a copilot like Github copilot

Installation

Ollama

Ensure ollama is installed:

curl -fsSL https://ollama.com/install.sh | sh

Or follow the manual install.

Models

To use the default model expected by ollama-copilot:

ollama pull codellama:7b-code

Makefile

sudo make USER=$(whoami) install

Configure IDE

Neovim

  1. Install copilot.vim
  2. Configure variables
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false

VScode

  1. Install copilot extension
  2. Sign-in or sign-up in github
  3. Configure open settings config and insert
{
    "github.copilot.advanced": {
        "debug.overrideProxyUrl": "http://localhost:11437"
    },
    "http.proxy": "http://localhost:11435",
    "http.proxyStrictSSL": false
}

Roadmap

  • Enable completions APIs usage; fill in the middle.
  • Enable flexible configuration model (Currently only supported llamacode:code).
  • Create self-installing functionality.
  • Windows setup
  • Documentation on how to use.

About

Proxy that allows you to use ollama as a copilot like Github copilot

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 95.0%
  • Makefile 5.0%