Proxy that allows you to use ollama as a copilot like Github copilot
Ensure ollama is installed:
curl -fsSL https://ollama.com/install.sh | shOr follow the manual install.
To use the default model expected by ollama-copilot:
ollama pull codellama:codego install github.com/bernardo-bruning/ollama-copilot@latestEnsure your $PATH includes $HOME/go/bin or $GOPATH/bin.
For example, in ~/.bashrc or ~/.zshrc:
export PATH="$HOME/go/bin:$GOPATH/bin:$PATH"ollama-copilotor if you are hosting ollama in a container or elsewhere
OLLAMA_HOST="http://192.168.133.7:11434" ollama-copilot- Install copilot.vim
- Configure variables
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false- Install copilot extension
- Sign-in or sign-up in github
- Configure open settings config and insert
{
"github.copilot.advanced": {
"debug.overrideProxyUrl": "http://localhost:11437"
},
"http.proxy": "http://localhost:11435",
"http.proxyStrictSSL": false
}- Open settings (ctrl + ,)
- Set up edit completion proxying:
{
"features": {
"edit_prediction_provider": "copilot"
},
"show_completions_on_input": true,
"edit_predictions": {
"copilot": {
"proxy": "http://localhost:11435",
"proxy_no_verify": true
}
}
}(experimental)
- Install copilot-emacs
- Configure the proxy
(use-package copilot
:straight (:host github :repo "copilot-emacs/copilot.el" :files ("*.el")) ;; if you don't use "straight", install otherwise
:ensure t
;; :hook (prog-mode . copilot-mode)
:bind (
("C-<tab>" . copilot-accept-completion)
)
:config
(setq copilot-network-proxy '(:host "127.0.0.1" :port 11434 :rejectUnauthorized :json-false))
)- Enable completions APIs usage; fill in the middle.
- Enable flexible configuration model (Currently only supported llamacode:code).
- Create self-installing functionality.
- Windows setup
- Documentation on how to use.
