Skip to content

Conversation

@bocklucas
Copy link

Feature: Adding Ollama support

Screencast.from.2024-10-14.01.07.35.PM.webm

NOTE: I don't have an OpenAI account so I'm not able to verify the changes here didn't break OpenAI, someone with an OpenAI account should verify this still works as expected.

Description

  • Adding support for Ollama by adding additional env variables
  • Updated README to reflect changes
  • Added sample envs for Ollama and OpenAI

QA

  1. Copy .env-ollama.example to .env and update the OLLAMA_BASE_URL to point to your Ollama instance, making sure you keep the /v1 at the end and also set the model to a vision model you have pulled.
  2. Run npm run dev, go to http://localhost:3000 and draw something
  3. Click on Make Real and verify you get a result from Ollama

@bocklucas
Copy link
Author

CC @SawyerHood

@orkutmuratyilmaz
Copy link

@bocklucas thank you for the PR :)

@bocklucas
Copy link
Author

CC @shtepcell @0x-sen @kawana77b

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants