generated from allenai/python-package-template
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Open
Description
🚀 The feature, motivation and pitch
Ollama already supports https://ollama.com/library/qwen2.5vl, it loads https://huggingface.co/mradermacher/olmOCR-7B-0825-GGUF so it would be possible to run olmocr on smaller VRAM gpu's but I can't figure how to connect olmocr to ollama server. Any documentation to run it in the official documents would be welcome. It could be an official Ollama vision llm too :) Tried the ocr demo on the site it was stunningly good!
Alternatives
No response
Additional context
No response
Metadata
Metadata
Assignees
Labels
No labels