Replies: 3 comments
-
Hi @passoion! I'm Dosu and I’m helping the ragflow team. RAGflow can't add Ollama models offline because it relies on the Ollama service to provide models via its HTTP API. If the Ollama model wasn't pulled (downloaded) before disconnecting from the internet, RAGflow can't access or add it. To use RAGflow and Ollama fully offline, you need to:
If you see "No data" in the UI, it usually means RAGflow can't reach the Ollama service or the requested model isn't available locally. Double-check that the models are pulled and the service is running and accessible on your LAN. For more details and step-by-step instructions, see the official deploy local LLM guide and FAQ. To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
-
In order to allow all users on the local area network to access RAGFlow, I modified the port settings. |
Beta Was this translation helpful? Give feedback.
-
i have saloved it |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I already deployed Ragflow using“ $ docker compose -f docker-compose.yml up -d” while connected to the internet,and the system is running just fine. I was able to add the Ollama model without any issues.But ultimately.
I want to use RAGflow on a local network without any internet connection.After I disconnected from the internet and logged back into RAGflow, I found that I could no longer add ollama models. How can I fix this issue?

Beta Was this translation helpful? Give feedback.
All reactions