-
-
Notifications
You must be signed in to change notification settings - Fork 14k
lobehub lobe-chat Llm-usage Discussions
Sort by:
Latest activity
Categories, most helpful, and community links
Categories
Community links
🧠 LLM Usage Discussions
find llm usage case with LobeHub
-
You must be logged in to vote 🧠 temperature
Model Providerandtop_p` cannot both be specified for this model. Please use only one.Model provider related os:linux deployment:serverServer-side database mode priority:mediumImportant issues to be addressed soon feature:settingsSettings and configuration issues platform:webWeb platform device:pcPC/Desktop browser -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 -
You must be logged in to vote 🧠 Chat with ollama/mistral-7b behind litellm returns strange answer
provider:ollamaOllama provider and models