Skip to content

feat: Add default base URL for Ollama #158

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

Conversation

hcmlinj
Copy link

@hcmlinj hcmlinj commented Mar 14, 2025

What type of PR is this?

According to Ollama's API doc we should be able to set the default base URL to http://localhost:11434.

Check the PR title.

  • This PR title match the format: <type>(optional scope): <description>
  • The description of this PR title is user-oriented and clear enough for others to understand.
  • Attach the PR updating the user documentation if the current PR requires user awareness at the usage level. User docs repo

(Optional) Translate the PR title into Chinese.

(Optional) More detailed description for this PR(en: English/zh: Chinese).

en:
zh(optional):

(Optional) Which issue(s) this PR fixes:

(optional) The PR that updates user documentation:

@CLAassistant
Copy link

CLAassistant commented Mar 14, 2025

CLA assistant check
All committers have signed the CLA.

@hcmlinj hcmlinj requested a review from kuhahalong March 17, 2025 10:04
@hcmlinj hcmlinj requested a review from kuhahalong March 17, 2025 11:40
@@ -37,8 +38,8 @@ var CallbackMetricsExtraKey = "ollama_metrics"

// ChatModelConfig stores configuration options specific to Ollama
type ChatModelConfig struct {
BaseURL string `json:"base_url"`
Timeout time.Duration `json:"timeout"` // request timeout for http client
BaseURL string `json:"base_url"` // optional, host can be configured via the OLLAMA_HOST environment variable otherwise default is http://localhost:11434
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ollama provides an initialization function, api.ClientFromEnvironment(), which retrieves the host from environment variables. However, as a component, it still requires initialization before use. The additional support for retrieving values from environment variables doesn’t seem to offer much convenience

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you are using the default Ollama host or have already set the OLLAMA_HOST environment variable, there is no need to specify BaseURL when using ChatModelConfig. You only need to set BaseURL if you are using a non-default Ollama host and have not set the OLLAMA_HOST environment variable.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my option is that we could call api.ClientFromEnvironment() to create client when base_url is empty instead of manually reading envconfig.Host() to set base_url

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

4 participants