-
Notifications
You must be signed in to change notification settings - Fork 246
:latest tag should not be assumed for non-OCI artefacts #1534
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Reviewer's GuideIn the CLI’s model-listing routine, the code now retains the “:latest” tag for Ollama and OCI URIs by wrapping the suffix removal in a guard that only applies to other schemes (after the existing Hugging Face URL normalization). Flow diagram for model URI normalization and :latest tag handlingflowchart TD
A[Start: model URI] --> B{model starts with 'huggingface://'?}
B -- Yes --> C[Replace 'huggingface://' with 'hf://']
B -- No --> D
C --> D{model starts with 'ollama://' or 'oci://'?}
D -- Yes --> E[Do not remove ':latest']
D -- No --> F[Remove ':latest' suffix]
E --> G[Continue processing]
F --> G
D --> G
File-Level Changes
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
@engelmi PTAL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary of Changes
Hello @ericcurtin, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request fixes an issue where the :latest
tag was being incorrectly removed from model identifiers that are not OCI or Ollama specific. The change updates the model string processing to preserve the :latest
tag for OCI and Ollama schemes while continuing to remove it for others like local files.
Highlights
- Conditional Suffix Removal: Modified the logic in
_list_models_from_store
to conditionally remove the:latest
suffix from model identifiers. - Scheme-Specific Handling: The
:latest
suffix is now only removed if the model identifier does not start withollama://
oroci://
. This means it will still be removed for schemes likefile://
orhf://
.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command>
or @gemini-code-assist <command>
. Below is a summary of the supported commands.
Feature | Command | Description |
---|---|---|
Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/
folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configureGemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
The pull request modifies the _list_models_from_store
function in ramalama/cli.py
to avoid removing the :latest
suffix from model references that use the ollama://
or oci://
schemes. This prevents the tool from incorrectly assuming the :latest
tag for non-OCI artifacts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @ericcurtin - I've reviewed your changes - here's some feedback:
- If file:// URIs should also preserve the
:latest
suffix, includefile://
in the exclusion list here. - You can simplify the negative check by using
if not model.startswith(("ollama://", "oci://"))
instead of two separatestartswith
calls.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- If file:// URIs should also preserve the `:latest` suffix, include `file://` in the exclusion list here.
- You can simplify the negative check by using `if not model.startswith(("ollama://", "oci://"))` instead of two separate `startswith` calls.
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
5831bd4
to
0f9073d
Compare
LGTM |
Yeah... Will get to them |
LGTM |
0f9073d
to
a345bc8
Compare
I see people showing things like: file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/UD_Q2_K_XL/Qwen3-235B-A22B-UD-Q2_K_XL-00001-of-00002.gguf:latest 1 month ago 46.42 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/UD_Q2_K_XL/Qwen3-235B-A22B-UD-Q2_K_XL-00002-of-00002.gguf:latest 1 month ago 35.55 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/Q8_0/Qwen3-235B-A22B-Q8_0-00001-of-00006.gguf:latest 1 week ago 46.44 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/Q8_0/Qwen3-235B-A22B-Q8_0-00002-of-00006.gguf:latest 1 week ago 46.0 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/Q8_0/Qwen3-235B-A22B-Q8_0-00003-of-00006.gguf:latest 1 week ago 45.93 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/Q8_0/Qwen3-235B-A22B-Q8_0-00004-of-00006.gguf:latest 1 week ago 46.0 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/Q8_0/Qwen3-235B-A22B-Q8_0-00005-of-00006.gguf:latest 1 week ago 46.0 GB file://srv/llm/modles/unsloth/Qwen3-235B-A22B-GGUF/Q8_0/Qwen3-235B-A22B-Q8_0-00006-of-00006.gguf:latest 1 week ago 2.39 GB Signed-off-by: Eric Curtin <[email protected]>
a345bc8
to
8ff0cd3
Compare
LGTM |
I see people showing things like:
Summary by Sourcery
Enhancements: