-
Notifications
You must be signed in to change notification settings - Fork 235
Add --all option to ramalama ls #1528
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Relates to: containers#1278 By default, ramalama ls should not display partially downloaded AI Models. In order to enable users to view all models, the new option --all for the ls command has been introduced. Signed-off-by: Michael Engel <[email protected]>
Reviewer's GuideIntroduces a new --all flag to the Sequence Diagram for
|
Change | Details | Files |
---|---|---|
Extended CLI to support an --all flag |
|
ramalama/cli.py |
Adjusted listing logic to filter and label partial downloads |
|
ramalama/cli.py |
Simplified partial detection in ModelStore |
|
ramalama/model_store.py |
Updated manpage with --all option and example |
|
docs/ramalama-list.1.md |
Tips and commands
Interacting with Sourcery
- Trigger a new review: Comment
@sourcery-ai review
on the pull request. - Continue discussions: Reply directly to Sourcery's review comments.
- Generate a GitHub issue from a review comment: Ask Sourcery to create an
issue from a review comment by replying to it. You can also reply to a
review comment with@sourcery-ai issue
to create an issue from it. - Generate a pull request title: Write
@sourcery-ai
anywhere in the pull
request title to generate a title at any time. You can also comment
@sourcery-ai title
on the pull request to (re-)generate the title at any time. - Generate a pull request summary: Write
@sourcery-ai summary
anywhere in
the pull request body to generate a PR summary at any time exactly where you
want it. You can also comment@sourcery-ai summary
on the pull request to
(re-)generate the summary at any time. - Generate reviewer's guide: Comment
@sourcery-ai guide
on the pull
request to (re-)generate the reviewer's guide at any time. - Resolve all Sourcery comments: Comment
@sourcery-ai resolve
on the
pull request to resolve all Sourcery comments. Useful if you've already
addressed all the comments and don't want to see them anymore. - Dismiss all Sourcery reviews: Comment
@sourcery-ai dismiss
on the pull
request to dismiss all existing Sourcery reviews. Especially useful if you
want to start fresh with a new review - don't forget to comment
@sourcery-ai review
to trigger a new review!
Customizing Your Experience
Access your dashboard to:
- Enable or disable review features such as the Sourcery-generated pull request
summary, the reviewer's guide, and others. - Change the review language.
- Add, remove or edit custom review instructions.
- Adjust other review settings.
Getting Help
- Contact our support team for questions or feedback.
- Visit our documentation for detailed guides and information.
- Keep in touch with the Sourcery team by following us on X/Twitter, LinkedIn or GitHub.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Signed-off-by: Michael Engel <[email protected]>
Another weird thing about
|
I'll have a look 👀 |
I couldn't reproduce the issue @ericcurtin ramalama pull hf://ggml-org/gemma-3-4b-it-GGUF A subsequent ls looked like this: $ ramalama ls
hf://ggml-org/gemma-3-4b-it-GGUF 2 minutes ago 3.11 GB
url://huggingface.co/mradermacher/SmolLM-135M-GGUF/SmolLM-135M.IQ3_M.gguf:main 1 month ago 86.03 MB Looking at your output:
it seems you have a different LLM model Could you run |
I'll clean up my model store and ignore it for now @engelmi there's been a lot of changes recently. Who knows at this point... |
Here's the output with the --all:
|
Fair point. Please let me know when the issue persists :) Note: Based on the |
Relates to: #1278
By default, ramalama ls should not display partially downloaded AI Models. In order to enable users to view all models, the new option --all for the ls command has been introduced.
Summary by Sourcery
Add an --all flag to ramalama ls/list to show partially downloaded models (excluded by default), annotate them in the output, and update documentation and model-store logic accordingly.
New Features:
Enhancements:
Documentation: