Skip to content

Adds a user configuration setting to disable gpu prompting #1632

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 3, 2025

Conversation

ieaves
Copy link
Collaborator

@ieaves ieaves commented Jun 30, 2025

This PR provides two pieces of functionality

  1. The ability to nest configuration settings as structured objects
  2. A user configuration field to disable no_gpu prompting. This was inspired by issue Tests can't be run on OSX for users using the applehv engine #1618

If we like this path, I will modify the tests to disable the gpu prompting with a fixture.

Summary by Sourcery

Enable structured nested configurations and add a user-level option to bypass GPU missing prompts

New Features:

  • Support nested configuration settings as structured dataclass objects
  • Introduce user.no_missing_gpu_prompt setting to disable GPU missing prompts

Enhancements:

  • Integrate build_subconfigs in LayeredMixin to instantiate nested config dataclasses
  • Modify confirm_no_gpu to respect the new no_missing_gpu_prompt user setting

Copy link
Contributor

sourcery-ai bot commented Jun 30, 2025

Reviewer's Guide

Extends the layered configuration system to support nested dataclass-based subconfig objects and adds a new user-level flag to disable GPU prompts, wiring it through the config initialization and prompt flow.

Sequence diagram for disabling GPU prompt via user config

sequenceDiagram
    participant User
    participant Application
    participant Config
    participant confirm_no_gpu
    User->>Application: Start operation requiring GPU
    Application->>Config: Access CONFIG.user.no_missing_gpu_prompt
    Config-->>Application: Return True (if set)
    Application->>confirm_no_gpu: Call function
    confirm_no_gpu->>Config: Check CONFIG.user.no_missing_gpu_prompt
    alt no_missing_gpu_prompt is True
        confirm_no_gpu-->>Application: Return True (skip prompt)
    else no_missing_gpu_prompt is False
        confirm_no_gpu->>User: Prompt for confirmation
        User-->>confirm_no_gpu: Respond
        confirm_no_gpu-->>Application: Return result
    end
Loading

Class diagram for nested configuration and UserConfig addition

classDiagram
    class LayeredMixin {
        +__init__(*layers)
        +is_set(name)
    }
    class BaseConfig {
        +container: bool
        +engine: str
        +store: str
        +model: str
        +transport: str
        +ocr: bool
        +default_image: str
        +user: UserConfig
        +__post_init__()
    }
    class UserConfig {
        +no_missing_gpu_prompt: bool
    }
    BaseConfig --> UserConfig : has-a
    LayeredMixin <|-- BaseConfig
Loading

Class diagram for build_subconfigs utility function

classDiagram
    class build_subconfigs {
        +build_subconfigs(values: dict, obj) dict
    }
Loading

File-Level Changes

Change Details Files
Enable nested config objects by instantiating dataclass subconfigs from nested dict values
  • Import is_dataclass in layered_config
  • Implement build_subconfigs to recurse into dict fields matching dataclass annotations
  • Wrap merged config dict with build_subconfigs before super().init in LayeredMixin
ramalama/layered_config.py
Introduce a new UserConfig dataclass and embed it into BaseConfig for user-specific settings
  • Define UserConfig with no_missing_gpu_prompt flag
  • Add user: UserConfig field with default_factory to BaseConfig
  • Add a sample Config instantiation overriding user.no_missing_gpu_prompt
ramalama/config.py
Respect the new no_missing_gpu_prompt flag in the GPU confirmation flow
  • Import CONFIG in confirm_no_gpu
  • Return early if CONFIG.user.no_missing_gpu_prompt is true before prompting
ramalama/common.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @ieaves - I've reviewed your changes and they look great!

Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments

### Comment 1
<location> `ramalama/layered_config.py:20` </location>
<code_context>
+    """Facilitates nesting configs by instantiating the child typed object from a dict"""
+    dtypes: dict[str, Type] = obj.__annotations__
+    for k, v in values.items():
+        if isinstance(v, dict) and (subconfig_type := dtypes.get(k)) and is_dataclass(subconfig_type):
+            values[k] = subconfig_type(**build_subconfigs(v, dtypes[k]))
+    return values
+
</code_context>

<issue_to_address>
Modifying the input dictionary in-place may have unintended side effects.

Mutating 'values' directly can cause issues if it's used elsewhere. Recommend returning a new dictionary to prevent side effects.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment on lines +20 to +36
if isinstance(v, dict) and (subconfig_type := dtypes.get(k)) and is_dataclass(subconfig_type):
values[k] = subconfig_type(**build_subconfigs(v, dtypes[k]))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Modifying the input dictionary in-place may have unintended side effects.

Mutating 'values' directly can cause issues if it's used elsewhere. Recommend returning a new dictionary to prevent side effects.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @ieaves, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request enhances the configuration system by allowing nested configurations and introduces a user-configurable setting to disable the GPU prompt. This change aims to provide more flexibility in configuring the application and addresses a specific user request.

Highlights

  • Configuration Nesting: Introduces the ability to nest configuration settings as structured objects using dataclasses.
  • Disable GPU Prompt: Adds a user configuration option to disable the 'no GPU found' prompt, addressing issue #1618.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a feature to disable the GPU-missing prompt, enabled by adding support for nested configuration objects. The implementation is clean and effective.

@ieaves
Copy link
Collaborator Author

ieaves commented Jul 1, 2025

This morning I thought this was going to be an easy fix but once I started running unit tests I belatedly realized the issue was more complicated than it first appeared because of a circular import issue and the dependency tree between common and config. In order to break the cyclical dependency I needed to pull apple_vm out of the get_default_engine logic being used in the config which is in any case a good idea as otherwise we end up calling it multiple times (which itself involves a subprocess call to the podman cli).

The equivalent logic for OSX has been moved to the initialize_environment code under the root init.py.

Next up, I needed a place to store the new prompt disabling configuration setting. I went back and forth on either setting up a new local cache for user preferences using something like shelve but anticipated some hesitation around splitting the config and introducing a second file which has to be managed. I landed, instead, on leaving it in the default config but under a nested user attribute. This required some additional machinery so that we could instantiate and type nested config objects.

It was at this point that I realized modifying the config file in a testing environment was slightly complicated by the multiple config file locations we traverse and that, for the purposes of the initial issue, a better resolution would be to allow these settings to be defined through environment variables. In order to do that I implemented a pattern for the arbitrary resolution of config values in environment variables where the config value

[ramalama.subkey]
key = value

corresponds to the environment variable RAMALAMA_SUBKEY__KEY=value or equivalently RAMALAMA__SUBKEY__KEY=value.

Finally, I added runtime type validation logic for these environmental variables.

With that machinery in place I added changes in conftest to preload specific environment variables for testing purposes and then restore the users environment once testing completed.

@ieaves ieaves force-pushed the feat/user-prompt-configs branch 2 times, most recently from ed2d303 to 45fce73 Compare July 2, 2025 21:50
Signed-off-by: Ian Eaves <[email protected]>
@ieaves ieaves force-pushed the feat/user-prompt-configs branch from 45fce73 to 27fa390 Compare July 2, 2025 21:59
def coerce_to_bool(value: Any) -> bool:
if isinstance(value, bool):
return value
elif isinstance(value, str):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: No need for elif, if works fine.

val = value.strip().lower()
if val in {"true", "1", "yes", "y"}:
return True
elif val in {"false", "0", "no", "n"}:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: No need for elif, if works fine.

@rhatdan
Copy link
Member

rhatdan commented Jul 3, 2025

LGTM

@rhatdan rhatdan merged commit ae38e3f into containers:main Jul 3, 2025
17 of 25 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants