Skip to content

Conversation

ericcurtin
Copy link
Member

@ericcurtin ericcurtin commented Jul 9, 2025

This was recently removed:

  •        if getattr(self.args, "model", False):
    
  •            data["model"] = self.args.model
    

it is required

Summary by Sourcery

Restore conditional addition of the model field in chat request payloads based on the presence of the --model argument.

Bug Fixes:

  • Reintroduce the model field in request data only when self.args.model is set instead of always using self.args.MODEL.

Enhancements:

  • Switch to using getattr(self.args, "model", False) for safe attribute access.

Copy link
Contributor

sourcery-ai bot commented Jul 9, 2025

Reviewer's Guide

Reintroduced conditional 'model' parameter in request payload using getattr, replacing the previous unconditional uppercase MODEL assignment.

Class diagram for updated request data construction in chat module

classDiagram
    class Chat {
        - args
        - conversation_history
        + _make_request_data()
    }
    Chat : _make_request_data() now conditionally adds 'model' to data if self.args.model exists
Loading

File-Level Changes

Change Details Files
Restore conditional inclusion of the 'model' field in request data
  • Add getattr(self.args, 'model', False) guard
  • Assign lowercase self.args.model instead of self.args.MODEL
  • Indent model assignment under the new conditional
ramalama/chat.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@ericcurtin
Copy link
Member Author

@kush-gupt PTAL

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @ericcurtin, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a bug in the chat functionality by reintroducing a necessary conditional check for the model parameter. This ensures that the model is correctly included in API requests only when it's available, preventing potential errors or misconfigurations that could arise from its previous unconditional assignment, and also corrects a case sensitivity issue in the attribute name.

Highlights

  • Bugfix for Model Parameter Handling: Reinstated the conditional assignment of the model parameter within the _make_request_data function in ramalama/chat.py. This restores logic that ensures the model is only added to the request data if self.args.model exists, addressing a recent regression as it was previously removed.
  • Attribute Name Correction: Corrected the attribute name used for retrieving the model from self.args.MODEL (uppercase) to self.args.model (lowercase), aligning with standard Python conventions and fixing a potential typo.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @ericcurtin - I've reviewed your changes and they look great!


Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

ramalama/chat.py Outdated
Comment on lines 135 to 137
if not (hasattr(self.args, 'runtime') and self.args.runtime == "mlx"):
data["model"] = self.args.MODEL
if getattr(self.args, "model", False):
data["model"] = self.args.model
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (code-quality): Merge nested if conditions (merge-nested-ifs)

Suggested change
if not (hasattr(self.args, 'runtime') and self.args.runtime == "mlx"):
data["model"] = self.args.MODEL
if getattr(self.args, "model", False):
data["model"] = self.args.model
if not (hasattr(self.args, 'runtime') and self.args.runtime == "mlx") and getattr(self.args, "model", False):
data["model"] = self.args.model


ExplanationToo much nesting can make code difficult to understand, and this is especially
true in Python, where there are no brackets to help out with the delineation of
different nesting levels.

Reading deeply nested code is confusing, since you have to keep track of which
conditions relate to which levels. We therefore strive to reduce nesting where
possible, and the situation where two if conditions can be combined using
and is an easy win.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request reintroduces a previously removed block of code that sets the 'model' attribute in the request data for the chat API. A suggestion has been made to add a check to ensure the model attribute has a valid value before assignment.

ramalama/chat.py Outdated
Comment on lines 136 to 137
if getattr(self.args, "model", False):
data["model"] = self.args.model
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider checking if self.args.model is not None or empty before assigning it to data["model"] to prevent potential errors if the value is missing or invalid.

Suggested change
if getattr(self.args, "model", False):
data["model"] = self.args.model
if getattr(self.args, "model", False) and self.args.model:
data["model"] = self.args.model

@kush-gupt
Copy link
Contributor

kush-gupt commented Jul 9, 2025

This fixes the model error indeed!

For chatting with mlx, a wrong model name won't show any errors on the chat side. It just repeats the api request until infinity, and the only indication is the mlx server process logs

127.0.0.1 - - [09/Jul/2025 12:06:27] "POST /v1/chat/completions HTTP/1.1" 404 -

@kush-gupt
Copy link
Contributor

An agnostic model name that works is "default_model", as nothing else worked with my continue integration

@ericcurtin
Copy link
Member Author

ericcurtin commented Jul 9, 2025

On second thought I think we should just remove this bit and go back to the original code prior to mlx:

not (hasattr(self.args, 'runtime') and self.args.runtime == "mlx")

if the user passes some bogus model, that's user error, we shouldn't mask that

This was recently removed:

+            if getattr(self.args, "model", False):
+                data["model"] = self.args.model

it is required

Signed-off-by: Eric Curtin <[email protected]>
@rhatdan
Copy link
Member

rhatdan commented Jul 9, 2025

LGTM

@rhatdan rhatdan merged commit 5fd3015 into main Jul 9, 2025
28 of 38 checks passed
@ericcurtin ericcurtin deleted the bugfix-for-chat branch July 9, 2025 20:06
@ieaves ieaves mentioned this pull request Jul 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants