-
Notifications
You must be signed in to change notification settings - Fork 234
Allow std input #1606
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow std input #1606
Conversation
Reviewer's GuideReintroduce support for reading prompts from standard input when piped, merging piped data with CLI arguments and preserving existing behavior for direct arguments. Sequence diagram for handling prompt input from CLI arguments and stdinsequenceDiagram
actor User
participant CLI as Command Line Interface
participant ChatHandler
User->>CLI: Provide ARGS and/or pipe stdin
CLI->>ChatHandler: Initialize with args
ChatHandler->>ChatHandler: Check if ARGS present
alt ARGS present
ChatHandler->>ChatHandler: Set prompt from ARGS
end
ChatHandler->>ChatHandler: Check if stdin is piped
alt stdin is piped
ChatHandler->>ChatHandler: Read stdin
ChatHandler->>ChatHandler: Append stdin to prompt (if ARGS)
end
alt prompt exists
ChatHandler->>ChatHandler: Call self.default(prompt)
ChatHandler->>ChatHandler: Call self.kills()
ChatHandler->>CLI: Return True
else
ChatHandler->>CLI: Return False
end
Class diagram for updated prompt handling in ChatHandlerclassDiagram
class ChatHandler {
- args
- url
+ handle_args()
+ default(prompt)
+ kills()
}
ChatHandler : handle_args() now merges ARGS and stdin for prompt
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary of Changes
Hello @ericcurtin, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request reintroduces a previously existing feature that allows the ramalama
tool to process data piped through standard input. This enhances the tool's flexibility, enabling workflows where content from other sources can be seamlessly fed into ramalama
for processing, optionally alongside a user-defined prompt from command-line arguments.
Highlights
- Standard Input Reinstatement: I've reinstated the ability for the
ramalama
tool to accept input via standard input (stdin). This allows users to pipe content from files or other commands directly intoramalama
. - Prompt Concatenation: The implementation now intelligently combines command-line arguments with standard input. If both are provided, the standard input content is appended to the command-line arguments, separated by two newlines, forming a single comprehensive prompt.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command>
or @gemini-code-assist <command>
. Below is a summary of the supported commands.
Feature | Command | Description |
---|---|---|
Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/
folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
b41b4ef
to
87da2b0
Compare
This might help @ieaves |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @ericcurtin - I've reviewed your changes and they look great!
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request re-implements the feature to allow standard input to be used as a prompt for the ramalama chat
command. The implementation is straightforward. A suggestion has been made to refactor the implementation slightly to make it more robust against empty piped input and to improve its overall clarity.
We used to have this feature, got dropped recently accidentally, can do things like: `cat text_file_with_prompt.txt | ramalama run smollm:135m` or `cat some_doc | ramalama run smollm:135m Explain this document:` Signed-off-by: Eric Curtin <[email protected]>
87da2b0
to
aab36b0
Compare
LGTM |
We used to have this feature, got dropped recently accidentally, can do things like:
cat text_file_with_prompt.txt | ramalama run smollm:135m
or
cat some_doc | ramalama run smollm:135m Explain this document:
Summary by Sourcery
Allow reading from standard input for prompts, merging it with any provided CLI arguments and passing the combined text to the chat handler.
New Features:
Enhancements: