Skip to content

Conversation

DavidKoleczek
Copy link
Collaborator

Create a new generic wrapper for LLMs that can easily be extended to other providers.

@DavidKoleczek DavidKoleczek requested a review from Copilot January 25, 2025 16:50
@DavidKoleczek DavidKoleczek changed the title David koleczek/generic llm Generic LLM Interface Jan 25, 2025
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 53 out of 67 changed files in this pull request and generated 1 comment.

Files not reviewed (14)
  • .vscode/settings.json: Language not supported
  • notebooks/local_llm/10_ollama_deepseek_coder_v2.ipynb: Evaluated as low risk
  • src/not_again_ai/llm/gh_models/azure_ai_client.py: Evaluated as low risk
  • notebooks/local_llm/20_phi-3-vision.ipynb: Evaluated as low risk
  • notebooks/llm/01_openai_chat_completion.ipynb: Evaluated as low risk
  • notebooks/llm/02_ollama_intro.ipynb: Evaluated as low risk
  • notebooks/llm/10_gpt-4-v.ipynb: Evaluated as low risk
  • notebooks/local_llm/01_common_chat_completion.ipynb: Evaluated as low risk
  • notebooks/local_llm/02_ollama_intro.ipynb: Evaluated as low risk
  • README.md: Evaluated as low risk
  • .github/workflows/python.yml: Evaluated as low risk
  • pyproject.toml: Evaluated as low risk
  • src/not_again_ai/llm/chat_completion/init.py: Evaluated as low risk
  • src/not_again_ai/llm/chat_completion/interface.py: Evaluated as low risk
Comments suppressed due to low confidence (6)

src/not_again_ai/llm/chat_completion/providers/openai_api.py:70

  • The code converts tool_call['function']['arguments'] to a string, but it does not handle cases where arguments might already be a string. This could lead to double stringification.
tool_call['function']['arguments'] = str(tool_call['function']['arguments'])

src/not_again_ai/llm/chat_completion/providers/openai_api.py:125

  • The code attempts to load message.get('content', '{}') as JSON, but it does not handle cases where content is not a valid JSON string.
json_message = json.loads(message.get('content', '{}'))

src/not_again_ai/llm/chat_completion/providers/openai_api.py:220

  • The openai_client function has multiple branches based on api_type, api_key, and other parameters. Ensure that all branches are covered by tests.
def openai_client(api_type: Literal['openai', 'azure_openai'] = 'openai',

src/not_again_ai/llm/chat_completion/providers/ollama_api.py:67

  • The parameter 'json_mode' is used inconsistently. Explicitly check for True or False to avoid unexpected behavior.
json_mode = kwargs.get("json_mode", None)

src/not_again_ai/llm/chat_completion/types.py:109

  • The stop field can be either a string or a list of strings, which could lead to ambiguity. Consider using a single type for consistency.
stop: str | list[str] | None = Field(default=None)

src/not_again_ai/llm/chat_completion/types.py:126

  • The json_message field is defined as dict[str, Any] | None. Consider using a more specific type if possible.
json_message: dict[str, Any] | None = Field(default=None)

@DavidKoleczek DavidKoleczek merged commit a776606 into main Jan 25, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant