Skip to content

Commit deb3730

Browse files
authored
fix cluttered api referances docs (#2069)
1 parent 69b698e commit deb3730

File tree

12 files changed

+464
-337
lines changed

12 files changed

+464
-337
lines changed

autogen/agentchat/assistant_agent.py

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -19,10 +19,10 @@ class AssistantAgent(ConversableAgent):
1919
2020
AssistantAgent is a subclass of ConversableAgent configured with a default system message.
2121
The default system message is designed to solve a task with LLM,
22-
including suggesting python code blocks and debugging.
23-
`human_input_mode` is default to "NEVER"
24-
and `code_execution_config` is default to False.
25-
This agent doesn't execute code by default, and expects the user to execute the code.
22+
including suggesting python code blocks and debugging. \n
23+
`human_input_mode` is default to "NEVER" \n
24+
and `code_execution_config` is default to False. \n
25+
This agent doesn't execute code by default, and expects the user to execute the code. \n
2626
"""
2727

2828
DEFAULT_SYSTEM_MESSAGE = """You are a helpful AI assistant.
@@ -52,20 +52,20 @@ def __init__(
5252
**kwargs: Any,
5353
):
5454
"""Args:
55-
name (str): agent name.
56-
system_message (str): system message for the ChatCompletion inference.
55+
- name (str): agent name. \n
56+
- system_message (str): system message for the ChatCompletion inference. \n
5757
Please override this attribute if you want to reprogram the agent.
58-
llm_config (dict or False or None): llm inference configuration.
59-
Please refer to [OpenAIWrapper.create](https://docs.ag2.ai/latest/docs/api-reference/autogen/OpenAIWrapper/#autogen.OpenAIWrapper.create)
60-
for available options.
61-
is_termination_msg (function): a function that takes a message in the form of a dictionary
58+
- llm_config (dict or False or None): llm inference configuration. \n
59+
Please refer to [OpenAIWrapper.create](https://docs.ag2.ai/latest/docs/api-reference/autogen/OpenAIWrapper/#autogen.OpenAIWrapper.create) \n
60+
for available options. \n
61+
- is_termination_msg (function): a function that takes a message in the form of a dictionary
6262
and returns a boolean value indicating if this received message is a termination message.
63-
The dict can contain the following keys: "content", "role", "name", "function_call".
64-
max_consecutive_auto_reply (int): the maximum number of consecutive auto replies.
63+
The dict can contain the following keys: "content", "role", "name", "function_call". \n
64+
- max_consecutive_auto_reply (int): the maximum number of consecutive auto replies.
6565
default to None (no limit provided, class attribute MAX_CONSECUTIVE_AUTO_REPLY will be used as the limit in this case).
66-
The limit only plays a role when human_input_mode is not "ALWAYS".
67-
**kwargs (dict): Please refer to other kwargs in
68-
[ConversableAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/ConversableAgent).
66+
The limit only plays a role when human_input_mode is not "ALWAYS". \n
67+
- **kwargs (dict): Please refer to other kwargs in
68+
[ConversableAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/ConversableAgent). \n
6969
"""
7070
super().__init__(
7171
name,

autogen/agentchat/chat.py

Lines changed: 27 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -154,37 +154,36 @@ def initiate_chats(chat_queue: list[dict[str, Any]]) -> list[ChatResult]:
154154
"""Initiate a list of chats.
155155
156156
Args:
157-
chat_queue (List[Dict]): A list of dictionaries containing the information about the chats.
158-
157+
chat_queue (List[Dict]): A list of dictionaries containing the information about the chats.\n
159158
Each dictionary should contain the input arguments for
160-
[`ConversableAgent.initiate_chat`](../ConversableAgent#initiate-chat).
161-
For example:
162-
- `"sender"` - the sender agent.
163-
- `"recipient"` - the recipient agent.
164-
- `"clear_history"` (bool) - whether to clear the chat history with the agent.
165-
Default is True.
166-
- `"silent"` (bool or None) - (Experimental) whether to print the messages in this
167-
conversation. Default is False.
168-
- `"cache"` (Cache or None) - the cache client to use for this conversation.
169-
Default is None.
170-
- `"max_turns"` (int or None) - maximum number of turns for the chat. If None, the chat
171-
will continue until a termination condition is met. Default is None.
172-
- `"summary_method"` (str or callable) - a string or callable specifying the method to get
173-
a summary from the chat. Default is DEFAULT_summary_method, i.e., "last_msg".
174-
- `"summary_args"` (dict) - a dictionary of arguments to be passed to the summary_method.
175-
Default is {}.
176-
- `"message"` (str, callable or None) - if None, input() will be called to get the
177-
initial message.
178-
- `**context` - additional context information to be passed to the chat.
179-
- `"carryover"` - It can be used to specify the carryover information to be passed
180-
to this chat. If provided, we will combine this carryover with the "message" content when
181-
generating the initial chat message in `generate_init_message`.
182-
- `"finished_chat_indexes_to_exclude_from_carryover"` - It can be used by specifying a list of indexes of the finished_chats list,
183-
from which to exclude the summaries for carryover. If 'finished_chat_indexes_to_exclude_from_carryover' is not provided or an empty list,
184-
then summary from all the finished chats will be taken.
159+
[`ConversableAgent.initiate_chat`](../ConversableAgent#initiate-chat).\n
160+
For example:\n
161+
- `"sender"` - the sender agent.\n
162+
- `"recipient"` - the recipient agent.\n
163+
- `"clear_history"` (bool) - whether to clear the chat history with the agent.\n
164+
Default is True.\n
165+
- `"silent"` (bool or None) - (Experimental) whether to print the messages in this\n
166+
conversation. Default is False.\n
167+
- `"cache"` (Cache or None) - the cache client to use for this conversation.\n
168+
Default is None.\n
169+
- `"max_turns"` (int or None) - maximum number of turns for the chat. If None, the chat\n
170+
will continue until a termination condition is met. Default is None.\n
171+
- `"summary_method"` (str or callable) - a string or callable specifying the method to get\n
172+
a summary from the chat. Default is DEFAULT_summary_method, i.e., "last_msg".\n
173+
- `"summary_args"` (dict) - a dictionary of arguments to be passed to the summary_method.\n
174+
Default is {}.\n
175+
- `"message"` (str, callable or None) - if None, input() will be called to get the\n
176+
initial message.\n
177+
- `**context` - additional context information to be passed to the chat.\n
178+
- `"carryover"` - It can be used to specify the carryover information to be passed\n
179+
to this chat. If provided, we will combine this carryover with the "message" content when\n
180+
generating the initial chat message in `generate_init_message`.\n
181+
- `"finished_chat_indexes_to_exclude_from_carryover"` - It can be used by specifying a list of indexes of the finished_chats list,\n
182+
from which to exclude the summaries for carryover. If 'finished_chat_indexes_to_exclude_from_carryover' is not provided or an empty list,\n
183+
then summary from all the finished chats will be taken.\n
185184
186185
Returns:
187-
(list): a list of ChatResult objects corresponding to the finished chats in the chat_queue.
186+
(list): a list of ChatResult objects corresponding to the finished chats in the chat_queue.\n
188187
"""
189188
consolidate_chat_info(chat_queue)
190189
_validate_recipients(chat_queue)

autogen/agentchat/contrib/captainagent/captainagent.py

Lines changed: 20 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -149,25 +149,26 @@ def __init__(
149149
description: str | None = DEFAULT_DESCRIPTION,
150150
**kwargs: Any,
151151
):
152-
"""Args:
153-
name (str): agent name.
154-
system_message (str): system message for the ChatCompletion inference.
155-
Please override this attribute if you want to reprogram the agent.
156-
llm_config (LLMConfig or dict or False): llm inference configuration.
157-
Please refer to [OpenAIWrapper.create](https://docs.ag2.ai/latest/docs/api-reference/autogen/OpenAIWrapper/#autogen.OpenAIWrapper.create) for available options.
158-
is_termination_msg (function): a function that takes a message in the form of a dictionary
159-
and returns a boolean value indicating if this received message is a termination message.
160-
The dict can contain the following keys: "content", "role", "name", "function_call".
161-
max_consecutive_auto_reply (int): the maximum number of consecutive auto replies.
162-
default to None (no limit provided, class attribute MAX_CONSECUTIVE_AUTO_REPLY will be used as the limit in this case).
163-
The limit only plays a role when human_input_mode is not "ALWAYS".
164-
agent_lib (str): the path or a JSON file of the agent library for retrieving the nested chat instantiated by CaptainAgent.
165-
tool_lib (str): the path to the tool library for retrieving the tools used in the nested chat instantiated by CaptainAgent.
166-
nested_config (dict): the configuration for the nested chat instantiated by CaptainAgent.
167-
A full list of keys and their functionalities can be found in [docs](https://docs.ag2.ai/latest/docs/user-guide/reference-agents/captainagent).
168-
agent_config_save_path (str): the path to save the generated or retrieved agent configuration.
169-
**kwargs (dict): Please refer to other kwargs in
170-
[ConversableAgent](https://github.com/ag2ai/ag2/blob/main/autogen/agentchat/conversable_agent.py#L74).
152+
"""
153+
Args:\n
154+
name (str): agent name.\n
155+
system_message (str): system message for the ChatCompletion inference.\n
156+
Please override this attribute if you want to reprogram the agent.\n
157+
llm_config (LLMConfig or dict or False): llm inference configuration.\n
158+
Please refer to [OpenAIWrapper.create](https://docs.ag2.ai/latest/docs/api-reference/autogen/OpenAIWrapper/#autogen.OpenAIWrapper.create) for available options.\n
159+
is_termination_msg (function): a function that takes a message in the form of a dictionary\n
160+
and returns a boolean value indicating if this received message is a termination message.\n
161+
The dict can contain the following keys: "content", "role", "name", "function_call".\n
162+
max_consecutive_auto_reply (int): the maximum number of consecutive auto replies.\n
163+
default to None (no limit provided, class attribute MAX_CONSECUTIVE_AUTO_REPLY will be used as the limit in this case).\n
164+
The limit only plays a role when human_input_mode is not "ALWAYS".\n
165+
agent_lib (str): the path or a JSON file of the agent library for retrieving the nested chat instantiated by CaptainAgent.\n
166+
tool_lib (str): the path to the tool library for retrieving the tools used in the nested chat instantiated by CaptainAgent.\n
167+
nested_config (dict): the configuration for the nested chat instantiated by CaptainAgent.\n
168+
A full list of keys and their functionalities can be found in [docs](https://docs.ag2.ai/latest/docs/user-guide/reference-agents/captainagent).\n
169+
agent_config_save_path (str): the path to save the generated or retrieved agent configuration.\n
170+
**kwargs (dict): Please refer to other kwargs in\n
171+
[ConversableAgent](https://github.com/ag2ai/ag2/blob/main/autogen/agentchat/conversable_agent.py#L74).\n
171172
"""
172173
super().__init__(
173174
name,

autogen/agentchat/contrib/graph_rag/graph_rag_capability.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,10 +14,10 @@
1414
class GraphRagCapability(AgentCapability):
1515
"""A graph-based RAG capability uses a graph query engine to give a conversable agent the graph-based RAG ability.
1616
17-
An agent class with graph-based RAG capability could
18-
1. create a graph in the underlying database with input documents.
19-
2. retrieved relevant information based on messages received by the agent.
20-
3. generate answers from retrieved information and send messages back.
17+
An agent class with graph-based RAG capability could:\n
18+
1. create a graph in the underlying database with input documents.\n
19+
2. retrieved relevant information based on messages received by the agent.\n
20+
3. generate answers from retrieved information and send messages back.\n
2121
2222
For example,
2323
```python

autogen/agentchat/contrib/graph_rag/neo4j_graph_query_engine.py

Lines changed: 18 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -28,23 +28,24 @@
2828

2929
@require_optional_import("llama_index", "neo4j")
3030
class Neo4jGraphQueryEngine:
31-
"""This class serves as a wrapper for a property graph query engine backed by LlamaIndex and Neo4j,
32-
facilitating the creating, connecting, updating, and querying of LlamaIndex property graphs.
33-
34-
It builds a property graph Index from input documents,
35-
storing and retrieving data from the property graph in the Neo4j database.
36-
37-
It extracts triplets, i.e., [entity] -> [relationship] -> [entity] sets,
38-
from the input documents using llamIndex extractors.
39-
40-
Users can provide custom entities, relationships, and schema to guide the extraction process.
41-
42-
If strict is True, the engine will extract triplets following the schema
43-
of allowed relationships for each entity specified in the schema.
44-
45-
It also leverages LlamaIndex's chat engine which has a conversation history internally to provide context-aware responses.
46-
47-
For usage, please refer to example notebook/agentchat_graph_rag_neo4j.ipynb
31+
"""
32+
This class serves as a wrapper for a property graph query engine backed by LlamaIndex and Neo4j,\n
33+
facilitating the creating, connecting, updating, and querying of LlamaIndex property graphs.\n
34+
\n
35+
It builds a property graph Index from input documents,\n
36+
storing and retrieving data from the property graph in the Neo4j database.\n
37+
\n
38+
It extracts triplets, i.e., [entity] -> [relationship] -> [entity] sets,\n
39+
from the input documents using llamIndex extractors.\n
40+
\n
41+
Users can provide custom entities, relationships, and schema to guide the extraction process.\n
42+
\n
43+
If strict is True, the engine will extract triplets following the schema\n
44+
of allowed relationships for each entity specified in the schema.\n
45+
\n
46+
It also leverages LlamaIndex's chat engine which has a conversation history internally to provide context-aware responses.\n
47+
\n
48+
For usage, please refer to example notebook/agentchat_graph_rag_neo4j.ipynb\n
4849
"""
4950

5051
def __init__( # type: ignore[no-any-unimported]

autogen/agentchat/contrib/rag/query_engine.py

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -29,18 +29,18 @@ def init_db(
2929
"""Initialize the database with the input documents or records.
3030
3131
This method initializes database with the input documents or records.
32-
Usually, it takes the following steps:
33-
1. connecting to a database.
34-
2. insert records
35-
3. build indexes etc.
32+
Usually, it takes the following steps:\n
33+
1. connecting to a database.\n
34+
2. insert records.\n
35+
3. build indexes etc.\n
3636
37-
Args:
38-
new_doc_dir (Optional[Union[Path, str]]): A directory containing documents to be ingested.
39-
new_doc_paths_or_urls (Optional[Sequence[Union[Path, str]]]): A list of paths or URLs to documents to be ingested.
40-
*args: Any additional arguments
41-
**kwargs: Any additional keyword arguments
42-
Returns:
43-
bool: True if initialization is successful, False otherwise
37+
Args:\n
38+
new_doc_dir (Optional[Union[Path, str]]): A directory containing documents to be ingested.\n
39+
new_doc_paths_or_urls (Optional[Sequence[Union[Path, str]]]): A list of paths or URLs to documents to be ingested.\n
40+
*args: Any additional arguments\n
41+
**kwargs: Any additional keyword arguments\n
42+
Returns:\n
43+
bool: True if initialization is successful, False otherwise\n
4444
"""
4545
...
4646

0 commit comments

Comments
 (0)