You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/source/rllib/rllib-examples.rst
+13-3Lines changed: 13 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -134,13 +134,21 @@ Connectors
134
134
This type of filtering can improve learning stability in environments with highly variable state magnitudes
135
135
by scaling observations to a normalized range.
136
136
137
-
- `Multi-agent connector mapping global observations to different per-agent/policy observations <https://github.com/ray-project/ray/blob/master/rllib/examples/connectors/multi_agent_with_different_observation_spaces.py>`__:
138
-
A connector example showing how to map from a global, multi-agent observation space to n individual, per-agent, per-module observation spaces.
A connector alters the CartPole-v1 environment observations from the Markovian 4-tuple (x-pos,
148
+
angular-pos, x-velocity, angular-velocity) to a non-Markovian, simpler 2-tuple (only
149
+
x-pos and angular-pos). The resulting problem can only be solved through a
150
+
memory/stateful model, for example an LSTM.
151
+
144
152
145
153
Curiosity
146
154
+++++++++
@@ -308,8 +316,10 @@ Multi-agent RL
308
316
a hand-coded random policy while another agent trains with PPO. This example highlights integrating static and dynamic policies,
309
317
suitable for environments with a mix of fixed-strategy and adaptive agents.
310
318
311
-
- `Different spaces for agents <https://github.com/ray-project/ray/blob/master/rllib/examples/multi_agent/different_spaces_for_agents.py>`__:
319
+
- `Different observation- and action spaces for different agents <https://github.com/ray-project/ray/blob/master/rllib/examples/multi_agent/different_spaces_for_agents.py>`__:
312
320
Configures agents with differing observation and action spaces within the same environment, showcasing RLlib's support for heterogeneous agents with varying space requirements in a single multi-agent environment.
321
+
Another example, which also makes use of connectors, and that covers the same topic, agents having different spaces, can be found
0 commit comments