Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds an initial StableBaselines3 RL environment as an example #2667

Draft
wants to merge 4 commits into
base: arjo/feat/server_reset_public_api
Choose a base branch
from

Conversation

arjo129
Copy link
Contributor

@arjo129 arjo129 commented Nov 7, 2024

A lot of things are not working. Particularly when ResetAll is called, the EnableVelocityChecks does not trigger the phyics system to populate the velocity components. This is a blocker for the current example.

A lot of things are not working. Particularly when `ResetAll` is called,
the EnableVelocityChecks does not trigger the phyics system to populate
the velocity components. This is a blocker for the current example.

Signed-off-by: Arjo Chakravarty <arjoc@intrinsic.ai>
Signed-off-by: Arjo Chakravarty <arjoc@intrinsic.ai>
Signed-off-by: Arjo Chakravarty <arjoc@intrinsic.ai>
Signed-off-by: Arjo Chakravarty <arjoc@intrinsic.ai>
@arjo129
Copy link
Contributor Author

arjo129 commented Nov 12, 2024

So the above code should be able to train a RL model even on a potato. Currently I've got the algorithm to successfully balance a cart pole. There are some open issues however that will block this from being merged. Primarily, my main concern is that I've hacked together an API for running the gui client.

RL_with_gazebo_simple_example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Inbox
Development

Successfully merging this pull request may close these issues.

1 participant