Skip to content

Latest commit

 

History

History
30 lines (24 loc) · 726 Bytes

README.md

File metadata and controls

30 lines (24 loc) · 726 Bytes

Toy Cart-Pole

Repository for the Cart-Pole problem.

Example screen

Implemented Classic Controllers:

  • PID
  • Pole-Placement in State-Space
  • LQR
  • MPC

To Do:

  • Implement RL agent (discrete and continuous actions)

To run:

  1. Create a venv
  2. Clone the OpenAi Gym repo into project directory (Toy-CartPole/)
  3. Add the /gym/gym/envs/classic_control/cartpole_cont.py file to the corresponding path
  4. Add to /gym/gym/envs/__init__.py
register(
    id="CartPole-cont",
    entry_point="gym.envs.classic_control.cartpole_cont:CartPoleEnv",
    max_episode_steps=500,
    reward_threshold=475.0,
)

  1. Do $pip install -e . inside /gym/