Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Joint_trajectory_controller should not generate extra waypoints #669

Closed
lmendyk opened this issue Jun 14, 2023 · 9 comments
Closed

Joint_trajectory_controller should not generate extra waypoints #669

lmendyk opened this issue Jun 14, 2023 · 9 comments
Labels

Comments

@lmendyk
Copy link

lmendyk commented Jun 14, 2023

The Joint_trajectory_controller seems to generate some extra waypoints instead of just executing a defined trajectory. This seems as redundant functionally which overlaps with solutions like MoveIt which should be responsible for generating smooth trajectories with potentially many waypoints to smoot the movement.

Background:
I’m still learning robotics and I’m fighting with jerked movements of turtlebot3 with Open Manipulator see ROBOTIS-GIT/turtlebot3_manipulation#57.
The hardware implements only position command interface. So I decided to “start from basics” and instead of MoveIt2 to generate trajectory, I have created manually the trajectory consisting per each task only with a start and an end point hoping to have a smooth movement done by the hardware based on speed and acceleration profile mechanism

ros2 action send_goal /arm_controller/follow_joint_trajectory control_msgs/action/FollowJointTrajectory
"{trajectory: {joint_names:[joint1, joint2, joint3, joint4], points: [
{positions:[0.0, 0.0, 0.0, 0.0], time_from_start: {sec: 0, nanosec: 0} },
{positions:[1.57, 0.78, -0.78, 0.0], time_from_start: {sec: 1, nanosec: 0} }
]} }"

But what I observed that for this trajectory (just start and end point) the movement was very jerked instead one consistent movement which I expected to be done by the hardware being forwarded just the end position according to the doc (“For command interfaces position, the desired positions are simply forwarded to the joints”)

When I added debugging to TurtleBot3ManipulationSystemHardware ::write() I have discovered that the Joint_trajectory_controller has issued extra waypoints (it was 4 waypoint counting the start one) which caused the robot arm to brake and start at these additional waypoints. So instead of smoothing it added the jerks to the arm moments.

To my understanding (still limited) that behavior is strange as the controller should just execute the trajectory and should not try to “smooth” the movement by adding extra waypoints. As I understand that type of logic should be reserved to higher level system like moveIt.

When I changed the trajectory to just one point:

ros2 action send_goal /arm_controller/follow_joint_trajectory control_msgs/action/FollowJointTrajectory
"{trajectory: {joint_names:[joint1, joint2, joint3, joint4], points: [
{positions:[1.57, 0.78, -0.78, 0.0], time_from_start: {sec: 0, nanosec: 0} }
]} }"

The movement was smooth as the end position being just forwarded to the hardware which was responsible to do the smooth movement according to the profile mechanism.

Am I missing something and this behavior of the controller is the proper one?

BWT – what is a proper way of getting “smooth” movements for the robot arms which provide position only command interface. I wonder if an alternative to forcing one step trajectories as I describe above I could do sth which seems contradiction common sense. I mean lowering the max speed of the joint motors and make them slow enough so the movements can’t be done before next waypoint is generated by the controller so the override mechanism could assure movement smoothness for multi-point trajectory.

I would appreciate any comments and suggestions.

@lmendyk lmendyk added the bug label Jun 14, 2023
@bmagyar bmagyar added question and removed bug labels Jun 14, 2023
@lmendyk
Copy link
Author

lmendyk commented Jun 14, 2023

I have started with these docs but has not helped me (may be I didn't understand sth ).

But in the mean time I looked into source code and after that It came to me that may be this "interpolation_method" parameter is responsible for this. It is by defaul: spline. So hopefully I can switch off this behavior by changing it to "None"

Still I don't understand why this functionality is implemented in this controller as I would expect that it should be a moveIt tool job. As it is by default turned on, so I have not seen this parameter set explicitly in the config file of turtlebot3_manipulation so it has not given me a clue to check its meaning.

The second link you provided is about updating the on-going trajectory which is not my case I don't send new trajectory before the previous one has not ended.

@gavanderhoorn
Copy link
Contributor

gavanderhoorn commented Jun 14, 2023

@lmendyk: interpolation as performed by the controller is essentially a 'service' to the action client.

It's very often the case the hw controlled by the controller requires denser trajectories (both in time and space) than the trajectory representation (in ROS, but this holds in general I believe) could efficiently represent (or encode for).

Interpolation is the go-to strategy when you want to reconstruct a signal with a higher sampling rate than the input provides. In the case of the JointTrajectoryController, it uses spline interpolation of various orders (2nd, 3rd, etc) by default (if you only provide positions, it's not going to be able to do a very good job though). This spline is then sampled at the time resolution configured.

In essence, this allows it to send smooth motion (as you describe it) to the hw iface at a very high resolution (again: in time and space) even if the input trajectory is (very) sparse.

If you have servos which implement their own interpolation, this may not be needed. But in many cases signal continuity is a requirement, and without interpolation, that would require action clients like moveit to send JointTrajectorys with thousands, if not hundreds of thousands of points to the controller. That would make it very inefficient. It would also require reconfiguration for every system which requires different resolutions.

Technically, you could set things up as you describe: let whoever (or whatever) comes up with the trajectory worry about space and time resolution and the controller should not interpolate. With the parameter you found that's possible.

In reality however, interpolation is very useful -- the default MoveIt configuration fi typically generates about 20 to 40 points for a simple trajectory -- which is why it's enabled by default.


Having written all of that, I don't believe you can conclude that the "Joint_trajectory_controller should not generate extra waypoints". First of all, it doesn't really generate waypoints. The original trajectory is still there. It's just used as input to an interpolation algorithm which is used both for efficiency and for flexibility reasons.

@lmendyk
Copy link
Author

lmendyk commented Jun 14, 2023

@gavanderhoorn - thank you for this detailed explanation.

It is due to that I'm not experience person and I have started learning trying to understand what moveIt2 does and reading about planning pipelines with plugins which optimize initial trajectories generated by a planner misguided me that this is the system responsible for final smoothness of the trajectory.

So may be suggestion to add to the documentation a sentence or two :) that the controller by default does lower level trajectory tuning - (in the sense what is sent to the actuators - "virtual" points :) ) . And may be a clue that it may be not needed for hardware which provides position only command interface thus probably performing speed and acceleration profiling based on sparse trajectory.

Once again thank you for your explanation .

@gavanderhoorn
Copy link
Contributor

gavanderhoorn commented Jun 14, 2023

So may be suggestion to add to the documentation a sentence or two :) that the controller by default does lower level trajectory tuning

that would perhaps make for an excellent pull request. You are currently in the best position to formulate your thoughts about this in a way newcomers would find understandable.

(but please don't use the word tuning, that's not what it's doing)

And may be a clue that it may be not needed for hardware which provides position only command interface thus probably performing speed and acceleration profiling based on sparse trajectory.

again: not a conclusion you can draw / statement you can make (without adding more ifs).

I work with position only interfaces daily, and those require 125 to 250 Hz sampling rates with continuity requirements up to (and including) jerk, and do not do any velocity or acceleration calculations themselves.

This is exactly why the JointTrajectoryController allows you to configure it, as you found. There is no single use-case or configuration that could be taken as "the default". It's always going to depend on (local) specifics.

@lmendyk
Copy link
Author

lmendyk commented Jun 15, 2023

May be the problem is deeper than I thought. I checked and the configuration prepared by a robot vendor seems to have unrealistic update rate 100 Hz for the system based on Raspberry Pi with ubuntu without RT kernel. So may be instead of switching off the interpolation mechanizm I must get the update rate right.

@christophfroehlich
Copy link
Contributor

I agree that the joint_trajectory_controller could have a more detailed documentation. Feel free to make a suggestion via a PR where we can discuss this further.

If you want to shape the trajectory by yourself (e.g., using Ruckig or other methods like S-shaped/7-step jerk-limited trajectory generators), or want the motor controller doing this job: Just use a forward controller instead of JTC.

@lmendyk
Copy link
Author

lmendyk commented Jul 18, 2023

OK, it took me while to get close to satisfying result unfortunately doing it in try and error manner.

Finally I have return back to my original problem which was getting smooth movements generated by moveIt stack.
The key was to get the joint_limits.yaml with proper velocity and acceleration limits and have it properly installed – which was not the case in the original code and on top of that I needed to apply a workaround for the bug: pilz_cartesian_limits.yaml overwrites joint_limits.yaml when using moveit_configs_builder.py · Issue #1691 · ros-planning/moveit2 (github.com)
Thanks to that the trajectory generated by the planning pipeline could have been processed by the Joint_trajectory_controller with interpolation mechanism in a way that the final movement was performed by the turtlebot3_manipulation smoothly. Of cause it is smooth with scaling 1.0 as 0.1 resulted in more fine-grained trajectory with steps “too slow” for the motors driven by position only command interface (having “hardcoded speed and acceleration used by the hardware).
To my understating of what I have observed, the e2e processing looks like this:

  • First trajectory generated by moveit is composed of steps with 100 ms dt, and the position changes are calculated based on max acceleration and max velocity
  • This trajectory is then further processed by Joint_trajectory_controller doing the interpolation which dt is determined by the rate defined for the controller – I tested up to 50 Hz (20 ms) as higher is problematic for the turtlebot3 hardware.

@gavanderhoorn @christophfroehlich - thank you for your comments. As to your suggestion for me to make a PR for better documentation. I’m afraid that I still don’t know much to be able to prepare valuable PR not causing “irritation” 😊 by my ignorance. In fact I have gone further then initially anticipated – My ambition was just using high level tools like moveit not getting to details of the controllers – and concentrate on my may domain of interest – 5G base MEC which could host “brain” applications like MoveIT to control (high level) vie 5G URLLC network the robots (low level controllers run on robot hardware) in close to real time. But I find this subject interesting so I would appreciate you hints where I can find good materials explaining basics for me. Especially if there are some best practices which define:

  • When it is better to use more “basic” hardware based controller (PWM based) and use ROS2_contller to do more by software vs. have more intelligence in hardware controller (position command interface based controller for example implemented in motor microcontrollers)
  • How to properly set up the hierarch of the controller so they collaborate rather the overlapping. By this I mean hierarchy (top-down). MoveIt->ROS2_controller->hardware controller (embedded microcontroller based) .

@christophfroehlich
Copy link
Contributor

@lmendyk your questions are rather general and not well suited for this issue, please ask you questions on https://robotics.stackexchange.com

I'll close this one, feel free to reopen if you have questions to your original problem.
There was an update in the docs regarding JTC, maybe this helps here too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants