Skip to content

Commit

Permalink
Add pre-commit hooks and BSD 3-Clause license (#51)
Browse files Browse the repository at this point in the history
* Add pre-commit and license

* Exclude xacro

* Remove duplicate license
  • Loading branch information
amalnanavati authored Dec 14, 2024
1 parent 9323099 commit 12c5456
Show file tree
Hide file tree
Showing 96 changed files with 480 additions and 156 deletions.
1 change: 1 addition & 0 deletions .codespell-ignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ser
2 changes: 2 additions & 0 deletions .github/LICENSE_HEADER.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Copyright (c) 2024, Personal Robotics Laboratory
License: BSD 3-Clause. See LICENSE.md file in root directory.
16 changes: 16 additions & 0 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Description

\[TODO: describe, in-detail, what issue this PR addresses and how it addresses it. Link to relevant Github Issues.\]

# Testing procedure

\[TODO: describe, in-detail, how you tested this. The procedure must be detailed enough for the reviewer(s) to recreate it.\]

# Before opening a pull request

- \[ \] `pre-commit run --all-files`
- \[ \] Run your code through [pylint](https://pylint.readthedocs.io/en/latest/). `pylint --recursive=y --rcfile=.pylintrc .`. All warnings but `fixme` must be addressed.

# Before Merging

- \[ \] `Squash & Merge`
44 changes: 44 additions & 0 deletions .github/workflows/pre-commit.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.

name: pre-commit

on:
push:
branches:
- main
- master
- ros2-devel
pull_request:
workflow_dispatch:

concurrency:
group: "${{ github.workflow }} @ ${{ github.event.pull_request.head.label || github.head_ref || github.ref }}"
cancel-in-progress: true

env:
PYTHON_VERSION: "3.10"

jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}

## Run pre-commit and try to apply fixes
- name: Run pre-commit
uses: pre-commit/action@v3.0.1
- name: Apply fixes from pre-commit
uses: pre-commit-ci/lite-action@v1.0.2
if: always()
# - name: Install dependencies
# run: |
# python -m pip install --upgrade pip
# pip install --force-reinstall pylint==3.1.0
# pip install overrides
# - name: Analysing the code with pylint
# run: |
# pylint --recursive=y --rcfile=.pylintrc . --disable fixme --disable import-error
69 changes: 69 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: check-added-large-files
- id: check-case-conflict
- id: check-executables-have-shebangs
- id: check-merge-conflict
- id: check-shebang-scripts-are-executable
- id: check-symlinks
- id: check-xml
- id: check-yaml
- id: debug-statements
- id: destroyed-symlinks
- id: detect-private-key
- id: end-of-file-fixer
- id: mixed-line-ending
- id: requirements-txt-fixer
- id: trailing-whitespace

- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.0
hooks:
- id: insert-license
args:
- --license-file
- .github/LICENSE_HEADER.md
- --use-current-year
types_or: [python, yaml]

- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.0
hooks:
- id: insert-license
args:
- --license-file
- .github/LICENSE_HEADER.md
- --use-current-year
- --comment-style
- "<!--| |-->"
types_or: [xml]
exclude: ".*package\\.xml$|.*\\.xacro$"

- repo: https://github.com/psf/black
rev: 23.1.0
hooks:
- id: black

- repo: https://github.com/lovesegfault/beautysh
rev: v6.2.1
hooks:
- id: beautysh

- repo: https://github.com/executablebooks/mdformat
rev: 0.7.16
hooks:
- id: mdformat
args:
- --number

- repo: https://github.com/codespell-project/codespell
rev: v2.2.4
hooks:
- id: codespell
args:
- --ignore-words=.codespell-ignore
1 change: 0 additions & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -643,4 +643,3 @@ init-import=no
# List of qualified module names which can have objects that can redefine
# builtins.
redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io

2 changes: 1 addition & 1 deletion LICENSE → LICENSE.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
BSD 3-Clause License

Copyright (c) 2023, Personal Robotics Laboratory
Copyright (c) 2024, Personal Robotics Laboratory

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Expand Down
24 changes: 15 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,42 +1,48 @@
# ada_ros2

ROS2 Hardware Interface and Description for the ADA Robot

## Setup

See the [`ada_feeding` top-level README for setup instructions](https://github.com/personalrobotics/ada_feeding/blob/ros2-devel/README.md).

## Running ADA MoveIt

### RVIZ

1. Run `ros2 launch ada_moveit demo.launch.py sim:=mock` command from your ROS2 workspace.
2. See here for a [brief guide to using RVIZ to interact with MoveIt](https://moveit.picknik.ai/humble/doc/tutorials/quickstart_in_rviz/quickstart_in_rviz_tutorial.html).

### Real

1. Run `ros2 launch ada_moveit demo.launch.py` command from your ROS2 workspace. If running for feeding specifically (i.e., where the watchdog dying kills the controllers) run `ros2 launch ada_moveit demo_feeding.launch.py`. Make sure the watchdog is running before you launch this node.

### MoveIt Servo

MoveIt Servo allows [real-time arm servoing](https://moveit.picknik.ai/humble/doc/examples/realtime_servo/realtime_servo_tutorial.html) in cartesian space by sending twist commands to the end effector.
MoveIt Servo allows [real-time arm servoing](https://moveit.picknik.ai/humble/doc/examples/realtime_servo/realtime_servo_tutorial.html) in cartesian space by sending twist commands to the end effector.

To use Servo with keyboard teleop:

1. Launch the force-torque sensor:
1. Sim: `ros2 run ada_feeding dummy_ft_sensor.py`
2. Real: `ros2 run forque_sensor_hardware forque_sensor_hardware`
1. Sim: `ros2 run ada_feeding dummy_ft_sensor.py`
2. Real: `ros2 run forque_sensor_hardware forque_sensor_hardware`
2. Launch MoveIt:
1. Sim:`ros2 launch ada_moveit demo.launch.py sim:=mock`
2. Real: `ros2 launch ada_moveit demo.launch.py`
1. Sim:`ros2 launch ada_moveit demo.launch.py sim:=mock`
2. Real: `ros2 launch ada_moveit demo.launch.py`
3. Re-tare the F/T sensor: `ros2 service call /wireless_ft/set_bias std_srvs/srv/SetBool "{data: true}"`
4. Enable MoveIt Servo:
1. Switch Controllers: `ros2 service call /controller_manager/switch_controller controller_manager_msgs/srv/SwitchController "{activate_controllers: [\"jaco_arm_servo_controller\"], deactivate_controllers: [\"jaco_arm_controller\"], start_controllers: [], stop_controllers: [], strictness: 0, start_asap: false, activate_asap: false, timeout: {sec: 0, nanosec: 0}}"`
2. Toggle Servo On: `ros2 service call /servo_node/start_servo std_srvs/srv/Trigger "{}"`
1. Switch Controllers: `ros2 service call /controller_manager/switch_controller controller_manager_msgs/srv/SwitchController "{activate_controllers: [\"jaco_arm_servo_controller\"], deactivate_controllers: [\"jaco_arm_controller\"], start_controllers: [], stop_controllers: [], strictness: 0, start_asap: false, activate_asap: false, timeout: {sec: 0, nanosec: 0}}"`
2. Toggle Servo On: `ros2 service call /servo_node/start_servo std_srvs/srv/Trigger "{}"`
5. Run the keyboard teleop script: `ros2 run ada_moveit ada_keyboard_teleop.py`
6. Follow the on-screen instructions to teleoperate the robot. Note that although cartesian control avoids obstacles in the planning scene, joint control does not.
7. Toggle Servo Off: `ros2 service call /servo_node/stop_servo std_srvs/srv/Trigger "{}"`

To create your own Servo client:

1. Follow steps 1-4 above.
2. Have your client publish Twist commands to `/servo_node/delta_twist_cmds`. Note the following:
1. For reliable cartesian control when sending angular velocities on the real robot and `lovelace`, ensure the angular velocity is <= 0.3 rad/s in magnitude. Greater angular velocities might change the end effector's position in addition to its orientation. We believe this is because of latencies with MoveIt Servo getting the robot's joint states via the joint state publisher.
2. Be sure to send 0-velocity Twist messages at the end to stop the robot.
1. For reliable cartesian control when sending angular velocities on the real robot and `lovelace`, ensure the angular velocity is \<= 0.3 rad/s in magnitude. Greater angular velocities might change the end effector's position in addition to its orientation. We believe this is because of latencies with MoveIt Servo getting the robot's joint states via the joint state publisher.
2. Be sure to send 0-velocity Twist messages at the end to stop the robot.

## Camera Calibration

Expand Down
13 changes: 8 additions & 5 deletions ada_calibrate_camera/README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,23 @@
# ada_camera_calibration

This file contains a nodes to do the following:

1. Calibrate ADA's eye-in-hand system's extrinsics (run every time the eye-in-hand system changes);
2. Publish the transform between ADA's end-effector and the camera (run every time the robot is used with perception).

## Calibrating the Camera's Extrinsics

1. Be in the `src` directory of your workspace.
2. `python3 src/ada_ros2/ada_calibrate_camera/calibrate_camera_start.py`
3. `screen -r calibrate`
1. (See here for relevant [`screen`` commands](https://gist.github.com/jctosta/af918e1618682638aa82))
1. (See here for relevant [\`screen\`\` commands](https://gist.github.com/jctosta/af918e1618682638aa82))
4. Follow the instructions on-screen. (Motions are expected to take ~6 mins and collect up to 30 samples)
5. Once it is done, verify it:
1. Re-build your workspace.
2. Run `ros2 launch ada_moveit demo.launch.py sim:=mock`
3. In RVIZ, add an axis for `camera_color_optical_frame`
4. Verify it looks like the frame is correct.
1. Re-build your workspace.
2. Run `ros2 launch ada_moveit demo.launch.py sim:=mock`
3. In RVIZ, add an axis for `camera_color_optical_frame`
4. Verify it looks like the frame is correct.

## Publishing the Transform

1. `ros2 launch ada_calibrate_camera publish_camera_extrinsics_launch.xml` (this is done by default in the `ada_moveit` launchfile).
2 changes: 2 additions & 0 deletions ada_calibrate_camera/ada_calibrate_camera/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
#!/usr/bin/env python3
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.

"""
This module defines a node that calibrates the camera using a charucoboard.
"""
Expand Down
15 changes: 9 additions & 6 deletions ada_calibrate_camera/ada_calibrate_camera/camera_calibration.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.

"""
This module defines the CameraCalibration class, which allows users to add samples
consisting of an RGB image, a gripper2base transform, and a target2cam transform.
Expand Down Expand Up @@ -201,7 +204,7 @@ def save_sample(
The target2cam translation vector.
"""
# pylint: disable=too-many-arguments
# Necessay to get all the data for this sample.
# Necessary to get all the data for this sample.
cv2.imwrite(os.path.join(data_dir, f"{sample_i}_rgb_img.png"), rgb_img)
np.savez_compressed(
os.path.join(data_dir, f"{sample_i}_sample.npz"),
Expand Down Expand Up @@ -351,7 +354,7 @@ def compute_calibration(
The calibration method that got the least error.
"""
# pylint: disable=too-many-locals, too-many-branches, too-many-statements, too-many-arguments
# The comaprison to the reference data requires many arguments.
# The comparison to the reference data requires many arguments.

if len(self.Rs_gripper2base) < 3:
print("Need at least 3 samples to calibrate the camera.", flush=True)
Expand Down Expand Up @@ -417,7 +420,7 @@ def compute_calibration(
)
continue

# Convert to a homogenous transform
# Convert to a homogeneous transform
T_cam2gripper = np.eye(4)
T_cam2gripper[:3, :3] = R_cam2gripper
T_cam2gripper[:3, 3] = t_cam2gripper.reshape((3,))
Expand All @@ -427,7 +430,7 @@ def compute_calibration(
ts_target2base = []
# pylint: disable=consider-using-enumerate
for i in range(len(self.Rs_target2cam)):
# Get the homogenous transform from the gripper to the base
# Get the homogeneous transform from the gripper to the base
T_gripper2base = np.eye(4)
if self.Rs_gripper2base[i].shape == (3,):
T_gripper2base[:3, :3] = R.from_rotvec(
Expand All @@ -437,7 +440,7 @@ def compute_calibration(
T_gripper2base[:3, :3] = self.Rs_gripper2base[i]
T_gripper2base[:3, 3] = self.ts_gripper2base[i]

# Get the homogenous transform from the target to the camera
# Get the homogeneous transform from the target to the camera
T_target2cam = np.eye(4)
if self.Rs_target2cam[i].shape == (3,):
T_target2cam[:3, :3] = R.from_rotvec(
Expand All @@ -447,7 +450,7 @@ def compute_calibration(
T_target2cam[:3, :3] = self.Rs_target2cam[i]
T_target2cam[:3, 3] = self.ts_target2cam[i]

# Compute the homogenous transform from the target to the base
# Compute the homogeneous transform from the target to the base
T_target2base = T_gripper2base @ T_cam2gripper @ T_target2cam

# Extract the rotation and translation
Expand Down
3 changes: 3 additions & 0 deletions ada_calibrate_camera/ada_calibrate_camera/charuco_detector.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.

"""
This module defines the CharucoDetector class, which is used to detect a charucoboard in an image.
"""
Expand Down
15 changes: 9 additions & 6 deletions ada_calibrate_camera/ada_calibrate_camera/helpers.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.

"""
This module contains helper functions for calibrating the camera.
"""
Expand Down Expand Up @@ -50,7 +53,7 @@ def pose_to_rot_trans(

def pose_to_matrix(pose: Pose) -> np.ndarray:
"""
Convert a Pose message to a homogenous transformation matrix.
Convert a Pose message to a homogeneous transformation matrix.
Parameters
----------
Expand All @@ -60,7 +63,7 @@ def pose_to_matrix(pose: Pose) -> np.ndarray:
Returns
-------
np.ndarray
The homogenous transformation matrix.
The homogeneous transformation matrix.
"""
M = np.eye(4)
M[:3, :3] = R.from_quat(ros2_numpy.numpify(pose.orientation)).as_matrix()
Expand All @@ -70,7 +73,7 @@ def pose_to_matrix(pose: Pose) -> np.ndarray:

def transform_to_matrix(transform: Transform) -> np.ndarray:
"""
Convert a Transform message to a homogenous transformation matrix.
Convert a Transform message to a homogeneous transformation matrix.
Parameters
----------
Expand All @@ -80,7 +83,7 @@ def transform_to_matrix(transform: Transform) -> np.ndarray:
Returns
-------
np.ndarray
The homogenous transformation matrix.
The homogeneous transformation matrix.
"""
M = np.eye(4)
M[:3, :3] = R.from_quat(ros2_numpy.numpify(transform.rotation)).as_matrix()
Expand All @@ -90,12 +93,12 @@ def transform_to_matrix(transform: Transform) -> np.ndarray:

def matrix_to_pose(M: np.ndarray) -> Pose:
"""
Convert a homogenous transformation matrix to a Pose message.
Convert a homogeneous transformation matrix to a Pose message.
Parameters
----------
M : np.ndarray
The homogenous transformation matrix.
The homogeneous transformation matrix.
Returns
-------
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
# !/usr/bin/env python3
#!/usr/bin/env python3
# Copyright (c) 2024, Personal Robotics Laboratory
# License: BSD 3-Clause. See LICENSE.md file in root directory.

"""
This module contains a rclpy ROS2 node that takes in parameters for the translation
and rotation (quaternion) as well as the frame_id's and publishes the requested transform.
Expand Down
Loading

0 comments on commit 12c5456

Please sign in to comment.