In the world of robotics, combining mechanical engineering, advanced electronics, and smart software leads to the creation of smart and adaptable robots. Our project for the MEAM5100 Mechatronics System Design course is a great example of this combination. We've built an mobile robot that can move around and interact with its environment in many different ways. This robot is special because it has mecanum wheels, which allow it to move smoothly in any direction, making it very agile and precise.
Our main goals were to make the robot follow walls closely, track a beacon accurately, and push a model of a police car. Achieving these goals meant bringing together the mechanical structure, electrical systems, and software of the robot in a way that shows our technical skill and creativity.
Frame Design: The chassis is laser-cut from acrylic, providing a sturdy yet lightweight foundation.
Wheels: Mecanum wheels are used, allowing omnidirectional motion.
Motor Mounts: Custom-designed and 3D printed motor mounts provide stability.
Design Considerations: A modular approach with strategic placement of components for optimized space utilization and functionality.
Power Source: The robot is powered by a LiPo battery, providing sufficient current for motors and sensors.
Voltage Regulation: Voltage regulators are used to step down the battery voltage to suitable levels for different components like microcontrollers and sensors.
Motor Drivers: TB6612FNG motor drivers are used for controlling the mecanum wheels. Motor drivers are interfaced with the ESP32 S2 microcontroller. PWM signals control the speed, and digital signals determine the direction.
WiFi Module: Integrated into the ESP32 S2 for wireless communication, crucial for receiving data from the HTC Vive system.
Serial Communication: Used for debugging and inter-module communication between the ESP32 and ATMega32u4 controllers.
Primary Controller - ESP32 S2: Chosen for its processing power, WiFi capabilities, and ample I/O options. Handles high-level tasks like motor control, sensor data processing, and communication.
Secondary Controllers- ATMega32u4: Dedicated to specific tasks like IR beacon tracking, providing efficient task segregation and parallel processing.
Modular Code Structure: The firmware is organized into modules for motor control, sensor readings, communication, and decision-making algorithms.
Interrupts and Timers: Used for precise motor control, sensor data acquisition, and maintaining synchrony with the Vive system.
Sensor Fusion:
a. Ultrasonic Sensors: Utilized for distance measurement in wall-following. They are interfaced with the microcontroller using digital I/O pins.
b. IR Sensors: Phototransistors and operational amplifiers form the core of the IR detection circuit. This circuit is tuned to detect specific IR frequencies for beacon detection.
c. vive positioning: Vive detection circuit is used as a positioning device to localize the robot with respect to world frame.
Combining data from ultrasonic sensors, IR sensors, and the Vive system for accurate environmental perception and decision-making.
Feedback Loops: Employing PID control algorithms for precise motor control, wall-following, and beacon tracking.
For this a frequency detection circuit is employed with secondary ATMega32U4 controller to detect the beacon with 550Hz and 23Hz. The robot sweeps a specific area, moving in various directions (right, forward, left), and periodically checks the states of phototransistors to identify the beacon’s location. Upon both phototransistors indicating a high signal (beacon detected), the robot moves forward. If the beacon is not detected, the robot rotates counterclockwise. This flow ensures the robot autonomously follows the beacon’s signal and adjusts its movement accordingly.
GIF takes time to load.
The software uses Vive trackers for positioning and includes PID control for movement and rotation. It establishes a WiFi connection, sends UDP packets with positional information. The robot sweeps in the Y-direction based on Vive location data until it aligns with the police car location sent over UDP. It then moves forward, pushing the police car.
GIF takes time to load.
The wall-following algorithm utilizes two ultrasonic sensors. followWall(): This function calculates the wall-following error using PID control, based on target distances from the wall and sensor readings. calculateMecanumWheelSpeeds(): Determines individual wheel speeds for the mecanum wheels based on the estimated error.
Process Flow: The loop function continuously reads distances from the ultrasonic sensors, adjusts motor speeds to maintain the desired distance from the wall, and triggers a rotational movement if needed.
GIF takes time to load.
This project is protected under the [MIT] License. For more details, refer to the LICENSE file.