Lab 12: Path Planning & Execution
Logic
For lab 12, my original plan was to have my robot localize at each waypoint, then use the belief data to calculate the yaw and distance to the next point and navigate through the world. However, given the time it takes for the robot to do a full rotation and the limited accuracy of the localization, I decided wasn't worth the overhead.
Next, I attempted to use ToF sensor measurements with extrapolation for position control across all segments. My orientation PID is quite accurate thanks to the DMP (and to Stephan Wagner for the best implementation tutorial), but because of the obscure angles and distance from walls for the first three points, the ToF sensor values were too unreliable, making accurate position control nearly impossible.
As a result, though not ideal, I opted for fully open-loop control for the first three segments, using fixed drive times determined through trial and error to get the robot from one point to the next.
For the final four segments, which were situated near a straight wall, ToF readings were much more stable. I was able to apply position PID control as originally intended.
Below are the functions start_path_execution()
and handle_path_execution()
that are called when the command EXECUTE_PATH
is sent.
First, the angular difference between the current and next waypoint is calculated and assigned to GOAL_ANGLE
for orientation control.
Then, resetyaw()
is called to clear any residual data on the DMP and account for an offset if present, and the PID_ANGLE_ON
flag is set to true, initiating orientation PID to rotate the robot to the desired heading.
Once the turn is complete, the code checks whether the segment is open-loop or closed-loop. For open-loop segments, the robot drives forward for a fixed duration.
For closed-loop segments, the PID_ON
flag is set to true, triggering position PID control to move the robot toward the next waypoint based on distance feedback.
This process is repeated for each segment until the final waypoint is reached.
Code for PID_ON
and PID_ANGLE_ON
can be found in the writeup for Lab 6: Orientation PID.
In attempts to make my code a bit more concise, I defined three arrays:
waypoints[]
store all the path points relative to the origin in millimeters.control[]
holds the control values for each segment. These represent the drive time (in ms) for the first three open-loop segments,
and the target distance from the wall (in mm) for the remaining closed-loop segments utilizing position PID.
openloop[]
is a boolean array indicating whether each segment should use open-loop control based on time or closed-loop PID control based on distance.Testing
Here is a video of a valid but failed attempt.
From this run, it looks like my logic is working fine. The first three segments, which use timed open-loop control, turn correctly and head in the right direction, but the robot doesn’t drive forward long enough to hit the waypoints. This is mostly just a tuning issue - the delay values need some tweaking, and the robot's behavior can get quite inconsistent depending on how full the battery is.
That said, it still ends up near the right spot. The last few segments use position PID with ToF sensors, so they help the robot correct itself. Since the walls are straight and therefore ToF readings are more reliable, the robot can fix the earlier errors as long as it is in the correct orientation, which shows my orientation PID is doing a good job.
Below is a video of the robot hitting the first three waypoints. After the timed open loop control portion, the robot makes an unusual wide turn before going all over the place.
After multiple test runs, I discovered the issue was caused by having tape only on the left wheels. I chose to adjust the calibration factor to compensate for the imbalance rather than removing the tape, because the tape helps the robot turn.
Here is a video of my best run, hitting 6 out of the 9 waypoints quite accurately!
There was a lot of overshoot with my linear PID, but this was the best I could get it to work before my robot started going out of control 🥲 Because of the instability of the system, I also implemented hard stops during the intermediate position PID segments to prevent the robot from getting stuck or oscillating excessively. This ended up causing the robot to turn after it crashed into the wall while navigating through waypoints 7 and 8.
🤖yay🤖
Thank you all for a great semester! Big thanks to Farrell for the engaging lectures, and to the TAs for all the support and open hours. This was one of the most time consuming classes I took at Cornell (especially since I wasn’t taking it as a requirement, just for funsies) but I’m really glad I did! I genuinely enjoyed working on both the hardware and software sides of the robot. Very rewarding to see everything I’ve learned across multiple classes come together.