SLAM Racer team

*Team description*

Developers

Team Lead and Designer

Programmer

Yikai Huang

Programmer

Featured post

SLAM Team Story at Circuit Launch Competition 8/14/21

Harrison (left) and Dev (right) at the DIY Robocars race held at Circuit Launch in Oakland.

Today was our second race of the year at Circuit Launch representing UC San Diego! We had two teams from UCSD make an appearance, the Jetracer team and Team 2 (creative name, I know right?), with three members making a trip from San Diego to Oakland. The race was held at Circuit Launch and hosted by our good friend Chris Anderson, the founder of the DIY Robocars community. These races have two cars driving head-to-head autonomously on an indoor racetrack. The purpose is to bring people interested in autonomous technologies together to have fun, and to hopefully provide some motivation for people to get their own projects going.

The Jetracer team consisted of our co-founder Andrew Britten and Angus Yick. Using the Jetracer framework and taking a deep learning approach, they finished third place out of all the racers! Congratulations to Andrew and Angus!

Reflections: 

Team 2 consisted of Harrison Lew and myself. It was our first time visiting Oakland to race, so we were both nervous and excited. We had spent the previous two days before the race at a hackathon with the Autoware Foundation, a nonprofit organization that develops and open-sources autonomous vehicle software and the past month learning about the navigation stack in ROS. Our cars used the F1/10th platform developed by researchers at the University of Pennsylvania and the ROS navigation stack, which included perception, mapping, localization, and planning.

Harrison had worked on developing a low-cost car which involved a camera, RP LiDAR, VESC 4.2, and a NVIDIA Jetson Nano. The goal of the car was to create an affordable entry-point for people to get involved with robocars. In addition, he worked to improve the physical design of the cars by lowering their shocks and the overall assembly, lowering center of gravity and improving driving by reducing risk of flipping.

My car had sensors such as the SICK TIM 571 Series LiDAR, Razor 9DOF IMU, Intel Realsense D455 Camera, VESC 6.6, and a NVIDIA Jetson Xavier.

Our solution to navigate around the racetrack autonomously relied on mapping the track, localizing to know our current location on the track, and path-planning and control with the help of the ROS Navigation Stack. I installed and configured ROS drivers to interface with each sensor and piece of hardware on my car, including the LiDAR, IMU, Intel Realsense Camera, and VESC 6.6. I also configured a F710 Logitech Joystick, which used some code to convert joystick inputs to motor commands, enabling us to drive the car. For the actual solution, I used laser_scan_matcher, a ROS package that outputs the 2D pose of the car using the change between consecutive laser scans, and gmapping, a ROS package that produces a map of the environment using SLAM. We also used a waypoint logger script and drove around the track to generate waypoints that we could pass in as an input to move_base, a node in the ROS Navigation Stack that uses a global and local planner to output command velocities to a base controller. However, along the process, we faced issues relating to the accuracy of our odometry and setting up move_base. While we were not able to make the car run autonomously, we learned a lot and thought we’d document the challenges we faced to be successful in future races! 

Challenges:

1. Height of the LiDAR w.r.t. the black rubber walls. 

The inner part of the track is lined with black rubber walls, which we originally planned to use for mapping and localization of the car. However, when I drove the car around the track and visualized the map in RViz, the black rubber walls were not visible. We concluded the LiDAR was mounted too high on the car, resulting in the laser beams passing over the walls. This reduced our ability to accurately localize the car around the track, leaving us to rely on odometry. 

2. Noise in odometry readings. 

With the laser beams not hitting the walls, odometry became critical in our ability to accurately localize our car. However, when driving the car (slowly) from the starting line to collect odometry data, we observed a significant deviation in the 2D pose of the car after completing one lap. We tried three different sources of odometry: wheel odometry using the encoders in our brushless motor, pose estimation from laser_scan_matcher, and our IMU. All sources showed a significant difference between the starting and ending position of the car. 

What we would do differently next time:  

1. Test early, test often. 

The LiDAR problem could be avoided by measuring the height of the LiDAR mounted and the black rubber walls, which is a quick way to check if the solution is viable. 

2. Use an extended kalman filter to fuse different sensor readings for odometry. 

This would reduce the noise and increase the accuracy of our car’s ability to localize. 

Overall, I’m grateful to have gotten the opportunity to travel to Oakland to participate in the two-day hackathon, the race, and meet people in the community! I feel more confident in my programming abilities, improved my understanding of ROS and the navigation stack, and feel inspired after seeing the different methods our competitors used to navigate autonomously. It’s been a pleasure and both Harrison and I are excited for returning to the track in November! 

End

Content

All Team posts