by Abdelrahman Barakat
Follow the Leader
As the area of autonomy grows in relation to vehicular control and navigation, complete autonomous driving is quickly becoming a reality. in the past get years, we have seen several significant steps being made. Some of the most famous are the Google car and the Volvo highway train. This project was inspired by Volvo’s model. It functions by controlling part of the journey by connecting several follower vehicles wirelessly by a leading vehicle driven by a professional driver. The purpose of this mode of driving is to allow the drivers of the follower cars to take a break from driving during long journeys. Vehicles which have the system installed can choose to join or leave the convoy at any point in the journey.
Autonomous driving technology is one of the most anticipated futuristic technologies which could dominate the roads in the near future. The technology needed to realize the dream of creating an efficient autonomous driving system involves sensing, thinking, and acting. In the Volvo model, sensing is done internally within the leading truck and is translated into wireless signals which are decoded by the follower vehicles and analyzed to be safely used to control it. An advantage of internal sensing is that it could be more predictable since the components within the truck being sensed do not change.
The analysis would be done by an algorithm which does all the calculations and computes the reaction to be used by the other cars.
To create a system which is designed to copy the moves of the vehicles, there could be
several approaches. One approach is to communicate wirelessly between the vehicles. Another approach is to use sensors like ultrasonic sensors and cameras to track the exterior motion of the leader. The way I looked at the problem was to divide it into two main sections; direction and speed. As the leader vehicle changes its direction or speed, the follower should react similarly. I used external sensors; a camera and an ultrasonic sensor. The camera was used to track the chance in direction while the ultrasonic sensor was set to keep a constant distance from the back of the leader.
The camera I used (NXTcam) depended on a technique which identified “blobs” of a particular color and their position in the image frame. I used the information to move the front wheels right and left depending in the relative vertical position of a brightly colored orange object at the back of the leader vehicle. The ultrasonic sensor was attached to the front wheels such that it moves towards the direction of the object to avoid losing track of the leader. The distance from the back of the leader truck is taken and the speed is adjusted depending on how close the follower is. The closer the follower, the slower the speed becomes. All the analysis was done by the follower robot. The leader robot was a remote controlled robot with a trailer attached to it. The Lego construction of the robot was designed to look similar to normal vehicles as seen in the images.
After several weeks of hard work, I finally completed the project. I was able to adjust the steering of the robot to make it always follow the leader. The robot could also keep a safe distance from the leading robot and avoid colliding with it.
Some of the problems I faced included dealing with variation in room illumination during different times of the day and understanding the syntax used to program the camera.
An autonomous driving system like this one could be implemented in real life but with several constraints including operation during daylight only. A better way to do things could be via wireless communication perhaps by Bluetooth connection between the NXT bricks for example. However, using the camera and the ultrasonic sensor makes it closer to how human beings operate cars by vision. Rebuilding this project using a better camera and connection is definitely worth trying.