Robotics2026

Autonomous Lane-Keeping RC Car

Self-driving RC car on Raspberry Pi 5 with real-time YOLOv11 obstacle avoidance, dual-PID control, and Kalman-filtered lane tracking. 1st place winner in AIRE475 competition at Abu Dhabi University.

Autonomous Lane-Keeping RC Car
1st Place — AIRE475 CompetitionRobotics / Embedded AI
15+ FPS on Pi 5
YOLOv11n + Kalman Filter
Dual-PID Control
±2cm Stop Precision

Overview

Built as part of the AIRE475 course at Abu Dhabi University, this is a fully autonomous RC car that navigates a two-lane track in real time, running entirely on a Raspberry Pi 5. The system uses a dual-stream pipeline, one for lane following using Bird's Eye View (BEV) transformation, polynomial fitting, and Kalman filtering, and another for obstacle detection using a NCNN-quantized YOLOv11n model. Both streams feed into a synchronized dual-PID controller managing steering and speed simultaneously. The car successfully navigated curved sections, handled dashed lane markings, and stopped within ±2cm of obstacles at 15+ FPS. A real-time Flask web dashboard enabled live tuning of PID gains during test runs.

The Problem

Implementing advanced autonomous driving (robust lane detection, real-time object tracking, and multi-axis vehicle control) on a resource-constrained Raspberry Pi 5 that must run everything at 15+ FPS. The indoor arena added challenges: curved track sections, dashed lane markings, and a highly reflective floor that caused specular noise in the camera feed.

The Solution

A hybrid computer vision pipeline combining classical image processing (BEV transformation, HSV masking, Top-Hat enhancement, Canny edge detection, polynomial fitting) with a NCNN-quantized YOLOv11n deep learning model. Two parallel perception streams feed into a dual-PID controller coordinated through a differential drive mixer. Kalman filters stabilize noisy polynomial coefficients and allow the car to coast through brief detection gaps.

How It Works

1

Pi Camera captures raw frames in real time on the Raspberry Pi 5

2

Lane pipeline applies BEV homography, HSV white/yellow masking, Top-Hat enhancement, and Canny edge detection

3

Sliding window algorithm locates lane pixels, then fits quadratic polynomials to left and right lanes independently

4

Kalman filter smooths polynomial coefficients and predicts lane position between frames

5

Obstacle pipeline runs raw frames through NCNN-quantized YOLOv11n for real-time bounding box detection

6

Detected objects projected into BEV space, with the closest in-lane object designated as the Most Important Object (MIO)

7

Steering PID calculates lateral offset from lane center, outputs differential motor correction

8

Distance PID uses MIO distance to reduce speed and stop the car safely before obstacles

9

Both PID outputs mix into left/right PWM signals driving the differential motors via GPIO

10

Flask dashboard streams live camera + BEV feed and allows real-time slider-based PID tuning

Key Features

  • Dual-stream parallel pipeline: lane following and obstacle avoidance running concurrently
  • Bird's Eye View (BEV) homographic transformation eliminating perspective distortion for accurate lane modeling
  • NCNN-quantized YOLOv11n object detection achieving 15+ FPS on Raspberry Pi 5
  • Kalman filter for smooth Kalman-filtered lane polynomial coefficients with predictive 'coasting' through occlusions
  • Dual-PID control: steering PID for lateral lane centering, distance PID for longitudinal obstacle avoidance
  • Most Important Object (MIO) identification via BEV ego-lane projection and distance estimation
  • Real-time Flask web dashboard for live Kp/Ki/Kd tuning and camera feed monitoring during runs
  • Stopping precision within ±2cm of detected in-lane obstacles
  • HSV masking + Top-Hat morphological transform for robust lane detection under reflections and variable lighting

Technology Stack

PythonOpenCVYOLOv11nRaspberry Pi 5Kalman FilterPID ControlFlaskNumPyNCNN

Results

Won 1st place in the AIRE475 (Self-Driving Cars) competition at Abu Dhabi University, supervised by Dr. Sajid Khawaja. The system achieved 15+ FPS inference on Raspberry Pi 5, successfully navigated curved track sections and dashed lane markings, maintained tracking through reflective flooring, and stopped within ±2cm of obstacles. Team: Housein Hassan Kahhoul, Omar Majed Saab, El Sayed Hesham Mowafi.

Command Palette

Search for a command to run...

Chat with AI