IoT Smart Waste Sorter
IoT system that classifies waste as recyclable or non-recyclable using MobileNetV2 computer vision, then physically sorts it with a servo. Real-time web dashboard, Telegram alerts, and bin level monitoring. 1st place in CEN425.
Overview
Built as part of the CEN425 course at Abu Dhabi University, this project combines edge computing, machine learning, and physical actuation into a fully automated waste sorting system. A MobileNetV2 classifier fine-tuned on a custom + TrashNet dataset achieves 98% classification accuracy. When waste is placed on the sorting platform, motion detection triggers image capture, the Raspberry Pi classifies the item, and sends a UART command to an Arduino Uno which tilts a servo motor to sort it into the correct bin. HC-SR04 ultrasonic sensors monitor bin fill levels continuously. A Flask-SocketIO web dashboard streams live camera feed, bin levels, and classification history over Wi-Fi. Telegram notifications alert when bins exceed 90% capacity.
The Problem
Traditional waste bins depend on users to manually sort waste, leading to recyclable materials ending up in landfills. The goal was to build an automated IoT system demonstrating all four pillars (sensing, edge processing, actuation, and connectivity) in a practical real-world deployment.
The Solution
A Raspberry Pi 5 acts as the IoT gateway running a MobileNetV2 image classifier. When an object is placed on the sorting platform, background subtraction motion detection triggers image capture and classification. The result is sent over UART serial to an Arduino edge node that actuates the servo. Bin fill levels are read from HC-SR04 sensors and pushed back over serial. Everything surfaces on a real-time Flask-SocketIO dashboard with Telegram alerts for full bin notifications.
How It Works
User places a waste item on the tilting sorting platform
Raspberry Pi camera continuously monitors for motion via background subtraction (5000-pixel threshold)
0.5-second settling delay, then a fresh frame is captured and preprocessed (224×224, ImageNet normalization)
TorchScript MobileNetV2 model classifies the item, with the sigmoid output determining recyclable vs. non-recyclable
Classification result sent to Arduino as single character ('R' or 'N') over UART serial at 9600 baud
Arduino tilts MG995 servo to 135° (recyclable bin) or 45° (non-recyclable bin) for 250ms, then returns to 90°
HC-SR04 sensors report bin fill levels every 2 seconds in format 'BIN:R:xx:N:yy' back to Raspberry Pi
Flask-SocketIO pushes real-time updates to the browser dashboard with no page refresh needed
Telegram notification sent via Bot API if any bin fill level exceeds 90%
Key Features
- MobileNetV2 transfer learning classifier achieving 98% validation accuracy on recyclable vs. non-recyclable waste
- Motion-triggered pipeline with background subtraction, end-to-end latency under 2 seconds
- UART serial communication (9600 baud) between Raspberry Pi gateway and Arduino edge node
- MG995 servo motor sorts items to 135° (recyclable) or 45° (non-recyclable) with smooth return to center
- Two HC-SR04 ultrasonic sensors monitoring bin fill levels within ±2cm accuracy
- Flask-SocketIO real-time web dashboard with MJPEG camera stream (~10 FPS) accessible over Wi-Fi
- Telegram Bot API push notifications when either bin exceeds 90% capacity
- WS2812B LED strip visual feedback: green for recyclable, red for non-recyclable, dimmed by fill level
- SQLite database logging every classification with timestamp, confidence score, and user correction flag
Technology Stack
Results
Won 1st place in the CEN425 (IoT 2) competition at Abu Dhabi University, supervised by Dr. Mohamed Fadul. The system achieved 98% classification accuracy, an end-to-end sorting latency under 2 seconds, ultrasonic sensor accuracy within ±2cm, and dashboard update latency under 1 second. UART communication was fully reliable with no errors in testing. Team: El Sayed Hesham Mowafi, Ahmad Atef Abushapap.