Lai, Eddy Thin Jun (2024) End-to-end object detection with transformers. Final Year Project, UTAR.
| PDF Download (2490Kb) | Preview |
Abstract
In the past decade, You Only Look Once (YOLO) series has become the most prevalent framework for object detection owing to its superiority in terms of accuracy and speed. However, with the advent of transformer-based architecture, there has been a paradigm shift in developing real-time detector models. This thesis aims to investigate the performance of YOLOv8 and Real-Time DEtection TRansformer (RT-DETR) variants in the context of urban zone aerial object detection tasks. Specifically, a total of five models namely YOLOv8n, YOLOv8s, YOLOv8m, RT-DETR-r18, and RT-DETR-r50 are trained using an expensive graphics processing unit (GPU) and subsequently executed on a central processing unit (CPU), which is more relevant for power-hungry drone applications. Experiment results reveal that RT-DETR-r50 stands out with the highest mean average precision 50-95 (mAP 50-95) of 0.598, whereas YOLOv8n achieves the fastest inference speed of 30.4 frames per second (FPS). Such benefits come at the expense of slow speed (1.7 FPS) and poor accuracy (mAP 50-95 of 0.440), respectively. In this sense, YOLOv8s emerges as the most promising model due to its ability in striving the best tradeoff between accuracy (mAP 50-95 of 0.529) and speed (11.4 FPS).
Item Type: | Final Year Project / Dissertation / Thesis (Final Year Project) |
---|---|
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science T Technology > T Technology (General) |
Divisions: | Lee Kong Chian Faculty of Engineering and Science > Bachelor of Engineering (Honours) Mechatronics Engineering |
Depositing User: | Sg Long Library |
Date Deposited: | 09 Jul 2024 15:35 |
Last Modified: | 09 Jul 2024 15:35 |
URI: | http://eprints.utar.edu.my/id/eprint/6556 |
Actions (login required)
View Item |