Mostra i principali dati dell'item

dc.contributor.advisorSolari, Fabio <1967>
dc.contributor.authorAlessi, Roberta <1996>
dc.contributor.otherChiara Bartolozzi
dc.contributor.otherArren Glover
dc.date.accessioned2023-11-02T15:20:46Z
dc.date.available2023-11-02T15:20:46Z
dc.date.issued2023-10-25
dc.identifier.urihttps://unire.unige.it/handle/123456789/6818
dc.description.abstractScopo di questa tesi e' quello di affrontare la sfida rappresentata da un'applicazione finalizzata allo scopo di "eye-tracking", in tempo reale e con camere ad eventi. Grazie alle eccezionali proprieta' di cui sono dotate, tra cui, l'alta risoluzione temporale, l'ampia gamma luminosa (Dynamic Range), il basso consumo energetico, e il fatto che non sono affette da motion blur, il nostro obiettivo e' stato dimostrare come il loro utilizzo possa essere particolarmente vantaggioso per uno scenario "real-time" che richiede tempi di risposta immediati. Considerando inoltre l'attuale stato dell'arte, questo lavoro introduce il primo sistema di eye-tracking interamente ed esclusivamente basato su eventi, che funziona dal vivo e non solo su simulazione.it_IT
dc.description.abstractThe aim of this thesis was to approach the problem of a real-time eyes-tracking application based on an event-driven camera. Thanks to their outstanding properties, such as high temporal resolution and high dynamic range, low power consumption, and being less affected by motion blur respect to traditional cameras, we were able to demonstrate how their usage can be especially advantageous in a real time scenario. With this work, we introduce, considering the actual state-of-the-art, the first eyes-tracking system based only on events which runs real-time. Our system consists of two components: a low frequency Detection thread which runs ``globally" for the whole image while looking for the eyes with convolutional kernels, and a high frequency Tracker thread which starts based on the detected eye position and runs in a smaller region of interest (ROI). Both the threads can run independently from each others. The experiments were conducted both real-time and on a dataset recorded in-house. To demonstrate the performance of our algorithm, we run it also on the benchmark datasets. Our algorithm proved to be robust against head movements and worked well real-time, even though our error, computed as MSE between detected positions and ground truth positions, is slightly higher than theirs. This work also presents a detailed ablation study on the algorithm. Finally, the last contribute is ``In-House" dataset, which has a duration around 30 seconds and is fully labeled, and it is publicly available. This dataset is recorded with a 3rd generation ATIS event camera. The subjects in there is visible from head to shoulder while performing slight head and eye movements.en_UK
dc.language.isoen
dc.rightsinfo:eu-repo/semantics/openAccess
dc.titleEye Tracking Con Camera Neuromorficait_IT
dc.title.alternativeEye Tracking Algorithm Based On A Event Driven Cameraen_UK
dc.typeinfo:eu-repo/semantics/masterThesis
dc.subject.miurINF/01 - INFORMATICA
dc.publisher.nameUniversità degli studi di Genova
dc.date.academicyear2022/2023
dc.description.corsolaurea10635 - ROBOTICS ENGINEERING
dc.description.area9 - INGEGNERIA
dc.description.department100023 - DIPARTIMENTO DI INFORMATICA, BIOINGEGNERIA, ROBOTICA E INGEGNERIA DEI SISTEMI


Files in questo item

Thumbnail

Questo item appare nelle seguenti collezioni

Mostra i principali dati dell'item