Event-Driven compressive vision for multimodal interaction with mobile devices

Coordinator: Sorbonne Universite 🇫🇷
Partners: FBK 🇮🇹, Streetlab 🇫🇷, Experis 🇪🇸, IIT 🇮🇹, Prophesee 🇫🇷, Innovati 🇪🇸, Spanish National Research Cpuncil 🇪🇸  and Insitute de la Vision 🇫🇷

ECOMODE was a research project funded under Horizon 2020, in which we developed an innovative technology that allows visually impaired people and older adults to interact with mobile technology using mid-air gestures and voice controls. ECOMODE technology integrates a neuromorphic camera inspired by human vision, making it possible to control smartphones and tablet devices with multimodal interaction, regardless of environmental conditions and background noise.