Wearable cameras, smart glasses, and AR/VR headsets are gaining importance for research and commercial use. They feature various sensors like cameras, depth sensors, microphones, IMUs, and GPS. Advances in machine perception enable precise user localization (SLAM), eye tracking, and hand tracking. This data allows understanding user behavior, unlocking new interaction possibilities with augmented reality. Egocentric devices may soon automatically recognize user actions, surroundings, gestures, and social relationships. These devices have broad applications in assistive technology, education, fitness, entertainment, gaming, eldercare, robotics, and augmented reality, positively impacting society.
Previously, research in this field faced challenges due to limited datasets in a data-intensive environment. However, the community's recent efforts have addressed this issue by releasing numerous large-scale datasets covering various aspects of egocentric perception, including HoloAssist, Ego4D, Ego-Exo4D, EPIC-KITCHENS, and HD-EPIC.
The goal of this workshop is to provide an exciting discussion forum for researchers working in this challenging and fast-growing area, and to provide a means to unlock the potential of data-driven research with our datasets to further the state-of-the-art.
Details coming soon...
Details coming soon...
| Challenges Leaderboards Open | TBD |
| Challenges Leaderboards Close | TBD |
| Challenges Technical Reports Deadline (on CMT) | TBD |
| Notification to Challenge Winners | TBD |
| Challenge Reports ArXiv Deadline | TBD |
| Extended Abstract Deadline (on CMT) | TBD |
| Extended Abstract Notification to Authors | TBD |
| Extended Abstracts ArXiv Deadline | TBD |
| Workshop Date | TBD |
Details coming soon...
This workshop follows the footsteps of the following previous events:
EPIC-Kitchens and Ego4D Past Workshops:
Human Body, Hands, and Activities from Egocentric and Multi-view Cameras Past Workshops:
Project Aria Past Tutorials: