Mobile cameras, smart glasses, and headsets are becoming increasingly important as both research test cases and off-the-shelf products. They capture the wearer’s interactions with the world and enable a rich set of not only images and videos, but also gaze information, audio, geolocation, IMU data, etc. Being combined with head-mounted displays, they can also provide new forms of interaction and visualization including augmented reality. Based on this rapid progress, we believe we are only beginning to understand how these technologies and their application can have an impact on our lives.
These egocentric sensing devices could soon be able to automatically understand what the wearer is doing, manipulating, or attending to. They will be able to recognize the surrounding scene and understand gestures and social relationships, and enhance everyday activities such as workplaces, sports, education, and entertainment. We believe that bringing EPIC@X workshop series to ICCV2021 will be an important forum for this growing community, in line with previous years (latest: EPIC@ECCV2020, EPIC@ICCV2019)
EPIC@ICCV2021 accepts papers submitted for publishing as part of workshop proceedings. Topics of interest to the workshop include, but are not limited to:
Importantly, this upcoming workshop will also see the introduction of a new, massive-scale egocentric video dataset: Ego4D. The Ego4D dataset is the result of a multi-year project involving 12 universities worldwide. It offers thousands of hours of unscripted daily-life activity video captured by more than 500 unique camera wearers (not just grad students!) across 7 countries and spanning hundreds of scenarios (household, outdoor, workplace, leisure, entertainment, etc.) Not only will Ego4D dramatically expand the hours of diverse egocentric video footage freely available to the research community, but it also presents a host of new benchmark challenges centered around first-person video understanding: querying an episodic memory, analyzing hand-object manipulation, forecasting future activity, audio-visual conversation transcription, and interpreting social interactions.
If you are interested to learn about Egocentric Perception, Interaction and Computing, including future calls for paper, code, datasets and jobs, subscribe to the mailing list: email@example.com
Instructions to subscribe: