Aims and Scope

Egocentric camera devices are becoming increasingly popular, both as research prototypes and off-the-shelf products. They can acquire images, videos and can collect multimodal data such as gaze information, GPS position, IMU data, etc. When connected with head-mounted displays, they can also provide new forms of interaction and visualization. Based on this rapid progress, we believe we are only at the beginning and these technologies and their application can have a great impact on our life.

In fact, these egocentric camera devices will be able to automatically understand what the wearer is doing, acting on, manipulating, and where his or her attention is going. On the scientific side, for example, scientists are already taking advantage of these kinds of wearable cameras by monitoring and analyzing visual experiences of infants' early life. This work has already resulted in important findings on how young children actively explore the world to create experiences and collect visual data to facilitate their development and early learning of words and visual objects

Egocentric perception introduces a series of challenging questions for computer vision since motion, real-time responsiveness and generally uncontrolled interactions in the wild are more frequently required or encountered. Questions such as what to interpret as well as what to ignore, how to efficiently represent egocentric actions, and how captured information can be turned into useful data for guidance or log summaries become central.

This new EPIC@X series of workshops aims to bring together the various communities that are relevant to egocentric perception including Computer Vision, Multimedia, HCI and the Visual Sciences and is planned to be held on the major conferences in these fields. EPIC@ICCV will accept Full Papers for novel work, and Extended Abstracts for ongoing or already published work. Both research and application works related to Egocentric Perception, Interaction and Computing are encouraged, including those that can be demonstrated or are in the prototype stages.

Submissions are expected to deal with human-centric perception including, but not limited to:

  • Eyewear devices for egocentric perception and computation
  • Eyewear devices for acquisition and visualization
  • Egocentric vision for object/event recognition
  • Egocentric vision for summarization
  • Egocentric vision for social interaction and human behavior understanding
  • Egocentric vision for children and education
  • Egocentric vision for health
  • Head-mounted eye tracking and gaze estimation
  • Computational visual behaviour analysis
  • Attention modelling and next fixation prediction
  • Eye-based human-computer interaction
  • Human and wearable devices interaction
  • Symbiotic human-machine vision systems
  • Affective computing with respect to wearable devices
  • Interactive AR/VR and Egocentric perception
  • Augmented human performance
  • Interactive AR/VR and Egocentric perception
  • (Eye-based) daily life and activity monitoring
  • Benchmarking and quantitative evaluation with human subject experiments

EPIC Community

If you are interested to learn about Egocentric Perception, Interaction and Computing, including future calls for paper, code, datasets and jobs, subscribe to the newly introduced mailing list: epic-community@bristol.ac.uk

Instructions to subscribe:

  • send an email to: sympa@sympa.bristol.ac.uk
  • with the subject: subscribe epic-community
  • and blank message content