Software

Charades-Ego

This is the public implementation of the method and the accompanying dataset described in the CVPR 2018 paper. Charades-Ego is dataset composed of 7860 videos of daily indoors activities recorded from both third and first person views. The dataset contains 68,536 temporal annotations for 157 action classes. This dataset enables learning the link between the two, actor and observer perspectives. Thereby, we address one of the biggest bottlenecks facing egocentric vision research, providing a link from first-person to the abundant third-person data on the web. We use this data to learn a joint representation of first and third-person videos, with only weak supervision, and show its effectiveness for transferring knowledge from the third-person to the first-person domain. The implementation and the dataset are available on GitHub.

Comments are closed.