A curated list of egocentric (first-person) vision and related area resources
-
Updated
Oct 14, 2024
A curated list of egocentric (first-person) vision and related area resources
A repo for training and finetuning models for hands segmentation.
[CVPR 2022] Joint hand motion and interaction hotspots prediction from egocentric videos
[NeurIPS 2024] Official code for HourVideo: 1-Hour Video Language Understanding
Action Scene Graphs for Long-Form Understanding of Egocentric Videos (CVPR 2024)
This the official repo for our paper: POV-Surgery: A Dataset for Egocentric Hand and Tool Pose Estimation During Surgical Activities; at MICCAI 2023
Making a long story short: A multi-importance fast-forwarding egocentric videos with the emphasis on relevant objects @ Journal of Visual Communication and Image Representation 53 (2018)
A Weighted Sparse Sampling and Smoothing Frame Transition Approach for Semantic Fast-Forward First-Person Videos @ IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018
Code for the paper "Differentiable Task Graph Learning: Procedural Activity Representation and Online Mistake Detection from Egocentric Videos" [NeurIPS (spotlight), 2024]
The official code and data for paper "VidEgoThink: Assessing Egocentric Video Understanding Capabilities for Embodied AI"
Collect related papers and datasets for research
CVMHAT: Multiple Human Association and Tracking from Egocentric and Complementary Top Views, IEEE TPAMI.
Add a description, image, and links to the egocentric-videos topic page so that developers can more easily learn about it.
To associate your repository with the egocentric-videos topic, visit your repo's landing page and select "manage topics."