Skip to content

A repo with model and links to dataset from Egocentric Gesture Recognition for Head-Mounted AR devices (ISMAR 2018 Adjunct)

Notifications You must be signed in to change notification settings

V-Sense/EgoCentricGestureNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

EgoCentricGestureNet

A repo with model and links to dataset from Egocentric Gesture Recognition for Head-Mounted AR devices (ISMAR 2018 Adjunct)

Steps to test the model

  • Make sure you all the dependencies below installed
  • Run download_dataset.sh script, this will download the dataset(EgoCentricGestures.tar.gz) to current directory
  • Move this file to an appropriate location and untar the file. This will generates two directories and README.txt files explaining the structure of the dataset and directories
  • run the model with 'python ego_gesture_net_test.py --test_dir=<full_path_to_test_directory>'. This should output the actual gesture id and the recognised id by the network.

Dependencies

  • wget
  • python - 3.5
  • torch - 0.3.1
  • torchvision - 0.2.1
  • Pillow - 5.2.0
  • cuda - 8.0.61
  • nvidia driver - 384.59

Links

Citation

  • Bibtex Entry

@inproceedings{tejo2018ismar, author = {Tejo Chalasani and Jan Ondrej and Aljosa Smolic}, title = {Egocentric Gesture Recognition for Head-Mounted AR devices}, booktitle = {2018 {IEEE} International Symposium on Mixed and Augmented Reality, {ISMAR} 2018 Adjunct}, year = {2018} }

About

A repo with model and links to dataset from Egocentric Gesture Recognition for Head-Mounted AR devices (ISMAR 2018 Adjunct)

Resources

Stars

Watchers

Forks

Packages

No packages published