Skip to content

Latest commit

 

History

History
17 lines (14 loc) · 890 Bytes

File metadata and controls

17 lines (14 loc) · 890 Bytes

Wavify

Gesture-based presentation helper using a ML model that detects and recognizes various hand gestures.

Requisites

A functional camera (and your permission to use it).

How It Works

  1. The user chooses the camera mode, the presentation app they are using, two hand gestures used as signal to enable and disable hand gesture shortcuts respectively, and calibrates the ML model if needed using the GUI.
  2. Once the chosen presentation app is in presentation mode, the ML model starts searching for the signal hand gesture.
  3. If the chosen "enable shortcuts" hand gesture is detected, the user will be able to use presentation keyboard shortcuts via simple hand gestures.

User settings to implement

  1. Camera mode (selfie or normal)
  2. Presentation app (dropdown menu? preconfigured list?)
  3. "Toggle" hand gestures
  4. Calibration menu
  5. Settings to customize sign mapping