Gestures, standard and/or userdefined

Posted on
  • Its an early state for questions like this, anyway lets start to be prepared once we get bangle.js
    Gesture event gives some xyzxyz....xyz info. Hard to recognize gestures based on that.
    AFAIK bangle needs tflite and tfname files to recognize gestures, which are then given in aiGesture event.

    If I'm not on the wrong train:
    Will there be a standard file set, for example with gestures from kickstarter video ?
    How could we create our own gestures, for example drawing an horizontal/vertical circle in the air?
    Will we have something like a gesture library, so we could create special sets of gestures for an app ?

  • Hi! Yes, you're right :)

    Will there be a standard file set, for example with gestures from kickstarter video ?

    Yes - there's currently an app on https://banglejs.com/apps/ called 'Gesture Test' that contains some default files.

    How could we create our own gestures, for example drawing an horizontal/vertical circle in the air?

    There's some info on this here: https://github.com/gfwilliams/workshop-n­odeconfeu2019/blob/master/step4.md

    Basically Nearform did a 'Google Colab' which is like an interactive Python file + document that walks you through the process.

    Will we have something like a gesture library, so we could create special sets of gestures for an app ?

    That was my hope eventually- mainly to have a way of getting lots of gestures from different people to train the AI. Right now there are only gestures from two people in there and it makes the AI pretty unreliable.

  • Do you think step counting can be somehow done with this too or is there better/easier way?

  • https://banglejs.com/reference#l_Bangle_¬≠step :)

    It seems you can just look at the acceleration magnitude and see when it goes above/below 1g by some threshold - seems to work moderately well. That's what Bangle.js does anyway

  • Oh, will try the code with my devices how good it is. It doesn't work well on many trackers - it will often count just hand movements etc. without really walking. Also seen some research papers on the net so it is not that easy to do with good accuracy. So if one would already feed accelerometer coordinates to TF lite model for recognising gestures, one more model trained for steps would not add that much(?). Not sure how big those models are and how fast is to process them (and how it depends on complexity of the problem).

  • one more model trained for steps would not add that much

    Hmm, I wouldn't like to bet on it :s To save memory/battery the gesture model is kept in flash, and is only loaded and run when a certain level of movement is detected over a certain period - so for instance running wouldn't trigger it, nor would a sharp tap.

  • Oh, I see. Can't it 'load and run' directly from flash?

  • Sadly no, it has to load into RAM (at least at the moment) since the flash is external. It does it in the idle loop though so can make use of a bunch of execution stack and save using as much program memory.

  • Post a reply
    • Bold
    • Italics
    • Link
    • Image
    • List
    • Quote
    • code
    • Preview
About

Gestures, standard and/or userdefined

Posted by Avatar for JumJum @JumJum

Actions