• Hi, as this is not an app, for the app store, I thought I'd post it here and see what people made of it.

    I have put together a purely online gesture learning AI for the BangleJS: https://paulcockrell.github.io/banglejs-­tensorflow-example/

    This is not the same as the one that generates a model in Python, for you to load into the Bangle. But rather this simply lets your bangle send raw gesture data to the browser, and it will use the Tensorflow model (stored in local storage in the browser) to predict your gestures.

    Since the Espruino stuff is all about the Javascript, I thought it would be nice to have a pure JS implementation for gesture learning. Its all in the browser, so no server setup.

    I don't know exactly how this would be applied to something (maybe browser card games, or menus..) but thought it would be fun.

    Props to the following that helped alot with this project:

    1. https://medium.com/@devdevcharlie/play-s­treet-fighter-with-body-movements-using-­arduino-and-tensorflow-js-6b0e4734e118
    2. https://www.nearform.com/blog/running-te­nsorflow-lite-on-nodewatch-bangle-js/


  • So you're saying that this actually trains a Tensorflow model in the browser? That's awesome!

    When I try it I get a 'Training Failed' alert though - I'm not sure why :(

    But this could me amazing if it can export to a Tensorflow lite model that can then be uploaded to the watch. Currently actually getting a model trained for yourself is a bit of a pain, but this would make it so easy for everyone to use.

  • Yeah its all in the browser!

    Yes you will get an unhelpfull 'training failed' message if you record only one gesture..

    The UI is a bit non-intuitive, but these are the steps to do a successful train:

    1. Enter gesture name + select sample size (5 is fine if the gesture movements are very different)
    2. Click "Record gesture" .. the page will scroll down to the feed back section, wait for the feed back bar to turn red (recording mode) and the message to say its ready.
    3. Perform your gestures (first always fails as it needs the raw data to be cleaned..)
    4. Once the feedback bar goes green, scroll back up to the top of the page, and enter a NEW gesture name, keep the sample value the same as before.
    5. Perform steps 2-3 again
    6. Now you will have recorded TWO gestures (you can confirm this in the bangle gesture readings section at the bottom of page)
    7. Click train, and let the magic happen.

    I can certainly look at having it generate a TFLite model. I wanted to know if this was something of use to the community before investing much time!

    Let me know if you are successful.

  • Wow, GREAT!!!
    Just tested drawing circles above my head and in front of my stomach.
    Works fine on first test.
    So, at least one member of the community is interested in that ;-)
    What I would like to have is

    • an open database which collects samples for gestures also from other users
    • collect my own set of gestures for my app and train based on database
    • generate a TFLite model

    For the beginning, testing etc. we could use my shared server.
    Feel free to contact me directly, if you are interested.

  • Wow - this is cool - thanks for sharing - have to try it!

  • I can certainly look at having it generate a TFLite model. I wanted to know if this was something of use to the community before investing much time!

    YES!! This is totally is - and will actually make the gesture detection something useful :)

    I'd actually filed a bug for a website to collect gestures at https://github.com/espruino/BangleApps/i­ssues/85 but I had no idea where to start with in-browser training.

    @brainfart-be has already done some nice gesture collection stuff as well but I'm not sure it's saved anywhere yet?

    I guess the whole storing gestures thing fits in with something we're hitting more and more now though - I need to add some way to store per-user data behind a login, but also to have shared data.

    So for example app reviews in a big one for the App Loader, but then also modules/files/etc in the Web IDE would be really handy too. I've been a bit busy trying to get the Pucks out the door, but hopefully I'll have time for something soon - I guess for now if you use localStorage whatever gets added would fit in to that nicely.

  • @JumJum Glad it worked for you!
    @Gordon Cool, yeah I think the gesture detection stuff opens up many doors for cool projects.

    Well as per the apps, this belongs to the community, so if you missed the repo link in the site, here it is: https://github.com/paulcockrell/bangle-t­ensorflow-game

    (The 'game' in the repo name is there as I had a PhaserIO game hooked in but this was too much of a distraction to the point of the repo, so I removed it)

    Please feel free to fork the repo and work on it, I don't know how you will go about coordinating if its not in the main Bangle repo (for issues and feature requests).

    I cannot guarantee my availability for chipping in on this so please take over and run with it... I'll join in where I can :)

    Cheers.

    Paul C

    Side note: It looks like you would have to have an endpoint on a server to convert the generated tfjs model to any other model type, doesn't appear to be a part of TensorflowJS Api unfortunately.

    The model this generates is in local storage (devtools->application->local storage)

  • Hi!

    I have to admit i didn't advanced on the gesture collection at all due to the lockdown (i try to avoid being on the computer as much as possible)
    (you can find what i have so far here : https://bangle.dimitri-gigot.com/gesture­.html)

    We should maybe create a new "official" repository for this project.

    What "blocked" me so far is where and how to store data.
    At some point I was thinking about creating a node app using a github or google oauth and simply store every entry in mongo.

  • @PaulC @brainfart-be, your work is awesome. I will surly try!!. Thanks for posting..

  • Love this @PaulC!

    @JumJum We ran into a lot of issues trying to train the model with data from more than one person. When we demoed what we were physically doing to the person who built the model, we all realised that e.g. "swipe left" was done completely differently by different people. So you end up with a poor model trying to recognise a wide variety of motions. It's actually much better and more accurate to train it with your specific movements.

  • I think if we had data from 100s of people we could maybe revisit the 'global training', so it'd be nice to have the capability to do it - but yeah, right now, personal training seems the way to go.

    It'd be interesting it we could find the actual tf to tflite model converter code - it might be we can compile it to JS in some way (emscripten?).

  • @ConorONeill,
    thanks for your hint. I see the problem to understand what a gesture is.
    @brainfart-be added a chart in his tool, which could be a first step to visialize a gesture.

    I could like to have something like a 3d drawer to show gesture. But did'nt find anything yet.

    I'm working on my own version based on @PaulC and @brainfart-be, mainly to get a first understanding. Recording part and save in local files (not in local storage) is running in a first version.

  • Thanks @ConorONeill ! It will be interesting to see how this project evolves.

    I look forward to see what you build @JumJum, it's all very exciting.

    Just a quick note on all this: I have looked high and low for scripts that convert TFJS generated models to other model types (especially TFLite as that's what we require for the Bangle), and I don't see any, I also came across a post from April this year (by a TF contributor I believe) that said you can only convert models to TFJS not the other way around, and they don't plan on introducing that feature.

    I just don't want people to invest to much time only to find out there is a massive dead-end with model conversion.

    Hopefully I am wrong, and someone elses Google-Fu is better than mine and comes across a solution.

  • To give an update about my next steps:

    • recording and save samples locally works fine
    • train samples and store resulting model locally works fine. Two files are stored, first is a JSON file with a lot of description, second is a binary with weights.

    Ok, whats now. Tool from @PaulC takes the model, reads gesture data from bangle and recognizes gesture in browser. What we would like to do is to have both steps, read gesture data and recognize gesture on bangle. Therefore we need a TFLIE model.

    As @paulc mentioned there is no way to convert TFJS model to TFLITE.
    First I tried to find something like that in TFJS, no success.

    Next installed tflite_converter for python.
    Failed on my UBUNTU server, but got it running on windows.
    But again, no success, tflite_converter expects model in pbtxt or pb file.

    Next found another converter which converts tfjx model to Kera model, which then could be converted tflite (?).
    Startet tensorflowjs_wizard, and ....
    Unfortunately, this converter runs into an error
    AttributeError: 'list' object has no attribute 'items'
    No idea what that is, but now its weekend :-)

  • @PaulC
    Actual converter does not accept model file, created by your app, any more.
    Digging deeper, saved model has an array, where converter expects an object.
    Going back to your app, you use tfjs@1.0.0, actual version is @2.0.0
    Did you check/try/have_experience/whatever with actual version ?

  • Switched scripts to :

    https://cdn.jsdelivr.net/npm/@tensorflow­/tfjs@2.0.0/dist/tf.min.js
    https://cdn.jsdelivr.net/npm/@tensorflow­/tfjs-vis@1.1.0/dist/tfjs-vis.umd.min.js­
    

    Training runs fine with updatetd scripts
    Next converted output from JSON-Format to Keras H5 format.

    tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras ./model.json ./model.H5
    

    Then converted Keras to tflite file

    tflite_convert --output_file c:\temp\bangletf\Model.H5\lite.tflite --keras_model_file c:\temp\bangletf\Model.H5
    

    Now I've a tflite file, but cannot test. I'm not at home, but thats where my bangle is :-(

  • That's great news - thanks!

    If you're interested it's actually possible to run tensorflow in the emulator: https://www.espruino.com/ide/emulator.ht­ml

    You just have to convert it to Base64 first since the emulator doesn't have a filesystem yet: https://github.com/espruino/Espruino/tre­e/master/libs/tensorflow#actually-using-­it

  • @JumJum Fantastic work! I have updated the versions on my repo and it still work (surprising as its a major version bump)

    I have added a download button which will bring down the model on the prediction page.

    I ran through the conversion steps you posted, and they do produce a TFLite model (your Google-Fu is strong!)

    The problem I see now is that the TFLite model it produces is ~14Kb, and around 19Kb when converted to base64. I'm not sure if this is too large. I see the python script does quantization to optimize for size.

    I went through the steps @Gordon linked to, and the emulator runs out of memory, and I checked the model on the gesture app its around 5K. So maybe we can do some optimization during the conversion process?

  • @paulc,
    you brought me to the right train ;-)
    There is an option for tensorflow_converter, --quantization_bytes=1 (or 2)
    May be, this will help to reduce the size. At least it should reduce size of bp file in H5

  • Post a reply
    • Bold
    • Italics
    • Link
    • Image
    • List
    • Quote
    • code
    • Preview
About

BangleJS Gesture Machine Learning... Online & in JS

Posted by Avatar for PaulC @PaulC

Actions