How to use quantized TensorFlow Lite files on Bangle? #5302
Replies: 1 comment
-
Posted at 2021-09-24 by Robin Fri 2021.09.24 Hi Thomas @ThomasVikström while I have no experience with TensorFlow, I was able to locate a sample code snippet. Until others with more experience respond, there are other samples to follow:
Posted at 2021-09-25 by ThomasVikström Thx @robin! What I'm not understanding is how to use the tf-lite model files, can/should I save them to Bangle first and invoke them? How? Posted at 2021-09-25 by ThomasVikström Probably below code snippet is part of the solution, but how do I copy the content from the .lite file to …atob("....")… ? The .lite file seems to be a binary file, should I convert it first?
Posted at 2021-09-25 by NebbishHacker Try using the File Converter to convert it to Base 64. Posted at 2021-09-25 by @gfwilliams Hi! That's great news about the export from Edge Impulse, I didn't realise you could do that! I think the best solution is actually to upload via the IDE. So go to the IDE, click the Storage icon (4 discs) in the middle of the screen, then 'Upload a file', choose the file and name it '.tfmodel'. There should really be a proper tutorial (and if Edge Impulse's export really works then I think that'd be a great thing for me to add) but right now the main examples are at: https://github.com/espruino/Espruino/blob/master/libs/tensorflow/README.md#actually-using-it and https://nodewatch.dev/software#google-colab-from-workshop (which includes a Google Colab for gesture training, as well as the code to export the .lite file to base64 if you wanted to go that route) Posted at 2021-09-25 by ThomasVikström @nebbishhacker & @gfwilliams With your guidance I've now been able to try out my very simple model and it works technically, albeit it almost always shows "right" even if it should have been "left". This might very well be because I only had 10 samples of each which probably is far too small dataset, I'll look into this. I had actually tried the Google Colab workshop but the quantized exporting always fails with "ValueError: the operator SPACE_TO_BATCH_ND is not supported by TFLite Micro", and as I for now prefer Edge Impulse for its simplicity, I did not dig deep into possible causes of the error message. @gfwilliams Posted at 2021-09-25 by Robin Sat 2021.09.25 How about directly to the 'Tutorials' forum heading:
Posted at 2021-09-25 by ThomasVikström That could be an option, especially if it would be possible to paste images (= screen captures) from Espruino Web IDE, Edge Impulse Studio etc. Posted at 2021-09-25 by Robin
Use the Posted at 2021-09-25 by ThomasVikström Well, yes... but then I'd need to store the images at some website and paste the URL into the link box. I can't just paste the image from the clipboard like in e.g. Word. Posted at 2021-09-27 by @gfwilliams You can attach images to the posts... To be honest even if you just emailed me it'd be a great help and I could convert them. It'd just be nice to have a decent tutorial on it Posted at 2021-09-27 by ThomasVikström Sure, I'll write down the steps, take screenshots and provide them one way or another to you. After some struggles, I'm new to JavaScript, I've since an hour ago finally been able to collect gesture data, train it in Edge Impulse and upload the model to Bangle. Now it works fine, and I have e.g. been able to control a PowerPoint presentation just by twitching my hand. That comes in handy as I'm a teacher :-) When using CSV-files with Edge Impulse, the reguirement is one sample per file, format is like this:
With Bangle JS I've so far created one file per sample, but it gets quickly tedious when I need to download a lot of CSV-files from Bangle to my computer. Instead I'd like to create one big file per event (e.g. one file for samples of left twitches, one for right, one for up ...). But how to split these files into separate CSV-files on my computer? I'll later search for some Python-code or similar, but if anyone has something ready made I'm grateful. Posted at 2021-09-27 by ThomasVikström Ok, to split files with Python, this can be used
Found it at Stackoverflow. Posted at 2021-09-29 by ThomasVikström @gfwilliams, I have now started with the tutorial, I hope it's ok that I use my own website for it? This way I also learn how to add posts there using Wordpress. Posted at 2021-09-30 by @gfwilliams Great! Well, it's totally up to you. I think it'd be nice to have something on the Espruino website but I guess I can always have a page that just links to yours. Posted at 2021-09-30 by ThomasVikström Well, I'd also prefer it to be on the Espruino website, I don't have any specific needs to have it on my own website. Do note that I'll also cover the Edge Impulse part in the tutorial, so it will not only be about Bangle. Seems like your website is pretty much done with Github. I seem to have a Github account from before, so I'll setup a private project and try to create the tutorial there instead. Then I guess I can some way or another share or send the content to you? Don't know yet though how the image embedding/linking works, I'll need images mainly for the Edge Impulse part, so in the end perhaps not too many. Posted at 2021-10-01 by @gfwilliams Ok, great! Did you see this already? http://www.espruino.com/Writing+Tutorials There's a note in there about how to handle images too Posted at 2021-10-01 by ThomasVikström Thx for the link, hadn't seen it! Had already started adding a few lines of text and images that I can copy over once I fork the EspruinoDocs-repository. Seems to be pretty straightforward to use. Posted at 2021-10-02 by ThomasVikström @gfwilliams, have now created the tutorial using GitHub and made a pull request. Whenever you (or others) have time, take a look at it and try it out, and shoot me some comments! Posted at 2021-10-02 by johan_m_o @ThomasVikström That looks really good! My Bangle.js v1 is currently dead because I've lost my charging cable, otherwise I would get on testing this straight away. Been wanting to dabble with this kind of stuff since I got the watch, but never had the time to figure it out. That tutorial of yours is gonna be a godsend once I have a watch that's running again. Posted at 2021-10-02 by ThomasVikström @johan_m_o Thx for your comments! Posted at 2021-10-03 by Robin Sat 2021.10.02
Tutorial at GitHub Direct link Posted at 2021-10-04 by @gfwilliams Thank you - this looks brilliant! I'm a bit busy right now with the KickStarter but I'll try and give this a try when things calm down. It looks so much easier/quicker than the Python alternative. Published version on the website is here: http://www.espruino.com/Bangle.js+EdgeImpulse Posted at 2021-10-04 by ThomasVikström Great, thx! Posted at 2021-10-05 by Robin Mon 2021.10.04 Nicely done @ThomasVikström ! > psst! . . . Hey @gfwilliams me thinks you have found your new tutorial writer!! *wink, wink* ;-) Posted at 2021-10-05 by ThomasVikström Thx @robin ! Posted at 2022-02-04 by ThomasVikström Just an update to this for other AI & ML nerds like me. With Bangle.js2 I'm now able to spell the English alphabet in the air by "drawing" characters in the air (CAPS only + The main changes I've done compared to the tutorial I wrote, is putting below code snippet in the beginning of the programs used to collect the gestures and to recognise them. The code sets the sensitivity for starting and ending the gestures. Depending on your use case, you might want to tweak the settings accordingly.
If you want to connect your Bangle to e.g. a computer, and write in any application, the below code is what I uploaded to Bangle. I'm sure it could be optimized and cleaned (and commented!), but as a proof of concept, it works good enough for me. Feel free to provide improvements though, that way I'll learn JS better myself.
Posted at 2022-02-04 by myownself That is really cool! How about a Bangle on each arm and using semaphore? :D. Posted at 2022-02-04 by ThomasVikström Thx! Somehow it's not replacing my keyboard either :D Right before Christmas I actually put a Bangle on my ankle and taught it to recognize if I had shoes or socks on my feet when walking. While the accuracy was not great, it was better than random. I guess by tweaking the gesture sensitivity, using other type of shoes, and collecting lots more data, the accuracy would've been better. And no, I don't need AI to tell me what I'm wearing, HI (Human Intelligence) is good enough for that :) Posted at 2022-02-04 by Serj Wow! Looks amazing! Posted at 2022-02-04 by myownself Speaking of keyboards, here is an idea for you. A friend and I discussed that a Bangle could be trained as a keylogger using ML and accelerometer data. At least one hand, although we think touch typing two hands might even work. What are your thoughts? Posted at 2022-02-04 by ThomasVikström Thx @serj ! Posted at 2022-02-04 by ThomasVikström Not sure I understand, do you mean that Bangle could be trained to recognise what I'm typing on a keyboard? And this could be used later when I'm typing on e.g. a table surface, and Bangle connected to a device without external keyboard, like a phone? Or perhaps you mean something else...? Posted at 2022-02-04 by myownself I must type differently. My wrists hover like I am playing a piano when I type. It is what I meant. Posted at 2022-02-04 by ThomasVikström And you use all fingers, not only index fingers? If using all fingers, how would it be possible to know what finger you pressed a key with, even if you are moving your wrists? Posted at 2022-02-04 by myownself Yes, I use all my fingers. My assumption is that I use a different amount of force and the distance to each key differs based on the position of the key and which finger I am using. Maybe it is ridiculous though, it was just an idea I discussed with a friend over coffee, I haven't actually looked at the accelerometer data when I am typing. Posted at 2022-02-04 by ThomasVikström Ok, then I understand your train of thought! PS Similar solutions can be found by searching for Posted at 2022-02-07 by @gfwilliams This looks amazing - thanks for posting up! And this is all still done with Edge Impulse? It's really impressive - I always felt that the Tensorflow stuff was a bit of a steep learning curve and a bit painful to train, and it seems like Edge Impulse has really made massive strides there. Posted at 2022-02-07 by ThomasVikström Thx @gfwilliams ! Posted at 2022-10-11 by ThomasVikström You wanted more :-) While I've not been able (or even tried) to add any Espruino device to what I've been working on, this was published yesterday. In the video and tutorial I'm showing and explaining how to use an EEG-headset and ML (Machine Learning) to control a very simplistic Pong-game. Next related projects are already under work, so if interested, stay tuned :-) Posted at 2022-10-11 by Serj Thank you, Sir, THIS made my day! I still remember the previous video that I associate with AI on the BJS watch , kudos! Also, such works inspire to invest time and energy in open-source! Project Idea: Virtual Mouse based on BJS watch + TensorFlow. Thus, the cursor and buttons will be controlled by moving an empty hand on the table or in the air, simulating mouse movement :) Cheers! Posted at 2022-10-11 by ThomasVikström Thanks @serj! I've added the previous video to my YouTube channel, somehow it had escaped me to do it earlier. Posted at 2022-10-11 by Serj Great! Interesting details! I hope the students enjoy it 👍 By the way, do you have a practice of recording lectures on video? Or is it prohibited? It would be great to listen to upcoming lectures on AI in combination with technologies such as TensorFlow Lite, BJS watch, etc 👍 Posted at 2022-10-11 by ThomasVikström If the lectures would be virtual (they aren't), recording and sharing publicly might be possible as long as the student's names or faces are not visible. But in my case it will be a blend of F2F workshops with some theory in between so no chance of recording. Posted at 2022-10-12 by @gfwilliams That looks great! Thanks again for all your work on the Edge Impulse stuff (it just popped up again on my twitter feed) Last week I was at Nodeconf and they had a workshop on Tensorflow.js, and it also mentioned https://teachablemachine.withgoogle.com If some of you haven't seen this, give it a go (especially the image version at https://teachablemachine.withgoogle.com/train/image) - it's insane how quickly you can teach it to do useful things with the webcam. I did some tutorials for a workshop, and very quickly added one on using it and Bangle.js to detect you picking your nose and to buzz a Bangle.js to stop you: https://github.com/gfwilliams/workshop-nodeconfeu2022/blob/main/tensorflow.md I'll bring the other tutorials there onto the main Espruino website soon as well: https://github.com/gfwilliams/workshop-nodeconfeu2022 Posted at 2022-10-12 by ThomasVikström Thx Gordon, I learned a lot myself in the project! Posted at 2022-10-17 by ThomasVikström @serj you might've seen the newest AI-related video in my YouTube channel? I try to keep you entertained :-D Next AI project might involve a Bangle.js in case I'm able to communicate through Bluetooth with Nicla Sense ME as asked about here. I'm still a complete newbie in this area, so don't know how easy or difficult it is, but as soon as the Nicla arrives I'll start looking into it more. Posted at 2022-10-18 by Serj Great! I especially liked the possibility of typing, it turns out that you can combine this system with a browser and surf the Internet (f.e. combine with Vimium plugin)? Then you can dream up and combine it with Espruino, and things from the real world, for example, control a quadcopter. But that's just a thought =) By the way, I saw a new video on the forum earlier, because notifications of new messages come to my mail. However, everything is at hand on YouTube, in the future it will be possible to quickly find any video =) Posted at 2022-10-19 by ThomasVikström
But long before that I'll try to sneak in BJS communicating with a Arduino device through Bluetooth, AI will play a part as well. |
Beta Was this translation helpful? Give feedback.
-
Posted at 2021-09-24 by ThomasVikström
Am trying to wrap my head around TensorFlow Lite on Bangle. Have successfully managed to collect gesture data (twitch hand left and right) from the watch and also trained a neural network in Edge Impulse.
From Edge Impulse I can then download float32 and int8 quantized files as well as the full SavedModel files (see their forum: https://forum.edgeimpulse.com/t/using-edge-impulse-with-other-boards/88/9).
I'm attaching the int8 and the SavedModel files for reference.
My question is how I can deploy these files to Bangle and use them there?
Have backed Bangle v2 and am hoping I could include it in my master thesis research. The thesis project is to conduct a AI-course at university level (during spring 2022), and as main platform I'm planning to use Edge Impulse, am now "hoarding" different edge devices like Bangle, OpenMV Cam, HiMax etc. to use for collecting different type of data. Bangle v2 seems like an optimal fit for accelerometer type of data as it can be transferred wirelessly. If I first get something working on Bangle v1 and later on v2, I'll see if my university would be willing to buy 10 to 20 watches to use by the students.
Attachments:
Beta Was this translation helpful? Give feedback.
All reactions