@JumJum Fantastic work! I have updated the versions on my repo and it still work (surprising as its a major version bump)
I have added a download button which will bring down the model on the prediction page.
I ran through the conversion steps you posted, and they do produce a TFLite model (your Google-Fu is strong!)
The problem I see now is that the TFLite model it produces is ~14Kb, and around 19Kb when converted to base64. I'm not sure if this is too large. I see the python script does quantization to optimize for size.
I went through the steps @Gordon linked to, and the emulator runs out of memory, and I checked the model on the gesture app its around 5K. So maybe we can do some optimization during the conversion process?
Espruino is a JavaScript interpreter for low-power Microcontrollers. This site is both a support community for Espruino and a place to share what you are working on.
@JumJum Fantastic work! I have updated the versions on my repo and it still work (surprising as its a major version bump)
I have added a download button which will bring down the model on the prediction page.
I ran through the conversion steps you posted, and they do produce a TFLite model (your Google-Fu is strong!)
The problem I see now is that the TFLite model it produces is ~14Kb, and around 19Kb when converted to base64. I'm not sure if this is too large. I see the python script does quantization to optimize for size.
I went through the steps @Gordon linked to, and the emulator runs out of memory, and I checked the model on the gesture app its around 5K. So maybe we can do some optimization during the conversion process?