I tried to create my own gestures. Somehow I got this in github https://github.com/nearform/nodeconfeu-gĀesture-models
There is a file create_gesture_model.ipynb which I got running on COLAB
First problem was circular loading around AccelerationReader
Got rid of that by changing directory to nodeconfeu-gesture-models before importing reader and go back
Next problem was in converting to tflite, functions SPACE_TO_BATCH_ND and BATCH_TO_SPACE_ND have been unknown.
Added these commands to builtin_operator_version_support in export_tflite.py
At least I got it working this way. Training etc ran fine.
Next created .tfnames and .tflite and added a handle for aiGesture
Now I get an error Didn't find op for builtin opcode 'SPACE_TO_BATCH_ND'
Is there any idea, what to do or how to do it better ?
Espruino is a JavaScript interpreter for low-power Microcontrollers. This site is both a support community for Espruino and a place to share what you are working on.
I tried to create my own gestures. Somehow I got this in github
https://github.com/nearform/nodeconfeu-gĀesture-models
There is a file create_gesture_model.ipynb which I got running on COLAB
First problem was circular loading around AccelerationReader
Got rid of that by changing directory to nodeconfeu-gesture-models before importing reader and go back
Next problem was in converting to tflite, functions SPACE_TO_BATCH_ND and BATCH_TO_SPACE_ND have been unknown.
Added these commands to builtin_operator_version_support in export_tflite.py
At least I got it working this way. Training etc ran fine.
Next created .tfnames and .tflite and added a handle for aiGesture
Now I get an error Didn't find op for builtin opcode 'SPACE_TO_BATCH_ND'
Is there any idea, what to do or how to do it better ?