Has anyone been successful in detecting specific gestures using the accelerometer on the Puck.js v2?
I know the Bangle.js seems to use Tensorflow Lite to recognize certain gestures, if you create a model to train them. Does the Puck.js v2 have the ability to load and recognize the same gesture model?
I'm looking for something where I can train a model to recognize simple gestures (forward, left, right, up, down, draw a Z, draw a circle) and then load it onto the Puck.
It looks like the Puck.js v2 has similar specs to the Bangle.js
Puck.js v2
64MHz nRF52832 ARM Cortex-M4
64kB RAM, 512kB Flash
If that isn't possible, what's the best way to go about this just using the built in accelerometer? (I've been playing with require("puckjsv2-accel-movement").on(); )
Espruino is a JavaScript interpreter for low-power Microcontrollers. This site is both a support community for Espruino and a place to share what you are working on.
Has anyone been successful in detecting specific gestures using the accelerometer on the Puck.js v2?
I know the Bangle.js seems to use Tensorflow Lite to recognize certain gestures, if you create a model to train them. Does the Puck.js v2 have the ability to load and recognize the same gesture model?
I'm looking for something where I can train a model to recognize simple gestures (forward, left, right, up, down, draw a Z, draw a circle) and then load it onto the Puck.
It looks like the Puck.js v2 has similar specs to the Bangle.js
Bangle.js
64MHz nRF52832 ARM Cortex-M4
64kB RAM 512kB on-chip flash, 4MB external flash
Puck.js v2
64MHz nRF52832 ARM Cortex-M4
64kB RAM, 512kB Flash
If that isn't possible, what's the best way to go about this just using the built in accelerometer? (I've been playing with require("puckjsv2-accel-movement").on(); )