Avatar for 230

230

Member since Jan 2022 • Last active Jan 2022
  • 1 conversations
  • 1 comments

Most recent activity

    • 2 comments
    • 847 views
  • in Bangle.js
    Avatar for 230

    Hi everyone,

    I was searching for an overview of the events generated for different Bangle.js user input methods, but didn't find anything. From my point of view, the multiple user input possibilities are underappreciated, since they allow for some really cool apps.
    The reference is also not very verbose, so I tested with some sample code.

    So far I gathered the following:

    Accelerometer:

    • accel: Raw accelerometer data @1.25Hz/12.5Hz, depending on whether the watch was moved recently. On my watch

      (0,0,-1) means watch lying flat face up,
      (0,0,1) watch lying flat face down
      (0,-1,0) watch standing straight (on bottom side)
      (0,1,0) watch standing upside down (on top side)
      (-1,0,0) watch lying on right side (the on with the button)
      (1,0,0) watch lying on left side (without button)

    • twist: Watch was twisted up or down (along z axis)

    • tap: Either side of the watch was tapped, condition for double tapping is available. This is especially nice since it can be used while the watch is locked. That means it is always available for user input while not having to waste power for waking the touch screen.

    • step: A step was taken

    • gesture: Raw data describing how the watch was moved. I guess this is not very useful, since the raw data is hard to process?

    • aiGesture: Tensorflow interpretation of gesture data. This sounds really cool, however I'm not sure what the power consumption would be if active during everyday use?

    • faceUp: Watch face is up or no longer up

    Touch screen:

    • touch: From Bangle 1, only left/right side of the screen
    • swipe: Low resolution touch screen gestures (left/right/up/down)
    • stroke: High resoluton touch screen gestures using Unistroke. I used print to output the raw object data, then added them to Bangle.strokes as mentioned in the reference. Works really great so far!
    • drag: Raw data from touch screen while swiping across the touch screen.
    • lock: Touch screen is locked / not locked any more.

    Did I get anything completely wrong? Am I missing any other useful input methods (apart from Bluetooth/GNSS)?

    Any input would be appreciated!

Actions