Avatar for user130273

user130273

Member since Jun 2021 • Last active Oct 2021
  • 1 conversations
  • 5 comments

Most recent activity

  • in Bangle.js
    Avatar for user130273

    Hello @ThomasVikström !
    Great to hear that is not crushing. The structure looks pretty much like mine, the minor differences are:

    1. input x is an Array because my model has more than one input.
    2. I set the input as tf.getInput().set(x_array); I think this is better when you have more than one input.
    3. Invoke is the same.
    4. Output is similar: output = tf.getOutput(); basically instead of selecting the first element of the output I select all the output values because my model has more than one output.
  • in Bangle.js
    Avatar for user130273

    It works, thank you very much!

  • in Bangle.js
    Avatar for user130273

    I managed to load it to the Bangle.js, there were two issues:

    1. Apparently base64.b64encode(tflite_model) wasn't working in my base64 version, it was giving a corrupted string, I had to create it with base64.b64encode(tflite_model).decode('u­tf-8')
    2. I had to change the conv1d layers to conv2d layers, otherwise there was a different error.

    The problem is now that the model does not seem to take any values as inputs. I trained the model with inputs in shape (1, 12, 3, 1), so I've prepared the data in the bangle in the same format: the input for one prediction is of shape (1, 12, 3, 1) that is, an array of arrays.
    But when I execute the following commands the input has been assigned all to NaN.

    **print(input);
    tf.getInput().set(input);
    print(tf.getInput());**
    
  • in Bangle.js
    Avatar for user130273

    Hi @Gordon, thank you for your reply.
    My Tensorflow version is the latest one tf.version Out[1]: '2.5.0', I'm attaching the model configuration in a screenshot and also the information provided when running the model, I don't see anything strange but I might be missing something?

    I've tried the tool that you recommended but I couldn't make it work, it keeps giving errors (also screenshot attached).

    I'd appreciate any help. Thanks!

  • in Bangle.js
    Avatar for user130273

    I have created a Tensorflow model that expects a (1, 12, 3) sample as input. I converted it to Tensorflow Lite and print the declaration as in the example provided in a tutorial in this forum:
    print("var model=atob(\""+str(base64.b64encode(mode­l_no_quant_tflite))+"\");")
    I'm using quantization because it wasn't reducing any further the size, the size of the model I'm using is 4356 bytes .

    Here the problem, the command "var tf = require("tensorflow").create(2048, model);" keeps giving an error "Uncaught Error: Model provided is schema version 0 not equal to supported version 3." when I try to load it to the Emulator (loading to the watch also fails, it even disconnects the Bluetooth half loading and the loading fails).

    Do you know what could be the cause of the error?

    Thank you very much.

Actions