Is it possible to control a three.js project?

Posted on
  • Dear community, I am currently working on an artwork for my next interactive installation and trying out new ways/ideas. So I stumbled into puck.js

    Which would be the best way to control a 3d artwork created by three.js with a puck.js v2?

    I assume this is somehow possible.

    For example:
    The 3d artwork will run on a huge tft screen or projector, and will be created with three.js as a web page running on a chrome browser, installed on a windows10 or pi-like device. Using the puck gyro sensor, one could move around with a puck in it's hand, controlling the rotation of the shown 3d objects. I suppose I could use the Web Bluetooth API on the web page to collect the sensor data off the puck?

    Are my assumptions correct?

  • it looks like to be easy enough. But I did not use Web Bluetooth API, so I know only how to do that using MQTT and EspruinoHub. Puck.js can advertize sensors data in BLE, EspruinoHub converts it to MQTT messages, while your app connects to MQTT server using WebSockets (e.g. mqtt.js), receives the messages and process.

  • Maybe not the most straight forward answer but you should be able to get it running based on those two posts.
    First is Web Bluetooth app connecting to a gyro device and visualising using three.js. Its not an espruino device but you should get the idea.
    http://forum.espruino.com/conversations/­347308/#comment15271498
    Second is Web Bluetooth app connecting to espruino Bangle watch and visualising using tinydash.
    http://forum.espruino.com/conversations/­346860/#comment15254612

    So using bits from both examples you should be able to do what you want. I want to make a version for the second demo using three.js but its not going to be any time soon.

    Hope that helps.

  • Thanks for the links @kri100s - those look like they'd be perfect.

    @user112895 I'd love to see what you come up with!

  • @SergeP I especially like the MQTT idea, but can this run also on a windows10 machine in case I have to use one? as far as I understood it, MQTT messages are converted not on the Puck.js but afterwards, when the advertised sensor data arrive at the client

    @kri100s thanks for the links, I suppose I could first make a quick&dirty test with my Samsung Galaxy S8 as well, without ordering a Puck.js?

    Followup general question:
    Any idea how to implement a good reliable distance measuring (0.5-4m / resolution min. 5cm steps)? I am just looking into the "Grove Ultrasonic Sensor", but an all-in-one solution, maybe with the Puck.js and Bluetooth I wasn't able to find up to now (because all online sources state that Bluetooth distance measuring is quite unreliable) would be much preferable.

  • I especially like the MQTT idea, but can this run also on a windows10 machine in case I have to use one?

    I do not know. May be, but it is a question to somebody who have tried already. I use bananapi M64 board as a server, and both EspruinoHub and mosquitto (and BlueTooth) on the server. May be it may be started at Windows 10, too. And yes, EspruinoHub converts BLE messages to MQTT messages. What is good for me is that I can implement some tree.js controller in 10 minutes.

    Any idea how to implement a good reliable distance measuring

    May be it will be interesting for you to measure smaller distance (0.02-1 m) with better resolution? You can use VL53L0X in the case. I've used it and have found not bad. It may be used at longer distances, too. But conditions are not usual, as far as I remember.

  • but can this run also on a windows10 machine in case I have to use one?

    Possibly, yes. You would however have to mess around with Zadig to ensure that you had a BLE adaptor that could talk directly to Node.js. For $10 for a Raspberry Pi Zero W (almost the same cost as a BLE dongle) I'd say it's a no-brainer to just use one of those though.

    Just in case you need more info on distance sensors there's a bunch of stuff on the Espruino site:

  • Yes, you could test it with your phone first.
    @Gordon suggested laser sensors which would be my choice too.
    I am wondering what do you want to use the ranging for. From your initial project description I suspect you are trying to not only get rotation/inclination but also translation (moving in line)? If thats true I think this project might be more challenging. You get into stuff like SLAM and sensor fusion.

  • I like the

    Micro Mini 3.3 V Volt Ultraschall RCW-0001 Ultrasonic Sensor

    same as HC-SR04, just smaller and runs with 3v3 volt.

  • Thank you all for the great/helpful suggestions so far! Will do some basic test with my phone first, and possibly end up with a puck.js. @kri100s I was just thinking of expanding the interaction with my three.js artwork by linking the distance (user <> artwork) with the cameraZoom-in/out - so I guess (hope) this will not very complicated/challenging

  • @kri100s ...as for testing with my phone: spend the last 3hrs trying to setup a first connection but my samsung galaxy s8 is not being discovered by Web Bluetooth API so far, was not able to find any similar problems on the web for the same setup / I even tried pairing my phone to my laptop... no use... tried this: https://googlechrome.github.io/samples/w­eb-bluetooth/device-info.html?allDevices­=true and other examples as well...

  • I never tried web bluetooth on a phone as client. I imagine the phone would have to make itself discoverable. As a hardware it should be capable. I am guessing you need to run some client side app or enable setting.

  • yeah, just figured out unfortunately this wont work "out of the box" (https://stackoverflow.com/questions/5501­8232/web-bluetooth-cannot-detect-my-mobi­le-phone-galaxy-note-9)... for me the solution running a client side app could only work if the app provides gyro data of the smartphone - so far I couldnt find any on the app store, and I am not capable of writing one, therefore I would need to order a puck.js with no prior testing...

  • Just so I understand it the right way:

    1. Would an ultrasonic sensor allow me to use a certain width/range of multiple people in front of the screen/installation in order to calculate the distance? What I am saying is, if for example, there are 5 people watching the screen/installation, and they are lets say half to one meter away from each other, some closer/some further away from the screen/installation, then the ultrasonic sensor would fire on the one who is closest?

    2. What max. width/range can I expect? I dont know how the correct terminus is, but I dont mean the distance from the sensor, which as far as I read is max. between 2-4m (depending on the sensor used).

    3. Would this width/range behavior I expect for the ultrasonic sensor (I expect it, but it might be a misunderstanding of the concept on my side), work on a laser sensor as well, or is the laser sensor something like a light barrier, what I mean: working only on a certain width/range exactly in front of the sensor?

  • the ultrasonic sensor would fire on the one who is closest?

    Yes, that's right

    What max. width/range can I expect?

    I can't remember for sure but IIRC it's got around a 90 degree FOV and max distance of around 1.5m.

    Some have datasheets which will tell you for sure: https://cdn.sparkfun.com/datasheets/Sens­ors/Proximity/HCSR04.pdf

    That one says up to 4m, but I'm not 100% sure I trust it - that'll be if you're wearing a sandwich board :)

    is the laser sensor ... working only on a certain width/range exactly in front of the sensor?

    It depends on the sensor again - they usually say in the datasheet. IIRC the laser sensor is a bit narrower (30 degrees?) and maybe not as far away as the ultrasonic one though

  • Post a reply
    • Bold
    • Italics
    • Link
    • Image
    • List
    • Quote
    • code
    • Preview
About

Is it possible to control a three.js project?

Posted by Avatar for deeno @deeno

Actions