• @DrAzzy,

    Ic. Sorry not being more understanding of the data structure as conveyed by the shared code.

    Based on above details I would make an initial request that just delivers the frame and code for subsequent - serialized - xhr requests for each of the 8 time series. The initial request delivers the list with the detail - type of the time series. The xhrs have to be serialized: after first returns data, data is put into a received history object, second is fired, and so fort. After last, you move on with with complete composed received History object to render it graphically, as you already do now.

    Before you though go thru this hassle, I'm not sure if there is not a streaming way... which you hint with the .drain() - because I recall that images can be sent, and they are a lot of data...

    Since everything is JS and thus single threaded - at least logically - you could enforce the writes to the result to be broken up by each of the history types and also throttle them the following way:

                    ...
                    .....  
                    writeBegin();
    }                
    
    function writeBegin(res) {
      res.writeHead(200,head);              
      var types = "Temp","RH","Pressure","AirQual","Clear"­,"Red","Green","Blue"];
      setTimeout(writeTypedHistory,10,res,0,ty­pes);
    }
    
    function writeTypedHistory(res,tdx,types) {
      var type = types[tdx];
      res.write((tdx === 0) ? '{"' : ',"');
      res.write(type);
      res.write('":');
      res.write(JSON.stringify(History[type]))­ 
      if (++tdx < types.length) {
        setTimeout(writeTypedHistory,500,tdx,typ­es);
      } else {
        res.write('}');
      }
    }
    
    function writeEnd(res) {
                    .....
                    ...
    

    Currently, throttling is done by constant time... but it could be variable or even driven by available memory.

About

Avatar for allObjects @allObjects started