The problem is actually twofold here... First, because of the way StorageFile works, you can't currently use character code 0xFF in it, which rules out compression - but even if you could, you can't just add blobs of compressed data to each other (it's like adding the bytes of two JPEG files together and hoping for an image twice as wide :) ).
In the example I gave above, I write to Storage as a single file, which should work fine. So for your example, do something like this:
var fileName = "log";
var fileNumber = 0;
// work out the next available file number
require("Storage").list(fileName).forEach(f=>{
var n = parseInt(f.substr(fileName.length));
if (n>=fileNumber) fileNumber=n+1;
});
// here are your records - ideally this is a big array
var records = new Uint8Array(32);
records.set([1,2,3,4,5,6,7,8,9]);
// write the data
require("Storage").write(fileName+fileNumber, require("heatshrink").compress(records))
Also worth noting that the compression gets better the more data there is, so you really need to write bigger chunks to get the best out of it (hence my suggestion of 5k). This might be helpful for some examples of packing data: http://www.espruino.com/Data+Collection#ram-dataview
To read the data, all you need to do is iterate through the files decompressing and printing them:
Espruino is a JavaScript interpreter for low-power Microcontrollers. This site is both a support community for Espruino and a place to share what you are working on.
The problem is actually twofold here... First, because of the way StorageFile works, you can't currently use character code 0xFF in it, which rules out compression - but even if you could, you can't just add blobs of compressed data to each other (it's like adding the bytes of two JPEG files together and hoping for an image twice as wide :) ).
In the example I gave above, I write to Storage as a single file, which should work fine. So for your example, do something like this:
Also worth noting that the compression gets better the more data there is, so you really need to write bigger chunks to get the best out of it (hence my suggestion of 5k). This might be helpful for some examples of packing data: http://www.espruino.com/Data+Collection#ram-dataview
To read the data, all you need to do is iterate through the files decompressing and printing them:
In the example above I convert the data to base64 with
btoa
since it's binary, and when reading the data on the PC you can convert it back withatob
.You could also just send the compressed data, and then decompress it on the PC with https://github.com/espruino/EspruinoWebTools#heatshrinkjs - which would obviously improve your download speeds.