Any plans to port zlib?

Posted on
  • Node has a fairly comprehensive zlib implementation https://nodejs.org/docs/v0.6.2/api/zlib.­html

    I'd love to at least be able to unzip a file on an sd card in place.

  • Not really - I did look at it but if I recall the memory requirements are actually quite high for compression.

    For decompression you could probably run a JavaScript zlib implementation if needed - not that it'd be that quick!

    Internally Espruino uses heatshrink when compressing/decompressing its code to flash memory - so at some point I would like to expose that to JavaScript.

  • Yeah I was hoping to use something like https://github.com/imaya/zlib.js/blob/ma­ster/src/inflate_stream.js and pipe the output to a File stream but it looks like your File.pipe only goes one way, out not in.

  • I think there are simpler, single-file zlibs that you can just push data into one chunk at a time.

    pipe can go both in and out of files as well, but the implementation may not be totally node-compatible - if the target has a write function it'll get called repeatedly with chunks of data. I'm not sure what you're trying because obviously you're piping out of the filesystem right now for your HTTP server.

  • If you are looking to server static content, then you don't need the espruino todo the heavy lifting, you can leave that to the much more powerful browser.

    On the sd card, store .gz versions of the files, and then have the onPageRequest handler server the compressed content:

    function onPageRequest(req, res) {
    
    	var a = url.parse(req.url, true);
    	console.log(a);
    
        file=a.pathname;
    	if (file == '/') {
    		file='/index.htm';
    	}
      var headers= {};
        var f;
        try {
    	f = E.openFile('www' + file, "r");
        }
          catch(e) {
            console.log('no file');
            // look for compressed
            try {
            f = E.openFile('www' + file + '.gz', "r");
            headers['Content-Encoding']='gzip';
            } catch(e) {}
        }
      
    	if (f !== undefined) {
          mime='text/html';
          if ( file.substr(-2)  == 'js' ) mime='application/javascript';
          if ( file.substr(-3)  == 'css' ) mime='text/css';
          if ( file.substr(-3)  == 'ico' ) mime='image/vnd.microsoft.icon';
          
          headers['Content-Type']= mime;
    
    	  res.writeHead(200, headers);
        console.log('started ' + file );
             f.pipe(res,{chunkSize:512, end:false,  complete:function(){
              console.log("Complete");
              f.close();
              res.end();
              } } );
    
    	} else {
    		res.writeHead(404, {
    			'Content-Type': 'text/plain'
    		});
    		res.end("404: Page " + a.pathname + " not found");
    	}
    }
    

    This assumes all of the web assets are in www ( you could use sub folders www/css)

    As an example of fetching:
    http://192.168.15.13/pure-min.css

    The difference is:
    pure-min.css 31KB 2 secs to fetch
    pure-min.css.gz 600ms secs to fetch

  • That's very cool - thanks for posting it up! I imagine that'll be a huge help to a lot of people, and could be used for serving out of local flash if needed too.

  • Yes, serving up gzipped content is a great way to save some bytes down the wire.

    Unfortunately what I am looking for is fetching a bundle of content from a remote server, and then extracting that bundle to the SD card, and then serving those individual files (each gzipped).

  • I've tried a range of options to try and get this working but pretty much every JS-only implementation of zlib is just too big to fit onto Espruino.

    If you could find the time to expose heatstrink that would be really helpful.

  • Post a reply
    • Bold
    • Italics
    • Link
    • Image
    • List
    • Quote
    • code
    • Preview
About

Any plans to port zlib?

Posted by Avatar for dave_irvine @dave_irvine

Actions