The issue is actually the flash memory available to save code to. The STM32F4 chip has pages arranged in 16,16,16,16,64,128,128 - the first is needed for the bootloader, and the interpreter binary needs a continuous chunk of flash memory to work from, and 256kB is a bit tight. It leaves only the pages 2-4, which is 48k.
You could do a build using the 64kB page as well (and lowering the room for the interpreter itself to 256kB), but you'd have to flash the Pico itself using DFU (See Advanced Reflashing in the Pico Page) because you have to replace the bootloader as well.
Making use of more memory in a normal build (or in the future when the firmware gets bigger) is probably going to involve adding some kind of compression when saving to flash. Not impossible, but more work.
Have you tried turning on minification in the Web IDE - it'll make it a bit harder to debug but should save you a whole load of space. Or if not, just moving comments that you had inside functions to above the function would really help.
Thinking about compression though, can anyone think of a very compact compression library that could be included? So far miniz (which seems to be zlib compatible) looks like a good bet.
Obviously if it's going in the interpreter, it could be exposed as a JS module as well.
The entire code of the project is compressed using UglifyJS and loaded from the SD card. I separated the most of the code into JS modules. Would it be possible to create a build without the WizNet driver so that it fits into the 256 kB, so that I can use more RAM? How is it possible that on the regular Espruino which has 32kB RAM I can use more JSVars with the Bigram build than here?
I am a bit frustrated because when I don't find a solution for this problem, my project fails.
I'd actually be surprised if you save much memory by loading from SD card, unless you're able to clear the modules out of the cache when you're done with them?
On the original Espruino, RAM is 48kB, but the page sizes are all 1kB - it means that there's much more control over where the split of program + saved code is, and code can be put right at the end of flash memory.
Try downloading this file, and following the instructions for DFU that I'd linked in the last post. The command should look like:
dfu-util -a 0 -s 0x08000000 -D espruino_1v79.19_bigram_pico_1r3.bin
(or you can use ST's Windows GUI version - the important thing is shorting the BOOT0 jumper and pressing the button while booting).
It's the latest build, with 4860 variables in it. It might be possible to get a bit more, but hopefully that'll do you for now.
Thank you very much Gordon!
I already updated the chip and I saw that there are now 4860 JSVars usable. To get it up-to-date I had to remove it from my PCB. Currently I have no solder anymore to get it soldered on the PCB again. As soon I soldered it again to the board, I'll test it with my project again and give you a feedback if everything is working.
GAAAH! I knew it felt claustrophobic, but I thought I was just imagining things, and I've been so busy designing PCBs that I haven't had a chance to work on my next project (which would likely result
It's a pity the firmware has to be in contiguous pages... Is this a technical limit, or something that could be solved in the future?
Could this be done?
Put bootloader in first block
w/e in second
code in the other 16k pages and the 64k page.
Then 256k in the two 128k pages for firmware, and rip out wiznet and cc3k to make room. Or is this still not enough space?
If this is viable - I'll auto-build it nightly like the other builds I do if you tell me what arguments to compile it with... I think as people start doing projects with it, they're going to start bumping into the memory limit, particularly when using the ESP8266, because of the size of the AT and ESP8266WiFi modules (and those users don't need the CC3k/Wiznet support)
What do you mean by
I did try splitting the firmware originally, but I didn't manage to do it with a linker script. If someone knows how to do it I'd be interested - but if I'm honest it's probably a bit late now given the need to update the bootloader with DFU. Most people won't want to do that.
The whole firmware will currently fit in that 256k even with WIZnet (that's what I did with the binary in the build above) - I just think it may end up being tight in the future. I actually put a branch online: https://github.com/espruino/Espruino/tree/Pico_Extra_RAM
These are the changes that are needed: https://github.com/espruino/Espruino/commit/9c063c9b2b5088b3ddac5a3f7146dacabe670c51
But honestly I think compression is the best bet - I imagine I could reliably get 80kB of data into 48kB of flash, especially as it'll never be 100% full (or your program wouldn't run). Then it's just a simple firmware update, there's more flash for other stuff, you get a compression library built in and potentially we could squeeze a bit more into the original Espruino board too.
Any thoughts about suitable compression libraries? I'd probably need something that either wrote data using a callback, or that could decompress in chunks.
LZFX looks nice and small, but I'd have to mess around with it quite drastically it order to get it writing via callbacks.
"w/e" is common abbreviation for "whatever" - As in, if we can't split the f/w up, there'd have been at least one wasted 16k page. I guess two, since you're citing 80kb (I forgot how much was "overhead" for the stack/etc)
Why does the bootloader need to be replaced in order to support using a different page for saved code? Doesn't the firmware tell the bootloader where the data needs to go? Or does Pico_Extra_Ram take the 16k page used by the bootloader, rather than the 64k page + a 16k page?
if we can't split the f/w up, there'd have been at least one wasted 16k page.
True - it's a bit of a waste using a 16k page for firmware anyway. It would have been great for a fake EEPROM though.
Doesn't the firmware tell the bootloader where the data needs to go?
Afraid not. The bootloader starts first (so the board can't be bricked) and the jumps to a location - that location is the problem - it's at the start of the 64kB page. If we could keep the vector table at the start of the 64kB page (placing the saved code before and after it) we could still do it without changing the bootloader - but wow, it's a big hack :)
... it also makes it way more likely you'd need to reflash the firmware if there was a glitch while saving code... In fact it's worse than that, because I think right now the vector table is used straight out of flash, so the second you erased the page the device would lock up.
the new hacked firmware image helped me well. It seems to work with the new amount of JSVars now. Thank you so much! How can these builds be created when you upload new versions?
To make your own builds you'll have to set up your own build system - either using Ubuntu, or a VM with Ubuntu installed.
There are files so that you can use Docker and Vagrant but I haven't personally used those. There are instructions on the build process at https://github.com/espruino/Espruino though
I'd be happy to add these to the list of special builds that I auto-build nightly on drazzy.com - Do I need to do anything special, other than syncing that branch and compiling normally?
Like, if someone posted the git command to sync the branch, and the command to build it, I could get this set up real fast (just adding appropriate bits to text file). But I'm too busy right now to figure out what commands I need to use.
Well, you could probably do:
git clone https://github.com/espruino/Espruino.git cd Espruino git checkout Pico_Extra_RAM git merge master scripts/create_pico_1v3_image.sh # something like this
No guarantees it'll last for long I'm afraid.
I'd really love to use a JS PC emulator to let people download an image containing GCC that they could use to build their own Espruino image. Not sure how realistic that is at the moment though...
Okay, I'll try to throw that into my build script and hopefully it'll generate them. BTW - this is why I asked. It would have taken me a long time to figure out how to get the files from the branch down; I'm not very good with git still.
I'll put them at http://drazzy.com/e/espruino/PicoRam/index.php (link should be live tomorrow)
Commands do not work, yeilds this
What is the correct command to use instead of: git merge master - git merge master isn't scriptable because it opens an interactive application.
And what is it going on about a commit about?
Just type :q here to close the editor without making a commit message. Then it should be ready to use.
what happens if you use the parameter --no-commit?
This works to build it, thanks!
git clone https://github.com/espruino/Espruino.git
git checkout Pico_Extra_RAM
git merge master --no-commit
Unfortunately my autobuild scripts need to be fixed to deal with the new file version format. I hope to get autobuilding working this weekend.
Autobuild is set up and working again.
Sort on exisiting nightlies is messed up due to version change - but that'll sort itself out.
A stopgap for this - what if save() checked how many variables were in use and refused to save() if more than 3040 were in use? That way, we could use more at runtime, as long as we didn't try to save more than we could fit. I mean 4860-3040 is only 1620 jsvars, and if you have a program that saved requires 3000 jsvars, realistically when you run it, you're gonna need another 500-1500 jsvars anyway...
Also, crud, builds for picoram aren't being created. I think I forgot to have the script delete the old build directory. Maybe it's failing because i'm not getting all the locks off the directory - this byzantine process where you actually have to change directory to the git repo to make it get the branch instead of the trunk and working with that garbage seems to keep tripping me up. It doesn't help that it takes like 20-30 mins to test the script.
Edit: nvm - just no version bump so far.
Yes, it'd be a good solution. It does depend on the variables that are used being right at the start of memory though - if you allocated something in those final 1620 vars then it'd fail.
... but if some kind of compression were added (even if it were only RLE!) it would really help with that though.
I read that the Pico has 96 kb of RAM. So I would expect I could use round about 6000 JSVars.
What I can see when I call process.memory() is, that I just can use 3040 JSVars which must be 48kb. This is less than what I could use with the regular Espruino with a Bigram Build. There I have 3250 JSVars.
This is a major problem for me now. I have a highly complex application which was working with the CC3000 module and the Bigram Build on the Espruino near to the RAM limit. I switched to the Pico to have the additional RAM (total 96kb) so that I am able to use a JS-based driver for GPRS. When I now load the driver together with the rest of my application I run out of memory. Is there a possibility to use more than 3040 JSVars on the Pico? Maybe with a special build?