-
• #2
Interesting update. This time when I called eraseAll I got this instead. I think I've corrupted the system overall, better start from scratch, brb.
UPDATE: Please ignore this, possibly a red-herring from an unclean file-system.
[255] ?[255] ?[255] ?[255] ?[255] ?[255]... ^ in function called from system Uncaught SyntaxError: Got ?[255] expected EOF at line 1 col 1 ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255]... ^ in function called from system Uncaught SyntaxError: Got ?[255] expected EOF at line 1 col 1 ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255]... ^ in function called from system Uncaught SyntaxError: Got ?[255] expected EOF at line 1 col 1 ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ?[255] ^ in function called from system at line 1 col 1
-
• #3
I think point 3. above is resolved after restructuring my code (ref: comment in my other post ). Now to see if I get the large file issue again.
-
• #4
I got the 'file too big' error again and unfortunately, since it wasn't handled, it led to the file system getting corrupted again. The only way to recover is
require("Storage").eraseAll()
and start from scratch.I'll put in some defensive code to handle the file too big error and try a file rollover until
require("Storage").getFree()
returns a lowish number. -
• #5
The
?[255]
error comes when you had code saved to flash which you were executing from, and then you erase the flash memory while you're still executing that code :) If, after aneraseAll
, you doreset()
orload()
everything will be fine.I think I should probably just get rid of the 'My Files' tab now since the IDE handles it in a more sane way (and allows you to download StorageFile files directly). Do you have any objections to that?
In terms of file size, I believe each 'chunk' is about 4k, and you get around 255 chunks - so you can save about one megabyte of data. Right now only the first megabyte of flash is actually exposed to JS apps, so that's really all you can store.
1MB of data is a lot though - it'll take ages to download. Do you really need to store data at that granularity?
-
• #6
The ?[255] error comes when you had code saved to flash which you were executing from, and then you erase the flash memory while you're still executing that code :) If, after an eraseAll, you do reset() or load() everything will be fine.
Bingo, that did it... Thanks.
I think I should probably just get rid of the 'My Files' tab now since the IDE handles it in a more sane way (and allows you to download StorageFile files directly). Do you have any objections to that?
Nopes, no issues. I will take efforts to figure out how the IDE handles it though :D (I saw the post, just haven't gotten around to reviewing it yet).
In terms of file size, I believe each 'chunk' is about 4k, and you get around 255 chunks - so you can save about one megabyte of data. Right now only the first megabyte of flash is actually exposed to JS apps, so that's really all you can store.
That's cool. Knowing limits helps work backwards :-). I shall now make efforts towards being more succinct in my data storage attempts. It is a watch after all :-)
1MB of data is a lot though - it'll take ages to download. Do you really need to store data at that granularity?
True. It does take a while (I haven't messed with baud rates so whatever default speed it is) to transfer the maxed out file. However, looks like my current inefficient data storage is filling up in about 8-10 hours of HR and Step counts. I would love for it to last a day. So my task is cut-out :-)
This post is resolved for everything I started with, but one last question if I may: Any plans to make the rest of the memory available to apps ? If not via StorageFile maybe a new namespace like Archive or something so we can move files from 'active' storage area to 'Archive' area from time to time? 4Mb will also run out at some point... just good to know :-)
-
• #7
Any plans to make the rest of the memory available to apps?
Yes :) Initially the idea was that I'd leave a big chunk free so that it could be used as needed (eg to store massive amounts of data) but I believe that
StorageFile
has pretty much negated the need for that now. Accessing the flash directly would always have been a hack as having 2 apps that used it at once would almost certainly have caused corruption.Looks like my current inefficient data storage is filling up in about 8-10 hours of HR and Step counts.
I think that's something that could definitely be improved on! 1 megabyte in 10 hours is around 30 bytes every second! That's insanely detailed - like you're storing 30 bytes of data for every heartbeat!
I'm not sure what you're planning on the PC end of things to analyse this, but I'd imagine that storing heart rate and step count data for every minute of the day would be more than enough - and even if you stored that in text form, it'd be maybe 10*60*10 = 6kB, not 1MB :)
-
• #8
I think that's something that could definitely be improved on! 1 megabyte in 10 hours is around 30 bytes every second! That's insanely detailed - like you're storing 30 bytes of data for every heartbeat!
There's the enterprise dev (in me) kicking in again... I was extra cautious to have correct CSV format, so every field in every line was wrapped in an opening quote and a closing quote #faceplam...
- just getting rid of the quotes from the file reduced it by 100K. I am actually saving all the
accel
values sox
,y
,z
,g
anddelta
. - Changed row header form
HRM
toHR
andACCL
toAL
, another 10K savings - My Date/Time format is a mess, I am going to change that up to save some more.
The file currently has ~12K lines of data of which
- 7364 lines are accelerometer values (saved at every
step
event). - 396 GPS entries (Loads of NAN that can be trimmed)
- 4728 lines of Heart Rate entries
- Logging started at about 09:07AM and stopped at 16:19PM.
I am saving full resolution of the floating point data I get so yeah each line is pretty detailed.
Looks like I can't really save full day of heart rate + accelerometer + GPS settings in current format. I'll see what further optimisations I can do or what data I can live without.
There is nothing on the desktop/analysis side yet apart from a prototype to map GPS points on a map using MapBox. But I want a closer look at accelerometer readings and heart rate and find co-relations between low movement + heart rate = sleep quality and stuff like that. Also correlations between heart rate and GPS values when cycling etc. Basically mimic as much of the Fitbit functionality as possible.
I'll keep working on optimising data collection until we have access to the rest of the memory via
StorageFile
:-). Also I'll aim to sync atleast twice a day. So it is just a matter of managing entries for 8 hours.Update:
Looks like it clocked off at about 2:15 pm today... that too at 675KB only... hmm... very curious... I'll keep digging. First I need to fix this need foreraseAll
after it has errored out! - just getting rid of the quotes from the file reduced it by 100K. I am actually saving all the
I am having a few issues with large files
"My files" tab seems to crash if it tries to show all the segments with an error like follows:
The above file
ftclog
is actually less than an Mb. But Bangle stops writing to it after a certain point and throws 'file too big' error. Question is, how big a file can I write? Should it be 4Mb-Firmware size? Is that transparent if I useStorageFile
or are there some tricks I am missing.Update : Removed 3rd point which is unrelated to large files.