Until I received the responses here, I did try reading the file using indexOf.
It's a file with 1 column having 17K rows. I wanted to separate these 17K values and store them in an array since I need to do analyze these values and compute something further.
Obviously, since 17K is a very large number, I for instance tried reading only the first 500 values. (Could separate out every value from the file based on carriage return ) and could print all of them. No problem.
However, the moment I try to store these 500 values in an array, I face the low memory problem.
It works well if I store only 100 values in the array. However, the computation that I need to do on this data, requires at least 500 values for me to analyze it well because I am working on a pattern. Can't analyze with only 100 values.
Is there any way I could achieve this - storing 500 values or maybe at least 300?
Espruino is a JavaScript interpreter for low-power Microcontrollers. This site is both a support community for Espruino and a place to share what you are working on.
Thanks a lot, @fanoush and @Gordon for your replies.
Until I received the responses here, I did try reading the file using indexOf.
It's a file with 1 column having 17K rows. I wanted to separate these 17K values and store them in an array since I need to do analyze these values and compute something further.
Obviously, since 17K is a very large number, I for instance tried reading only the first 500 values. (Could separate out every value from the file based on carriage return ) and could print all of them. No problem.
However, the moment I try to store these 500 values in an array, I face the low memory problem.
It works well if I store only 100 values in the array. However, the computation that I need to do on this data, requires at least 500 values for me to analyze it well because I am working on a pattern. Can't analyze with only 100 values.
Is there any way I could achieve this - storing 500 values or maybe at least 300?
Thanks a lot once again for all your help!