Avatar for charlie

charlie

Member since Jul 2023 • Last active Sep 2023
  • 4 conversations
  • 24 comments

Most recent activity

  • in Bangle.js
    Avatar for charlie

    So it's 8x slower than just using an Array, and honestly I'm amazed
    it's that good.

    At first I did just use an array but I need to limit the size otherwise the mem is very quickly consumed (think wanting to sample gps data, but only recent as you want to reduce noise). You are speed testing two different things. You are saying pushing to an array is quicker than pushing to an array and then pruning it. Which is obviously true. If you have a quicker way of pushing and then resizing that would be more constructive.

    On Espruino as far as I can see you could just use Array directly, and
    if you really need to limit the size of the queue you could override
    .push to just call splice after it to limit the length.

    I am literally doing the same thing as splice after push (via shift). Originally I extended Array rather than having an internal array variable but the espruino compiler isn't releasing memory when splice or shift.

    I mean, you look at isEmpty, it's calling this.any() which calls
    _this.length which is a getter which calls this.internalArray.length. It's just a nightmare - it's like someone wrote the code specifically
    to waste CPU cycles - even on Node.js it's going to suck
    If I want to know if my array is empty I need to do a check in my code. I still need to call array.length. I could repeat this code every time I want to check, it wouldn't speed up the processing, or I can add an additional layer which can be individually unit tested and optimised further in future.

    Ultimately you might save some cpu time with large chunks of repeated code. It will limit you from creating good stable apps which are easy to maintain and extend. Most of the apps in the current app library are not extendable or maintainable. You need to be able to split the code into small segments to unit test. Otherwise you are taking a step back to pre object orientated test driven development.

  • in Bangle.js
    Avatar for charlie

    yeh if I have time I will make a simplified example and set timeout as you suggest

  • in Bangle.js
    Avatar for charlie

    I have narrowed down what is triggering the low memory (when run from RAM). I have the following class which is basically shifting the queue if push item and over max size:

    export class Queue<T> {
        protected itemLimit:number;
        protected internalArray: Array<T>;
    
        constructor(itemLimit = 3) {
            this.itemLimit = itemLimit;
            this.internalArray = [];
        }
    

    If I initiate 2 or more of these like this:

    this.waypoints = new Queue<IWaypoint>(10);
    this.waterways = new Queue<ILocalisedFeature>(10);
    

    ...the memory bottoms out. Which is curious as even if I comment out the initialisation of the array it still bottoms out.

    The actual non minified compiled typescript looks as follows:

    /***/ './src/constructs/queue.ts': /***/ function (__unused_webpack_module, exports) {
                eval(
                    '\r\nObject.defineProperty(exports, "__esModule", ({ value: true }));\r\nexports.Queue = void 0;\r\nvar Queue = /** @class */ (function () {\r\n    // itemCount = 0;\r\n    function Queue(itemLimit) {\r\n        if (itemLimit === void 0) { itemLimit = 3; }\r\n        var _this = this;\r\n        this.lastN = function (count) {\r\n            if (count === void 0) { count = 2; }\r\n            var newAry = [];\r\n            for (var i = _this.length - 1; i >= Math.max(0, _this.length - count); i--) {\r\n                newAry.push(_this.internalArray[i]);\r\n­            }\r\n            return newAry;\r\n        };\r\n        this.any = function () {\r\n            return _this.length > 0;\r\n        };\r\n        this.lastEntry = function () {\r\n            try {\r\n                return _this.internalArray[_this.length - 1];\r\n            }\r\n            catch (err) {\r\n                return null;\r\n            }\r\n        };\r\n        this.firstEntry = function () {\r\n            return _this.internalArray[0];\r\n        };\r\n        this.lastMinus = function (numberFromEnd) {\r\n            if (numberFromEnd === void 0) { numberFromEnd = 0; }\r\n            return _this.internalArray[_this.length - 1 - numberFromEnd];\r\n        };\r\n        this.clear = function () {\r\n            _this.internalArray.length = 0;\r\n        };\r\n        this.asArray = function () {\r\n            return _this.internalArray;\r\n        };\r\n        this.itemLimit = itemLimit;\r\n        this.internalArray = [];\r\n    }\r\n    Object.defineProperty(Queue.prototype, "length", {\r\n        get: function () {\r\n            return this.internalArray.length;\r\n        },\r\n        enumerable: false,\r\n        configurable: true\r\n    });\r\n    Queue.prototype.isEmpty = function () {\r\n        return !this.any();\r\n    };\r\n    Queue.prototype.push = function () {\r\n        var _a;\r\n        var items = [];\r\n        for (var _i = 0; _i < arguments.length; _i++) {\r\n            items[_i] = arguments[_i];\r\n        }\r\n        var n = (_a = this.internalArray).push.apply(_a, items);\r\n        if (this.itemLimit != null && this.itemLimit > 0) {\r\n            this.internalArray.splice(0, this.length - this.itemLimit);\r\n        }\r\n        return this.length;\r\n    };\r\n    return Queue;\r\n}());\r\nexports.Queue = Queue;\r\n\n\n//# sourceURL=webpack://ck_nav/./src/constru­cts/queue.ts?',
                );
    
                /***/
            },
    

    I am wondering if the get overrides (such as below)

    • which result in Class.prototype stuff is the cause but need to test more.

      get length() {
          return this.internalArray.length;
      }
      
  • in Bangle.js
    Avatar for charlie

    loadRoute is doing two things. Its reading the gpx file with Storage.read, and parsing additional json files with jsonRead. In both cases I am parsing the data into local constructs. The mem used by both these functions is correctly releasing the memory once outside the scope.

    I actually think the memory which is released is memory bangle is using when loading up the main js file on app load.

  • in Bangle.js
    Avatar for charlie

    My first thought was that bangle was keeping files read from strorage (which I do in "loadRoute") in memory and the garbage collection for this is somehow delayed. However I checked and this memory is released as soon as the scope ends.

  • in Bangle.js
    Avatar for charlie

    The print mem function was defined within the class instance which was running for the entire length of the program so any variables local to that should stay locked in memory so this shouldn't be a factor.

    In your example "this" could have change so they could print different results if printMem was different. Arrow functions were introduced to guarantee "this" stays the same. So

    const app = new App();
    app.setup() //"this" will be instance of the App
    app.printMem() // will be same as this.printMem inside setup
    
  • in Bangle.js
    Avatar for charlie

    setting the garbage collection to true or false seems to make no difference (with any process.memory() tests I have done)

  • in Bangle.js
    Avatar for charlie

    I agree that actually implementing it in practice would be difficult. I can actually do this with typescript using transformers by using custom decorators at the top of methods (eg @themed({themeOptions}. Which could inject lines at the top of method to set theme options and lines at the bottom to reset when compiling.

    • 3 comments
    • 122 views
Actions