• Nope no magic corruption:) Only different decimal precision

    Look at code at line 57-L60:

    /*57*/ var newDiv = 0xE4C000;
    /*58*/ var msband = newDiv & 0xFFFF0000;
    /*59*/ var midand = newDiv & 0x0000FF00;
    /*60*/ var lsband = newDiv & 0x000000FF;
    

    Going back from line 60-59-58:
    Assign 3 variables using the value of a variable called newDiv and some binary operations. What's the value of newDiv at that moment in time? It's exactly 0xE4C000, because you just assigned that value at L57.
    Anything happened before is irrelevant.

    To demonstrate it (start with correct.js):

    Add var bla = newDiv to L45
    Add print('bla == 14991360:', bla == 14991360) to L66

    It will print bla == 14991360: false so, the bla variable still holds that almost-14991360-but-not-exactly value.
    Just type bla-14991360 and you get -0. Or type (bla-14991360)*1e9, and you get the same -1.86264514923.
    No corruption here, as far as I can tell, only different fraction handling.

  • ' It's exactly 0xE4C000, because you just assigned that value at L57'

    Yes and now you have proved my point. We are masking bits with identical hex values. In either case the answer should be identical, as we are dealing with bits. Thats the reason for performing a mask operation, to detect which bit.

    'as far as I can tell, only different fraction handling'

    That case is using a floating point value for input. The example provided used the same integer equivalent of the hex notation. I left out the parseInt() conversion as the browser tests didn't need it.

    We have the node.js test, the Firefox browser test and my Chrome browser test, that produce correct identical results in these three examples. In this instance on one device using a different mechanism, we get different conflicting results. We shouldn't.

    I'm reading up at mozilla . . .




    https://developer.mozilla.org/en-US/docs­/Web/JavaScript/Reference/Operators/Bitw­ise_Operators

    "Bitwise operators treat their operands as a sequence of 32 bits (zeroes and ones), rather than as decimal, hexadecimal, or octal numbers."

    "The operands of all bitwise operators are converted to signed 32-bit integers in two's complement format, except for zero-fill right shift which results in an unsigned 32-bit integer. "

    As these are numbers, it shouldn't matter if it is decimal or in hex notation. The browser tests show this. (a thought - aren't integers converted to floating point under the hood? hmmmm)

    I did notice that the mozilla site uses integer constants for their flags, rather than hex notation as I did. Could this be a difference between C/C++ and Javascript?

    Testing that . . .

About

Avatar for Robin @Robin started