Hi,
I’ve an issue with javascript numbers systems : the code following
console.log(num % 10, (num + 1) % 10);
outputs 0 0
with out of 32 bits numbers (e.g. num = 87521618088882544408046480
).
I need to use big numbers bits to use big numbers to encode and decode textual ASCII charachers (87521618088882544408046480 is “Hello World” per example, 87521618088882544408046480 = 0x48656c6c6f2059e834ab90, representing “\x48\x65\x6c\x6c\x6f\x20…” which is “Hello …”) and they seems to be unusable out of something like 87521618088882544000000000. Can someone help me?
Thanks for helping!