我在JavaScript中有一个非常大的数字表示为二进制:
var largeNumber = '11010011010110100001010011111010010111011111000010010111000111110011111011111000001100000110000011000001100111010100111010101110100010001011010101110011110000011000001100000110000011001001100000110000011000001100000110000111000011100000110000011000001100000110000011000010101100011001110101101001100110100100000110000011000001100000110001001101011110110010001011010001101011010100011001001110001110010100111011011111010000110001110010101010001111010010000101100001000001100001011000011011111000011110001110111110011111111000100011110110101000101100000110000011000001100000110000011010011101010110101101001111101001010010111101011000011101100110010011001001111101'
Run Code Online (Sandbox Code Playgroud)
当我通过使用parseInt(largeNumber, 10)l 将其转换为十进制时它给了我1.5798770299367407e+199但是当我尝试将其转换回二进制时:
parseInt(`1.5798770299367407e+199`, 2)
Run Code Online (Sandbox Code Playgroud)
当我期望看到我的原始二进制表示时,它返回1(我认为与parseInt舍入值的工作方式有关)largeNumber.你能解释一下这种行为吗?我如何在JavaScript中将其转换回原始状态?
编辑:这个问题是我的实验结果,我正在玩存储和传输大量的布尔数据.它largeNumber是一组[true,true,false,true ...]布尔值的表示,必须在客户端,客户端工作者和服务器之间共享.
javascript ×1