JavaScript is frequently ridiculed when developers first encounter this seemingly baffling result:
0.1 + 0.2 == 0.30000000000000004
Memes about JavaScript's handling of numbers are widespread, often leading many to believe that this behaviour is unique to the language.
However, this quirk isn't just limited to JavaScript. It is a consequence of how most programming languages handle floating-point arithmetic.
For instance, here are code snippets from Java and Go that produce similar results:
Computers can natively only store integers. They don't understand fractions. (How will they? The only way computers can do arithmetic is by turning some lights on or off. The light can either be on or off. It can't be "half" on!) They need some way of representing floating point numbers. Since this representation is not perfectly accurate, more often than not, 0.1 0.2 does not equal 0.3.
All fractions whose denominators are made of prime factors of the number system's base can be cleanly expressed while any other fractions would have repeating decimals. For example, in the number system with base 10, fractions like 1/2, 1/4, 1/5, 1/10 are cleanly represented because the denominators in each case are made up of 2 or 5 - the prime factors of 10. However, fractions like 1/3, 1/6, 1/7 all have recurring decimals.
Similarly, in the binary system fractions like 1/2, 1/4, 1/8 are cleanly expressed while all other fractions have recurring decimals. When you perform arithmetic on these recurring decimals, you end up with leftovers which carry over when you convert the computer's binary representation of numbers to a human readable base-10 representation. This is what leads to approximately correct results.
Now that we've established that this problem is not exclusive to JavaScript, let's explore how floating-point numbers are represented and processed under the hood to understand why this behaviour occurs.
In order to understand how floating point numbers are represented and processed under the hood, we would first have to understand the IEEE 754 floating point standard.
IEEE 754 standard is a widely used specification for representing and performing arithmetic on floating-point numbers in computer systems. It was created to guarantee consistency when using floating-point arithmetic on various computing platforms. Most programming languages and hardware implementations (CPUs, GPUs, etc.) adhere to this standard.
This is how a number is denoted in IEEE 754 format:
Here s is the sign bit (0 for positive, 1 for negative), M is the mantissa (holds the digits of the number) and E is the exponent which determines the scale of the number.
You would not be able to find any integer values for M and E that can exactly represent numbers like 0.1, 0.2 or 0.3 in this format. We can only pick values for M and E that give the closest result.
Here is a tool you could use to determine the IEEE 754 notations of decimal numbers: https://www.h-schmidt.net/FloatConverter/IEEE754.html
IEEE 754 notation of 0.25:
IEEE 754 notation of 0.1 and 0.2 respectively:
Please note that the error due to conversion in case of 0.25 was 0, while 0.1 and 0.2 had non-zero errors.
IEEE 754 defines the following formats for representing floating-point numbers:
Single-precision (32-bit): 1 bit for sign, 8 bits for exponent, 23 bits for mantissa
Double-precision (64-bit): 1 bit for sign, 11 bits for exponent, 52 bits for mantissa
For the sake of simplicity, let us consider the single-precision format that uses 32 bits.
The 32 bit representation of 0.1 is:
0 01111011 10011001100110011001101
Here the first bit represents the sign (0 which means positive in this case), the next 8 bits (01111011) represent the exponent and the final 23 bits (10011001100110011001101) represent the mantissa.
This is not an exact representation. It represents ≈ 0.100000001490116119384765625
Similarly, the 32 bit representation of 0.2 is:
0 01111100 10011001100110011001101
This is not an exact representation either. It represents ≈ 0.20000000298023223876953125
When added, this results in:
0 01111101 11001101010011001100110
which is ≈ 0.30000001192092896 in decimal representation.
In conclusion, the seemingly perplexing result of 0.1 + 0.2 not yielding 0.3 is not an anomaly specific to JavaScript, but a consequence of the limitations of floating-point arithmetic across programming languages. The roots of this behaviour lie in the binary representation of numbers, which inherently leads to precision errors when handling certain fractions.
The above is the detailed content of Beyond JavaScript - Why doesnt equal in programming. For more information, please follow other related articles on the PHP Chinese website!

The main difference between Python and JavaScript is the type system and application scenarios. 1. Python uses dynamic types, suitable for scientific computing and data analysis. 2. JavaScript adopts weak types and is widely used in front-end and full-stack development. The two have their own advantages in asynchronous programming and performance optimization, and should be decided according to project requirements when choosing.

Whether to choose Python or JavaScript depends on the project type: 1) Choose Python for data science and automation tasks; 2) Choose JavaScript for front-end and full-stack development. Python is favored for its powerful library in data processing and automation, while JavaScript is indispensable for its advantages in web interaction and full-stack development.

Python and JavaScript each have their own advantages, and the choice depends on project needs and personal preferences. 1. Python is easy to learn, with concise syntax, suitable for data science and back-end development, but has a slow execution speed. 2. JavaScript is everywhere in front-end development and has strong asynchronous programming capabilities. Node.js makes it suitable for full-stack development, but the syntax may be complex and error-prone.

JavaScriptisnotbuiltonCorC ;it'saninterpretedlanguagethatrunsonenginesoftenwritteninC .1)JavaScriptwasdesignedasalightweight,interpretedlanguageforwebbrowsers.2)EnginesevolvedfromsimpleinterpreterstoJITcompilers,typicallyinC ,improvingperformance.

JavaScript can be used for front-end and back-end development. The front-end enhances the user experience through DOM operations, and the back-end handles server tasks through Node.js. 1. Front-end example: Change the content of the web page text. 2. Backend example: Create a Node.js server.

Choosing Python or JavaScript should be based on career development, learning curve and ecosystem: 1) Career development: Python is suitable for data science and back-end development, while JavaScript is suitable for front-end and full-stack development. 2) Learning curve: Python syntax is concise and suitable for beginners; JavaScript syntax is flexible. 3) Ecosystem: Python has rich scientific computing libraries, and JavaScript has a powerful front-end framework.

The power of the JavaScript framework lies in simplifying development, improving user experience and application performance. When choosing a framework, consider: 1. Project size and complexity, 2. Team experience, 3. Ecosystem and community support.

Introduction I know you may find it strange, what exactly does JavaScript, C and browser have to do? They seem to be unrelated, but in fact, they play a very important role in modern web development. Today we will discuss the close connection between these three. Through this article, you will learn how JavaScript runs in the browser, the role of C in the browser engine, and how they work together to drive rendering and interaction of web pages. We all know the relationship between JavaScript and browser. JavaScript is the core language of front-end development. It runs directly in the browser, making web pages vivid and interesting. Have you ever wondered why JavaScr


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

SublimeText3 English version
Recommended: Win version, supports code prompts!

Atom editor mac version download
The most popular open source editor

Notepad++7.3.1
Easy-to-use and free code editor

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.
