Home >Web Front-end >JS Tutorial >Can (a == 1 && a == 2 && a == 3) Ever Be True in JavaScript?
Can (a== 1 && a ==2 && a==3) Ever Be True in JavaScript?
This perplexing interview question challenges one's understanding of JavaScript's equality operator (==). The question asks whether it is possible to make the expression (a== 1 && a ==2 && a==3) evaluate to true.
At first glance, it seems impossible, as a cannot equal three distinct values simultaneously. However, a clever trick exploits the malleable nature of JavaScript's == operator.
In JavaScript, == performs type coercion, which allows for unexpected value comparisons. By defining an object with a custom toString() or valueOf() function, one can manipulate the value returned when comparing the object to a number.
For instance, consider the following code:
<code class="javascript">const a = { i: 1, toString: function () { return a.i++; } }; if(a == 1 && a == 2 && a == 3) { console.log('Hello World!'); }</code>
Here, the object a has a property i initialized to 1. Its toString() function is defined to increment i and return its value, effectively changing the object's representation with each comparison.
When a is first compared to 1, i is incremented to 2. In the second comparison, i is again incremented to 3, satisfying the equality condition. Finally, in the third comparison, i is incremented to 4, but type coercion converts it back to 3, resulting in a match.
Thus, by taking advantage of JavaScript's flexible equality operator and defining a custom object, it is indeed possible to satisfy the expression (a== 1 && a ==2 && a==3), making it evaluate to true.
The above is the detailed content of Can (a == 1 && a == 2 && a == 3) Ever Be True in JavaScript?. For more information, please follow other related articles on the PHP Chinese website!