You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just saw that we have unit tests in a project that test JSONassert because of the inconsistent behaviour. I will abstain from explaining why unit tests (especially those with edge cases) are important to get correct and consistent code. Instead I just report the problems I found here.
One thing is that NaN seems to be allowed sometimes but rejected in other cases. I know that at least new JSONArray with NaN inside it works and JSONAssert even thinks that it could be equal, even thought NaN != NaN. It should always throw an exception, as it does sometimes.
Another problem is with numbers that start with 0. JSON doesn't allow it but JSONassert sometimes interprets it is octal, and sometimes just ignores the 0. Instead it should reject any number that starts with 0 unless it's actually 0 ("Unexpected number in JSON at position ...").
It's also weird that integers and fractions are not the same. The reason is probably because the JSON specs have different grammar rules for the two but in the end it's always just a "number". So 1 is the same as 1.0. Therefore [1] is equal to [1.0].
But somehow JSONassert is inconsistent when it comes to comparing numbers. There are no true integers in ECMAScript, so why does it even use different types in Java for those? It should just store them all as Double. Or possibly as BigDecimals. I dind't get into the comparing of numbers that are too close together to distinguish using 64bit floating point numbers. For that one would use strings anyway and I'd expect them to be equal if the numbers are equal. But are those two JSON expressions equal: {foo: -0.0} {foo: 0.0}
I would say they are. JSONassert disagrees. But as soon as the numbers are inside an array they suddenly are equal. This inconsistency is clearly not correct.
The text was updated successfully, but these errors were encountered:
I just saw that we have unit tests in a project that test JSONassert because of the inconsistent behaviour. I will abstain from explaining why unit tests (especially those with edge cases) are important to get correct and consistent code. Instead I just report the problems I found here.
One thing is that NaN seems to be allowed sometimes but rejected in other cases. I know that at least
new JSONArray
with NaN inside it works and JSONAssert even thinks that it could be equal, even thoughtNaN != NaN
. It should always throw an exception, as it does sometimes.Another problem is with numbers that start with 0. JSON doesn't allow it but JSONassert sometimes interprets it is octal, and sometimes just ignores the 0. Instead it should reject any number that starts with 0 unless it's actually 0 ("Unexpected number in JSON at position ...").
It's also weird that integers and fractions are not the same. The reason is probably because the JSON specs have different grammar rules for the two but in the end it's always just a "number". So
1
is the same as1.0
. Therefore[1]
is equal to[1.0]
.But somehow JSONassert is inconsistent when it comes to comparing numbers. There are no true integers in ECMAScript, so why does it even use different types in Java for those? It should just store them all as Double. Or possibly as BigDecimals. I dind't get into the comparing of numbers that are too close together to distinguish using 64bit floating point numbers. For that one would use strings anyway and I'd expect them to be equal if the numbers are equal. But are those two JSON expressions equal:
{foo: -0.0}
{foo: 0.0}
I would say they are. JSONassert disagrees. But as soon as the numbers are inside an array they suddenly are equal. This inconsistency is clearly not correct.
The text was updated successfully, but these errors were encountered: