You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm unsure how the current evaluator performs here. We should write some granular tests.
Currently when we come across a function call, we evaluate all the arguments and pass them along to another evaluator according to the function/intrinsic name. That means boz would get evaluated without context, which defaults to INTEGER(4). Then the INT intrinsic is evaluated, meaning the INTEGER(4) is coerced to an INTEGER(2). Is this different to coercing directly to an INTEGER(2) in the first place?
This would be resolved easily by adding a BOZ constructor to the scalar values. I've shied away from that because it feels weird.
The text was updated successfully, but these errors were encountered:
Adding to the above: The problem with adding an explicit BOZ constructor to the scalar values is that they would no longer be safely representable in memory. BOZs are a purely compile-time construct.
I think certain intrinsics need special handling where they unwrap and inspect the next layer. So for INT, instead of evaluating all arguments immediately, we pass them as their syntactic constructs. We handle ExpVal [...] ValBoz{} via a special case, like what is currently done in fortran-vars. Then the general case would evaluate arguments. (This would be a purely internal change -- I just did the easy thing where I get rid of AST types ASAP.)
I think one has to view INT(boz, kind) as a bottom-level expression that can't be broken down any further. Because boz by itself doesn't mean much (it requires context to interpret).
I'm unsure how the current evaluator performs here. We should write some granular tests.
Currently when we come across a function call, we evaluate all the arguments and pass them along to another evaluator according to the function/intrinsic name. That means
boz
would get evaluated without context, which defaults toINTEGER(4)
. Then theINT
intrinsic is evaluated, meaning theINTEGER(4)
is coerced to anINTEGER(2)
. Is this different to coercing directly to anINTEGER(2)
in the first place?This would be resolved easily by adding a BOZ constructor to the scalar values. I've shied away from that because it feels weird.
The text was updated successfully, but these errors were encountered: