28509, "bradcray", "Should numeric literals be type-agnostic and different from 'param's in this way?", "2026-03-07T02:10:12Z"
In this Discourse thread and particularly the sub-thread between thesecomments, @damianmoz points out cases in which Chapel's current handling of integer literals can lead to confusing results for a user by upcasting low-bit-width values to higher bit-widths. Though the current behavior is documented and rationalizable, the argument would be that it's made certain cases surprising, requiring more work (via downcasting) to preserve a given low-bit-width computation.
I believe that a big part of how we reached this point is that the language has, since its inception, treated param values and anonymous numerical literals identically. As a result, since we require x to have a type in a declaration like param x = 42; (specifically int(64)), we also think of 42 as having a fixed bit-width and essentially being identical to x in behavior. I believe that, historically, we thought of params as being a way of naming and computing on literals, elevating them to first-class concepts.
Damian's perspective, most clearly voiced in Integer Promotion Weirdness - #25 by damianmoz is that the two concepts ought to be distinct from one another, such that an integer literal would have a generic "any width or signedness that is sufficient to store the value" type, which we might write as integral for shorthand in this issue, whereas once it is named and put into a param, it has a well-defined width. Thus, in param x = 42;, x would have type int(64) (the default int width barring any other information), whereas 42 would have the type integral. Similarly, in param y = myInt32 + 42;, 42 would adapt to myInt32's size and signedness, causing y to be inferred to be int(32).
It is not obvious to me (or to Michael, I believe, who also participated in the conversation above) what the repercussions of this change would be. One key difference between Chapel and C (the language used as the basis for comparison in the Discourse thread) is that Chapel does not support implicit downcasts of const/var types from (say) int(64) to int(32) or int(8) due to the potential for data loss. Instead, an explicit cast must be used. params (of default type?) are an exception to this rule due to their value being known at compile-time, permitting the compiler to determine whether a given downcast is safe or not. C# might be an interesting point of comparison here, being C-based, yet with safety as a bigger concern (noting that we have used C# as a reference in previous design discussions around these featuers in the past).
Given this difference between Chapel and C, it's difficult to know what the repercussions of a change like this would be without implementing it and seeing the impact on the testing system. In the past when we have made such changes to overload resolution, it's resulted in something like a game of whack-a-mole where some cases might start working better, but others become more surprising. Michael alludes to this a bit in his comment about sin(1) in comments like this one.
Despite that uncertainty, this change of distinguishing params from literals would be intriguing to pursue, to see whether some of the cases that are currently surprising can be made less so without introducing new surprises in other areas of the language. I'm filing this issue to capture the question and interest in pursuing it, time and resources permitting.
If we were to pursue this, it likely makes sense to do so within the context of the Dyno compiler, due to its more principled implementation. That suggests exploring it once Dyno is able to compile codes sufficient to cover current tests about overload resolution, coercions, and params so that we can understand the impact with end-to-end compiles and invocations of the test suite.