Why are executables so large?

Hello: I have noticed that all Chapel executables (Linux Mint, 1 i7 cpu, 4 physical cores) are quite large (~3.8MB or even larger). I tried to find compiler switches to change that, but couldn’t find any. Is there any reason for these large executables? Is there a way to generate small ones (like C does)? Thanks,


Hi Nelson —

Good question! I think the short answer is that this isn’t something that we’ve really focused on optimizing for at all, as it hasn’t been considered a priority or problem for most users to date.

For a longer answer I suspect that it’s a combination of:

(a) even the simplest Chapel “hello, world” program tending to involve a lot of boilerplate and runtime code related to Chapel’s unique features (locales, parallelism, synchronization), memory allocation, tasking, communication, etc. Chapel programs tend to make use of these features as part of their bootstrap and teardown process even if the user code doesn’t require it. We could probably do a better job of only compiling in the minimal set of parts that a given program needs, but since our priorities are less about making simple programs small and more about making complex programs capable and easy to write, it hasn’t been a focus (so far).

(b) the back-end code that the Chapel compiler generates tends to be a bit verbose, as you can see if you capture it using the --savec flag. I’m not certain how much or little this affects the final binary size (since the C compiler optimizes it), but wouldn’t be surprised if it had at least some non-negligible impact. Some number of years ago, we did some gcov experiments with our generated code which suggested that we may generate more code than we need to, and this is something we’ve intended to return to, but haven’t had the chance to yet. It’s a strong possibility for this year, though, as we’ll be looking more at compilation time, and generated code size is one of our biggest factors in compile times.

I did my own imprecise tests on my Mac using a developer build of the sources and clang as my cmopiler and found that a vanilla hello world program resulted in a 1.6M binary by default, 1.5M when using --fast. On linux64 using something more like a release build and gcc, I saw numbers more like yours: 5M for a default compile, 4.4M when using --fast. I’m not sure which of those differences account for the difference in size.

If this is something that’s problematic for you, rather than just a matter of curiosity, definitely let us know.


Thanks a lot, Brad. It is not a big problem to have big executables. Because I have been experimenting with a lot of small (codewise) chapel programs, they take up some space in the directory, but a radical find . -executable -delete (dangerous; will wipe out anything executable…) cleans up :slight_smile: Regards,