Issue Building 1.29.0 compiler

Hello!

I am trying to build the Chapel version 1.29.0 compiler on my Raspberry Pi 3A+. The only config variables different from the standard are CHPL_ATOMICS=locks, CHPL_COMM=gasnet, GASNET_SPAWNFN=S, GASNET_MASTERIP, and GASNET_SSH_SERVERS. When I execute sudo make, the process hangs on the line: g++ -c -MMD -MP -03 -I. -I../../../compiler ... ending in -o ../../../compiler/../build/compiler/linux32/gnu/arm71/hostmem-jmalloc/llvm-none/11/san-none/frontend/lib/framework/error-classes-list.o error-classes-list.cpp

Has this been reported previously? Does anyone know why the build process hangs consistently here? I have tried wiping the tar and redownloading / reconfiguring / remaking, and the issue persists.

I haven't heard of problems building this particular file. One thing from your report here is that if you are running sudo make -- don't. Keep the sudo part to sudo make install only. So just run make as a regular user and then, if needed, use sudo for the installation part.

At some point in history we saw a strange problem that was actually solved by not using sudo make but I wouldn't bet that is the problem here. But, it's worth a try...

Raspberry Pi 3A+

This device only has 512MB of physical RAM from what I can see here. I am not sure we know the minimum recommended RAM to build Chapel at the moment. If you continue to have problems after removing sudo from your make command, it may be you are running out of physical memory. You might try setting up virtual memory in that case.

I was able to compile and execute Chapel multilocale with 1.28.0, for the record. Is the RAM requirement so different between versions?

I would expect that the compilation memory requirement might go up as we add new code & features and that we wouldn't notice if it did go up some. AFAIK, all of our testing systems have at least 16 GB of memory.

I tried wiping the chapel-1.29.0/ directory I had and building with a plain make command, but the build failed with the same error, which is that it hangs on the error-classes-list.cpp for a long time, and then
g++ fatal error: Killed signal terminated program cc1plus
compilation terminated.
make [3] ... Error 1
make [2] ...framework.makedir] Error 2
make [1]: *** [Makefile:84: compiler] Error 2
make: *** [Makefile:59: comprt] Error 2

It would be a shame if 1.28.0 was the last Raspberry Pi - compatible release of the language.

edit: I put some ellipses in the error output to paraphrase.

That error message almost certainly means the system was very low on memory and the OOM killer killed the compilation process as the biggest offender. A potential option here would be to add swap space. You might also be able to cross-compile for RPi from another machine with more memory, then transfer over the resulting executable.

1 Like

hi @vlzambr2 -- I agree with @annarift -- but I wanted to add one thing. If you have very limited memory, it's not a problem unique to Chapel -- see Does Anyone Have any Compiling Tips - Raspberry Pi Forums for some discussion specifically about Raspberry Pi and compilation memory usage. The solutions I saw there were to add swap space or to try cross-compiling.

1 Like

Catching up on this thread belatedly...

While I'm reluctant to get into the business of ensuring that each Chapel compiler file can be compiled in a specific envelope of memory, I'm also curious whether anyone (particularly @daniel.fedorin and @annarift) has any insights as to what it is about error-classes-list.cpp that is so memory-intensive given that it's a recently new file and not particularly large in terms of lines of code (where I realize that a lot can be hidden by #includes and instantiation). Specifically, I'd feel best if we either concluded "yes, it's reasonable that this file takes a lot of memory to compile because xyz" or realized that there was something we could change to reduce its memory requirements. I'm also curious whether we expect the memory requirements to grow as we add more error messages (since I think we're still towards the start of that process?)

[only vaguely related, but in case anyone knows: the new cmake world, what's the trick for seeing the g++-style command used to compile a file if I wanted to run it manually to monitor its memory usage on my laptop?]

Thanks,
-Brad

If I understand what you're wanting, I think you can export CMAKE_VERBOSE_MAKEFILE=1 to see the whole g++ command for each file.

edit: or, on a per-run basis: make -e CMAKE_VERBOSE_MAKEFILE=1

1 Like

One other stray thought here: I wonder how much of the memory required to compile this file is due to having optimizations on when building the compiler. @vlzambr2 , if you were interested in trying that, I'd be intrigued on whether removing -O3 or replacing it with -O1 or -O0 would change the behavior at all.

-Brad

Arkouda team has had issues with 1.29.0 compiler

@kaydoh — I was not aware of this. Can you share details? Is the problem similar to the one reported in the OP above (out of memory while compiling)?

Thanks,
-Brad

Hi @vlzambr2 (and anyone else affected by the issue)!

We've confirmed that error-classes-list.cpp consumes a lot of memory[1], and agree that it's undesirable. Two PRs have been merged to address the situation:

  1. A PR that splits the problematic file into two (Split error classes list into two files by DanilaFe · Pull Request #21362 · chapel-lang/chapel · GitHub)
  2. A PR that switches the mechanism used for reporting error messages (Switch the error system away from using queries towards `owned` pointers. by DanilaFe · Pull Request #21369 · chapel-lang/chapel · GitHub).

Together, these should be enough to reduce memory usage back to Raspberry Pi levels. Please let us know if you're still experiencing the issue building from main.

Thanks for reporting!


  1. In a rather unscientific "watch htop when you compile" test, we observed error-classes-list.cpp to require some 1.2 GB of memory, way more than a Raspberry Pi could be expected to provide. ↩︎

Hi folks!
I tried to pull from main last night and build the 1.30.0 compiler, and I hit another memory error. This is with half a GB of RAM. Seems to be similar to last time except that it hangs on a different file.