-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PMIx (2.0.1) #466
Comments
I have prepared an upgrade of pmix to 2.0.1 in combination with the Open MPI upgrade to 3.0.0. I also tried to build slurm against pmix 2.0.1 but it fails. slurm correctly detects pmix 2.0.1: checking for pmix installation... /opt/ohpc/pub/libs/pmix/2.0.1/ But I get errors during compilation:
|
I checked with a developer who is more in the know on this and the suggestion is to stick with an older variant of PMIx when configuring as standalone with SLURM. Apparently, SLURM does not yet have support for v2.x support - that is targeted for a Nov release. So, I have downgraded our pmix build to v1.2.3 for now; v1.2.4 is supposed to be out soon so we can likely go with that for our Nov release. With this build, the companion SLURM build went ok with pmix enabled (can't comment on functionality yet). |
@koomie thanks for merging and the downgrade to make it work with slurm. I also see that you removed the version number from the directory path: Curious why you removed the version? |
The rationale behind that was due to a desire to be able to change the pmix installation independent of the MPI stacks (and resource managers). Since this is really more of an administrative package that is accessed by packages outside of Lmod (e.g. slurm), it is helpful if there is a constant path to the install. I would not expect a desire to have multiple PMIx installations co-existing which is the motivating factor for versioned paths for all of the development tools/libraries accessed by developers. I do like having it being transparent into a non-default path (e.g. /opt/ohpc/pub) like you have it, although that is likely going to necessitate the need to add some rpath flags for some of the MPI stacks, or we might consider dropping some ohpc-specific files into I'm traveling the next two days, but will keep poking at it and we can iterate. |
Update: slurm + this standalone pmix + mpich encountered a change in behavior over previous builds in that execution of a singleton failed (e.g. just running MPI binary outside of slurm). Thanks to the support from @rhc54, there is a new patch to make this work (openpmix/openpmix#537) that we are now applying. |
…endent updates with resource managers and MPI stacks (#466).
Added PMIx based CI job which is now passing in CI. |
https://pmix.github.io/pmix/
The text was updated successfully, but these errors were encountered: