[MacPorts] #43381: openmpi-default @1.7.5_1+gcc48 does not work with hwloc @1.9_0
#43381: openmpi-default @1.7.5_1+gcc48 does not work with hwloc @1.9_0 ------------------------+------------------------------------ Reporter: dstrubbe@… | Owner: macports-tickets@… Type: defect | Status: new Priority: Normal | Milestone: Component: ports | Version: 2.2.1 Keywords: | Port: openmpi-default, hwloc ------------------------+------------------------------------ I get the error below for a simple test program. It does work if I downgrade to hwloc @1.8.1_0 and rebuild openmpi though. (Activating hwloc 1.9_0 without rebuilding openmpi makes the error come again.) I have OSX 10.8.5, XCode 5.1. It did work fine on my other computer, with openmpi- default @1.7.5_1+gcc45, OSX 10.6.8, XCode 3.2.6. {{{ $ mpif90-openmpi-mp test_new.f90 $[mpiexec-openmpi-mp -n 1 ./a.out [[38689,1],0] ORTE_ERROR_LOG: Error in file /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_science_openmpi /openmpi-default/work/openmpi-1.7.5/orte/util/nidmap.c at line 106 [[38689,1],0] ORTE_ERROR_LOG: Error in file /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_science_openmpi /openmpi-default/work/openmpi-1.7.5/orte/mca/ess/env/ess_env_module.c at line 154 -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): orte_util_nidmap_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): orte_ess_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relev------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- mpiexec-openmpi-mp detected that one or more processes exited with non- zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[38689,1],0] Exit code: 1 -------------------------------------------------------------------------- }}} A couple of threads describing a similar situation on Fedora suggests it is related to some conflict between different versions of OpenMPI, which inspired me to try swapping the hwloc version: http://www.open-mpi.org/community/lists/users/2013/07/22346.php https://lists.fedoraproject.org/pipermail/users/2013-July/438349.html -- Ticket URL: <https://trac.macports.org/ticket/43381> MacPorts <http://www.macports.org/> Ports system for OS X
#43381: openmpi-default @1.7.5_1+gcc48 does not work with hwloc @1.9_0 ------------------------------------+-------------------- Reporter: dstrubbe@… | Owner: sean@… Type: defect | Status: new Priority: Normal | Milestone: Component: ports | Version: 2.2.1 Resolution: | Keywords: Port: openmpi-default hwloc | ------------------------------------+-------------------- Changes (by macsforever2000@…): * owner: macports-tickets@… => sean@… * cc: sean@… (removed) * port: openmpi-default, hwloc => openmpi-default hwloc -- Ticket URL: <https://trac.macports.org/ticket/43381#comment:1> MacPorts <http://www.macports.org/> Ports system for OS X
#43381: openmpi-default @1.7.5_1+gcc48 does not work with hwloc @1.9_0 ------------------------------------+-------------------- Reporter: dstrubbe@… | Owner: sean@… Type: defect | Status: new Priority: Normal | Milestone: Component: ports | Version: 2.2.1 Resolution: | Keywords: Port: openmpi-default hwloc | ------------------------------------+-------------------- Comment (by sean@…): Thanks for the repo, I'll look into it. -- Ticket URL: <https://trac.macports.org/ticket/43381#comment:2> MacPorts <http://www.macports.org/> Ports system for OS X
#43381: openmpi-default @1.7.5_1+gcc48 does not work with hwloc @1.9_0 ------------------------------------+-------------------- Reporter: dstrubbe@… | Owner: sean@… Type: defect | Status: closed Priority: Normal | Milestone: Component: ports | Version: 2.2.1 Resolution: fixed | Keywords: Port: openmpi-default hwloc | ------------------------------------+-------------------- Changes (by sean@…): * status: new => closed * resolution: => fixed Comment: Fixed in r122954. Sorry for the delay! -- Ticket URL: <https://trac.macports.org/ticket/43381#comment:4> MacPorts <http://www.macports.org/> Ports system for OS X
participants (1)
-
MacPorts