Error during PartKway

This is the only error that I get during a job that dies during a call to ParMetis_V3_PartKway

[bigfrog0717:25027] *** An error occurred in MPI_Irecv
[bigfrog0717:25027] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[bigfrog0717:25027] *** MPI_ERR_RANK: invalid rank
[bigfrog0717:25027] *** MPI_ERRORS_ARE_FATAL (goodbye)

Any ideas?!

Thanks,
Craig Tanis

RE: Same error with mpi

I'm working on a parallel finite element library and I wish a graph build from degrees of freedom.
I create a random parallel graph gives to ParMETIS_V3_PartKway function and I obtain this error:

[JFoulon.local:5189] *** An error occurred in MPI_Wait
[JFoulon.local:5189] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[JFoulon.local:5189] *** MPI_ERR_TRUNCATE: message truncated
[JFoulon.local:5189] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

I'm using the same graph in sequential without any problems. I don't think that the problem is the graph. It's maybe my random repartition on processors to call parmetis.

Could you help me to solve this problem.
Thank you.
Jérémie Foulon.

RE: Is it possible to email me

Is it possible to email me the mesh in a metis-compatible format along with the calling parameters that you are using.

RE: RE:

which format do you want ?
It's easy for me to send graph in CSR format (distribute or not, as you want).
Do you want coordinates associate to the point of graph ?

RE: I do not need coordinates as

I do not need coordinates as you are not using them when you call parmetis.
As far as format, please send it in Metis' input graph format (i.e., what gpmetis reads).

RE: Same problem with ParMETIS_V3_PartKway

I'm using the same function ParMETIS_V3_PartKway to divide my graph of degree of freedom for a finite element program.
I obtain the same error when I use a bigger graph. First case: 3267 vertices and second case: 12675.
I build the graph from the same geometric support but with two degrees of refinement. I suppose that my graph is correct because I juste change a string name in a configuration file to switch between on meshes.
I have define an option value equal to 100 to print some information and I obtain:

call ParMetis
[ 12675 261513 6337 6338] [50] [ 0.000] [ 0.000]
[Jeremie-Foulons-MacBook-Pro.local:5542] *** An error occurred in MPI_Wait
[Jeremie-Foulons-MacBook-Pro.local:5542] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[Jeremie-Foulons-MacBook-Pro.local:5542] *** MPI_ERR_TRUNCATE: message truncated
[Jeremie-Foulons-MacBook-Pro.local:5542] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

For these tests, I use two processors.

I use also the same program in sequential and I have any problems.

Thank you for you help.
Jérémie Foulon

RE: Craig, Check to see if your

Craig,

Check to see if your input graph is correct, especially if for each edge (u,v), (v,u) is also there. You may want to write it out to a file and run serial metis on it to see if it works.