Problem with new Kway function prototype

Hi,

I'm here because the METIS_PartGraphKway function protype has changed with Metis-5 and i'm lost. Particularly with the ncon parameter.

I have written :
int ncon = 0, nbparts = 10;
ierr = METIS_PartGraphKway((idx_t*)matCSR.M,(idx_t*)&ncon,(idx_t*)matCSR.rowPtr,(idx_t*)matCSR.colInd,(idx_t*)matCSR.val,
NULL, NULL,(idx_t*)&nbparts,NULL,NULL,NULL,(idx_t*)&objval,parts);

And during the execution I get :


Current memory used: 384 bytes
Maximum memory used: 384 bytes
***Memory allocation failed for SetupCtrl: maxvwgt. Requested size: 343597383680 bytes

And with gdb, I get :

(gdb) where
#0 0x00007ffff7898b7b in raise (sig=)
at ../nptl/sysdeps/unix/sysv/linux/pt-raise.c:42
#1 0x00007ffff6a0070f in gk_errexit () from /usr/lib/libmetis.so
#2 0x00007ffff6a138a0 in gk_malloc () from /usr/lib/libmetis.so
#3 0x00007ffff6a3eed4 in libmetis__ismalloc ()
from /usr/lib/libmetis.so
#4 0x00007ffff6a2c75b in libmetis__SetupCtrl ()
from /usr/lib/libmetis.so
#5 0x00007ffff6a2dc6d in METIS_PartGraphKway ()
from /usr/lib/libmetis.so

So i do not understand why it can not allocate memory with ncon=1.
I have followed the specification given by the doc :
ncon : The number of balancing constraints. It should be at least 1.

Therefore, I think I do not understand this parameter, can you help me?

Sebastien

RE: I'm on ubuntu 12.04 64

I'm on ubuntu 12.04 64 bits
So i have changed #define IDXTYPEWIDTH 32 to #define IDXTYPEWIDTH 64 in the metis.h file.

RE: In that case, ncon should be

In that case, ncon should be defined as idx_t because int is 32 bits, so when you take &ncon things get weird...
and ncon should be 1.

RE: My post has disapeared

My post has disapeared xD

Exactly, you're right! It's perfect! No problem with malloc now !!

But now, i have another problem with ncon :

bin$ ./test_petsc_solver
Input Error: Incorrect ncon.

[0]PETSC ERROR: main() line 64 in testing/test_petsc_solver.c
application called MPI_Abort(MPI_COMM_WORLD, -2) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -2) - process 0
seb@ubuntu-seb:~/workspace/code-remi/bin$ ./test_petsc_solver
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 6, Mon Feb 11 12:26:34 CST 2013
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./test_petsc_solver on a arch-linu named ubuntu-seb by seb Mon Apr 8 15:09:10 2013
[0]PETSC ERROR: Libraries linked from /home/seb/Téléchargements/petsc-3.3-p6/arch-linux2-c-debug/lib
[0]PETSC ERROR: Configure run at Mon Apr 8 12:00:20 2013
[0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack --download-mpich
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

Thanks for your help!

RE: Exactly! You're right! I have

Exactly! You're right! I have change it like that

idx_t ncon = 1, nbparts = 10; //0 value before was for testing other value
ierr = METIS_PartGraphKway((idx_t*)matCSR.M,&ncon,(idx_t*)matCSR.rowPtr,(idx_t*)matCSR.colInd,(idx_t*)matCSR.val,
NULL, NULL,&nbparts,NULL,NULL,NULL,(idx_t*)&objval,parts);

and now, no problem with malloc ;)
But I submit you my next problem which seems "linked":

/bin$ ./test_petsc_solver

Input Error: Incorrect ncon.

[0]PETSC ERROR: main() line 64 in testing/test_petsc_solver.c
application called MPI_Abort(MPI_COMM_WORLD, -2) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -2) - process 0
seb@ubuntu-seb:~/workspace/code-remi/bin$ ./test_petsc_solver
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 6, Mon Feb 11 12:26:34 CST 2013
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./test_petsc_solver on a arch-linu named ubuntu-seb by seb Mon Apr 8 15:09:10 2013
[0]PETSC ERROR: Libraries linked from /home/seb/Téléchargements/petsc-3.3-p6/arch-linux2-c-debug/lib
[0]PETSC ERROR: Configure run at Mon Apr 8 12:00:20 2013
[0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack --download-mpich
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

Thanks a lot for your help!

RE: Hard to tell what is going

Hard to tell what is going on, but just as a sanity check, are you sure that all the arrays/scalars that you are passing to Metis are of the proper datatype (i.e., idx_t)? That is, all of the elements of matCSR and part vector?

george

RE: Thanks a lot!! It works!

Thanks a lot!! It works!

Yesterday I was not very connected ^^ but now, everything is ok with your help!

Have a nice day,
Sebastien

RE: A few questions: 1. Is idx_t

A few questions:

1. Is idx_t 32 or 64 bits?
2. ncon should be 1.

george