Specify initial partitioning?

Hi,

I'm currently using METIS to partition a graph representing a parallel workload for a distributed computation. The partitioning is done to re-balance a load which already has an initial distribution.

The resulting partitions are usually only minor adjustments of the initial partition, yet the partition indices assigned to the vertices are usually completely different, which results in a lot of data having to be copied back and forth between the nodes.

Is there any way of specifying an initial partition instead of having METIS guess one? I'm guessing that if I supply the current partitioning as an initial guess, the resulting assignment of vertices to partitions will not differ too drastically...

Many thanks,
Pedro

RE: Metis does not have this

Metis does not have this capability at this point. However, the parallel version of Metis (ParMetis) does, as part of its adaptive repartitioning routines.

RE: RE: How difficult would it be to implement?

Hi George,

How difficult would it be to implement this feature, i.e. as a METIS_OPTION_IPTYPE?

I've not had a look at the METIS internals, but if you think it would be only a moderate effort and could give me a few pointers regarding where to look for what, it would be worth my effort to have a look at this myself.

I want to avoid using ParMETIS as it is a bit overkill for my application (small graph, potentially large number of nodes) and it would seem to require re-labelling the nodes such that they are contiguous per MPI node. Or does ParMETIS implement in a way that can also be called from a single node?

Cheers,
Pedro

RE: Implementing in Metis will

Implementing in Metis will require to modify the coarsening so that it only contracts vertices that belong to the same partition and then propagate the partitioning vector down. Once you do that, the initial partitioning will be the same as that coming from the inherited partitioning vector, and then all the work will be done during refinement, which will remain the same. Conceptually simple, but involved changes.

You can use ParMetis with 2 MPI processes and use it to compute k-way partitioning (k>>2). The MPI processes can just be running on the same node. Also in ParMetis, you can also use the RefineKway routine...