Terminology and using weights

Hello,

I am using Parmetis to partition a mesh used in a Direct Simulation Monte Carlo code. The code uses finite element terminology for the mesh, notably elements and nodes (or vertices). As I understand it, Parmetis (and Metis) uses a graph partitioning paradigm and corresponding terminology. I believe, in particular, the term 'vertex' in Parmetis can refer to different entities, depending on the situation, and should not be confused with the finite element term 'vertex'. Specifically I believe the description for parameter 'part' in ParMETIS_V3_PartMeshKway: ... 'Upon successful completion the partition vector of the locally-stored vertices is written to this array.' does not mean 'vertices' in the finite element sense, but actually refers to 'elements'. I would appreciate some confirmation of this and further clarification if possible - perhaps with reference to other documents describing the graph partitioning paradigm.

Having re-partitioned the mesh with Parmetis, the calculation performs particle collisions resulting in an imbalance in particle counts per partition. I believe re-partitioning using weights corresponding to the particle counts per 'element' via parameter wgtflag (does the term 'element' here refer to the finite element context?) may address this imbalance. However I cannot see how these weights should be set. If for example, there are 4 partitions with total particle counts of 110, 130, 115, 125 (and for simplicity it is assumed all particle counts per element on a given partition are identical), what values should be assigned to the elements of array wgtflag? Should the total particle counts be assigned to all elements of each partition, or should the values perhaps be based on reciprocals of the particle counts? Are there limits on the size of the weights - particle counts are likley to be 6 figure integers?

I suspect answers to these quations may be trivial to somebody familiar with Parmetis, but it is not obvious to me from the documentation. Any help and references to further explanations and examples much appreciated.

Thanks

Mike Pettipher

RE: Here are some answers: In

Here are some answers:

In ParMETIS_V3_PartMeshKway the term "vertices" corresponds to FE elements. ParMetis constructs the dual graph of the mesh (each element becomes a vertex in the graph, and then partitions that graph; hence vertex=>element).

As far as the adaptive repartitioning with weights, first you need to call ParMETIS_V32_Mesh2Dual to get the dual graph of the mesh. Then you can use this graph in the ParMETIS_V3_AdaptiveRepart routine. The weights for each vertex in partition x will be set to to #particles-in-partition-x/#elements-in-partition-x. Of course, since the weights are integer quantities, you may need to round them up/down. To deal with small fractional parts, you can always multiply the weights by 10 prior to rounding.

RE: Using weights

Thanks for your reply.

Re using weights, you are suggesting using different routines for adaptive repartitioning. Does this mean I should not re-use ParMETIS_V3_PartMeshKway with weights? If so why not? What is the purpose of the weights with this routine if not to partition with weighted elements, whether the weights are known before first use or not?

Thank You

Mike Pettipher

RE: You can certainly use

You can certainly use ParMETIS_V3_PartMeshKway with weights. However, when people talk about repartitioning they will like to also ensure that the new partitioning is not significantly different from the old one. The adaptive routines in ParMetis optimize for that as well, whereas the ParMETIS_V3_PartMeshKway routine does not.

RE: Using weights

Thanks for your reply.

I am trying to use weights to improve the load balancing of the partitioning, and would appreciate some advice on what sort of improvement might be possible.

The system has 120000 elements (in the FE context) and starts with one million particles. The initial basic, parallel partitioning given to ParMetis is based on simply dividing the elements equally over the number of partitions – initially 4 partitions, i.e. 30000 elements per partition.

Particles are not distributed until after ParMetis has been called. The initial distribution of particles is not uniform – it takes into account the location of the elements. On 4 partitions, the particle count per partition varies from about 214000 to 285000, a range of about 70000. The particle count per element varies from 0 to about 150.

After running for some time, the particles should converge towards the elements with the most activity, potentially leading to a greater imbalance in the number of particles per partition. Perhaps surprisingly, after 300 steps, the range in particle count per partition drops from 70000 to 22000. Assuming this better reflects the ultimate particle distribution, I thought that providing updated particle counts per element as weights (via elmwgt) should result in a better balanced partitioning, in terms of total particle counts per partition.

Thus elmwgt[i] is set to the particle count per element (between 0 and 150), and wgtflag to 2 (in ParMETIS_V3_PartMeshKway), using the initial basic partitioning with equal numbers of elements per partition.

However if ParMetis is called using these weights or multiples of them, the range in particle counts per partition increases from 22000. Strangely if the actual particle count is used as the weight, it increases to 23000, while if the weight is increased by a factor of 5, it increases to 25000 – I assumed a scale factor would not make any difference to the partitioning.

Should using weights in this way have the effect I am looking for – a reduction in the range of total particle count per partition? If not what effect should be expected, and is there a way to achieve what I am looking for?

Thank You for your help.

Michael Pettipher

RE: Michael, In principle by

Michael,

In principle by putting weights on the elements, the balance should be getting better and should not be affected by the scaling, unless the sum of the weights overflows the signed 32 bit int. Now in terms of balance, if I understand it correctly, the initial max difference was 70K and after putting the weights, it gets down to 22K-23K, which is better. Right?

george

RE: Using weights

Dear George,

Thanks for your reply.

I agree the particle balancing is better after re-partitioning, but the re-partitioning occurs after running for at least 300 steps, after which the particle balancing has improved significantly. As the particle re-distribution takes into account the particle distribution at the end of the previous run, it is not clear to what extent the improvement in particle balancing after re-partitioning results from the re-partitioning itself or from the improvements associated with the previous run - or a combination of both. (Ignore if this doesn't make sense!) Unless I can untangle these two factors and establish otherwise, I will accept your conclusion that using weights is doing as intended.

Many Thanks

Mike Pettipher

RE: Using weights

Thanks for your reply.