partdmesh stops giving equally-sized domains

I was testing Metis 5.0 and metis 4.0

In two different Programs (not related) but both use partdmesh for partitioning , the count of cells is no longer uniform . With metis 3.0 i used to get the "Balance" 1.0 , with metis 5.0 and 4.0 I get "Balance 3.0" . I was following the code and I see that the change is in the function METIS_PartGraphKway.

well I used tracing tools to follow the load balance "after(3.0) and before(4.0/5.0) ", and there is load imbalance , sorry!. Anyway in the manual says (section 5.3 and 3 --waht is new--) that this function was changed to give less volume communication . is there any way to specified the old features as in metis 3.0

if the partion is done using for example METIS_PartGraphRecursive. the result is the expected one , what exactly has changed in METIS_PartGraphKway?

RE: This is something that is no

This is something that is no my "todo" list for 5.0. Hopefully I'll get back to that code early next month.

RE: I read again my post ,and I

I read again my post ,and I have to admit that maybe I was unable to express myself clearly. I am really
sorry about that :)

To add more details to the subject.

the load imbalance comes from the fact that using partdmesh in metis 4.0 , I have a difference in the cell count , some sub domains have many more cells that others , I read that METIS_PartGraphKWay was modified to , I cite A new refinement algorithm has been added that also minimizes the conectivity of the sub domains . this new algorithm have been made the default option ; UserGuide pag 7

well METIS_PartGraphKWay is a wapper to METIS_WPartGraphKway, from here i cannot follow , (i need to reread all my graph coloring stuff all over again )

however this "imbalance" comming from METIS_PartGraphKWay seems to be more a feature that smthg else . However. I trace 3 different applications (2 fluid dynamics and a geophysical one ) these three use unstructed grids of tetrahedra (need to generate grids of other shapes, not too much time :( ) I did try using grid of different sizes and equally counts of cells seems to be more important that communication (which is relatively fast on the other hand).

In one example with ~ 600 000 cells and 32 subdomains the difference was up to 4000 cell per sub domain . in terms of load imbalance that is roughly 15% .

what I did , it was ; I just wrapped pardmesh and METIS_PartMeshDual and pmetis to use METIS_PartGraphRecursive insted of METIS_PartGraphKWay and the load balance was the expected one (same number of cell in all sub domains ) .

Since Metis and Pmetis is one of the most used partitioning algor/library i wanted to point out what i saw. Unfortunately my knowledge of graph theory is really vague, with thus I can't go any further

I want to mention that for example zoltan uses metis for dynamical balance , and this has to be done in run time so the effect would be more notorious (by measuring time or tracing the application, but few people do that ). Moreover ,there are many other implementations that use metis, and maybe they haven't realized about this new feature , and still use ...PartGraphKway like in metis 3.0.

well thanks for your previous answer and I just wanted to mention this

Ps: It was difficult to read the images , maybe i am going blind, can you make it a little bit clearer :)