The fully constrained model below (with BCs on 16 700 nodes for a total of ~37 000) crashes with a system out of memory error when applying BCs (there was at least 50 GB of free RAM when the model started solving). With less BCs, the model solves just fine. Both versions also work with CCX. I was wondering if there is a limit in Mecway or if this is due to some settings on my system? The same thing also happens with version 6 and 7.
Comments
I'm having trouble replicating your results. I tried with both fixed support and frictionless support on 36 000 nodes of a 116 000 node model (attached) and it used about 1-2 GB extra during the Applying constraints stage.
Could it be there's a 0 missing from your numbers?
Are you using the 32 bit version? That would have this kind of problem. Help -> About shows if it's 32 or 64 bit.
Your model works fine on my machine and I am using the 64 bit version.
I have attached the model. It's about 5 times larger than yours with parabolic elements, maybe that has some effect.
I've reproduced it with your model. This appears to be caused by memory fragmentation and I've now fixed it for the next version so thanks for bringing it up. It's the combination of a large number of nodes together with a large number of constraints that triggers it.
To work around it, you might have to reduce the mesh size. It looks quite inefficient because it has uniform density. You could increase the Max. element size and Size grading in Meshing parameters. If that leaves some parts too coarse, increase the min. number of elements per curve/edge or add Local refinements.