Allocate failure

Hello,

After installing more memory in my laptop, I run a Non-linear static 3D analysis to verify that this time the solver does not exit due to lack of memory. The internal solver was Ok, but CCX was not. It achieves not result. It looks like the reason is an "allocate failure". Any suggestion?

The computer had always at least 10 GB free memory available.

I deleted the mesh to upload a light file. In order to obtain the mesh I worked with: Generate mesh --> Refine x2 all the generated elements. I obtain 251,617 nodes.

Comments

  • This number of nodes looks like about the limit for default CCX which I think is restricted to 16 GB so you might be over the limit with only 10 GB.

    I can't reproduce your model because the STEP file isn't included in the .liml file but I tried it with a cube of hex8 elements having these numbers of nodes:
    202800 nodes - no ALLOCATE failure message but it froze for at least half an hour. Not sure if it was solving or not.
    126750 nodes - seems to solve.

    A cube is almost the worse case geometry for memory use so a real model could probably be bigger than that.

    It might be worth asking about that error message in the CCX forum https://groups.yahoo.com/neo/groups/CALCULIX/conversations/messages . I've never seen it before but someone there probably knows.
  • I think I was not clear about the memory. I meant that during the solving calculation, the memory used by Mecway is growing. And at the peak of its memory use, the computer had still 10 GB free.

    Ok, perhaps I will try in that group of Calculix. Thanks.

  • Oh. That does sound wrong, unless you have ~30 GB or more of RAM, in which case it's reasonable to have 10 GB unused by both CCX and the internal solver.

    I'm running some benchmarks and will post graphs tomorrow showing my observations of memory use for various model sizes so you can see if you situation fits the trend.
  • Not, 10 GB free is not wrong. As I said, I increased the memory of my work computer. Now it has 24 GB, so the details about memory look like normal: with a problem around 230,000 nodes, 9 GB was the peak used by non-linear CCX solver, and many GB were still free. I just pointed that the "allocate failure" should not be caused by shortage of memory.
  • Here are some tests. Connectivity of the mesh has a huge effect on memory requirements so I used two models at opposite extremes - a cube and a 1-element-thick plate. Real models would usually be somewhere between the two.

    Your 250000 nodes sounds like it could have been too much if it's a very potato-shaped object. It would be interesting if you can upload the STEP file to try it with.

    All unstructured meshes with tet10 elements and Nonlinear Static 3D analysis type with 2 iterations (1 slow solve). CCX 2.11 that's included in Mecway 9. Memory use is from Task Manager. The computer has 16 GB of RAM.
  • I uploaded the step file. Thank you for sharing those test results.

    I do not understand: why do you point that "250000 nodes sounds like it could have been too much if..."? In all of these simulations I had plenty of memory available, according the resource monitor of Windows.

    Anyway, I repeated the tests, as I obtained the Allocate failure in with CCX in Static 3D too. This is beginning to be a problem, because I would like to use CCX (at least the linear Static) to avoid the incompatibility of constraints occured with the internal solver.

    These were the results. Using the step file (in the simulation in my first message). I was reducing the maximum and minimum size of tetra elements:
    — Static 3D - CCX - 37,900 nodes: no problem
    — Static 3D - CCX – 135,000 nodes: no problem
    — Static 3D – CCX – 205,600 nodes: no problem
    — Static 3D - CCX - 399,000 nodes: Allocate failure. I had still 5 GB of free memory.

    I attached the simulation using a hex-mesh, extruded from a dxf sketch. It is a bit simpler model than the previous one. I "refined 2x" a portion of elements:
    — Static 3D - CCX – 102,400 nodes: no problem
    — Static 3D - CCX - 256,000 nodes: Allocate failure. I had still 3,5 GB of free memory.


    I must still contact that Yahoo group on Calculix.


  • Hi Guillermo, just for check it, using CCX 2.14 I was able to solve easily your last model.
  • edited October 2018
    Thanks Sergio.
    So was I. For that reason I wrote "no problem" for the "102,400 nodes" case. I obtained the failure in the solving after refining a set of elements of the component, resulting in roughly 256,000 nodes.
  • 290.000 Nodes solved without issues.

    This do not seem a limitation of ccx. At least ccx_2.14_MT
    Intensive PC use but windows manage properly memory usage through the hard drive in case is needed.
    I only have 8 Gb.
    Try ccx ver 14. The link is in the forum.
  • It looks like the problem was solved by using the last version of CCX: 2.14. I was using 2.13. The Allocate failure also occured with two older versions which I tried.

    Thanks
  • edited November 2018
    I am still facing allocation problem even after using CCX 2.14_MT. Link to my .limil file.
    https://we.tl/t-jCEzVFD5PW
  • edited November 2018
    I found that ran a few iterations then exited without a solution on 2.11 single threaded.

    I guess you'd have to reduce the mesh size. A major area is the interior where it probably doesn't need to be nearly as fine as it is. You could also replace the uniform thickness parts with wedge and hex elements extruded from their inner surface. That should be more efficient than tets. Even better, you could extrude them from the surfaces of their underlying parts and get rid of all the bonded contacts.
  • Hey @Victor It worked if i reduce the Young's Modulus of cap on Tooth. Does this behaviour has a reason. ad How can I define a body as rigid
  • edited November 2018
    I don't know how changing material properties would help with allocate failure but maybe that somehow keeps it under the limit, or maybe it was by chance?

    There's no special rigid body without defining it through CCX cards, but you could use a very high Young's modulus with a linear material instead of hyperelastic.
Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!