Hello,
After installing more memory in my laptop, I run a Non-linear static 3D analysis to verify that this time the solver does not exit due to lack of memory. The internal solver was Ok, but CCX was not. It achieves not result. It looks like the reason is an "allocate failure". Any suggestion?
The computer had always at least 10 GB free memory available.
I deleted the mesh to upload a light file. In order to obtain the mesh I worked with: Generate mesh --> Refine x2 all the generated elements. I obtain 251,617 nodes.
Comments
I can't reproduce your model because the STEP file isn't included in the .liml file but I tried it with a cube of hex8 elements having these numbers of nodes:
202800 nodes - no ALLOCATE failure message but it froze for at least half an hour. Not sure if it was solving or not.
126750 nodes - seems to solve.
A cube is almost the worse case geometry for memory use so a real model could probably be bigger than that.
It might be worth asking about that error message in the CCX forum https://groups.yahoo.com/neo/groups/CALCULIX/conversations/messages . I've never seen it before but someone there probably knows.
Ok, perhaps I will try in that group of Calculix. Thanks.
I'm running some benchmarks and will post graphs tomorrow showing my observations of memory use for various model sizes so you can see if you situation fits the trend.
Your 250000 nodes sounds like it could have been too much if it's a very potato-shaped object. It would be interesting if you can upload the STEP file to try it with.
All unstructured meshes with tet10 elements and Nonlinear Static 3D analysis type with 2 iterations (1 slow solve). CCX 2.11 that's included in Mecway 9. Memory use is from Task Manager. The computer has 16 GB of RAM.
I do not understand: why do you point that "250000 nodes sounds like it could have been too much if..."? In all of these simulations I had plenty of memory available, according the resource monitor of Windows.
Anyway, I repeated the tests, as I obtained the Allocate failure in with CCX in Static 3D too. This is beginning to be a problem, because I would like to use CCX (at least the linear Static) to avoid the incompatibility of constraints occured with the internal solver.
These were the results. Using the step file (in the simulation in my first message). I was reducing the maximum and minimum size of tetra elements:
— Static 3D - CCX - 37,900 nodes: no problem
— Static 3D - CCX – 135,000 nodes: no problem
— Static 3D – CCX – 205,600 nodes: no problem
— Static 3D - CCX - 399,000 nodes: Allocate failure. I had still 5 GB of free memory.
I attached the simulation using a hex-mesh, extruded from a dxf sketch. It is a bit simpler model than the previous one. I "refined 2x" a portion of elements:
— Static 3D - CCX – 102,400 nodes: no problem
— Static 3D - CCX - 256,000 nodes: Allocate failure. I had still 3,5 GB of free memory.
I must still contact that Yahoo group on Calculix.
So was I. For that reason I wrote "no problem" for the "102,400 nodes" case. I obtained the failure in the solving after refining a set of elements of the component, resulting in roughly 256,000 nodes.
This do not seem a limitation of ccx. At least ccx_2.14_MT
Intensive PC use but windows manage properly memory usage through the hard drive in case is needed.
I only have 8 Gb.
Try ccx ver 14. The link is in the forum.
Thanks
https://we.tl/t-jCEzVFD5PW
I guess you'd have to reduce the mesh size. A major area is the interior where it probably doesn't need to be nearly as fine as it is. You could also replace the uniform thickness parts with wedge and hex elements extruded from their inner surface. That should be more efficient than tets. Even better, you could extrude them from the surfaces of their underlying parts and get rid of all the bonded contacts.
There's no special rigid body without defining it through CCX cards, but you could use a very high Young's modulus with a linear material instead of hyperelastic.