I have been running a study on a trailer coupling ball.
For the first run i used the default mesh density (not a draft quality mesh) ;
Results: Deformation is as expected.
Max stress (node) = 729 N/mm^2
Max stress (element) = 582 N/mm^2
Max displacement = 1.605 mm
So I set a mesh control to refine the mesh in the area of interest. I increased the mesh density in that area until the difference between Node and Element stress values were within 10%.
Results:
1. Displacement
2. Node stress
3. Element stress
Can anybody explain how stress has increased if displacement decreased? This trend was seen gradually in the results as I refined the mesh.
Also, the max stress calculated is roughly 40% higher than the fatigue limit, But the component has passed a 2 million cycle physical test at this loading.
Any help would be greatly appreciated.
Can you post the model itself so that I can take a look at it? The only thing I can think of off of the top of my head is that there might be a highly distorted element in that area (due to CAD geometry or something else) that is causing wildly inaccurate results.
That being said, it's very odd that your displacement deceased while your stress increased. It's also weird that your displacement decreased as you increased the number of DOF associated with the model; it's typically the other way around.
Regarding the discrepancy between your FEA model and your physical test results, are you sure that your model properties (loads, constraints, material properties, analysis type, etc.) between the two are very similar?