I have an application for a threaded rod separating two plates that are sheared relative to one another. The image below shows a test jig in which the threaded rod is simulated by a smooth rod "bonded" (as opposed actually threaded) into equal-diameter holes in the two blocks:
Here's a blow-up (for a particular mesh-control density) of the bottom joint, where the maximum stress exceeds the 20,000 psi yield strength of the steel:
I observe that the high-stress coverage decreases in area and increases in magnitude, apparently without limit, as mesh density is increased near the right-angle corner. I'm not trained as an engineer, but I understand that this behavior is typical of "re-entrant edges," where the theoretical stress can be unbounded. I have also modeled this configuration with fillets and observed that the stress converges nicely. Unfortunately my application does not permit a fillet or other stress-relieving feature around the actual joints.
So my questions are:
1) What is the most realistic way to simulated such a threaded joint?
2) How can one establish a practical failure criterion in order to choose bolt diameter, etc? -- John Willett
First - I'd model the rod for such a study with the root diameter of the thread. That way it is more easily compared to a hand calculation.
Second - If your FEA shows stress higher than the yield value it just means that there would be some plastic (permanent) deformation expected in that area. If it goes over the UTS for the material then it would break IRL.
Failure criteria depend heavily on the application. How safe do you want it to be?