I'm performing a study on a bolt in which a 10 pound tensile load is applied to the bolt's threads and the bolt's head is fixed to ground. The simulation is performed with a thermal load of -30F applied to the bolt (i.e. I'm simulating an environment that is -30F...the thermal load was applied to the entire bolt body, not just the external faces). I have SW Simulation (SW premium), but not Sim Pro or Sim Premium. I've run a baseline with no thermal load and compared the results from the two studies. I have a reasonably fine mesh (see image below).
The baseline result gives me a max stress of 6,399 psi, distributed as seen in the images below.
The -30F version gives me a max stress of 102,000 psi, distributed as seen in the images below.
Anyone know why I am getting such drastically different answers? And why the stress distribution is so radically different (it maxes out where the threads start on the baseline, but maxes out on the fixed face in the "cold" version)? Even if I look at the stress value in the cold version at the area where the threads start I get a value of 19,000 psi, which is still much higher than the 6,399 psi I got at that same location in the baseline study. Anyone know what might be going on here?