I'm trying to determine internal stresses produced by a series of press fits onto a shaft. Please refer to the image so you can follow along.
Part C is a shaft, the centerline, would be the bottom of the image. All parts can be considered rings. Please excuse the crude graphic. IP regs prohibit my posting of a screen shot of the assembly.
First part (A) is mounted to part (B).
Contact set (I) is a stress fit (press fit, interference fit, whatever).
This causes the inner diameter of (B) to collapse slightly.
The inner diameter of the (A)/(B) assembly is then machined to compensate for for this collapse.
The (A)/(B) assembly is mounted onto the shaft (C).
Contact set (II) is a tight slip fit (clearance fit).
Then part (D) is mounted onto the (A)/(B)/(C) assembly.
Contact set (III) is a heavy enough interference fit sized so that the contact set (II) becomes a heavy fit itself.
If I run an analysis on parts A, B, and C, it shows an interference fit at contact set II because SW does not acknowledge the machining step to remove the excess internal material. This results in increased stresses once the entire assembly is analyzed. Also, the assembly will spin at high speed(~20K RPM) so I need to know accurately if interference fits will be maintained. A false interference fit invalidates the results.
My question is this: Is there any way to have SW factor in that there has been material removed after the initial deflection?
Simply constraining that face produces higher stresses then would exist in reality.
Please do not suggest design changes as there are other factors requiring this construction.
Thanks in advance