I am trying to find out the maximum temperature rise in an IC (typically a TEC driver - ADN8834) when it is supplied with the theoretical power Input in watt. In my simulation I have included PCB layer details for heat dissipation through PCB, fan characteristics for air flow, thermal resistance of the IC package from the data sheet, perforated plates and radiations from the IC surface.
Electronics team indicated me that the peak power consumption by the IC as 2.61 W. Hence, I modelled such that 2.61 W power is supplied on the junction part of the 2- resistor component. When I ran the simulation, the resulting temperature was very high in the range of 250 degree celcius.
Since the power supplied was peak power, we decided to derate the supplied power to 50% of the peak power (1.3W) and simulated. But still the temperature was around 180 degree Celsius.
Meanwhile the electronics team built a test circuit and tested the actual temperature rise in the IC. It was shocking that the temperature rise were merely 2 degree Celsius more then the ambient. I have no clue on why this happened.
I have a few questions regarding,
1) In actual scenario, if suppose the IC is supplied with 2 W power, wil the entire 2 W be converted into heat or should I consider that the IC is say 80-90% efficient so that only the remaining 10-20% of the total power supplied is dissipated as heat?
2) Why is the temperature rise in actual board is as low as 2 degree celcius, while the temperature with respect to the same scenario with hand calculations should be around 120 degree Celsius with the ambient of 25 degree Celsius.
Using the formula,
Tj = Ta + (power * thermal resistance)
Tj = 25 + (2.6 * 37)
Tj = 121.2 degree celcius
Junction to ambient resistance - 37 degree Celsius/watt
Junction to case resistance - 1.65 degree Celsius/watt
I am stuck here. I want to.understand the reason for temperature rise. Please help me out in solving this.