- #1
tectactoe
- 39
- 0
Hello,
I am trying to do an analysis of an Xbox (funny, I know), and I'm going to try and redesign the cooling system, with different fans/heat sinks and such...
To perform my before/after analysis however, I will need to know some things about the GPU, since those are the main heat sources.
First of all, if the GPU is rated at 175 W, can this be the assumed amount of heat dissipation? 175 J per second? Or is that usually different than the actual amount of heat it disspiates? (unfortunately, I have no idea how G/CPUs work).
Here is a picture of the label of the GPU
Is there a way to tell how much heat/energy this will dissipate? Are the ratings an accurate measure of this?
Thank you!
I am trying to do an analysis of an Xbox (funny, I know), and I'm going to try and redesign the cooling system, with different fans/heat sinks and such...
To perform my before/after analysis however, I will need to know some things about the GPU, since those are the main heat sources.
First of all, if the GPU is rated at 175 W, can this be the assumed amount of heat dissipation? 175 J per second? Or is that usually different than the actual amount of heat it disspiates? (unfortunately, I have no idea how G/CPUs work).
Here is a picture of the label of the GPU
Is there a way to tell how much heat/energy this will dissipate? Are the ratings an accurate measure of this?
Thank you!