Technological feat aside:

Revolutionary heat dissipating coating effectively reduces temperatures by more than 10%

78.5C -> 70C = (78.5 - 70) / 78.5 = 0.1082 = 10% right?!

Well, not really. Celsius is an arbitrary temperature scale. The same values on Kelvin would be:

351.65K -> 343.15K = (351.65 - 343.15) / 351.65 = 0.0241 = 2% (???)

So that’s why you shouldn’t do % on temp changes. A more entertaining version: https://www.youtube.com/watch?v=vhkYcO1VxOk&t=374s

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    GamersNexus presents their temperature testing in terms of difference from room temperature, so this is probably how they’d do this comparison.

    I’m not sure they’d see a reason to cover ram temperature unless it was approaching actual risk of harm or enabled higher clocks, though. Comparing cases or CPU coolers by temperature makes sense. Comparing GPUs when they’re using the same chip and cooling performance is a big part of the difference between models? Sure. But RAM? Who cares.