• fuckwit_mcbumcrumble@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    11
    ·
    6 months ago

    I’d kill for a single CCD 16 core x3d part. The 7950x3d is tempting with it’s 3d CCD and high clock speed CCD, but since not every game/program knows how to use it properly you end up with hit or miss performance.

      • fuckwit_mcbumcrumble@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        5
        ·
        6 months ago

        My biggest concern from what I’ve seen is that the weird hack AMD uses to get programs to run on one set of cores vs the other wasn’t exactly great last I looked and can cause issues when a game tries to move off of one CCD onto the other. That said I haven’t looked into this ever since the CPU first came out so hopefully things are better now.

        How observant are you to micro stutters in a game? That was the biggest reason I got the 5800x3d in the first place, but now that I have a better GPU I can tell that thing struggles. And from what I remember most of the issued you’d have moving from CCD to CCD were more micro stutters vs normal frame rate dips or just lower average frame rates.

        • I only really notice stutters in heavily modded Minecraft, where it’s clearly linked to the garbage collector. In more demanding games I don’t notice any stuttering really. Or at least, none that I can’t easily link to something triggering in the game that is likely causing it.

          Sure, perhaps I have slightly lower average FPS compared to a 7800x3D, but I also use this PC for productivity reasons so there the extra oompf really does help. Still, 97% of a framerate that’s already way higher than what my 144Hz monitors support is still well above what my monitors support. I don’t think the performance difference is really noticeable, other than in certain benchmarks or if you try really hard to see them.

          It’s considerably faster than a 5800x3D though.

    • 30p87@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      I’m also wondering why there is even a difference in FPS in higher class CPUs - shouldn’t it be the GPU bottlenecking, especially in 4k high settings?

      • fuckwit_mcbumcrumble@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        1% and 0.1% lows will almost always be CPU bound as it loads more in. Well assuming it’s not vram limiting you. Games are pretty CPU intensive these days since the PS5 and Xbox no longer have potato CPUs. At 120+ fps I regularly see >50% CPU usage in most games. And that’s with nothing running in the background. In the real world you have a ton of background tasks, YouTube videos, discord etc eating your CPU.

        Also the 4090 is an absolute beast. My 5800X3D absolutely holds my 4090 back pretty often honestly.