• 15 Posts
  • 1.76K Comments
Joined 1 year ago
cake
Cake day: March 22nd, 2024

help-circle




  • brucethemoose@lemmy.worldto196@lemmy.blahaj.zoneCamp Rule
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    24 hours ago

    What about ‘edge enhancing’ NNs like NNEDI3? Or GANs that absolutely ‘paint in’ inferred details from their training? How big is the model before it becomes ‘generative?’

    What about a deinterlacer network that’s been trained on other interlaced footage?

    My point is there is an infinitely fine gradient through time between good old MS paint/bilinear upscaling and ChatGPT (or locally runnable txt2img diffusion models). Even now, there’s an array of modern ML-based ‘editors’ that are questionably generative most probably don’t know are working in the background.


  • brucethemoose@lemmy.worldto196@lemmy.blahaj.zoneCamp Rule
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    24 hours ago

    that’s a weird hill to die on, to be honest.

    Welcome to Lemmy (and Reddit).

    Makes me wonder how many memes are “tainted” with oldschool ML before generative AI was common vernacular, like edge enhancement, translation and such.

    A lot? What’s the threshold before it’s considered bad?








  • My last Android phone was a Razer Phone 2, SD845 circa 2018. Basically stock Android 9.

    And it was smooth as butter. It had a 120hz screen while my iPhone 16 is stuck at 60, and I can feel it. And it flew through some heavy web apps I use while the iPhone chugs and jumps around, even though the new SoC should objectively blow away even modern Android devices.

    It wasn’t always this way; iOS used to be (subjectively) so much faster that it’s not even funny, at least back when I had an iPhone 6S(?). Maybe there was an inflection point? Or maybe it’s only the case with “close to stock” Android stuff that isn’t loaded with bloat.