Skip to content
0
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Sketchy)
  • No Skin
Collapse

Wandering Adventure Party

  1. Home
  2. Uncategorized
  3. @StarkRG You assume generative AI is never trained solely on licensees and public domain works, which are therefore not stolen.

@StarkRG You assume generative AI is never trained solely on licensees and public domain works, which are therefore not stolen.

Scheduled Pinned Locked Moved Uncategorized
5 Posts 3 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Chris TrottierA This user is from outside of this forum
    Chris TrottierA This user is from outside of this forum
    Chris Trottier
    wrote on last edited by atomicpoet@atomicpoet.org
    #1
    @StarkRG You assume generative AI is never trained solely on licensed and public domain works, which are therefore not stolen.

    You also assume generative AI is used solely in data centres and cannot be used on modest hardware, no more power hungry than a gaming PC.

    But in fact, both assumptions are wrong. It is more than feasible to train off the public domain and run these models on a PC with no more than 12GB of VRAM.
    starkrg@myside-yourside.netS 1 Reply Last reply
    0
    • Chris TrottierA Chris Trottier
      @StarkRG You assume generative AI is never trained solely on licensed and public domain works, which are therefore not stolen.

      You also assume generative AI is used solely in data centres and cannot be used on modest hardware, no more power hungry than a gaming PC.

      But in fact, both assumptions are wrong. It is more than feasible to train off the public domain and run these models on a PC with no more than 12GB of VRAM.
      starkrg@myside-yourside.netS This user is from outside of this forum
      starkrg@myside-yourside.netS This user is from outside of this forum
      starkrg@myside-yourside.net
      wrote on last edited by
      #2

      @atomicpoet Yes, it *can* be trained with ethically sourced material, but it *isn't*. And it doesn't matter what hardware is used, it'll require similar amounts of energy. If you use low-powered hardware, it'll use less power (energy per unit time), but take longer.

      Chris TrottierA 1 Reply Last reply
      0
      • starkrg@myside-yourside.netS starkrg@myside-yourside.net

        @atomicpoet Yes, it *can* be trained with ethically sourced material, but it *isn't*. And it doesn't matter what hardware is used, it'll require similar amounts of energy. If you use low-powered hardware, it'll use less power (energy per unit time), but take longer.

        Chris TrottierA This user is from outside of this forum
        Chris TrottierA This user is from outside of this forum
        Chris Trottier
        wrote on last edited by atomicpoet@atomicpoet.org
        #3
        @StarkRG No, not that it *isn’t*. It *is*. Here’s a fairly trained model:

        https://huggingface.co/Mitsua/mitsua-likes

        Also, you are assuming ongoing training. However, you can use a pre-trained model on a local machine that requires only short bursts of compute. No retraining. No dataset. Roughly the same energy class as a heavy game or a 3D render.
        Bill, organizer of stuffW 1 Reply Last reply
        0
        • Chris TrottierA Chris Trottier
          @StarkRG No, not that it *isn’t*. It *is*. Here’s a fairly trained model:

          https://huggingface.co/Mitsua/mitsua-likes

          Also, you are assuming ongoing training. However, you can use a pre-trained model on a local machine that requires only short bursts of compute. No retraining. No dataset. Roughly the same energy class as a heavy game or a 3D render.
          Bill, organizer of stuffW This user is from outside of this forum
          Bill, organizer of stuffW This user is from outside of this forum
          Bill, organizer of stuff
          wrote on last edited by
          #4

          @atomicpoet @StarkRG But these types of narrowly-trained models (even assuming the "open licensed and public domain" is accurate, which it mostly isn't) are also very limited and extremely likely to be wrong, especially in sensitive use cases. Even in this case, where all the model can do is generate a particular style of anime images, "The Mitsua Likes struggles with most modern concepts and has difficulty understanding complex prompts"

          Chris TrottierA 1 Reply Last reply
          0
          • Bill, organizer of stuffW Bill, organizer of stuff

            @atomicpoet @StarkRG But these types of narrowly-trained models (even assuming the "open licensed and public domain" is accurate, which it mostly isn't) are also very limited and extremely likely to be wrong, especially in sensitive use cases. Even in this case, where all the model can do is generate a particular style of anime images, "The Mitsua Likes struggles with most modern concepts and has difficulty understanding complex prompts"

            Chris TrottierA This user is from outside of this forum
            Chris TrottierA This user is from outside of this forum
            Chris Trottier
            wrote on last edited by
            #5
            @wcbdata @StarkRG I gave one example. There are many others. Whether a narrowly trained model is powerful, weak, or boring is orthogonal to the point I was making.

            If someone claims a model is public domain or open-licensed and that claim is false, that’s not a vibes debate. That’s an evidentiary question for courts. I’m not an auditor and neither are you.

            Capability, usefulness, and correctness are separate axes from sourcing and authorship. Smuggling model quality into an ethics argument doesn’t strengthen it. It just changes the subject.
            1 Reply Last reply
            0

            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Login or register to search.
            Powered by NodeBB Contributors
            • First post
              Last post