Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

The New Coffee Room

  1. TNCR
  2. General Discussion
  3. 183,000

183,000

Scheduled Pinned Locked Moved General Discussion
12 Posts 5 Posters 96 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • taiwan_girlT Offline
    taiwan_girlT Offline
    taiwan_girl
    wrote on last edited by
    #3

    I agree. I hope they win, but unfortunately, it is already out there.

    Maybe someone who is smarter with computers than me can answer, but I have hear that if you have a powerful enough consumer computer, you can run your own "CHatGPT" locally.

    1 Reply Last reply
    • MikM Offline
      MikM Offline
      Mik
      wrote on last edited by
      #4

      Yes, but they can be enjoined not to use any of it.

      "The intelligent man who is proud of his intelligence is like the condemned man who is proud of his large cell." Simone Weil

      1 Reply Last reply
      • RainmanR Offline
        RainmanR Offline
        Rainman
        wrote on last edited by
        #5

        Aqua, if you get the time, I (we) would be interested in your take on this. I could only read the first bit without subscribing, but it seems a similar storm is coming with music (if not already taking place). People should be paid for their intellectual property, IMO, so I would like $2.39 for this post, should any AI creature-like bot suck up the complete expression of my unique perspective.

        Aqua LetiferA 1 Reply Last reply
        • RainmanR Rainman

          Aqua, if you get the time, I (we) would be interested in your take on this. I could only read the first bit without subscribing, but it seems a similar storm is coming with music (if not already taking place). People should be paid for their intellectual property, IMO, so I would like $2.39 for this post, should any AI creature-like bot suck up the complete expression of my unique perspective.

          Aqua LetiferA Offline
          Aqua LetiferA Offline
          Aqua Letifer
          wrote on last edited by
          #6

          @Rainman said in 183,000:

          Aqua, if you get the time, I (we) would be interested in your take on this. I could only read the first bit without subscribing, but it seems a similar storm is coming with music (if not already taking place). People should be paid for their intellectual property, IMO, so I would like $2.39 for this post, should any AI creature-like bot suck up the complete expression of my unique perspective.

          😄

          Well it's very, very complicated, because the extent to which you use AI to do the work and how those AI models do what they do varies considerably. And you're right, it's happening everywhere with every type of intellectual property you can think of.

          But let's say we're talking only about:
          AI models illegally using copyrighted work to train itself.

          As TG said, the cat's already out of the bag. Can't undo it. So the best we can do is put in protections further downstream: if you use AI to generate anything that could be considered intellectual property, you can't copyright it, period.

          That gets tricky, though, because the future of nearly all software programs is going to be to use AI in the background. Did you write your novel using Grammarly? Well, they're diving into AI, too. Does that mean you can't copyright your book now?

          Generally, though, it's a very bad idea to let people make new work that's built on the backs of hundreds of thousands of other people without their consent. We should be doing everything we can to prevent this. But that's going to require a shitload of new legal definitions that I don't think we're prepared to make yet. We're behind the times.

          Please love yourself.

          1 Reply Last reply
          • RainmanR Offline
            RainmanR Offline
            Rainman
            wrote on last edited by
            #7

            Thanks, Aqua. How about blanket payments. I get $2.39 for my post, you have written 10 books which have been soaked up to the cloud universe, therefore you get $239 per year for your posts. Well OK, let's lower the annual payment from Bill Gates et al to .0002 cents for me, and then the same ratio for you. Then Bill pays us, charges us, supply/demand, yada yada. Simple. Same payment structure that is used now, just broader. And OK (again), I get nothing for a post on a forum, but if your books were on climate, and you posted your thoughts on this forum, you should get a fairly hefty royalty check, under a new category. I'm rambling, think I'll shut up.

            Aqua LetiferA 1 Reply Last reply
            • RainmanR Rainman

              Thanks, Aqua. How about blanket payments. I get $2.39 for my post, you have written 10 books which have been soaked up to the cloud universe, therefore you get $239 per year for your posts. Well OK, let's lower the annual payment from Bill Gates et al to .0002 cents for me, and then the same ratio for you. Then Bill pays us, charges us, supply/demand, yada yada. Simple. Same payment structure that is used now, just broader. And OK (again), I get nothing for a post on a forum, but if your books were on climate, and you posted your thoughts on this forum, you should get a fairly hefty royalty check, under a new category. I'm rambling, think I'll shut up.

              Aqua LetiferA Offline
              Aqua LetiferA Offline
              Aqua Letifer
              wrote on last edited by
              #8

              @Rainman said in 183,000:

              Thanks, Aqua. How about blanket payments. I get $2.39 for my post, you have written 10 books which have been soaked up to the cloud universe, therefore you get $239 per year for your posts. Well OK, let's lower the annual payment from Bill Gates et al to .0002 cents for me, and then the same ratio for you. Then Bill pays us, charges us, supply/demand, yada yada. Simple. Same payment structure that is used now, just broader. And OK (again), I get nothing for a post on a forum, but if your books were on climate, and you posted your thoughts on this forum, you should get a fairly hefty royalty check, under a new category. I'm rambling, think I'll shut up.

              Sounds good to me, though. Where's the line to stand in?

              Please love yourself.

              1 Reply Last reply
              • MikM Offline
                MikM Offline
                Mik
                wrote on last edited by
                #9

                The cat may be out of the bag, but I think they're going to have to come up with a way to reasonably compensate the owners of the copyrighted material they essentially stole. I think they rolled the dice that they would not get caught, and that alone should heighten the compensation.

                "The intelligent man who is proud of his intelligence is like the condemned man who is proud of his large cell." Simone Weil

                AxtremusA 1 Reply Last reply
                • MikM Mik

                  The cat may be out of the bag, but I think they're going to have to come up with a way to reasonably compensate the owners of the copyrighted material they essentially stole. I think they rolled the dice that they would not get caught, and that alone should heighten the compensation.

                  AxtremusA Offline
                  AxtremusA Offline
                  Axtremus
                  wrote on last edited by
                  #10

                  @Mik said in 183,000:

                  The cat may be out of the bag, but I think they're going to have to come up with a way to reasonably compensate the owners of the copyrighted material they essentially stole. I think they rolled the dice that they would not get caught, and that alone should heighten the compensation.

                  “They” (the AI developers) could not know how valuable the product will be until they have built it (i.e., could not predict how good the model will turn out until after they trained the model). Neither could the copyright holders before they see the finished product. Neither side knows how valuable or how disruptive the thing will turn out to be until after the thing has been turned out, so neither side knew how much legal protection they should have invested in this before hand.

                  Now that the thing is built, they fight over who should get how big a share of the massive, unexpected bounty.

                  None of this would be news, and the authors/publishers would not have cared, had Large Language Model Generative AI turned out to be a flop.

                  Aqua LetiferA 1 Reply Last reply
                  • AxtremusA Axtremus

                    @Mik said in 183,000:

                    The cat may be out of the bag, but I think they're going to have to come up with a way to reasonably compensate the owners of the copyrighted material they essentially stole. I think they rolled the dice that they would not get caught, and that alone should heighten the compensation.

                    “They” (the AI developers) could not know how valuable the product will be until they have built it (i.e., could not predict how good the model will turn out until after they trained the model). Neither could the copyright holders before they see the finished product. Neither side knows how valuable or how disruptive the thing will turn out to be until after the thing has been turned out, so neither side knew how much legal protection they should have invested in this before hand.

                    Now that the thing is built, they fight over who should get how big a share of the massive, unexpected bounty.

                    None of this would be news, and the authors/publishers would not have cared, had Large Language Model Generative AI turned out to be a flop.

                    Aqua LetiferA Offline
                    Aqua LetiferA Offline
                    Aqua Letifer
                    wrote on last edited by
                    #11

                    @Axtremus said in 183,000:

                    @Mik said in 183,000:

                    The cat may be out of the bag, but I think they're going to have to come up with a way to reasonably compensate the owners of the copyrighted material they essentially stole. I think they rolled the dice that they would not get caught, and that alone should heighten the compensation.

                    “They” (the AI developers) could not know how valuable the product will be until they have built it (i.e., could not predict how good the model will turn out until after they trained the model). Neither could the copyright holders before they see the finished product. Neither side knows how valuable or how disruptive the thing will turn out to be until after the thing has been turned out, so neither side knew how much legal protection they should have invested in this before hand.

                    Now that the thing is built, they fight over who should get how big a share of the massive, unexpected bounty.

                    None of this would be news, and the authors/publishers would not have cared, had Large Language Model Generative AI turned out to be a flop.

                    Again, you could not be more wrong about this. Unauthorized usage of copyright-protected work gets prosecuted against even when the theft is 1:1. That's kind of the point of copyright law.

                    Please love yourself.

                    AxtremusA 1 Reply Last reply
                    • Aqua LetiferA Aqua Letifer

                      @Axtremus said in 183,000:

                      @Mik said in 183,000:

                      The cat may be out of the bag, but I think they're going to have to come up with a way to reasonably compensate the owners of the copyrighted material they essentially stole. I think they rolled the dice that they would not get caught, and that alone should heighten the compensation.

                      “They” (the AI developers) could not know how valuable the product will be until they have built it (i.e., could not predict how good the model will turn out until after they trained the model). Neither could the copyright holders before they see the finished product. Neither side knows how valuable or how disruptive the thing will turn out to be until after the thing has been turned out, so neither side knew how much legal protection they should have invested in this before hand.

                      Now that the thing is built, they fight over who should get how big a share of the massive, unexpected bounty.

                      None of this would be news, and the authors/publishers would not have cared, had Large Language Model Generative AI turned out to be a flop.

                      Again, you could not be more wrong about this. Unauthorized usage of copyright-protected work gets prosecuted against even when the theft is 1:1. That's kind of the point of copyright law.

                      AxtremusA Offline
                      AxtremusA Offline
                      Axtremus
                      wrote on last edited by Axtremus
                      #12

                      @Aqua-Letifer Law provides grounds for prosecution, yet whether to prosecute remains a choice.

                      I have no sympathy for the businesses of the world who profit by infringing the rights of others. Still, it costs large sums of money to prosecute a case in court.

                      Imagine a world where LLM generative AI turns out to be a flop, where ChatGPT spouts gibberish rather than prose. In such a world, the developers of ChatGPT would have exhausted their initial tens of millions in early funding, present a few conference papers, and that would be the end of it. Most authors/publishers would likely never know that their copyrighted works have been used to train an AI large language model, and even if they know, they would see insufficient monetary incentives to sue. (Why spend millions in legal fees to sue a bunch of developers who have exhausted their funding and has no prospect of making more money with their gibberish AI? Just for the principle? There are many researchers and developers using similar datasets without permission who aren't getting sued.)

                      Now that the world see the value of a certain way of doing generative AI, it is right that we seriously consider how to divide the large expected bounties among all contributing parties, authors and publishers included. Let the lawsuits run their courses. Let the advocates and the lobbyists make their pleas. And see what public policies emerge from all this.

                      1 Reply Last reply
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • Users
                      • Groups