A New York Times copyright lawsuit could kill OpenAI::A list of authors and entertainers are also suing the tech company for damages that could total in the billions.

  • kjPhfeYsEkWyhoxaxjGgRfnj@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I doubt it. It would likely kill any non Giant tech backed AI companies though

    Microsoft has armies of lawyers and cash to pay. It would make life a lot harder, but they’d survive

  • makyo@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    7
    ·
    10 months ago

    I always say this when this comes up because I really believe it’s the right solution - any generative AI built with unlicensed and/or public works should then be free for the public to use.

    If they want to charge for access that’s fine but they should have to go about securing legal rights first. If that’s impossible, they should worry about profits some other way like maybe add-ons such as internet connected AI and so forth.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      There’s plenty of money to be made providing infrastructure. Lots of companies make a ton of money providing infrastructure for open source projects.

      On another note, why is open AI even called “open”?

      • ItsMeSpez@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        On another note, why is open AI even called “open”?

        It’s because of the implication…

    • Pacmanlives@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Not really how it works these days. Look at Uber and Lime/Bird scooters. They basically would just show up to a city and say the hell with the law we are starting our business here. We just call it disruptive technology

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        Unfortunately true, and the long arm of the law, at least in the business world, isn’t really that long. Would love to see some monopoly busting to scare a few of these big companies into shape.

    • poopkins@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      What is unlicensed work? Copyrighted content will not have a licence agreement but this doesn’t mean you can freely infringe on copyright law.

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        By unlicensed I mean works that haven’t been licensed IE anything being used without permission or some other right

    • miridius@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      10 months ago

      Nice idea but how do you propose they pay for the billions of dollars it costs to train and then run said model?

    • asdfasdfasdf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      That goes against the fundamental idea of something being unlicensed, meaning there are no repercussions from using the content.

      I think what you mean already exists: open source licenses. Some open source licenses stipulate that the material is free, can be modified, etc. and you can do whatever you want with it, but only on the condition that whatever you create is under the same open source license.

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Ugh I see what you mean - no I mean unlicensed as in ‘they didn’t bother to license copyrighted works’ and public as in ‘stuff they scraped from Reddit, Twitter, and etc. without permission from anyone’.

    • canihasaccount@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      10 months ago

      Would you, after devoting full years of your adult life to the unpaid work of learning the requisite advanced math and computer science needed to develop such a model, like to spend years more of your life to develop a generative AI model without compensation? Within the US, it is legal to use public text for commercial purposes without any need to obtain a permit. Developers of such models deserve to be paid, just like any other workers, and that doesn’t happen unless either we make AI a utility (or something similar) and funnel tax dollars into it or the company charges for the product so it can pay its employees.

      I wholeheartedly agree that AI shouldn’t be trained on copyrighted, private, or any other works outside of the public domain. I think that OpenAI’s use of nonpublic material was illegal and unethical, and that they should be legally obligated to scrap their entire model and train another one from legal material. But developers deserve to be paid for their labor and time, and that requires the company that employs them to make money somehow.

    • dasgoat@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      9
      ·
      10 months ago

      Running AI isn’t free, and AI calculations pollute like a motherfucker

      This isn’t me saying you’re wrong on an ethical or judicial standpoint, because on those I agree. It’s just that, on a practical level considerations have to be made.

      For me, those considerations alone (and a ton of other considerations such as digital slavery, child porn etc) make me just want to pull the plug already.

      AI was fun. It’s a dumb idea for dumb buzzword spewing silicon valley ghouls. Pull the plug and be done with it.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    3
    ·
    edit-2
    10 months ago

    The NYT has a market cap of about $8B. MSFT has a market cap of about $3T. MSFT could take a controlling interest in the Times for the change it finds in the couch cushions. I’m betting a good chunk of the c-suites of the interested parties have higher personal net worths than the NYT has in market cap.

    I have mixed feelings about how generative models are built and used. I have mixed feelings about IP laws. I think there needs to be a distinction between academic research and for-profit applications. I don’t know how to bring the laws into alignment on all of those things.

    But I do know that the interested parties who are developing generative models for commercial use, in addition to making their models available for academics and non-commercial applications, could well afford to properly compensate companies for their training data.

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    5
    ·
    10 months ago

    Oh no, how terrible. What ever will we do without Shenanigans Inc. 🙄

  • Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    edit-2
    10 months ago

    This would bring up the cost of entry for making a model and nothing more. OpenAI will buy the data if they have too and so will google. The money will only go to the owners of the New York Times and its shareholders, none of the journalists who will be let go in the coming years will see a dime.

    We must keep the entry into the AI game as low as possible or the only two players will be Microsoft and Google. And as our economy becomes increasingly AI driven, this will cement them owning it.

    Pragmatism or slavery, these are the two options.

  • 800XL@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    10 months ago

    YES! AI is cool I guess, but the massive AI circlejerk is so irritating though.

    If OpenAI can infringe upon all the copyrighted material on the net then the internet can use everything of theirs all for free too.

    • VonCesaw@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Honestly, I’d rather OpenAI lose this one, and NYT lose later on in a much more embarrassing manner that cuts all the golden parachutes

  • kaitco@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    10 months ago

    I never thought that the AI-driven apocalypse could be impeded by a simple lawsuit. And, yet, here we are.

  • sugarfree@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    20
    ·
    10 months ago

    We hold ourselves back for no reason. This stuff doesn’t matter, AI is the future and however we get there is totally fine with me.

    • Zaderade@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      3
      ·
      10 months ago

      AI without proper regulation could be the downfall of humanity. Many pros, but the cons may outweigh them. Opinion.

      • sugarfree@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        12
        ·
        10 months ago

        AI development will not be hamstrung by regulations. If governments want to “regulate” (aka kill) AI, then AI development in their jurisdiction will move elsewhere.