China #1
Best friends with the mods at c/[email protected]

  • 1 Post
  • 37 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle






  • https://x.com/paulg/status/1796107666265108940

    There is Paul Graham saying exactly the opposite.

    Arguments can be made that Altman is making threats behind the scenes and forcing people to say nice things about him under duress, but 700 people from OpenAI signed a petition to bring him back as CEO, and only 3 wanted him gone. 700 people is a lot to quietly threaten, and none of them have come out to Toner’s defense as she continues on her crusade.

    I’m sure Altman has his own list of sins, everyone does, but I really think that he, and OpenAI by association, are constantly under attack in our new era of corporate warfare. You no longer need to sabotage the actual company, you just have to poison the well so that funding and potential employee base dries up, which is exactly what is happening with the constant stream of attacks against OpenAI. AI is the first quadrillion dollar industry, and OpenAI is the leader. It’s not just companies that want a piece, its entire countries. Look at how the US has to deal with election interference and then look at what goes on against OpenAI. You’ll see the patterns.


  • I’m just going to start reposting my reply to this story every time it’s posted, because it’s the same story being recycled by every tech outlet on the internet.

    This reads as the same shit that was outed during the hullabaloo when Altman was fired, and even then, at the height of the controversy, everyone around him shot it down. Take anything she says with a grain of salt. She’s been critical of the company for a while, tried to stage a coup for the company, and got judo-couped when Altman came back and she got fired. I don’t want to appear dismissive of her or whatever problems she allegedly has faced, but so far she has yet to put forth compelling evidence, and is just beating on the same drum repeatedly. Coincidentally, she is returning to the forefront as the recent drama around the Scarlett Johansson voice model is cooling off, which comes across like an attack to keep pressure on the company, rather than anything with validity, especially when, as I’ve said, she isn’t offering anything new, just the same attacks that failed before.




  • This story is blowing so fucking far out of proportion it’s honestly incredible. Just so everyone is one the same page, here is a video timestamped to the voice, and immediately following the voice you can hear the voice from Her as well.

    https://youtu.be/3BWjomtK-94?si=tDu574b4GySpnPIy&t=42

    They are not similar other than they are both female.

    The whole “her” thing that Altman threw up on twitter is just because the goddamned movie was a touchstone for the kind of thing that they are doing. They weren’t cloning the fucking voice. It’s like naming your new iguana Godzilla. It’s not going to destroy Tokyo any time soon, it’s just a cultural reference, you know, like a meme.

    As far as Johansson goes, she is falling prey to this shit just like every other celebrity that has been railing against big bad AI. There are so many sheisty lawyers trying to get their hands on the first big win from an AI suit that they will say anything to get a celebrity to sue, because if their firm wins, they become the Anti-AI lawfirm that all others will seek in the future. They will print money, but only if something sticks, and so far, nothing has. This will be another case like any other, where they take it to court, and there is no real basis for anything, and it ends up being all over the news and then disappearing like the whole debacle over Sarah Silverman’s book. In three months there will be another case against AI, and again, nothing will stick, because the people putting the bug in people’s ears don’t understand how to use most of the functionality of their cellphone, let alone how generative AI works.







  • “Workforce” doesn’t produce innovation, either. It does the labor. AI is great at doing the labor. It excels in mindless, repetitive tasks. AI won’t be replacing the innovators, it will be replacing the desk jockeys that do nothing but update spreadsheets or write code. What I predict we’ll see is the floor dropping out of technical schools that teach the things that AI will be replacing. We are looking at the last generation of code monkeys. People joke about how bad AI is at writing code, but give it the same length of time as a graduate program and see where it is. Hell, ChatGPT has only been around since June of 2020 and that was the beta (just 13 years after the first iPhone, and look how far smartphones have come). There won’t be a huge demand for workforce in 5 years, there will be a huge portion of the population that suddenly won’t have a job. It won’t be like the agricultural or industrial revolution where it takes time to make it’s way around the world, or where this is some demand for artisanal goods. No one wants artisanal spreadsheets, and we are too global now to not outsource our work to the lowest bidder with the highest thread count. It will happen nearly overnight, and if the world’s governments aren’t prepared, we’ll see an unemployment crisis like never before. We’re still in “Fuck around.” “Find out” is just around the corner, though.


  • Holy wall of unparagraphed word salad,

    Ahh, we are getting into the insult round of tonight’s entertainment. I’ll break this reply down for you.

    Again you are not understanding what is and isn’t an evolutionary process

    It seems our definitions differ slightly, yes.

    You don’t have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That’s all that is needed to have an evolutionary force.

    That, and the ability to self-actuate your own evolution. You see, that’s what we differ on definition of evolutionary force. We didn’t have some greater will forcing us down a path of evolution. There was no force. There was trial and error. The “lived long enough to fuck” survived, the rest didn’t. Reproduction is a fundamental aspect of evolution. Computers can’t reproduce. We have to facilitate that ourselves, though iterating on various aspects of computers. Right now we can fake it with increased processing power, increased memory, more elegant code, but at the end of the day, without some form reproductive system that doesn’t rely on us, the computer can’t exceed our grasp. If it could, we’d see true exponential growth, not compounding as in Moore’s Law. We can’t make them do more than what they already do. We can just make them do it faster.

    With computers we don’t have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress.

    Yeah, sure, and I can cram a hundred monkey’s in a room with a hundred typewriters and come up with a better love story than Twilight, but it’s gonna take time. Not Shakespeare time, but a few weeks at least. That’s the thing, though, the evolution of any system doesn’t happen overnight. We didn’t wake up one day, walk out of our cave, and create TikTok. Evolution is a long process. You forget all of the things that happened before we figured out that our thumbs weren’t solely for sticking up our own asses. There are millions of years that you aren’t accounting for. Billions of attempts to create what we take for granted. Consciousness. You say that we don’t have to know what we are doing, and you are right, we don’t, but it’s a crap-shoot with quadrillion to one odds.

    And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.

    Again, we can store as much data as we want, it won’t make AI happen. We haven’t spontaneously seen life form in libraries, but they have been storing data in them for thousands of years. Consciousness isn’t data. If that’s all you want, ChatGPT is passing the bar. It still can’t tell me it loves me, and mean it.

    On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.

    Funny, you seem to think that you perceive it pretty well…

    Therefore computer intelligence is vastly going to outpace or own. And very soon too.

    A well thought out conclusion I’m sure is based on all of the facts you failed to present. Bravo.


  • You can believe whatever you want, but I don’t think it’s arrogant to say what I did. You are basing your view of humanity on what you think humanity has done, and basing your view on AI based on what you think it will do. Those are fundamentally different and not comparable. If you want to talk about the science fiction future of AI, we should talk about the science fiction future of humanity as well. Let’s talk about augmenting ourselves, extending lifespans, and all of the good things that people think we’ll do in the coming centuries. If you want to look at humans and say that we haven’t evolved at all in the last 3000 years, then we should look at computers the same way. Computers haven’t “evolved” at all. They still do the same thing they always have. They do a lot more of it, but they don’t do anything “new”. We have found ways to increase the processing power, and the storage capacity, but a computer today has the same limits that the one that sent us to the moon had. It’s a computer, and incapable of original thought. You seem to believe that just because we throw more ram and processors at it that somehow that will change things, but it doesn’t. It just means we can do the same things, but faster. Eventually we’ll run out of things to process and data to store, but that won’t bring AI any closer to reality. We are climbing the mountain, but you speak like we have already crested. We’ve barely left base camp in the grand scheme of artificial intelligence.


  • Computer power doesn’t scale infinitely, unless you mean building a world mind and powering if off of the spinning singularity at the center of the galaxy like a type 3 civilization, and that’s sci-fi stuff. We still have to worry about bandwidth, power, cooling, coding and everything else that going into running a computer. It doesn’t just “scale”. There is a lot that goes into it, and it does have a ceiling. Quantum computing may alleviate some of that, but I’ll hold my applause until we see some useful real world applications for it.

    Furthermore, we still don’t understand how the mind works, yet. There are still secrets to unlock and ways to potentially augment and improve it. AI is great, and I fully support the advancement in technology, but don’t count out humans so quickly. We haven’t even gotten close to human level intelligence and GOFAI, and maybe we never will.