• 0 Posts
  • 20 Comments
Joined 10 months ago
cake
Cake day: January 31st, 2024

help-circle
  • Assuming we can get AGI. So far there’s been little proof we’re any closer to getting an AI that can actually apply logic to problems that aren’t popular enough to be spelled out a dozen times in the dataset it’s trained on. Ya know, the whole perfect scores on well known and respected collage tests, but failing to solve slightly altered riddles for children? It being literally incapable of learning new concepts is a pretty major pitfall if you ask me.

    I’m really sick and tired of this “we just gotta make a machine that can learn and then we can teach it anything” line. It’s nothing new, people were saying this shit since fucking 1950 when Alan Turing wrote it in a paper. A machine looking at an unholy amount of text and evaluation based on a new prompt, what is the most likely word to follow, IS NOT LEARNING!!! I was sick of this dilema before LLMs were a thing, but now it’s just mind numbing.













  • I don’t understand what was wrong with the original version that just took 2 AA batteries. Reaching for the AA charger and swapping cells not awkward enough or something?

    Smart and elegant design would be hiding a battery charger in the iMac it self (maybe even use something smaller than AA), not expect you to flip and plug in your mouse every time ya leave it. The Nintendo Switch, while a completely different form factor, is a great example of an elegant (you could even say “wireless”) charging solution.

    I’m getting really sick of the Apple esthetic of sticking out wires, be it the mouse or the dozen dongles for every portable device they now make. Uh! Can’t forget the world’s only pen that needs charging, for seemingly no reason.





  • Sounds like the kind of work my analyst does. I guess he’s technically part of the development team, so sure??? Our 3 client mediators are totally taking over. Also pretty sure we’re the only IT department that even has such a thing. The only other person in our IT branch to be mainly doing calls and such is the top head of IT, every other IT boss still has a lot of technical work around their necks. So at least at my job “close to 100%” is an absolute farcry.

    It’s a very similar story at my girlfriend’s work place. Except they don’t even have analysts.



  • As a developer I have to say OH hell nah. If I had to compare the issue to something more layman, I’d compare it to tesla’s self driving. If I have to watch it the entire time it does its thing because there’s an almost certain chance it’ll mess something up CATASTROPHICALLY due to the fact that it literally lacks the ability to understand, than I might as well just do it my self. It rarely saves time and only in dumb cases, that should have been automated in other ways a long time ago.

    Not saying it’s not a very handy tool occasionally, just that it can’t come up with solutions to problems on its own, which is like 75% of my work. And it can’t do this due to a fundamental limitation in how learning models work, no amount of training will fix this.