If you ever study biochemistry, it leaves you absolutely in awe. The best engineering we can do is pretty amazing, we have computers and airplanes and all this magic stuff, but the stuff in you is a hundred, a thousand times better made. It’s stunning. Comparatively speaking, it is perfect. And that’s only the stuff we understand. The stuff in your brain, we do not.
In the relatively short amount of time we’ve had with computers we’ve made pretty astounding progress though. If we had had a few million years to improve those silicon brains I think we’d give evolution a run for its money!
Yea, our engineered stuff might be simplistic compared to the brain and biology, but evolution is just a combination of luck, randomness and “unguided” trial and error. There’s no “thought” to evolution and that’s why we end up with all these…weird quirks and flaws LMAO
those quirks are all features, i swear
I remember a quote from Civ along the lines of “if the brain was simple enough for us to understand, our minds would be to simple to understand it.”
It’s a pretty trivial informational paradox for a mind to comprehend itself – comprehension of its comprehension of itself then needs further comprehension… So yeah. Only a much more complex mind can understand a given mind
Antivirus protection could be better, though. Oh, and the built in self destruct is kind of a bummer, too.
Figernails are so annoying
If you ever need to claw your way out of a heap of rubble, you’ll be thankful for them.
Rip them off.
It is a planned obsolescence.
Natural selection is essentially just a massively parallel Monte Carlo optimization algorithm that’s been running for billions of years. It’s so simple yet produces such amazing complexity.
Give it a few more billion and we’ll finally have an intelligence, that’s not hell bent on destroying itself.
This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).
So not only do shitty “AI” models use >20x the energy of a human to “think,” training them uses the lifetime energy equivalent of hundreds of humans. It’s absolutely absurd how inefficient this technology is.