![](/static/790fef6/assets/icons/icon-96x96.png)
![](https://lemm.ee/api/v3/image_proxy?url=https%3A%2F%2Flemmy.world%2Fpictrs%2Fimage%2F4db30cad-866a-45fe-8ba0-4b6a8c869929.png)
Funny enough, I think I liked LD because it felt closer in tone to 90s trek than modern trek. Like sure, those were serious shows, but in between the drama there was some pretty natural feeling levity based on character interactions, slice of life stuff, etc. Stuff that doesn’t take you out of the story, but gives you a breather and makes you care more about the characters when serious stuff does happen. Most of modern Trek feels like Marvel movies: End of the world level stakes and melodrama all the time, but constantly undercut with self-aware quips to keep things from ever being too sincere.
LD feels like a return to what I liked about those earlier shows: See the lives of some interesting characters in an interesting setting going on adventures. It’s not perfect and it’s not what I’d want to see entirely replace those old kinds of shows, but it stands in pretty stark contrast to the other new stuff.
Idk. There’s something going on in how humans learn which is probably fundamentally different from current ML models.
Sure, humans learn from observing their environments, but they generally don’t need millions of examples to figure something out. They’ve got some kind of heuristics or other ways of learning things that lets them understand many things after seeing them just a few times or even once.
Most of the progress in ML models in recent years has been the discovery that you can get massive improvements with current models by just feeding them more and data. Essentially brute force. But there’s a limit to that, either because there might be a theoretical point where the gains stop, or the more practical issue of only having so much data and compute resources.
There’s almost certainly going to need to be some kind of breakthrough before we’re able to get meaningful further than we are now, let alone matching up to human cognition.
At least, that’s how I understand it from the classes I took in grad school. I’m not an expert by any means.