This is an automated archive made by the Lemmit Bot.

The original was posted on /r/singularity by /u/Creative-robot on 2024-10-13 21:30:57+00:00.


From what little i know of it, i think it’s the idea that the brain is constantly predicting the outcomes of any situation it’s in, and that this is an integral part of learning and adapting to new information. If something like this was replicated in an AI system, would it then have the ability to learn and adapt, basically making it AGI by default? I’m kinda a layman at the topic so try not to roast me too much.