This is an automated archive made by the Lemmit Bot.

The original was posted on /r/singularity by /u/cobalt1137 on 2025-04-26 00:36:31+00:00.


After listening to more and more researchers at both leading labs and universities, it seems like they unanimously believe that AGI is not a question AND it is actually very imminent. And if we actually assume that AGI is on the horizon, then this just feels completely necessary. If we have systems that are intellectually as capable as the top percentage of humans on earth, we would immediately want trillions upon trillions of these (both embodied and digital). We are well on track to get to this point of intelligence via research, but we are well off the mark from being able to fully support feat from a infrastructure standpoint. The amount of demand for these systems would essentially be infinite.

And this is not even considering the types of systems that AGI are going to start to create via their research efforts. I imagine that a force that is able to work at 50-100x the speed of current researchers would be able to achieve some insane outcomes.

What are your thoughts on all of this?