A major bottleneck is power capacity. Is is very difficult to find 50Mwatts+ (sometime hundreds) of capacity available at any site. It has to be built out. That involves a lot of red tape, government contracts, large transformers, contractors, etc. the current backlog on new transformers at that scale is years. Even Google and Microsoft can’t build, so they come to my company for infrastructure - as we already have 400MW in use and triple that already on contract. Further, Nvidia only makes so many chips a month. You can’t install them faster than they make them.
Humans don’t actually think either, we’re just electricity jumping to nearby neural connections that formed based on repeated association. Add to that there’s no free will, and you start to see how “think” is a immeasurable metric.
A major bottleneck is power capacity. Is is very difficult to find 50Mwatts+ (sometime hundreds) of capacity available at any site. It has to be built out. That involves a lot of red tape, government contracts, large transformers, contractors, etc. the current backlog on new transformers at that scale is years. Even Google and Microsoft can’t build, so they come to my company for infrastructure - as we already have 400MW in use and triple that already on contract. Further, Nvidia only makes so many chips a month. You can’t install them faster than they make them.
And the single biggest bottleneck is that none of the current AIs “think”.
They. Are. Statistical. Engines.
Same
And it’s pretty great at it.
AI’s greatest use case is not LLM and people treat it like that because it’s the only thing we can relate to.
AI is so much better and many other tasks.
How closely do you need to model a thought before it becomes the real thing?
Need it to not exponentially degrade when AI content is fed in.
Need creativity to be more than random chance deviations from the statistically average result in a mostly stolen dataset taken from actual humans.
Humans don’t actually think either, we’re just electricity jumping to nearby neural connections that formed based on repeated association. Add to that there’s no free will, and you start to see how “think” is a immeasurable metric.
Markov chains with extra steps
You’re not going to get an argument from me.
Maybe we are statistical engines too.
When I heard people talk they are also repeating the most common sentences that they heard elsewhere anyway.
Is this the AI?