This page is a permanent link to the reply below and its nested replies. See all post replies »
DDonde · 31-35, M
I imagine it will jump again before then, similar to the jump we just experienced over the last few years. I think one bottleneck right now though is efficiency/cost to "think" and not just base ability. I predict this will be partially solved by innovations in hardware to match the specific use case of running machine learning models.
DDonde · 31-35, M
@Ferise1 Okay. So in the past few years we've seen a big jump in the capabilities of two types of things that get called "AI" in the media: Machine Learning models for image generation and Large Language Models. The best reasoning LLM I've used so far available to me is OpenAI's O3 model which has been able to solve problems I throw at it so far. The caveat is that it takes longer and more computing power to reliably solve problems. I am predicting that the next "jump" in AI capability isn't going to be some revolutionary new algorithm at this point, but is going to be coming up with specialized hardware for it so that it's not as expensive to run and can run the same amount of compute in less time.