Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
The real answer is compute power. At the moment it’s very expensive to run the computations necessary for big LLMs, I’ve heard some companies are even developing specialized chips to run them more efficiently. On the other hand, you probably don’t want your phone’s keyboard app burning out the tiny CPU in it and draining your battery. It’s not worth throwing anything other than a simple model at the problem.
The real answer is compute power. At the moment it’s very expensive to run the computations necessary for big LLMs, I’ve heard some companies are even developing specialized chips to run them more efficiently. On the other hand, you probably don’t want your phone’s keyboard app burning out the tiny CPU in it and draining your battery. It’s not worth throwing anything other than a simple model at the problem.