Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Apple silicon has a pretty decent on-board ML subsystem, you can get LLMs to output a respectable number of tokens per second off of it if you have the memory for them. I’m honesty shocked that they haven’t built a little LLM to power Siri
Apple silicon has a pretty decent on-board ML subsystem, you can get LLMs to output a respectable number of tokens per second off of it if you have the memory for them. I’m honesty shocked that they haven’t built a little LLM to power Siri