All my posts about various macOS technologies
Running LLMs locally with Ollama
As I am coming to the end of writing the second edition of Lean Ansible (more news on that coming soon), I thought now would be a great time to have a look at what exciting developments have been happening in the six months now that I have a little more free time. One of the things I have been keeping an eye on is the state of Large Language Models (LLM for short), especially since the introduction of open-source models such as Llama from Meta↗ and Mistral 7B↗ , which you can run locally. ...