Local LLMs Part 3 – Linux

In part I and part II of this series we looked at setting up local LLMs running on Apple MacOS and Microsoft Windows respectively. This post is going to dig into using Linux to run an LLM. In several respects Linux is different from both MacOS and...

Local LLMs Part 2 – Microsoft Windows

In part one of this series, we looked at running the Meta LLAMA 2 AI large language model (LLM) on Apple Silicon base computers directly. This allows a ChatGPT like AI assistant to run without an Internet connection, but much more importantly to...

Local LLMs Part 1 – Apple MacOS

Running large language models on your local computer can be a safe and cost-effective way to use the latest artificial intelligence tools. This blog post will outline the steps needed for local AI use on Apple Macs with Apple Silicon CPUs.

Subscribe Here For Our Blogs:

Recent Posts

Categories

see all