Turns out it's both easy and fast to run your own [[Large Language Models]] **locally** on your own hardware. # Msty For a more mass-consumer option, I'm testing out https://msty.app/, which gives you a UI for downloading and interacting with LLMs. This is actually **easier** than the already very easy ollama. It also supports [[Retrieval Augmented Generation]] which I'm testing out using these very notes you're reading right now. # Ollama The tool [ollama](https://ollama.com/) makes it fast and easy to download local models, run them, and interact with them. It's also sans UI - so there's no [[UNIX Design Philosophy|captive user interface]]. ![[Pasted image 20250214121053.png]] **** # More ## Source - 20 minutes of experimentation - https://ollama.com/ ## Related - [[Large Language Models]]