Local Tool-Chaining: Using Gemma 4 with Ollama and OpenCode
I’ve been spending some time lately experimenting with running LLMs locally to see how close we are to having a truly “private” coding…
Read MoreI’ve been spending some time lately experimenting with running LLMs locally to see how close we are to having a truly “private” coding…
Read MoreI was recently “gifted” a nice MacBook Pro and decided I want to use it to run AI models locally. I’ve become accustomed…
Read More