Local Tool-Chaining: Using Gemma 4 with Ollama and OpenCode
I’ve been spending some time lately experimenting with running LLMs locally to see how close we are to having a truly “private” coding…
Read MoreI’ve been spending some time lately experimenting with running LLMs locally to see how close we are to having a truly “private” coding…
Read MoreI was recently “gifted” a nice MacBook Pro and decided I want to use it to run AI models locally. I’ve become accustomed…
Read MoreResult: https://mcflurry.jshowers.com/ My new rescue dog McFlurry is popular at cafes and different places around town. I wanted to make a site featuring…
Read More