back to top
HomeTechPicksI Use Claude Code But Not for My Personal Projects — Here's...

I Use Claude Code But Not for My Personal Projects — Here’s What I Use Instead

- Advertisement -

I like Claude Code. But for some of my personal projects, the last thing I want is my code touching a cloud server I don’t control. So I went looking for an open source alternative and found this absolute beast.

It’s called Goose. Honestly surprised it took me this long to find it.

So what is Goose exactly?

goose ai coding

Think of it as an AI agent that lives on your machine. Not a chatbot that gives you code suggestions. An actual agent that can create files, edit code, run commands, debug errors, and work through multi-step tasks on its own.

The part that makes it different from most AI coding tools is the model flexibility. Goose doesn’t care what LLM you use. Connect it to Claude, GPT-4, Gemini, Groq or if you want everything fully local with zero internet, plug in an Ollama model like GLM-5 or Kimi K2. Your choice, your data.

How it’s different from Claude Code

Claude Code is built around Anthropic’s own models and needs an account to run even with local models. Goose has none of that, its fully open source, no account needed. The difference seems small. In practice for personal projects it changes everything.

FeatureClaude CodeGoose
Requires accountYesNo
Fully local optionPartialYes
Model flexibilityLimitedAny LLM or Ollama
Open sourceNoYes (Apache 2.0)
Approval gatesYesMinimal
Internet requiredYesOnly if using cloud models
Desktop appYesYes
CLI supportYesYes
CostPaidFree

Your model, your choice

Goose doesn’t care what model you use. Connect it to Claude, GPT-4, Gemini, or Groq if you want cloud performance. Or point it at a local Ollama model if you want everything staying on your machine. Both work. It just depends on your hardware and what you actually need from the session.

I personally use GLM-5 for most of my personal projects. Is it as good as Claude Opus? No. But it’s good enough for what I’m building, it runs locally, and my code never leaves my machine. That tradeoff works for me.

The part I actually appreciate though is how easy it is to switch. If I hit something complex that needs heavier reasoning I just change the model in settings and keep going. No reinstalling, no reconfiguring, no starting over. Same session, different model.

That kind of flexibility is rare in coding tools. Most lock you in. Goose just gets out of the way.

Getting started is simpler than you think

Goose installs like any normal app, download the binary for your system, open it, connect your model of choice and you’re running. Desktop app if you prefer a visual interface, CLI if you like staying in the terminal. Both work the same way.

Why I chose Goose for personal projects

Simple reason. I didn’t want my code leaving my system. I just have projects where the idea itself is something I want to keep to myself until it’s ready. That’s it.

Goose gives me that. My code stays on my machine, my ideas stay in my head, and I still get an AI agent that can actually work through problems autonomously.

Everyone has their own reasons for caring about this. Maybe it’s a client project with an NDA. Maybe it’s something you’re building that you don’t want anyone seeing yet. Maybe you just like knowing exactly where your data goes.

Whatever the reason, if you’ve been looking for an open source alternative to Claude Code that actually works, this one is worth trying.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
YOU MAY ALSO LIKE
Reka Edge is The 7B Multimodal AI Model That Beats Gemini 3 Pro on Object Detection

Reka Edge: The 7B Multimodal AI Model That Beats Gemini 3 Pro on Object...

0
Most people assume beating a Google model requires another massive frontier model. More parameters. More compute. That is just how the hierarchy usually works. Reka Edge is a 7-billion-parameter model. Yet it manages to outperform Gemini 3 Pro on object detection benchmarks, and with quantization it can even run on devices like the Samsung S25. That combination should not exist. A model small enough to fit on a phone outperforming a frontier AI system from Google on a specific but genuinely useful task is not something you expect to see in 2026. Yet here we are. This is not a model that beats Gemini at everything. It does not. But where it wins it wins convincingly.
Helios 14B AI Model That Generates Minute-Long Videos in Real Time

Helios: The 14B AI Model That Generates Minute-Long Videos in Real Time

0
Most open source video generation models make you wait. You write a prompt, hit generate, and then sit there hoping the output is what you imagined. If it is not you tweak the prompt and wait again. That loop gets old fast. Helios works differently. It generates video in real time at 19.5 frames per second on a single GPU. You can see it being created, interrupt mid generation if something looks off, tweak and continue. Up to a full minute of video without starting over every time something does not look right. With group offloading it runs on around 6GB of VRAM. Consumer GPU territory.
Open Source LLMs That Rival ChatGPT and Claude

7 Open Source LLMs That Rival ChatGPT and Claude

0
Two years ago if you wanted a genuinely capable AI model your options were basically ChatGPT, Claude, Gemini or Grok. Open source existed but the gap was real and everyone knew it. That gap is closing faster than most people expected. In some areas it is already gone. Today open source models do not just compete with closed source. Some of them beat closed source on specific benchmarks that actually matter. And the list of categories where that is true keeps getting longer. If you are curious about what open source AI actually looks like at full power or you are building something serious and evaluating your options this list is for you. One thing worth saying upfront, these are not consumer GPU friendly models. You will need serious hardware to run them at full capacity. Quantized versions exist for most of them but expect performance and quality to reflect that. I went through a lot of options to put this list together. These seven are the ones that actually made me stop and pay attention.

Don’t miss any Tech Story

Subscribe To Firethering NewsLetter

You Can Unsubscribe Anytime! Read more in our privacy policy