back to top
HomeTechJust After Launching Qwen3.5, Qwen's Core Team Walked Out. Is This the...

Just After Launching Qwen3.5, Qwen’s Core Team Walked Out. Is This the Last Great Qwen Model?

- Advertisement -

Yesterday I was testing Qwen3.5-4B on my machine, genuinely impressed by what a 4B model was doing with images and reasoning. Then I opened X and saw a five word post from Junyang Lin, the man who built Qwen from the ground up: “bye my beloved qwen.”

That was it. No explanation, just a goodbye.

Within hours the replies were flooding in. Developers, researchers, open source contributors all asking the same thing — what just happened? And then Elon Musk’s comment on Qwen3.5 calling it “impressive intelligence density” surfaced, and Lin replied with a simple “thx elon.” People in the comments started connecting the dots — was he already gone when he replied? Did he know? Nobody is quite sure what to make of that exchange but it made the whole thing feel even stranger.

Lin wasn’t alone. Yu Bowen, who led post-training for Qwen, resigned the same day. Hui Binyuan, a core contributor focused on coding, had already left in January. Three of the most important people behind one of the best open source AI model families in the world, gone within months of each other.

I had just tested the model. I had just written about why it was worth your attention. And now the people who built it had walked out.

How Qwen Lost Its Soul

Qwen Team Lead Stepped Down

Lin Junyang wasn’t just a team lead. He had been building Qwen since 2022, turning it from an internal Alibaba project into one of the most downloaded open source model families in the world. Over 400 models released, over a billion downloads. That’s not a department head, that’s the person the whole thing was built around.

His resignation on Wednesday came two days after Qwen3.5 launched with just a five word post on X and silence after that.

Yu Bowen, who ran post-training for Qwen, left the same day. Hui Binyuan, a core contributor focused on coding, had already quietly resigned in January. Three people who understood Qwen at its deepest level, all gone within two months.

What pushed them out isn’t entirely clear. Lin himself said at a Beijing forum in January that his team was stretched thin, spending most of their resources just meeting delivery demands rather than doing the kind of research that actually moves things forward.

Alibaba CEO Eddie Wu responded with a brief statement thanking Lin and announcing a task force to coordinate future AI model development. Reading between the lines that sounds like exactly the kind of corporate restructuring that makes passionate researchers leave separate teams, separate goals, less room to just build something great.

The open source community noticed immediately. Zhipu AI’s CEO was already publicly trying to recruit the departing engineers within hours of Lin’s post.

I tested it right before everything fell apart

The timing is strange to think about. While Lin was probably writing that five word goodbye, I was running Qwen3.5-4B on my machine, genuinely surprised by what it was doing.

Vision works better than you’d expect from a 4B model. I dropped in images and it described them accurately — screenshots, diagrams, general scenes. It stumbled on location and landmark identification occasionally, giving confident answers that were just wrong. But for everyday image understanding it holds up.

Text and reasoning is where it genuinely impressed me. It thinks before it answers, works through problems rather than guessing. For something running on 16GB RAM and 6GB VRAM that’s not what you expect.

The model is good. That’s what makes this whole situation harder to sit with.

What happens to the Qwen models we already have?

The models already out there aren’t going anywhere. They’re open source, already downloaded over a billion times, and the community will keep building on them regardless of what Alibaba does next.

But that’s where the reassurance ends.

The real concern is what comes next. Qwen’s release pace existed because the people behind it genuinely cared. Those people just left. Whether Alibaba keeps the same momentum, or even keeps future models open source, nobody knows right now. A few months ago nobody expected Lin to leave either.

Use it while you can

Maybe Alibaba figures it out. Maybe the new team surprises everyone. But right now the safest thing you can do is download the Qwen3.5 weights locally and keep them. We don’t know if the next version will be this good or this open.

What Lin and his team built was rare, a full open source AI stack, genuinely competitive, freely available. That doesn’t come around often. And right now it feels like we’re watching the end of something without quite knowing what comes next.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
YOU MAY ALSO LIKE
Qwen3.5-4B The Small AI Model That Thinks, Sees, and Runs on Your Machine

Qwen3.5-4B: The Small AI Model That Thinks, Sees, and Runs on Your Machine

0
Most small AI models are a compromise. You give up reasoning for size, or vision for speed. Qwen3.5-4B doesn't seem to have gotten that memo. Alibaba just dropped Qwen3.5, and the 4B version is the one worth paying attention to. It thinks before it answers, reads images and video, handles 201 languages, and sits on a context window of 262,144 tokens, longer than most models ten times its size. All of that in something small enough to run on your own machine. I went through the benchmarks and architecture docs so you don't have to. Here's what actually matters.
I Thought Figma Was Untouchable — Until This Open-Source AI Tool Designed My UI

I Thought Figma Was Untouchable — Until This Open-Source AI Tool Designed My UI

0
I've used Figma. I've used Adobe XD. And for most design work they do the job fine — if you're okay with paying for them and okay with your files living on someone else's server. I wasn't looking for a replacement. I just stumbled across OpenPencil while browsing GitHub one evening and the one thing that caught my attention wasn't the canvas or the components. It was the MCP server built directly into the tool. An AI agent that can read, create and modify your design files from the terminal. That's not a plugin. That's a different way of thinking about design tools entirely. I installed it, connected it to Claude Code, created a sample design and spent some time with it. Here's what I actually found.
claude down

Claude AI Down: Anthropic Confirms Global Outage, 529 Errors Reported

0
If you tried using Claude today and got hit with errors, timeouts, or blank responses, it’s not just you. Claude is currently experiencing a major outage. The issue was first flagged on March 2, 2026, and it appears to be affecting users across web, mobile, and API access. This isn’t a small regional hiccup or a single app glitch. It’s broad. According to the official status updates, the first “Investigating” notice went live at 11:49 UTC. A follow-up at 12:06 UTC confirmed the team is still looking into it. No resolution time has been shared yet. For now, users may see failed requests, inconsistent replies, or complete inability to access the service. Developers relying on Claude’s API are also reporting elevated error rates. And yes, this is happening worldwide.

Don’t miss any Tech Story

Subscribe To Firethering NewsLetter

You Can Unsubscribe Anytime! Read more in our privacy policy