Arguably, it’s one of the best open-weight models ever. Groq’s LPUs could make it even greater ![]()
Absolutely, we would love if the team integrates this model into groq, would be a game changer!
please! can’t wait to see it
Yes pleaseeeee !!! ![]()
How do I upvote this? We need groq marketing team to look into it!
We need too! ![]()
Gemma 4 will be an unlimited power for GroqCloud.
Umm.. Maybe we can try attract Google’s attention to ask them for helping Groq quantify the model for LPU arch.
when can we get Gemma4? it seems like Gemma4 can replace GPT OSS completely
Pls add support for Gemma 4
I would agree that this is one of the best, if not the best, open model so far. Astonishing levels of instruction following and composure. I’d implement it on Day 1.
Yes please groq team integrate this model
Yep, Groq would be one of the best cloud providers (if not the best) for this amazing model! It’s gpt-oss-120b on steriods
I do AI consulting for a number of clients and would immediately recommend switching some tooling to this platform if Gemma 4 were available. IMO it’s the first usable open weight agentic model that is small enough that it would sense for you guys to serve on your cloud platform. A lightening fast agent like would be a real step change. Hoping this is already in the works and would be excited to see it!
here’s hoping they add this model and the latest qwen models
been a little bit confusing to say the least to see no new models have been added in such a long time
pls gemma 4 e2b, e4b
