Hey — I’ve been building a full AI assistant PWA called Cyanix AI powered entirely by Groq and wanted to share it here.
**What it is:**
Mobile-first progressive web app with persistent memory, streaming chat, TTS, file attachments, and web search. Built solo on GitHub Pages + Supabase edge functions.
**How Groq fits in:**
- `compound-beta` for main chat + built-in search
- `llama-3.2-11b-vision-preview` for image understanding
- `llama-3.1-8b-instant` for background tasks
- `whisper-large-v3-turbo` for voice input
The streaming speed on mobile is what sold me — feels instant in a way that actually matters for daily use.
**Live:** Cyanix AI
Still actively building. Anyone found good patterns for managing context efficiently with compound models?
— Sarano