Built a full AI assistant PWA on Groq — memory, streaming, TTS, vision, all edge-deployed

Hey — I’ve been building a full AI assistant PWA called Cyanix AI powered entirely by Groq and wanted to share it here.

**What it is:**

Mobile-first progressive web app with persistent memory, streaming chat, TTS, file attachments, and web search. Built solo on GitHub Pages + Supabase edge functions.

**How Groq fits in:**

- `compound-beta` for main chat + built-in search

- `llama-3.2-11b-vision-preview` for image understanding

- `llama-3.1-8b-instant` for background tasks

- `whisper-large-v3-turbo` for voice input

The streaming speed on mobile is what sold me — feels instant in a way that actually matters for daily use.

**Live:** Cyanix AI

Still actively building. Anyone found good patterns for managing context efficiently with compound models?

— Sarano

nice so far,
keep it going.

found these bugs:

  • JS Error: loadNotifications is not defined (:0) on top bar in red showing
  • settings not working,
  • always answers in german
  • often get no response detected in chat , show in red text (always when websearch was on)

Thanks you for spotting the bugs I’ll work on fixing them effective immediately

Someone replied to your post.

| schnep1
March 22 |

  • | - |

nice so far,
keep it going.

found these bugs:

  • JS Error: loadNotifications is not defined (:0) on top bar in red showing
  • settings not working,
  • always answers in german
  • often get no response detected in chat , show in red text (always when websearch was on)