WACI: WhatsApp Calm Inbox

WACI (WhatsApp Calm Inbox) is an open-source mobile app that connects to WhatsApp as a linked device — entirely on your phone, with no backend server — and uses a pluggable local or cloud LLM to surface only the messages that actually need your attention. It replaces the anxiety of hundreds of unread group messages with a calm, AI-triaged priority inbox. All message processing happens on-device; your conversations never leave your phone.

Description

WACI — WhatsApp Calm Inbox

A calm, AI-powered priority inbox for WhatsApp. Open-source, on-device, no server.


The Problem

WhatsApp is where real life happens in India — family groups, alumni networks, professional communities, local neighbourhoods. But the same app that carries your most important messages also buries them under hundreds of forwarded memes, good-morning images, and group chatter.

Most busy people don't abandon these groups. They just stop opening them. And in the silence, they miss things that matter: a job opportunity mentioned in passing, a request from a colleague, a time-sensitive update from a community they care about.

This isn't a spam problem — it's a signal-to-noise problem. The groups are worth being in. The messages worth reading are just impossible to find without spending 30 minutes skimming.


What WACI Does

WACI connects to WhatsApp using the official multi-device linked-device protocol (the same mechanism as WhatsApp Web and WhatsApp Desktop). It registers itself as a companion device on your phone.

When you open the app, it:

  1. Connects to WhatsApp and fetches all messages received since your last session

  2. Triages each message against your personal filters using a local or cloud LLM

  3. Surfaces only the messages that match what you care about — action items, opportunities, important updates, mentions, or whatever you define

  4. Disconnects — no persistent connection, no background drain, no notifications unless you want them

The result is a calm inbox: open it when you're ready, see only what matters, close it and get back to work.


Architecture

WACI is built around a fully on-device architecture. There is no central server. Your WhatsApp messages never leave your phone.

Your Phone
├── React Native App (UI layer)
├── Go Bridge (wabridge)
│   ├── whatsmeow — official WA multi-device protocol
│   ├── SQLite — session keys + filter match results
│   └── LLM client — calls local Ollama or any OpenAI-compatible API
└── WhatsApp's servers ← only connection leaving the device

Key components

wabridge — Go package compiled to native via gomobile The engine. Wraps whatsmeow (the same battle-tested library used by Beeper) to speak WhatsApp's binary protocol over WebSocket with full end-to-end encryption. Compiled to a native .aar (Android) and .xcframework (iOS) using gomobile bind and loaded as a React Native native module.

React Native (Expo bare workflow) The UI. Priority inbox feed, filter management, WhatsApp pairing screen. Connects to the Go bridge via typed native modules.

Pluggable LLM triage WACI supports any OpenAI-compatible LLM endpoint. The FOSS-native path uses Ollama running on the same device or local network with open-weight models (Llama 3, Mistral, Gemma). Cloud APIs (Claude, GPT-4o, Gemini) are supported as optional providers for users who prefer them. The user supplies their own API key — WACI has no hosted AI backend.


FOSS Alignment


| Concern | Answer |
|---|---|
| WhatsApp connection | Uses whatsmeow, an open-source Go library implementing the official multi-device protocol. No screen scraping, no Puppeteer. |
| LLM dependency | Pluggable. Primary FOSS path: Ollama + open-weight models running locally. Cloud APIs are optional and user-supplied. |
| Data storage | Local SQLite on-device only. No cloud database, no telemetry. |
| License | GPL-3.0 |
| Server infrastructure | None. Zero. The app is the entire system. |

The only non-FOSS dependency is WhatsApp's own servers, which is unavoidable for any WhatsApp client — the same position occupied by every open-source WhatsApp bridge in existence.


LLM Attribution & AI-Assisted Development

This project was built with significant assistance from Claude (Anthropic) for architecture design, code generation, and implementation planning. All AI-generated code has been reviewed and understood by the author. The architecture decisions, product direction, tradeoff reasoning, and final implementation responsibility are entirely the author's own.

Specifically: Claude Code was used to scaffold boilerplate and generate implementation plans; Claude (via claude.ai) was used to reason through the whatsmeow vs mautrix-whatsapp tradeoff, the gomobile bridge design, and the on-device vs cloud architecture decision. And OpenClaw with Claude Sonnet 4.6 and GPT Codex 5.1 models was used to write much of the code (under the username alokit-bot). These conversations shaped the architecture materially and are part of the honest story of how this was built.

The on-device LLM triage component was designed so the app does not depend on any single AI provider — the same principle applied to the development process itself.


Filters — How Triage Works

Users define filters in natural language. Three are seeded by default:

  • Action Items — "Messages where someone is asking me to do something or expecting a response from me"

  • Opportunities — "Interesting opportunities, invitations, or offers I might want to follow up on"

  • Important Updates — "Time-sensitive announcements or changes I need to know about"

Each incoming message is evaluated against each active filter. The LLM returns { relevant: bool, reason: string, confidence: float }. Matches are stored locally and displayed in the inbox with the reason surfaced to the user.


What's Built in March 2026

The entire mobile app part of the project has been built in March.

  • packages/wabridge — Go package: whatsmeow integration, SQLite store, pluggable LLM triage client

  • gomobile build pipeline for Android (.aar) and iOS (.xcframework)

  • Android native module (Kotlin) bridging Go ↔ React Native

  • iOS native module (Swift) bridging Go ↔ React Native

  • React Native app: pairing screen, priority inbox, filter management

  • Ollama integration as the default FOSS LLM backend

  • End-to-end working demo on Android


Running It Yourself

bash

# 1. Build the Go bridge for Android
./scripts/build-wabridge-android.sh

# 2. Run the app
cd apps/mobile-client
npx expo run:android

Requires: Go 1.21+, Android SDK, gomobile. See README for full setup.


Repo

github.com/avikalpg/whatsapp-ai-filter — GPL-3.0 The mobile app lives in apps/mobile-client/ and the Go bridge in packages/wabridge/.


Note: Parent project: WhatsApp AI Filter. It used to run as a server and create your self-chat as the UI.