Skip to Main Content

Koshika

Offline-first Android app that parses blood report PDFs: extracts biomarkers, tracks health trends, and answers questions via on-device AI.

Description

Problem Statement

You get your blood test back, a PDF of abbreviations and clinical jargon you can't read. So you do what everyone does: upload it to ChatGPT. Your hemoglobin, liver enzymes, thyroid panel, fed into a model you don't control, training data you can't opt out of. One-time answer, permanent data loss.

The dedicated health apps aren't better. Same cloud upload, same opaque APIs, and now your data is trapped in a proprietary format with no way out.

Koshika rejects all of this. Offline-first, open-source, entirely on-device. Parse your labs, track your biomarkers, ask an AI about your results, nothing leaves your phone. Export as FHIR R4 whenever you want.

Your body made this data. You own it.


Features

- Offline-first mobile experience with no cloud dependency

- Local PDF lab report import and parsing

- Support for multiple Indian lab report formats

- OCR fallback for difficult or image-based PDFs

- Tracking for 63 biomarkers across multiple categories

- Trend charts, reference ranges, and borderline detection

- On-device AI chat grounded in the user’s own lab data

- Safety-aware response pipeline with validation and guardrails

- FHIR R4 export for structured, standards-based health data

- No accounts, no telemetry, and no mandatory internet connection


Tech Stack

- Framework: Flutter (Dart)

- Local storage: ObjectBox

- PDF extraction: syncfusion_flutter_pdf

- OCR: Google ML Kit + pdfx

- Charts and visualisation: fl_chart

- On-device LLM runtime: llamadart / llama.cpp with GGUF models

- Embeddings and semantic search: bge-small-en-v1.5 + HNSW vector indexing

- Health data export: fhir_r4


How to Use

# Clone and run
$ git clone https://github.com/priyavratuniyal/koshika.git
$ cd koshika && flutter pub get
$ flutter run

Note: Parsing, trends, and FHIR export work out of the box. AI chat is optional — download a GGUF model (~230 MB–1 GB) from Settings.