A quantized, offline-first LLM providing simplified legal literacy in Tamil and English, optimized for 8GB RAM devices.
Legal documents and the new Bhartiya Nyaya Sanhita (BNS) are complex and often inaccessible to the common citizen or local administration in rural areas with poor internet connectivity. Most existing AI solutions require high-end GPUs or expensive, proprietary APIs (OpenAI/Gemini), creating a "digital divide" in legal literacy.
Nyaya-LLM is an offline-first, open-source legal assistant. It leverages 4-bit quantized Llama-3/Mistral models via Ollama to run on standard 8GB-16GB RAM laptops.
- Offline RAG: Locally indexes BNS/BNSS/BSA documents for accurate, hallucination-free legal lookups.
- Multilingual Support: Interactive interface providing simplified explanations in Tamil and English.
- Low-Resource Optimized: Specifically tuned to run on low-end hardware without needing a dedicated GPU.
- Privacy First: No data leaves the device; all processing is local.
- AI Engine: Ollama / Ctransformers (GGUF Quantization)
- Framework: Streamlit for the UI
- Logic: Python (LangChain / ChromaDB for local vector storage)
- Data: Publicly available BNS (Bhartiya Nyaya Sanhita) gazette datasets.
Nyaya-LLM is built entirely on open-source libraries. It follows the FOSS Hack mandate by avoiding all proprietary APIs. The project is licensed under the MIT License.