Talk
Intermediate

Build Privacy focused Open Source AI Web Applications with Ollama, LangChainJS & TransformersJS

Rejected

Today, most of the AI applications send data to LLM cloud providers like OpenAI, raising privacy concerns. This talk an alternative and privacy focused way to build AI applications by running open source LLMs locally with Ollama

that keep everything local on your computer. This approach allows to avoid sending sensitive information to external servers. The talk also highlights LangChain's ability to create versatile AI agents capable of handling tasks autonomously by creating embeddings for the data. So come learn how can you build the next gen, privacy focused JavaScript/Web application powered by Local LLMs.


The talk covers the following topics:


1.Overview of cloud-based LLMs privacy issues and the importance of running Local LLM inferencing.


2.Detailed insights into generating embeddings with tools like Ollama and demonstrating how LangChain agents can perform tasks such as document summarisation and API interactions, all while maintaining data privacy in a JavaScript / React application.


3.Discovering practical use-cases for this approach.

None
FOSS

Shivay Lamba
TensorFlowJS SIG & WG Lead
Speaker Image

0 %
Approvability
0
Approvals
2
Rejections
0
Not Sure
Reviewer #1
Rejected
Generic Ollama talk
Reviewer #2
Rejected