Hi HN, I am Jan, CTO and co-founder of Pathway.com.
We’ve built a LLM microservice that answers questions about a corpus of documents, while automatically reacting to additions of new docs. The single, self-contained service fully replaces a complex multi-system pipeline that scans in real-time for new documents, indexes them into a specialized database and queries it to generate answers. Everyone can have their own real-time vector now.
Github: https://github.com/pathwaycom/llm-app
Demo video: https://youtu.be/kcrJSk00duw
I am eager to hear your thoughts and comments!
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the simplest contextless app
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the default app that builds a reactive index of context documents
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the contextful app reading data from s3
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the app using locally available models