Does anyone know of a way to do this locally with Ollama? The 'chat with documentation' thing is something I was thinking of a week ago when dealing with hallucinating cloud AI. I think it'd be worth the energy to embed a set of documentation locally to help with development
Yes, Langchain has tooling specifically for connecting to Ollama that can be chained with other tooling in their library to pull in your documents, chunk them, and store them for RAG. See here for a good example notebook: