This is an automated archive made by the Lemmit Bot.
The original was posted on /r/opensource by /u/PurpleReign007 on 2024-07-26 19:23:07+00:00.
I want to interact with some proprietary files (e.g. code, business-sensitive documents, personal life notes) using an LLM, but I’m not comfortable uploading them to a third party service so I was looking for a super simple app I can use to access / load / manage convo’s with local files.
It felt like there should be a million of these apps (there probably are…?) but for some reason I couldn’t find one that seemed stupidly simple to run and maintain - so I built one and open sourced the code. It uses LLama 3 (or Llama 3.1) via Ollama.
- Built using Flask, HTML, CSS, Python and JavaScript
- Running Llama 3 (or 3.1) 8B on ollama
- Can easily swap in Llama 3.1 by changing one line of code
- Everything runs local all the time - nothing ever leaves your device
Link to repo below in case anyone is interested in using it / contributing - it’s all open source. The folks over in r / ollama liked it so figured I’d share.
Like I said, it’s super friggin simple - stupidly so. Lots of room for improvement on UI and other functionality but it’s up and running and I’m personally finding it useful.
This version supports chatting with one file at a time; working on support for multiple files and eventually establishing a connection to my notes largely in Obsidian, some in txt files, so I can have a private personalized assistant.