This is an automated archive made by the Lemmit Bot.
The original was posted on /r/selfhosted by /u/Sorry_Transition_599 on 2025-08-14 09:09:53+00:00.
Hey r/selfhosted 👋
I’m one of the maintainers of Meetily, an open-source, privacy-first meeting note taker built to run entirely on your own machine or server.
Unlike cloud tools like Otter, Fireflies, or Jamie, Meetily is a standalone desktop app. it captures audio directly from your system stream and microphone.
- No Bots or integrations with meeting apps needed.
- Works with any meeting platform (Zoom, Teams, Meet, Discord, etc.) right out of the box.
- Runs fully offline — all processing stays local.
New in v0.0.5
- Stable Docker support (x86_64 + ARM64) for consistent self-hosting.
- Native installers for Windows & macOS (plus Homebrew) with simplified setup.
- Backend optimizations for faster transcription and summarization.
Why this matters for LLM fans
- Works seamlessly with local Ollama-based models like Gemma3n, LLaMA, Mistral, and more.
- No API keys required if you run local models.
- Keep full control over your transcripts and summaries — nothing leaves your machine unless you choose.
📦 Get it here: GitHub – Meetily v0.0.5 Release
I’d love to hear from folks running Ollama setups - especially which models you’re finding best for summarization. Feedback on Docker deployments and cross-platform use cases is also welcome.
(Disclosure: I’m a maintainer and am part of the development team.)
You must log in or register to comment.