back to top
HomeSoftwareAI ToolsLore: Local AI Note Manager with Smart Recall & Private Second Memory

Lore: Local AI Note Manager with Smart Recall & Private Second Memory

- Advertisement -

File Information

FileDetails
NameLore
Versionv0.1.0
TypeAI-Powered Personal Knowledge & Note Manager
LicenseMIT License (Open Source)
PlatformsWindows • macOS • Linux
Size130MB (exe) • 158MB (dmg) • 213MB (AppImage)
Primary UseLocal AI note capture, smart recall, and todo management
Github RepositoryGithub/Lore

Description

Lore is a lightweight, privacy-first desktop app that lives quietly in your system tray and gives you a pop-up chat interface to capture thoughts the moment they happen. Powered entirely by a local LLM through Ollama and a local vector database through LanceDB, it stores, understands, and retrieves your information without sending a single byte to the cloud.

You can store anything like quick notes, decision summaries, URLs, code snippets, bug reproduction steps, todo items and retrieve it all later by simply describing what you need in plain language. Lore classifies your input automatically and uses a RAG pipeline to pull the most relevant context before generating an answer.

If you’re a developer, a knowledge worker, or someone who just wants a smarter way to remember things, Lore is worth a try.

Screenshots

Lore AI Notes Manager Demo

Feature of Lore

FeatureDescription
Quick CapturePress a global shortcut to instantly pop up the chat bar and store a thought
Smart RecallAsk questions in plain language and get answers pulled from your stored notes
AI ClassificationAutomatically classifies input as a thought, question, command, or instruction
Todo ManagementAdd, list, complete, and organize todos with priority levels and categories
RAG PipelineRetrieval-augmented generation finds relevant context before every response
Fully LocalAll data and AI processing runs on your machine
Custom InstructionsSet persistent behavioral rules for how Lore responds to you
System Tray AppRuns silently in the background, accessible anytime with a keypress
Model SelectionChoose and download your preferred LLM and embedding models via Ollama
Settings PanelCustomize shortcuts, manage models, and control startup behavior

System Requirements

ComponentRequirement
Operating SystemWindows 10+, macOS (Intel & Apple Silicon), Linux (modern distros)
Processor64-bit CPU (Apple Silicon supported via arm64 build)
RAM8 GB minimum, 16 GB recommended for larger LLMs
Storage~500 MB for the app + additional space for LLM models
InternetNot required after initial model download
DependenciesOllama (guided setup during installation)

How to Install Lore?

Windows (.exe)

  1. Download the .exe installer.
  2. Double-click the installer file.
  3. Follow the on-screen setup instructions and choose your directories for LLM models and Ollama.
  4. Launch Lore from the Start Menu or System Tray.
  5. Open Settings → Models and download your preferred embedding model and LLM.

macOS (.dmg)

  1. Download the .dmg file, choose arm64 for Apple Silicon or x64 for Intel.
  2. Open the file to mount it.
  3. Drag the Lore app into your Applications folder.
  4. Launch it from Applications, the Lore icon will appear in your menu bar.
  5. Open Settings -> Models and download your preferred embedding model and LLM.

Linux (.AppImage)

  1. Download the .AppImage file.
  2. Right-click and make it executable
  3. Double-click to launch the app.
  4. Open Settings → Models and download your preferred embedding model and LLM.

Global Shortcut Once installed, press Ctrl+Shift+Space on Windows/Linux or Cmd+Shift+Space on macOS to toggle the Lore popup from anywhere on your desktop.

Download Lore AI Note Manager

Your Private Second Memory, Always One Shortcut Away

Lore rethinks personal note-taking by replacing passive storage with active, AI-powered recall. Instead of searching through folders or scrolling through endless notes, you just describe what you need and Lore finds it. Everything runs locally, everything stays private, and the app stays out of your way until you need it. For anyone who values both productivity and privacy, Lore is a genuinely different kind of knowledge tool.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

YOU MAY ALSO LIKE
openhuman app

OpenHuman: Open-Source Personal AI Assistant With Memory, Voice & Integrations

0
OpenHuman is trying to make personal AI assistants feel less like developer tools and more like something you can actually live with every day. You install it, connect apps like Gmail, Notion, GitHub, Slack, or Calendar, and it starts building a private memory system from your data on your own machine. It feels closer to installing a desktop app and getting started in a few minutes. It also comes with a lot built in already including voice support, web search, coding tools, local AI through Ollama, and a memory system that stores everything as Markdown inside an Obsidian compatible vault. The agent keeps syncing connected apps every 20 minutes, so it slowly builds context around your work. The project is still in early beta, so there are rough edges, but the direction is interesting. Especially if you've been looking for an AI assistant that feels personal.
omlx Run Local AI Models on Your Mac With a Native Menu Bar App

oMLX: Run Local AI Models on Your Mac With a Native Menu Bar App

0
oMLX is one of the cleanest ways to run local AI models on a Mac. You install the app, download models, and manage everything from a native macOS menu bar app and web dashboard. It can keep frequently used context in memory, move older cache data to SSD automatically, run multiple models together, and work with tools like Claude Code, OpenCode, Codex, and OpenClaw. The admin dashboard is surprisingly useful too. You can download models, benchmark them, manage memory usage, and even run vision or OCR models from the same interface. If you already own an Apple Silicon Mac, this feels much closer to a proper local AI workspace than most open source inference tools right now. oMLX keeps model context cached across RAM and SSD storage, so repeated prompts and long coding sessions feel faster over time.
Miri Keyboard-First macOS Window Manager Inspired by Niri

Miri: Keyboard-First macOS Window Manager Inspired by Niri

0
Miri is a keyboard first tiling window manager for macOS inspired by Niri on Linux. Instead of stacking windows everywhere, Miri organizes apps into smooth horizontal workspaces and columns that are easier to navigate with shortcuts or trackpad gestures. It works directly with normal macOS windows using Accessibility APIs, so apps like Chrome, VS Code, Finder, and Terminal continue behaving like regular Mac apps. You get a cleaner workspace, faster navigation, persistent layouts, and less time dragging windows around manually. It is especially good for developers, multitaskers, and people who constantly jump between apps all day.

Don’t miss any Tech Story

Subscribe To Firethering NewsLetter

You Can Unsubscribe Anytime! Read more in our privacy policy