back to top
HomeUncategorizedOsaurus: Open-Source macOS AI App for Running Local LLMs Offline

Osaurus: Open-Source macOS AI App for Running Local LLMs Offline

- Advertisement -

File Info

FileDetails
NameOsaurus
Versionv0.18.19
TypeLocal AI Harness / Offline AI Agent Platform
DeveloperOsaurus, Inc.
Size40MB
LicenseMIT License (Open Source)
PlatformsmacOS (Apple Silicon)
File Formats.dmg
AI ProvidersOpenAI • Anthropic • Ollama • Gemini • xAI • OpenRouter • LM Studio
Github Repositorygithub/osaurus
Official Sitehttps://osaurus.ai/

Description

Osaurus is a macOS-native AI harness designed around an idea “Your AI should belong to you.”

Instead of locking users into a single AI provider or cloud platform, Osaurus acts as a local control layer that sits between your AI models, tools, memory, and workflows. You can switch between local models running directly on Apple Silicon or connect cloud providers like OpenAI and Anthropic whenever you need extra power.

It is written entirely in Swift for Apple Silicon. The app feels closer to a native macOS utility. It focuses heavily on long-term context and ownership. Agents can remember conversations, manage tools, access files, execute tasks, and even run inside isolated Linux sandboxes while keeping your data on your own machine whenever possible.

Use Cases

  • Local AI assistant on macOS
  • Offline AI workflows with Apple Silicon
  • AI coding assistant with sandboxed execution
  • Long-term memory AI agents
  • Personal knowledge and file assistant
  • MCP server for AI tools and workflows
  • Voice-enabled local AI setup
  • Multi-model AI management

Screenshots

Features of Osaurus

FeatureDescription
Fully Offline SupportRun local AI models without internet access
Native Swift AppBuilt entirely in Swift for Apple Silicon
Multi-Model SupportSwitch between local and cloud AI providers
AI AgentsPersistent agents with their own memory and tools
Long-Term MemoryIdentity, episodic memory, and pinned knowledge layers
Sandbox VMRun code safely inside isolated Linux containers
MCP Server + ClientWorks with Model Context Protocol tools
Plugin SystemExtend functionality with native plugins
Voice FeaturesOn-device transcription and voice workflows
Local Model RuntimeMLX-optimized inference for Apple Silicon
Identity SystemCryptographic identity and access keys
Relay SystemExpose agents securely over the internet
AutomationScheduled tasks and workflow watchers
Developer ToolsMonitoring, debugging, and plugin tooling
Open SourceMIT licensed project

Supported AI Providers

CategorySupported Providers
Local ModelsGemma • Qwen • GPT-OSS • Llama • DeepSeek • Apple Foundation Models • Liquid AI LFM
Cloud ProvidersOpenAI • Anthropic • Gemini • xAI / Grok • Venice AI • OpenRouter • Ollama • LM Studio

Native Plugins Included

Plugin / ToolPurpose
MailEmail access and automation
CalendarScheduling and calendar workflows
BrowserWeb browsing and automation
GitRepository and version control actions
FilesystemLocal file access and management
SearchLocal and web search capabilities
FetchData retrieval and API requests
VisionImage and visual understanding tools
MusicMusic-related integrations and controls
XLSXSpreadsheet handling
PPTXPowerPoint document support
macOS Automation ToolsNative macOS workflow automation

System Requirements

ComponentRequirement
Operating SystemmacOS 15.5+
ProcessorApple Silicon
RAM16 GB minimum
Recommended RAM64 GB+ for larger local models
OptionalmacOS Tahoe for sandbox VM support

Installation

  1. Download the .dmg file
  2. Open the installer
  3. Drag Osaurus into the Applications folder
  4. Launch the app from Spotlight
  5. Configure local or cloud AI providers
You May Like: Posturr for macOS: Smart, Privacy-First Posture Detection with Apple Vision

Download Osaurus native macOS harness for AI agent

More Than Just a Local LLM App

Osaurus is more like a personal AI operating layer for macOS.

Install it once, connect the models and tools you want, and over time it becomes a more context-aware assistant that can remember tasks, access workflows, automate actions, and run securely on your own hardware.

For users interested in privacy-first AI, local agents, Apple Silicon inference, or building a long-term AI workspace on macOS, Osaurus is one of the more ambitious open-source projects currently emerging in the local AI space.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

YOU MAY ALSO LIKE
openhuman app

OpenHuman: Open-Source Personal AI Assistant With Memory, Voice & Integrations

0
OpenHuman is trying to make personal AI assistants feel less like developer tools and more like something you can actually live with every day. You install it, connect apps like Gmail, Notion, GitHub, Slack, or Calendar, and it starts building a private memory system from your data on your own machine. It feels closer to installing a desktop app and getting started in a few minutes. It also comes with a lot built in already including voice support, web search, coding tools, local AI through Ollama, and a memory system that stores everything as Markdown inside an Obsidian compatible vault. The agent keeps syncing connected apps every 20 minutes, so it slowly builds context around your work. The project is still in early beta, so there are rough edges, but the direction is interesting. Especially if you've been looking for an AI assistant that feels personal.
omlx Run Local AI Models on Your Mac With a Native Menu Bar App

oMLX: Run Local AI Models on Your Mac With a Native Menu Bar App

0
oMLX is one of the cleanest ways to run local AI models on a Mac. You install the app, download models, and manage everything from a native macOS menu bar app and web dashboard. It can keep frequently used context in memory, move older cache data to SSD automatically, run multiple models together, and work with tools like Claude Code, OpenCode, Codex, and OpenClaw. The admin dashboard is surprisingly useful too. You can download models, benchmark them, manage memory usage, and even run vision or OCR models from the same interface. If you already own an Apple Silicon Mac, this feels much closer to a proper local AI workspace than most open source inference tools right now. oMLX keeps model context cached across RAM and SSD storage, so repeated prompts and long coding sessions feel faster over time.
Miri Keyboard-First macOS Window Manager Inspired by Niri

Miri: Keyboard-First macOS Window Manager Inspired by Niri

0
Miri is a keyboard first tiling window manager for macOS inspired by Niri on Linux. Instead of stacking windows everywhere, Miri organizes apps into smooth horizontal workspaces and columns that are easier to navigate with shortcuts or trackpad gestures. It works directly with normal macOS windows using Accessibility APIs, so apps like Chrome, VS Code, Finder, and Terminal continue behaving like regular Mac apps. You get a cleaner workspace, faster navigation, persistent layouts, and less time dragging windows around manually. It is especially good for developers, multitaskers, and people who constantly jump between apps all day.

Don’t miss any Tech Story

Subscribe To Firethering NewsLetter

You Can Unsubscribe Anytime! Read more in our privacy policy