DMR News

Advancing Digital Conversations

Osaurus Builds Apple-Focused Local AI Platform For Switching Between Multiple LLMs

ByJolyen

May 17, 2026

Osaurus Builds Apple-Focused Local AI Platform For Switching Between Multiple LLMs

As competition among AI model providers intensifies, startups are increasingly focusing on the software layer that manages how users interact with large language models. One company entering that space is Osaurus, an open source AI platform built for Apple devices that allows users to switch between different local and cloud-based AI models while keeping files, tools, and memory stored on their own hardware.

The startup emerged from an earlier desktop AI companion project called Dinoki, which Osaurus co-founder Terence Pae described as an “AI-powered Clippy.”

According to Pae, customer feedback on Dinoki pushed the team toward building a more locally controlled AI system.

“Dinoki’s customers had asked him why they should buy the app if they still had to pay for tokens,” TechCrunch reported, referring to the usage-based pricing systems used by many AI providers.

“That’s how Osaurus started,” Pae said during an interview with TechCrunch.

Pae previously worked as a software engineer at Tesla and Netflix.

Running AI Models Locally On Apple Hardware

Pae said the original goal behind Osaurus was to create an AI assistant capable of operating locally on a Mac while interacting directly with files, browsers, and system settings.

“You can do pretty much everything on your Mac locally,” Pae said. “Browsing your files, accessing your browser, accessing your system configurations.”

He said this made local AI assistants appealing for users who want more direct control over their systems and data.

The project was built publicly as an open source platform, with new features and bug fixes added over time.

Today, Osaurus can connect to both locally hosted AI models and cloud providers including OpenAI and Anthropic. Users can choose which AI models they want to run while keeping other parts of the AI workflow, including memory, files, and tools, stored locally.

The platform is designed to let users switch between models depending on the task being performed, since different AI systems may perform better in different situations.

A Consumer-Focused AI Harness

Osaurus functions as what is commonly referred to as a “harness,” a software layer that connects AI models, workflows, and tools through a unified interface.

The concept is similar to platforms such as OpenClaw and Hermes, though Pae said many existing tools primarily target developers comfortable using command-line interfaces.

According to the company, Osaurus focuses more heavily on consumer usability while also addressing security concerns.

The platform runs AI processes inside hardware-isolated virtual sandboxes intended to limit how much access AI systems have to the user’s computer and data.

Osaurus currently supports a range of local AI models including MiniMax M2.5, Gemma 4, Qwen3.6, GPT-OSS, Llama, and DeepSeek V4.

The platform also works with Apple’s on-device foundation models, Liquid AI’s LFM model family, and cloud-based providers including OpenAI, Anthropic, Gemini, xAI/Grok, Venice AI, OpenRouter, Ollama, and LM Studio.

As a Model Context Protocol, or MCP, server, Osaurus can also provide tool access to compatible clients.

The platform includes more than 20 built-in plug-ins covering applications and services such as Mail, Calendar, Browser, Music, Git, Filesystem, XLSX, PPTX, Vision, Search, and Fetch.

More recently, the company added voice functionality to the platform.

Local AI Still Faces Hardware Constraints

Running AI models locally remains resource-intensive compared with cloud-based systems.

Pae said users generally need systems with at least 64GB of RAM to run local models effectively. Larger models, including DeepSeek V4, may require systems with roughly 128GB of RAM.

Despite those requirements, Pae believes local AI systems are improving rapidly.

“The intelligence per wattage,” which he described as an important metric for local AI performance, “has been going up significantly,” he said.

“Last year, local AI could barely finish sentences, but today it can actually run tools, write code, access your browser, and order stuff from Amazon,” Pae added.

Since launching nearly a year ago, Osaurus has reportedly been downloaded more than 112,000 times, according to the company’s website.

The platform competes with local AI tools such as Ollama, Msty, and LM Studio while positioning itself as more accessible for non-technical users.

Future Plans Include Enterprise Deployment

Osaurus founders Terence Pae and Sam Yoo are currently participating in the New York-based startup accelerator Alliance.

The company is also considering enterprise use cases where local AI deployment could help address privacy requirements, particularly in sectors such as healthcare and legal services.

Pae said local AI systems could eventually reduce dependence on large-scale AI data centers by allowing organizations to run models on local hardware such as Apple Mac Studio systems.

“Instead of relying on the cloud, they can actually deploy a Mac Studio on-prem,” Pae said.

“You still have the capabilities of the cloud, but you will not be dependent on a data center to be able to run that AI,” he added.


Featured image credits: Magnific.com

For more stories like it, click the +Follow button at the top of this page to follow us.

Jolyen

As a news editor, I bring stories to life through clear, impactful, and authentic writing. I believe every brand has something worth sharing. My job is to make sure it’s heard. With an eye for detail and a heart for storytelling, I shape messages that truly connect.

Leave a Reply

Your email address will not be published. Required fields are marked *