OpenAI-compatible API server for Apple device native models

This is a server project that provides OpenAI-compatible APIs for local models on Apple devices, making it easier for developers to run and invoke model services similar to the OpenAI interface locally.

project gety‑ai/apple‑on‑device‑openai(English name is Apple On‑Device OpenAI APIIt’s a cool tool, and its core functions are:

Project overview

  • This is a macOS application built using SwiftUI. It launches a server compatible with the OpenAI API locally, and the backend calls Apple’s own on‑device Foundation Models (also known as Apple Intelligence)
  • Your application or script can use OpenAI as long as it can originally use OpenAI /v1/chat/completions Interface, you can forward requests to this local server to implement AI Chat features. Data will not leave your device, ensuring privacy and low latency.

Why is it valuable?

Privacy Protection: All calculations are done locally and no user data is sent to external servers.

feature highlights

  • 🚀 OpenAI porting interface: Provide OpenAI like /v1/chat/completions(Support streaming) and other endpoints, and /v1/models Query available models.
  • 🖥️ Use SwiftUI as a front-end: Creating a graphical interface application that works with macOS can circumvent Apple’s rate limit on command-line tool call model
  • 🔧 Can be compiled by yourself or downloaded directly: You can download precompiled apps from Releases, or you can compile and run the clone source code with Xcode

Usage process (simple schematic)

  1. Make sure your Mac is running macOS 26 beta And the Apple Intelligence function is turned on (need to be turned on in Settings)

Discussions in the community

Some developers personally shared on the forum:

“I wrapped Apple’s new on‑device models in an OpenAI‑compatible API”
For example, when it comes to this job open-source, MIT licensed It also emphasizes that “the client can use OpenAI’s call method and leave the local area without data.”

Summary: What did this project do?

This project is equivalent to building a bridge locally, and itConvert native AI models on Apple devices to standard OpenAI API interfaces, allowing you to chat using Apple’s on-device AI without changing existing code. Especially suitable for pursuitPrivacy, security, low latency, and developers and systems that already rely on the OpenAI API.

Github:https://github.com/gety-ai/apple-on-device-openai

Oil tubing:

reference link

Scroll to Top