Models
Series Forge 1.0.5 supports on-device system generation when the device and OS make it available, plus optional online generation with the user’s own provider key. Drafts are reviewed before saving into story cards.
The app remains offline-first as a story bible, continuity workspace and prompt/context organizer. Creative content is not sent to the Series Forge server. If Online Models are enabled in the app, selected prompts and compact story context are sent directly from the device to the provider configured by the user. Gemma 4 and offline image-generation weights are not bundled or downloadable until the exact artifact, runtime, license, redistribution, checksum and memory budget are verified.
On-device first
Eligible devices can use the system on-device language model. Availability depends on Apple OS/device support and local model readiness.
Online models
OpenAI-compatible and Gemini endpoints can be configured on device. API keys are stored in Keychain and are not sent to the Series Forge VPS.
No bundled weights
No GGUF, Core ML, MLX, LiteRT-LM, safetensors, bin or compiled model package is shipped in the app bundle.
Local downloads locked
The public manifest returns local profiles with no download URL, no checksum and no redistribution permission.
Legal gate
A future local model release must pass license, commercial use, redistribution, checksum and notice review before appearing in the app.