A workflow that lives in your account is software you can use. A workflow that lives in a portable file is software you can ship. The .0n file format is the second kind. Here's the spec, the comparison to existing formats, and why portability is the unlock.
The Format in One Block
{
"name": "LinkedIn Lead Qualifier", "version": "1.0.0", "vendor": { "id": "...", "name": "Mike Mento", "verified": true }, "category": "lead-generation", "required_services": ["linkedin", "crm"], "required_plan": "starter", "trigger": { "type": "scheduled", "schedule": "0 /6 " }, "config_schema": { "icp_keywords": { "type": "string[]", "label": "Target ICP keywords", "required": true }, "crm_pipeline_id": { "type": "string", "label": "CRM pipeline for qualified leads" } }, "steps": [ { "id": "fetch_recent_leads", "type": "crm_action", "action": "list_contacts", "filter": { "tags": ["new"], "limit": 50 } }, { "id": "score_each", "type": "ai_execute", "model": "groq:llama-3.3-70b-versatile", "prompt": "Score this LinkedIn profile against ICP: {{config.icp_keywords}}\n\nProfile: {{step.fetch_recent_leads.contacts[]}}\n\nReturn JSON: {score: 0-100, reasoning: string}", "iterate_over": "step.fetch_recent_leads.contacts" }, { "id": "tag_qualified", "type": "crm_action", "action": "add_tag", "filter": "step.score_each[] WHERE score > 70", "tag": "qualified-{{trigger.timestamp.date}}" } ] }
That's a complete app. Triggers, steps, AI calls, CRM writes — in one file you can drop into any 0nCore-compatible runtime.
The 9 Step Types
| Type | What it does |
|---|---|
ai_execute | Call any LLM (Groq, Anthropic, OpenAI) with a prompt + structured output |
crm_action | Read/write to leadconnectorhq.com (245 operations supported) |
condition | If/else branching with template-variable evaluation |
webhook | POST to any URL with auth + payload |
email | Send via Resend (template + variables) |
slack | Post to channel with blocks/attachments |
wordpress | 43 WP REST operations (create post, update SEO, etc.) |
delay | Wait N seconds/minutes/hours/days before next step |
log | Write to execution log (debugging + audit) |
webhook (fire-and-forget) or manual (vendor delivers).
Template Variables
Every step can reference:
{{trigger.}}— fields from the trigger event{{config.}}— user-configured options{{step.— output from any prior step.}} {{user.}}— user account info (id, email, plan){{env.}}— declared environment vars
The template engine is simple deep-path lookups + array indexing. No code execution, no sandboxing concerns.
Required Services
The required_services array tells the installer what the user must have connected. If the user doesn't have LinkedIn + CRM connected, install fails with a guided "connect these services first" flow.
This is the equivalent of an iPhone app declaring — predictable, declarative, enforceable.
Comparison to Existing Formats
| Format | Strengths | Why .0n is different |
|---|---|---|
| n8n JSON | Visual, large step library | Locked to n8n runtime. No marketplace distribution. |
| Zapier export | Easy to author | Internal format, not portable. No re-import elsewhere. |
| OpenAI Custom GPT JSON | Distributable | Static config, no triggers, no orchestration |
| Make.com blueprint | Comprehensive | Vendor lock, opaque step semantics |
| Cloudflare Workers | Real code, real portability | Requires JavaScript engineering, not declarative |
| .0n | Declarative, portable, marketplace-shippable, AI-native | Smaller step library (intentional) |
Why Portability Matters Now
Three forces converge in 2026:
- AI agents need workflow primitives, not just tools. A tool is a one-shot API call. A workflow is a persistent capability that runs on its own schedule, against the user's own services, with the user's own data.
- Indie AI devs need distribution. Building a workflow takes weeks. Building a landing page + checkout + onboarding takes months. Marketplace distribution flips the ratio.
- Customers need predictability. A workflow you bought from a vendor that disappeared is dead software. A .0n file lives in your account regardless of vendor health.
Portability solves the third problem. Marketplace distribution (built on UCP) solves the second. Standardized format with declarative semantics solves the first.
Open Spec
The full schema is published at 0nmcp.com/0n-standard. Reference parser, installer, and executor live in the onork-app repository.
If you build an AI agent runtime that wants to consume .0n files, the format is open. If you build a workflow you want to sell, the format is the distribution mechanism. The marketplace at 0ncore.com/dashboard/marketplace is the first storefront.