Get Ready for the Apps SDK
Hundreds of millions of people now open a conversational interface every day—to plan trips, learn new skills, compare products, or simply get something done. That shift in daily behavior has quietly rewritten user expectations: answers should arrive inline, actions should complete without context switches, and an "app" should feel like help, not a detour.
OpenAI's new Apps SDK, built on top of the Model Context Protocol (MCP), formalizes this new reality. It lets your capability appear directly inside a conversation—the moment intent is expressed. Your UI can render in-thread, call your systems, return structured data or results, and then disappear until needed again. Websites and mobile apps don't vanish—they become structured data layers, identity providers, and policy engines that feed these conversational surfaces.
The value unit of software has changed. It's no longer a "destination" you visit; it's an intent you resolve. One chat may now compose multiple brands and services into a single outcome. ChatGPT is the first large-scale implementation, but the pattern will spread fast—other assistants will standardize the same in-thread app model, turning intent-native experiences into a cross-platform baseline.
This guide is your map to that landscape. You'll see how discovery and ranking work inside ChatGPT, what to build first (and why it sticks), the MCP building blocks you'll actually ship, design rules for inline UX, the KPIs that now define success, and the traits of teams that consistently get picked. If intent is the new homepage, this is how your brand shows up—and wins—at the moment of need.
The Conceptual Shift: From Destinations to Moments
For twenty years, digital strategy meant building places for users to go—websites, mobile apps, and dashboards. Every task began with a detour: open an app, sign in, search, tap through menus, complete the job, exit. It worked when attention was abundant and distribution predictable. Today, attention is fractured, and users expect everything to meet them in context.
Conversational interfaces changed that equation. Users now start with language—"Book a flight to Dubai," "Generate a logo," "Summarize this PDF." Instead of sending them away to a destination, the assistant can perform the task by orchestrating micro-capabilities behind the scenes. The request becomes the router.
This is why traditional growth levers—SEO, App Store ranking, notification funnels—are losing power. The next era favors systems that can respond precisely to user intent in real time. Discovery happens by relevance, not by search placement; retention happens by reliability, not by habit loops. In this model, the AI layer becomes the new operating system of attention.
Think of it as the difference between visiting a restaurant and having a chef who appears the moment you're hungry. The surface stays conversational, but the work behind it becomes modular, composable, and data-driven. Each capability exists to resolve a single verb—book, design, price, explain, calculate—and then hands control back to the user or to another module in the chain.
Research supports this pivot. The global conversational-AI market is projected to exceed $30 billion by 2029, with more than 900 million daily users engaging chat assistants across platforms. That's not hype—it's gravity. Users have already chosen the conversational interface as their default starting point.
For builders, this means success will no longer be measured by pageviews or downloads, but by how often and how confidently the model selects your capability to fulfill an intent. Reliability, clarity of contract, and speed of resolution become your new growth metrics.
Chapter 2 – Infrastructure Behind the Shift: MCP + Apps SDK
The Apps SDK is not just a new feature—it's the architectural hinge between the web and a fully conversational internet. It's powered by the Model Context Protocol (MCP), an open standard that defines how language models talk to tools, data, and interfaces. Together they turn what used to be API integrations into full, conversational capabilities.
MCP acts as the connective tissue. Every server that implements it can advertise tools
(functions defined with JSON Schema), respond to call_tool requests,
and optionally render a live UI inside the chat.
Transport is flexible—Server-Sent Events or Streamable HTTP—ensuring the same app works across ChatGPT web and mobile.
The model itself orchestrates everything: invoking, parsing, and deciding when to surface you.
{
"name": "price_checker",
"description": "Return live product pricing",
"input_schema": {
"type": "object",
"properties": { "sku": { "type": "string" } },
"required": ["sku"]
}
}
On top of MCP sits the Apps SDK—OpenAI's official toolkit that simplifies server registration, authentication, and UI delivery. It gives developers a consistent way to:
- Register tools and expose them to the model with metadata that informs discovery and ranking.
- Render inline UIs (cards, carousels, full-screen flows) using the
text/html+skybridgeMIME type. - Handle user authentication with built-in OAuth 2.1 support.
- Define latency budgets, caching hints, and localization through
_metaproperties.
When you deploy an MCP server through the SDK, ChatGPT can invoke it just as easily as it calls an internal OpenAI tool. The boundary between "OpenAI-built" and "third-party" dissolves. Your app becomes part of the model's native vocabulary—the assistant can reference it, chain it, or call it mid-conversation without breaking flow.
This is why early builders matter. The SDK's discovery and ranking system learns from usage patterns. Apps that deliver low-latency, high-completion results quickly become the model's preferred choices for that domain. The more your tool resolves intents cleanly, the more often it will be automatically suggested or invoked.
The protocol also makes experiences portable. MCP is open—other assistants can adopt it, meaning your same backend can power multiple conversational surfaces. Build once, and your service could appear across ChatGPT, enterprise copilots, and future multimodal agents.
Chapter 3 – Strategic Implications for Brands & Builders
The consequence of this infrastructure shift is strategic, not just technical. Every brand that relies on digital interaction must now decide how it will surface when the user no longer visits a site or opens an app.
In the old world, discovery meant capturing attention—SEO, social, ad funnels, app-store rankings. In the new one, discovery happens through relevance and reliability. The model decides which tool to call based on observed outcomes, latency, and clarity of schema. The more deterministic and accurate your responses, the higher your selection probability.
This transforms the business stack:
- Marketing → Metadata Engineering: success depends on how well your app describes itself to the model.
- UX → Intent Design: users don't browse; they declare. Each intent must map cleanly to a resolvable job.
- Support → Conversation Feedback Loops: every resolved task teaches the model when to choose you again.
Waiting on the sidelines is expensive. Early adopters are already shaping the ranking algorithms through usage signals—latency, completion, and satisfaction markers. Like early SEO pioneers, they'll own durable real estate in the model's decision graph.
For builders, this means reframing success metrics. You no longer measure clicks, sessions, or DAUs; you measure resolved outcomes. Did your capability finish the user's job? Did it do so quickly, clearly, and securely? Those are now the levers that drive organic discovery.
The companies that adapt fastest will rebuild their product roadmaps around intents rather than features. A "feature" is something users hunt for; an "intent" is something they simply express. The winners design capabilities that fit seamlessly into that sentence and deliver instant clarity.
This is the essence of the distribution reset. The web rewarded visibility; conversational ecosystems reward utility. Your growth loop becomes self-reinforcing: better resolutions → more model trust → higher invocation → more data → even better performance.
Chapter 4 – What to Build & Why It Works
The best early Apps are not mini websites—they are micro-capabilities that resolve a single, valuable intent cleanly inside a conversation. You win not by breadth, but by precision: the model keeps calling the tools that consistently complete the job fastest.
If a task already lives on the web, you can probably move it into ChatGPT. Think of your service as a function of intent:
| Category | Typical Intent | Conversation Outcome |
|---|---|---|
| Product Discovery | "Show me running shoes under $150." | Inline cards with filtered SKUs and links. |
| Planning & Decision | "Help me plan a 3-day Tokyo itinerary." | Carousel of suggested plans + booking CTAs. |
| Computation & Tools | "Calculate my monthly payment." | Interactive calculator widget with results summary. |
| Support & Education | "Explain recursion with a quick demo." | Animated teaching widget with follow-up Q&A. |
These patterns share a principle: resolution in-flow. The user never leaves the chat, yet completes the job. The system measures and rewards that frictionless outcome.
Over time, multiple brands will chain together: a budgeting app calls your mortgage calculator, which calls an insurance quote tool—all orchestrated by the model. The connective format that makes this possible is the structuredContent payload your app returns.
Chapter 5 – Engineering & Design Playbook
Building an App for ChatGPT means building an MCP server that declares your capabilities and optionally ships a small UI bundle. You don't need a new tech stack—just a disciplined structure:
- Describe your tools with clear JSON Schema.
- Expose them via a public
/mcpendpoint. - Attach an HTML template rendered with
text/html+skybridge. - Return three fields in every response:
structuredContent,content, and_meta.
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";
const server = new McpServer({ name: "price-checker", version: "1.0.0" });
// Define a simple tool
server.registerTool(
"check-price",
{
title: "Check Product Price",
inputSchema: { sku: z.string() },
_meta: { "openai/outputTemplate": "https://api.example.com/templates/price-card" }
},
async ({ sku }) => {
const price = await fetch(`https://api.example.com/prices/${sku}`).then(r => r.json());
return {
structuredContent: { sku, price: price.amount, currency: price.currency },
content: [{ type: "text", text: `The current price is ${price.amount} ${price.currency}.` }],
_meta: { source: "example-api", checkedAt: new Date().toISOString() }
};
}
);
server.listen(8080);
This snippet shows the full loop: the model calls check-price with a SKU,
your server fetches data, and returns both human and machine-readable outputs.
ChatGPT then decides whether to render a card, show text, or compose it with another tool.
Designing for Conversation
Your UI is not a standalone app—it's a fragment of dialogue. Keep interfaces single-purpose, visually quiet, and responsive to chat context. Use system fonts and platform colors, limit interactive depth to one or two steps, and let ChatGPT handle narration around your component.
- Inline cards — confirmations, summaries, and quick pickers.
- Carousels — comparisons or small collections (3–8 items).
- Fullscreen — complex flows like configuration or checkout.
Instrument everything. Log latency per invocation, hydration time, and completion rate. Treat these as product metrics, not technical afterthoughts—they directly influence ranking.
Security and privacy follow standard web rules: use HTTPS, strict CSP, and OAuth 2.1.
Never leak private identifiers in structuredContent; keep them in _meta.
When you localize, respect the _meta["openai/locale"] hint and render dates or currency accordingly.
The most elegant conversational interfaces keep it minimal.
By following these principles, your app feels like a natural extension of the conversation—fast, focused, and invisible until it's exactly what the user needs.
Chapter 6 – Monetisation Models
Utility without capture is philanthropy. Apps inside ChatGPT can't rely on banner clicks or ad impressions—there are none. The Apps SDK is a distribution layer, not a checkout flow. Monetisation therefore hinges on connecting in-thread value to your external revenue systems.
The core question becomes: Who owns the customer? OpenAI owns the conversation; you own the relationship. The winning pattern treats the assistant as your most powerful channel partner— you deliver resolution; it delivers reach.
Emerging Commercial Models
- SaaS Entitlement Play — Authenticate through OAuth 2.1, detect plan tier, and unlock premium features inline. Paying users experience full capability; free users see a guided teaser that converts naturally.
- High-Intent Lead Funnel — Ideal for consultative sectors (finance, real estate, B2B). Your app qualifies leads via calculators or diagnostics, then ends with one CTA: "Book a 15-minute consultation." Every invocation is a pre-qualified prospect.
- Transactional & Affiliate Model — Retail, travel, and marketplaces embed configuration, comparison, and pre-checkout flows in-chat. Final payment can redirect to your site with pre-filled carts and tracking parameters. The assistant becomes your conversion pre-processor.
- Brand & Awareness Utility — Some Apps act purely as brand anchors—free, frictionless, and ubiquitous. They build trust, gather preference data, and secure long-term default status ("Check the weather → calls your app").
Over time, OpenAI and others will formalise revenue APIs, but early builders shouldn't wait. The current advantage lies in habit formation: become the model's default resolver now, monetise through your existing channels later.
Chapter 7 – Where You'll Win First
Certain industries already think conversationally—they'll convert first because the interface matches their workflow. Anywhere users compare, configure, decide, or request in natural language is fertile ground.
| Sector | Example Intent | Inline Outcome |
|---|---|---|
| Travel & Hospitality | "Find flights to Dubai next Thursday." | Interactive flight cards with booking links. |
| Education & Training | "Teach me basic SQL with practice examples." | Adaptive lesson widget with live quizzes. |
| Finance & Insurance | "Estimate my mortgage payment." | Calculator + CTA to book advisor call. |
| Retail & E-Commerce | "Compare noise-cancelling headphones." | Carousel of products + direct purchase options. |
| Healthcare | "Schedule a follow-up with my doctor." | Secure scheduling + triage guidance. |
| Entertainment & Sports | "Show me tonight's NBA stats." | Live scoreboard + ticketing widget. |
| Home Improvement | "Plan a kitchen renovation budget." | Step-by-step planner with cost estimates. |
These categories share three properties:
- Structured Data — clear inputs/outputs make schemas easy.
- Conversational Tasks — users already express them verbally.
- High Intent — every invocation maps to monetisable action.
Early entrants in these sectors will define their industry schemas—the formats every competitor must match. Once those shapes solidify, the model will prefer known structures, giving schema authors a compounding advantage similar to early search-index dominance.
Chapter 8 – Team Traits & Future Orchestration
The teams that consistently win in this new ecosystem don't treat Apps as marketing stunts or integrations. They treat them as core product interfaces—living systems that evolve by observing, resolving, and learning from real user intent.
Traits of Teams That Win
- Utility Over Messaging: They lead with usefulness. The pitch is embedded in performance.
- Adaptive Experiences: Their tools learn from each invocation—refining schema, copy, and UX by data, not opinion.
- Lean Execution: They ship thin, modular capabilities fast. Perfection takes a back seat to iteration velocity.
- Interoperable Design: They structure data so other tools—and the model—can chain their outputs without friction.
- Obsessive Measurement: They instrument every call, from invocation latency to task completion, treating data as direction.
These teams collapse the traditional gap between engineering, design, and strategy. Conversation design is product design. Schema is UX. Latency is brand perception. The companies that grasp this reality early are the ones whose apps the model will repeatedly call.
The Next Step: Orchestration
Today, each App acts independently. Tomorrow, multiple capabilities—across brands and domains—will cooperate in a single conversation. This is the birth of the orchestrated web: where the assistant conducts a network of services to deliver complete outcomes. One chat might involve five vendors seamlessly chained: data retrieval, analysis, booking, payment, and follow-up.
MCP was designed with this future in mind. It standardizes contracts between capabilities so composition happens naturally. A travel planner app could invoke your pricing tool; your pricing tool could hand its structured output to a booking engine—all without user friction or custom integrations.
The long-term opportunity is enormous. When orchestration becomes the norm, brand equity will correlate with invocation reliability. The best app isn't the prettiest—it's the one the model calls first, because it never fails to deliver.
Conclusion – The Bottom Line
Apps inside ChatGPT aren't a novelty—they're the next distribution layer of software. The center of gravity has shifted from destinations to intents. The winners will be the teams who turn a single, high-value customer job into a fast, trustworthy capability that the model keeps choosing.
Treat this as product work, not marketing work. Build for intent, not for eyeballs. Measure resolution, not reach. The companies that internalize those principles now will own the next decade of discovery.
The playbook is clear:
- Pick one sharp intent you can dominate.
- Design a precise contract between input, schema, and result.
- Return structured data + UI in one clean response.
- Instrument everything from selection to resolution.
- Iterate relentlessly until invocation becomes habitual.
Every resolved task strengthens your position in the model's ranking graph. Every fast response earns another call. Over time, you don't just serve users—you become part of the conversation itself.
The market is wide open. Build with precision, respect latency, and let utility lead. You'll earn a permanent slot in the most valuable real estate in software—right inside the conversation.



