platform-strategy
npx skills add https://github.com/liqiongyu/lenny_skills_plus --skill platform-strategy
Agent 安装分布
Skill 文档
Platform Strategy
Scope
Covers
- Internal platforms (paved roads, shared infrastructure/services) treated as products
- External/hybrid platforms (APIs, extensions, partners) and ecosystem strategy
- Platform lifecycle strategy (when to open vs when to close for control/monetization)
- Platform surface-area design (interfaces, abstractions, governance) to reduce cognitive load for product teams
- AI platform defensibility (context repositories + integrated âtoolkitâ experiences) when relevant
When to use
- âCreate a platform strategy for our developer platform / API.â
- âTurn our internal platform into a product with clear users, metrics, and roadmap.â
- âWe want to open our platform to third partiesâdefine incentives, governance, and a rollout.â
- âWeâre building an AI platformâwhatâs the defensible system beyond a single feature?â
When NOT to use
- You donât have a clear problem/job-to-be-done yet (use
problem-definitionfirst). - You primarily need a product/company strategy and portfolio plan (use
ai-product-strategy). - Youâre selecting a vendor/tool rather than defining a platform strategy (use
evaluating-new-technology). - You need an implementation design/architecture doc (use
writing-specs-designsafter this).
Inputs
Minimum required
- Platform type: internal / external / hybrid (and who âthe platform ownerâ is)
- Primary users/consumers (e.g., developers, data scientists, partners) and their top jobs-to-be-done
- Current state: what exists today (surfaces, APIs, services, docs), and whatâs broken/painful
- Business intent: why now, desired outcomes (speed, reliability, revenue, ecosystem leverage), time horizon
- Constraints: security/privacy/compliance, SLOs, regions, budgets, resourcing, dependencies
- Decision context: who decides, what decisions are on the table (open/close, pricing, governance), target date
Missing-info strategy
- Ask up to 5 questions from references/INTAKE.md (3â5 at a time).
- If still missing, proceed with explicit assumptions and present 2â3 strategy options (e.g., internal-only vs partner beta vs public API).
- Do not request secrets or credentials. Require explicit confirmation for any production changes or external outreach.
Outputs (deliverables)
Produce a Platform Strategy Pack (in chat; or as files if requested), in this order:
- Platform Product Charter (users, jobs, non-goals, assumptions, outcomes)
- Platform Surface & Interface Map (capabilities, owners, APIs/SDKs, âpaved roadâ defaults, boundaries)
- Lifecycle Stage & Open/Close Strategy (stage diagnosis, stage-appropriate moves, transition risks)
- Moat & Ecosystem Model (compounding loops, incentives, seeding plan, investment gates)
- Governance & Policy Plan (whatâs open/closed, SLAs, deprecation, partner rules, pricing/packaging if relevant)
- Metrics & Operating Model (platform-as-product operating cadence, intake, support, adoption + productivity metrics)
- 12âmonth Roadmap (milestones, bets, sequencing, dependencies)
- Risks / Open questions / Next steps (always included)
Templates: references/TEMPLATES.md
Workflow (8 steps)
1) Define the platform as a product (users + jobs + outcomes)
- Inputs: Platform context, primary user groups, current pain.
- Actions: Write the platformâs âuser promiseâ and top 3â5 jobs-to-be-done. Add 3â5 non-goals. Choose 2â4 outcome metrics (prefer developer productivity metrics like cycle time).
- Outputs: Draft Platform Product Charter.
- Checks: You can describe value without naming internal components (âWe reduce X minutes of toil per deployâ).
2) Diagnose the platform lifecycle stage (and what decisions are truly on the table)
- Inputs: Market/organization conditions, competitive context (if external), timeline.
- Actions: Determine the most likely stage (Step 0â3) and list evidence. Clarify the âopen vs closeâ decision(s) you must make now (not someday).
- Outputs: Lifecycle Stage & Open/Close Strategy (draft).
- Checks: The stage is justified with evidence, not aspiration (âwe should be a platformâ).
3) Map surface area and define boundaries (reduce decision complexity)
- Inputs: Existing services/APIs, teams, dependency graph, common failures.
- Actions: Inventory platform capabilities; define what becomes a paved road vs optional. Specify boundaries: what platform owns vs domain teams own. Draft interface contracts (APIs/SDKs/events) and âdefault decisionsâ the platform makes for others.
- Outputs: Platform Surface & Interface Map.
- Checks: A domain team can build without re-deciding foundational choices (auth, logging, deployment, guardrails).
4) Identify the moat and the compounding loop(s)
- Inputs: Unique assets, distribution, data/context advantages, ecosystem participants.
- Actions: Propose 1â3 moat hypotheses and at least one compounding loop (âif this works, it acceleratesâ). Define incentives for each participant and a small seeding plan. Define âinvestment gatesâ (signals that justify more spend).
- Outputs: Moat & Ecosystem Model.
- Checks: The loop has measurable leading indicators (activation, retained developers, successful integrations).
5) Decide what to open, how to govern it, and how to protect the core
- Inputs: Stage, risks, support capacity, security/compliance requirements.
- Actions: Specify whatâs open now vs later, plus governance: access control, quotas, review processes, partner rules, SLAs, deprecation/backwards compatibility, and (if relevant) pricing/packaging. Include an âabuse/qualityâ plan (observability, enforcement).
- Outputs: Governance & Policy Plan.
- Checks: âOpenâ surfaces have a sustainability plan (support, docs, incident response, versioning).
6) (If AI platform) Build defensibility as a system, not a feature
- Inputs: AI use cases, context sources, data sensitivity, required integrations.
- Actions: Design a âSwissâarmy toolkitâ system: shared context repository + multiple experiences (autocomplete, chat, agent workflows) with consistent policies. Define permissions, audit logs, eval/monitoring, and human-in-the-loop points.
- Outputs: AI section inside Platform Product Charter + Governance & Policy Plan updates.
- Checks: The plan improves outcomes while containing risk (least privilege, auditable access, measurable quality).
7) Define metrics + operating model (platform-as-product)
- Inputs: Target outcomes, resourcing constraints, stakeholder map.
- Actions: Define a metric stack (north-star + input metrics) and an operating cadence (intake, prioritization, roadmap reviews, documentation, support/on-call, feedback loops). Ensure a PM/owner exists for internal platforms.
- Outputs: Metrics & Operating Model.
- Checks: Metrics tie to user outcomes (not vanity counts like â# of services migratedâ alone).
8) Sequence the roadmap and quality-gate the pack
- Inputs: All draft artifacts.
- Actions: Create a 12âmonth roadmap with 3 horizons (Now / Next / Later). Add dependencies, resourcing, and rollback/exit paths. Run references/CHECKLISTS.md and score with references/RUBRIC.md. Always include Risks / Open questions / Next steps.
- Outputs: Final Platform Strategy Pack.
- Checks: A stakeholder can make a decision (owner, date, next actions) and understand trade-offs.
Quality gate (required)
- Use references/CHECKLISTS.md and references/RUBRIC.md.
- Always include: Risks, Open questions, Next steps.
Examples
Example 1 (internal platform): âUse platform-strategy to create a platform strategy for an internal ML platform used by 40 engineers. Goal: cut model deployment cycle time from 2 weeks to 2 days. Constraints: PII present; SOC2; 2 platform engineers; 6âmonth horizon.â
Expected: platform-as-product charter + paved-road interfaces + productivity metrics + governance for AI data.
Example 2 (external ecosystem): âUse platform-strategy to define an API platform strategy for opening our analytics product to partners. We want 20 high-quality integrations in 12 months without breaking core reliability.â
Expected: stage diagnosis + open/close decisions + incentives + governance/versioning + roadmap.
Boundary example: âWe should become a platform like Appleâmake us a platform strategy.â
Response: out of scope without specific users/jobs and a plausible compounding loop; ask intake questions and/or start with problem-definition.