AgentSkillsCN

specs

根据领域需求与架构图,生成主题化的规格说明。

SKILL.md
--- frontmatter
name: specs
description: Generate topic-based specifications from domain requirements and architecture diagrams.
disable-model-invocation: true
argument-hint: <feature-name>

Vista Specs

Generate detailed topic-based specifications from domain requirements and architecture diagrams. Specs become the authoritative implementation reference consumed by Ralph loops.

Invocation

code
/vista:specs <feature-name>

Prerequisites

  • Feature must exist at .vista/features/<name>/
  • domain-requirements.md must be filled (run /vista:plan first)
  • Architecture diagrams should exist in arch/ (generated by /vista:plan)
  • TDD diagrams are optional but recommended for heavy-logic components (run /vista:tdd first)

Workflow

Step 1: Load All Planning Artifacts

  1. Read .vista/features/<name>/domain-requirements.md — the primary requirements source
  2. Read arch/_arch.json manifest
  3. Read ALL arch/*.mmd files — both architecture and TDD diagrams
  4. Parse Mermaid content to extract: entities, components, flows, states, relationships
  5. Check for ## TDD Candidates and ## Test Expectations sections in domain-requirements.md

Present a summary to the user:

code
I loaded the planning artifacts for <feature-name>:

- Domain requirements: [X sections covering Y topics]
- Architecture diagrams: [N diagrams]
  - system-architecture.mmd, user-flow.mmd, data-model.mmd, ...
- TDD diagrams: [M diagrams] (covering components: A, B, C)
- TDD candidates flagged: [list]

Ready to decompose into topic specifications.

Step 2: Decompose into Topics

Break the domain requirements into focused, independent topics.

Reference: vista/skills/specs/references/spec-template.md

2a. Identify topics

From the domain requirements and architecture diagrams, identify distinct concerns:

  • Each major component or service from the system architecture
  • Each data entity group from the data model
  • Each user workflow or journey
  • Each integration point with external systems

2b. Validate scope

Apply the "one sentence without and" test to each proposed topic:

  • "Image upload handles file selection and validates uploaded images" — PASS (related capabilities)
  • "User system handles authentication, profiles, and billing" — FAIL (3 separate topics)

2c. Confirm with user

Present the proposed topic list:

code
I propose breaking this feature into [N] specs:

1. **auth-flow** — User authentication and session management
2. **data-sync** — Offline-first data synchronization
3. **payment-processing** — Payment calculation and checkout
4. **notification-system** — User notifications and alerts

Does this decomposition look right? Should I split, merge, or rename any topics?

Use AskUserQuestion to get confirmation. Typically aim for 3-8 topics per feature.

Step 3: Generate Specifications

For each confirmed topic, create specs/{topic}.md using the spec template.

Reference: vista/skills/specs/references/spec-template.md

Each spec must include these sections:

Requirement ID Assignment

Before writing specs, derive the feature prefix: take the first letter of each hyphenated word (e.g., ralph-mcp-tool -> RMT, user-auth -> UA). Single words use first 2-3 letters (e.g., dashboard -> DSH).

Assign sequential IDs to every requirement:

  • Functional: {PREFIX}-F1, {PREFIX}-F2, ...
  • Non-Functional: {PREFIX}-NF1, {PREFIX}-NF2, ...
  • Business Rules: {PREFIX}-BR1, {PREFIX}-BR2, ...

Format each requirement line as: 1. **{PREFIX}-F1:** Requirement text

See vista/skills/spec-update/references/conventions.md for full ID format details.

Standard sections (always present)

  • Overview — One sentence without "and"
  • Changelog — Table with initial "Created" entry (date, ---, "Created", "Initial spec generation", ---)
  • Parent JTBD — Which job to be done this supports
  • Scope — In/out of scope
  • Requirements — Functional and non-functional, each with requirement ID
  • User Workflows — Actor, trigger, steps, error cases
  • Data Model — Entities, fields, relationships
  • Integration Points — Dependencies and provides-to
  • Technical Considerations — Architecture, patterns, libraries
  • Acceptance Criteria — Specific, testable checkpoints
  • Requirement Index — Table mapping all IDs to section and status (Active/Deprecated)

Diagram cross-references (always present)

Add a ## Related Diagrams section to every spec:

markdown
## Related Diagrams

- `arch/system-architecture.mmd` — Shows this component's role in the system
- `arch/data-model.mmd` — Entity definitions relevant to this spec
- `arch/sequence-checkout.mmd` — Interaction flow for the checkout workflow

Only reference diagrams actually relevant to the spec's topic.

Testing Strategy (for TDD-flagged components)

If the spec covers a component flagged as a TDD candidate, it MUST include a ## Testing Strategy section derived from the TDD diagrams:

markdown
## Testing Strategy

**TDD Required:** Yes (flagged during planning)
**TDD Diagrams:** `arch/tdd-pricing-logic.mmd`, `arch/tdd-pricing-decision.mmd`

### Test Cases (from TDD diagrams)

#### Happy Path
1. **Valid order with gold tier** — Input: non-empty items + gold tier → Expected: 15% discount + free shipping (subtotal > $50)
2. **Valid order with standard tier** — Input: non-empty items + standard → Expected: no tier discount, standard shipping

#### Edge Cases
3. **Empty order** — Input: empty items array → Expected: EMPTY_ORDER error
4. **Boundary: exactly $50 subtotal** — Verify shipping threshold behavior
5. **Single item order** — Minimum viable order

#### Error Conditions
6. **Negative item price** — Expected: validation error
7. **Unknown user tier** — Expected: default to standard tier

### Test Data Requirements
- Mock user with each tier level
- Order items covering various price points
- Edge case amounts ($0, $50 exactly, $50.01)

For specs covering non-TDD components, omit this section.

Step 4: Validate and Report

4a. Run quality checks

Verify all specs meet quality standards:

  • Every spec passes the "one sentence without and" test
  • Every functional requirement has at least one acceptance criterion
  • Every TDD-flagged component's spec has a Testing Strategy section
  • Every spec has a Related Diagrams section with valid references
  • No spec mixes unrelated concerns (architecture AND TDD in same scope)
  • All open questions are resolved (no ## Open Questions with unanswered items)

4b. Report to user

Tell the user:

  1. Specs generated: [N] spec files in specs/ covering all identified topics
  2. Quality status: All checks passed / [list any issues]
  3. Source of truth: Specs + arch diagrams are now the authoritative implementation reference
  4. Next steps:
    • Review specs for accuracy and completeness
    • Use /vista:spec-update <feature> [topic] to modify specs with proper change tracking
    • Start Ralph loops for implementation:
      • Plan mode first: builds IMPLEMENTATION_PLAN.md from specs
      • Build mode: implements code from the plan
    • Use # Implements {ID}: description comments in code for traceability

Output Files

After completion, .vista/features/<name>/specs/ contains:

code
specs/
├── auth-flow.md              # Topic specification
├── data-sync.md              # Topic specification
├── payment-processing.md     # Topic specification (with Testing Strategy if TDD-flagged)
└── notification-system.md    # Topic specification

References

Reference documents in vista/skills/specs/references/:

  • spec-template.md — Template for topic specifications (with Related Diagrams and Testing Strategy sections)
  • test-planning.md — Guide for planning test cases during spec generation
  • acceptance-criteria-format.md — Format guide for acceptance criteria