Module Creation Best Practices
This document is a practical guide for efficiently creating new modules for the AI Instruction Kits.
π Original Documents
For detailed content, please refer to the original documents:
π― Overview
This document summarizes practical insights gained from a large-scale module creation project in January 2025. It particularly focuses on efficient parallel investigation strategies and quality assurance methods.
π Key Learning Points
1. Power of Parallel Investigation Strategy
Performance Data
- Modules Created: 41 (35 new + 6 improved)
- Work Time: Approximately 3 hours
- Efficiency: Average 4.4 minutes/module
Success Factors
- Ensuring Task Independence
- Make each investigation task completely independent
- Process dependent tasks sequentially
- Appropriate Tool Selection
- Web search: Understanding latest trends
- Literature review: Confirming authoritative sources
- Implementation example collection: Practical code examples
- Utilizing Batch Processing
- Process similar tasks together
- Minimize context switching
2. Module Development Process
Phase 1: Planning and Preparation
Checklist:
- Determine category
- Estimate number of required modules
- Plan investigation strategy
- Prepare templates
Phase 2: Parallel Investigation
Investigation items:
- 2024-2025 best practices
- Industry standards and specifications
- Implementation patterns and anti-patterns
- Tools and frameworks
Phase 3: Implementation
Implementation steps:
1. Create metadata (YAML)
2. Write body content (Markdown)
3. Define variables and dependencies
4. Add implementation examples
Phase 3.5: Creating Concise Version
Important principles:
- Create detailed version first: Complete version based on broad and deep research with best practices
- Extract essence: Extract only the most important concepts from the detailed version
- Size target: 20-30% of detailed version (focusing on token efficiency)
Creation steps:
1. Confirm detailed version completion
2. Identify core concepts (express in 1-2 sentences)
3. Convert to quick reference in tabular format
4. List essential best practices as bullet points
This order enables creation of a well-founded concise version based on deep understanding.
Phase 4: Quality Assurance
Quality checks:
- Structural consistency
- Verify implementation examples work
- Variable appropriateness
- Documentation completeness
3. Category-Specific Best Practices
Expertise Modules
- Features: Deep specialized knowledge, emphasis on latest trends
- Required Elements:
- Theoretical background
- Implementation examples (3 or more)
- Compliance with industry standards
- Success metrics
Skills Modules
- Features: Specific implementation techniques
- Required Elements:
- Step-by-step guides
- Error handling
- Performance considerations
- Practical tips
Methods Modules
- Features: Processes and methodologies
- Required Elements:
- Phased approaches
- Deliverables for each phase
- Clear roles and responsibilities
- Implementation examples
π Quality Metrics
Quantitative Metrics
coverage:
- Module coverage: 90% or above
- Test coverage: 80% or above
- Documentation completeness: 100%
performance:
- Generation time: Within 5 seconds
- Memory usage: 100MB or less
- Dependency resolution: Automatic
Qualitative Metrics
quality:
- Readability: Clear and concise
- Practicality: Immediately usable
- Maintainability: Easy to update
- Extensibility: Simple to add new features
π οΈ Recommended Tools and Techniques
Development Tools
- VS Code: Syntax highlighting and preview
- Git: Version control
- Python: Module generation testing
- YAML Linter: Metadata validation
Efficiency Techniques
- Template Utilization
cp templates/module_template.md modules/new_module.md
- Batch Generation Scripts
# Bulk generation of multiple modules for module in module_list: generate_module(module)
- Automated Validation Tools
scripts/validate-modules.sh
π Module Validation
Validation Script Overview
The project includes scripts to automatically validate module metadata (YAML files) for correctness.
Usage
# Validate all modules
./scripts/validate-modules.sh
# Example output
π Starting module metadata validation...
π Language: en
π Category: tasks
β blog_writing.yaml
β project_planning.yaml
β thesis_writing.yaml
Validation Items
Required Fields
id
: Module identifiername
: Module nameversion
: Version informationdescription
: Module description
Format Checks
- Array fields:
tags
,dependencies
,prerequisites
must be arrays - String fields:
id
,name
,description
must be strings - Naming convention: id should start with
{category}_
(e.g.,task_
,skill_
)
Common Errors and Solutions
1. Dependencies Field Format Error
# β Incorrect
dependencies:
required:
- module_name
optional:
- another_module
# β
Correct
dependencies:
- module_name
- another_module
2. ID Naming Convention Mismatch
# β Incorrect (for tasks category)
id: "project_planning"
# β
Correct
id: "task_project_planning"
CI/CD Integration
Itβs recommended to run the validation script locally before creating a PR. In the future, automatic validation will be executed via GitHub Actions.
π Learning Resources
Recommended Materials
Community
- GitHub Issues: Questions and discussions
- Pull Requests: Improvement suggestions
π Start Now
- Copy template
- Plan investigation
- Execute parallel investigation
- Implement module
- Run validation script
./scripts/validate-modules.sh
- Fix any errors
- Create pull request
Let's enrich the AI Instruction Kits with efficient module development!