backlog.md•21.3 kB
# Learning Hour MCP Product Backlog
## Product Vision
Extend Claude's capabilities to create highly contextual,
team-specific Learning Hour content by integrating with development
workflows and providing adaptive learning intelligence.
## MCP Architecture Context
This is an **MCP server** that provides tools to Claude Desktop and other AI assistants.
Features must:
- Work through conversational interfaces (no UIs or dashboards)
- Provide tools/functions that Claude can invoke
- Return structured data that Claude can interpret and present
- Integrate with external systems to bring context into conversations
## User Personas
### Primary: Technical Coach Sarah
- Uses Claude Desktop with Learning Hour MCP to prepare sessions
- Wants Claude to generate content using her team's actual codebase
- Needs Claude to remember previous sessions and adapt recommendations
- Values contextual, immediately applicable learning materials
### Secondary: Development Team Lead Mike
- Occasionally asks Claude to generate Learning Hour content
- Wants examples that use his team's technology stack and patterns
- Needs quick, relevant content with minimal prep time
- Values integration with team's actual development context
---
## MCP Tools to Build
### Epic 1 Tools:
- `analyze_repository` - Scan codebase for learning opportunities
- `analyze_tech_stack` - Detect team's technology preferences
- `generate_contextual_session` - Create Learning Hour using team's actual code
### Epic 2 Tools:
- `record_session_outcome` - Store completed session context
- `get_learning_recommendations` - Suggest next topics based on history
- `update_team_profile` - Maintain team learning context
### Epic 3 Tools:
- `analyze_recent_prs` - Find learning opportunities in recent pull requests
- `track_learning_application` - Measure if learning is being applied
- `generate_improvement_metrics` - Show before/after code quality changes
---
## Epic 1: Team Codebase Integration (HIGHEST IMPACT)
*Transform generic Learning Hours into team-specific sessions using actual codebases*
### Story 1.1: Repository Analysis for Learning Opportunities
```gherkin
Feature: Repository Analysis for Learning Opportunities
As a Technical Coach using Claude
I want Claude to analyze our team's actual codebase
So that Learning Hour examples come from code we maintain daily
Scenario: Finding real code smell examples
Given I ask Claude "analyze our GitHub repo for Feature Envy examples"
When Claude invokes the analyze_repository tool
Then it should scan the codebase for Feature Envy patterns
And return 3-5 real code examples with file paths and line numbers
And include confidence scores and complexity ratings
And suggest which examples work best for different experience levels
And Claude should present these as "Here are Feature Envy examples from your actual codebase..."
Scenario: Generating team-specific refactoring examples
Given Claude found Feature Envy in src/UserService.java:45-67
When I ask "create a Learning Hour using this example"
Then Claude should generate before/after code using the actual methods
And preserve the team's coding style and naming conventions
And reference actual classes and business logic (anonymized)
And show how the refactoring improves this specific codebase
And create session content that says "Let's improve our UserService class"
Scenario: Protecting sensitive business information
Given our codebase contains proprietary algorithms and sensitive data
When Claude analyzes for learning examples
Then it should automatically anonymize business-specific names
And replace sensitive constants with placeholder values
And preserve technical structure while removing domain context
And flag any examples that cannot be safely anonymized
And ask for permission before including potentially sensitive code
```
### Story 1.2: Technology Stack Adaptation
```gherkin
Feature: Technology Stack Adaptation
As a Technical Coach
I want Learning Hour content adapted to our team's technology choices
So that examples use frameworks and patterns we actually work with
Scenario: Detecting team's technology profile
Given I provide Claude with our repository URL
When Claude invokes the analyze_tech_stack tool
Then it should identify primary programming languages
And detect main frameworks (React, Spring Boot, Django, etc.)
And identify testing frameworks and build tools in use
And note architectural patterns (REST APIs, microservices, etc.)
And store this as reusable team context for future sessions
Scenario: Generating stack-specific examples
Given our team uses "Node.js with Express and Jest"
When I ask Claude "create a Clean Code Learning Hour"
Then examples should use Express route handlers and middleware
And JavaScript/TypeScript patterns our team recognizes
And Jest testing examples that match our existing test structure
And Node.js-specific refactoring opportunities
And reference packages we already use (lodash, moment, etc.)
Scenario: Learning path recommendations based on tech stack
Given our tech profile shows "React with Redux and RTK Query"
When I ask Claude "what should we learn next after completing TDD basics?"
Then suggestions should prioritize React-specific testing patterns
And Redux state management testing strategies
And component testing with React Testing Library
And API mocking patterns that work with RTK Query
```
---
## Epic 2: Adaptive Learning Intelligence
*Enable Claude to remember learning context and provide intelligent recommendations over time*
### Story 2.1: Learning Session History and Context
```gherkin
Feature: Learning Session History and Context
As a Technical Coach using Claude
I want Claude to remember our previous Learning Hours
So that future sessions build logically on past learning
Scenario: Recording completed sessions
Given I tell Claude "we completed the TDD basics session last week"
When Claude invokes the record_session_outcome tool
Then it should store that this team completed TDD basics
And note the date and any specific feedback I provide
And associate this with the team's repository context
And update the team's learning progression profile
Scenario: Getting intelligent next session recommendations
Given our team has completed "TDD Basics" and "Red-Green-Refactor"
When I ask Claude "what should we learn next?"
Then Claude should invoke get_learning_recommendations
And suggest logical follow-up topics like "TDD with Legacy Code"
And explain why each suggestion builds on previous learning
And prioritize topics that address gaps identified in past sessions
And consider our team's technology stack and codebase complexity
Scenario: Adapting content based on team struggles
Given I tell Claude "the team struggled with writing testable code during our TDD session"
When I ask for the next Learning Hour topic
Then Claude should prioritize topics like "Dependency Injection" or "Test Doubles"
And suggest revisiting TDD concepts with simpler examples
And recommend preparatory activities to address the identified struggle
And adjust difficulty levels for future TDD-related content
```
### Story 2.2: Team Learning Profile Management
```gherkin
Feature: Team Learning Profile Management
As a Technical Coach
I want Claude to maintain context about each team's learning journey
So that recommendations become more personalized over time
Scenario: Building team learning profiles
Given I'm coaching multiple teams with Claude
When I specify which team I'm asking about
Then Claude should maintain separate learning profiles for each team
And track their completed topics, struggled areas, and skill progression
And remember their technology stack and codebase characteristics
And note their preferred learning styles and session formats
Scenario: Generating experience-appropriate content
Given our team profile shows "completed 8 Learning Hours, strong in testing, weak in refactoring"
When I ask Claude to generate content for "Strategy Pattern"
Then it should create intermediate-level content (not beginner)
And include more refactoring practice since that's a weak area
And reference testing patterns they already know well
And use examples from their technology stack
Scenario: Cross-team learning insights
Given I coach multiple teams using Claude
When I ask "which teams would benefit from a Clean Code session?"
Then Claude should analyze all team profiles
And identify teams that haven't covered Clean Code recently
And suggest which teams might benefit from group sessions together
And recommend different approaches based on each team's experience level
```
---
## Epic 3: Advanced Context-Aware Content Generation
*Enhance content quality through better understanding of team context and learning needs*
### Story 3.1: Pull Request and Code Review Integration
```gherkin
Feature: Pull Request and Code Review Integration
As a Technical Coach
I want Claude to analyze recent team pull requests
So that Learning Hours address actual code quality issues the team faces
Scenario: Identifying learning opportunities from recent PRs
Given I ask Claude "what Learning Hour topics would help our team based on recent code reviews?"
When Claude invokes the analyze_recent_prs tool
Then it should scan the last 20 pull requests for common patterns
And identify recurring code review comments about naming, complexity, testing
And suggest Learning Hour topics that address the most frequent issues
And provide specific examples from the PRs (anonymized) to use in sessions
Scenario: Generating improvement-focused sessions
Given recent PRs show frequent comments about "long parameter lists"
When I ask Claude to create a Learning Hour for this issue
Then it should generate content specifically about "Parameter Object" refactoring
And use actual examples from our recent commits (anonymized)
And show before/after using our team's coding style
And explain how this addresses the specific review feedback patterns
Scenario: Tracking improvement over time
Given we completed a Learning Hour on "Method Naming" 4 weeks ago
When I ask Claude "has our code review feedback improved since the naming session?"
Then it should analyze recent PRs compared to pre-session PRs
And identify whether naming-related comments have decreased
And suggest follow-up topics if issues persist
And highlight success stories where the learning was applied
```
### Story 3.2: Difficulty and Experience Adaptation
```gherkin
Feature: Difficulty and Experience Adaptation
As a Technical Coach using Claude
I want content automatically adapted to team experience levels
So that sessions are appropriately challenging without being overwhelming
Scenario: Auto-detecting team experience from codebase
Given Claude analyzes our team's codebase
When it evaluates code complexity, testing patterns, and architecture
Then it should infer whether this is a beginner, intermediate, or advanced team
And note specific strengths (good testing) and weaknesses (complex methods)
And use this context to automatically adjust content difficulty
And recommend appropriate starting points for different topics
Scenario: Generating experience-appropriate examples
Given our team profile shows "advanced in testing, intermediate in design patterns"
When I ask Claude to create a "Strategy Pattern" Learning Hour
Then it should assume familiarity with testing concepts
And focus more time on design pattern implementation
And use testing examples that leverage our advanced testing skills
And provide design pattern exercises at intermediate difficulty
Scenario: Progressive complexity in learning paths
Given we've completed basic TDD and are ready for advanced topics
When I ask Claude "what's next in our TDD journey?"
Then it should suggest "TDD with Legacy Code" or "Outside-In TDD"
And explain how these build on our established TDD foundation
And provide exercises that challenge us beyond basic red-green-refactor
And include techniques for handling complex real-world scenarios
```
---
## Epic 4: Development Workflow Integration
*Connect Learning Hours with daily development practices through tool integrations*
### Story 4.1: GitHub Integration for Learning Context
```gherkin
Feature: GitHub Integration for Learning Context
As a Technical Coach using Claude
I want to connect Learning Hours with our GitHub workflow
So that learning becomes part of our daily development process
Scenario: Creating learning-focused GitHub issues
Given we completed a Learning Hour on "Extract Method" refactoring
When I ask Claude "create GitHub issues for applying this learning"
Then Claude should invoke the create_github_issues tool
And create issues for refactoring opportunities found in our codebase
And tag them with "learning-hour" and "refactoring" labels
And include links to the Learning Hour content and examples
And assign them to appropriate team members based on their experience
Scenario: Tracking learning application in pull requests
Given our team learned about "Single Responsibility Principle" last week
When I ask Claude "has the team applied SRP in recent pull requests?"
Then Claude should analyze recent PRs for SRP improvements
And identify commits that demonstrate applying the learning
And highlight missed opportunities for applying SRP
And suggest gentle reminders or follow-up topics if needed
Scenario: Learning-driven code review suggestions
Given we have a Learning Hour session planned on "Clean Code"
When I ask Claude "what should we focus on in this week's code reviews?"
Then Claude should suggest specific Clean Code principles to watch for
And provide code review templates that reinforce the learning
And suggest how to give constructive feedback that references the Learning Hour
```
### Story 4.2: Slack/Teams Integration for Learning Continuity
```gherkin
Feature: Slack/Teams Integration for Learning Continuity
As a Technical Coach
I want to extend Learning Hour discussions into our team chat
So that learning conversations continue beyond the session
Scenario: Sharing Learning Hour summaries to team channels
Given we completed a Learning Hour on "TDD with Legacy Code"
When I ask Claude "share the key takeaways with the team"
Then Claude should invoke the post_to_slack tool
And create a well-formatted summary of the session
And include the main learning objectives and outcomes
And provide links to relevant resources and examples
And encourage continued discussion in the thread
Scenario: Setting up learning reinforcement reminders
Given our team learned about "Pair Programming" techniques
When I ask Claude "remind the team to try pair programming this week"
Then Claude should schedule reminder messages in our team channel
And suggest specific pairing opportunities based on current work
And provide tips for successful pairing sessions
And ask team members to share their pairing experiences
Scenario: Facilitating learning discussions in chat
Given someone asks a code quality question in Slack
When I mention Claude in the thread
Then Claude should provide relevant Learning Hour references
And suggest which completed sessions address this question
And offer to generate additional examples or explanations
And recommend follow-up Learning Hour topics if appropriate
```
---
## Epic 5: Coaching Intelligence and Best Practices
*Provide Claude with coaching knowledge to help Technical Coaches improve their facilitation skills*
### Story 5.1: Facilitation Guidance and Tips
```gherkin
Feature: Facilitation Guidance and Tips
As a Technical Coach using Claude
I want coaching advice during session preparation
So that I can improve my facilitation skills
Scenario: Getting facilitation tips for specific topics
Given I ask Claude "how should I facilitate a Learning Hour on TDD for a skeptical team?"
When Claude invokes the get_facilitation_guidance tool
Then it should provide specific techniques for handling resistance to TDD
And suggest ways to demonstrate value quickly
And recommend starting with simple, obvious examples
And provide phrases for acknowledging concerns while moving forward
And include timing suggestions for different activities
Scenario: Adapting facilitation style to team dynamics
Given I tell Claude "my team includes 2 senior developers who dominate discussions"
When I ask for facilitation advice
Then Claude should suggest techniques for encouraging broader participation
And provide strategies for managing dominant personalities
And recommend small group activities that give everyone a voice
And suggest ways to leverage senior developers as coaches rather than leaders
Scenario: Handling difficult moments during sessions
Given I ask Claude "what if someone says 'this TDD stuff slows us down'?"
When Claude provides facilitation guidance
Then it should offer several response options
And suggest acknowledging the concern as valid
And provide concrete examples of how TDD speeds up development
And recommend trying a small experiment to demonstrate value
And include follow-up questions to keep the discussion productive
```
### Story 5.2: Learning Effectiveness Measurement
```gherkin
Feature: Learning Effectiveness Measurement
As a Technical Coach
I want Claude to help me measure whether Learning Hours are working
So that I can continuously improve my approach
Scenario: Analyzing code quality improvements
Given we completed Learning Hours on "Clean Code" and "Refactoring" over the past month
When I ask Claude "has our code quality improved since these sessions?"
Then Claude should analyze recent commits compared to pre-session code
And identify improvements in method length, naming, and complexity
And highlight specific examples where learning was applied
And suggest areas that still need attention
Scenario: Tracking team engagement and satisfaction
Given I tell Claude about team feedback from recent Learning Hours
When I ask "how can I improve engagement in future sessions?"
Then Claude should analyze patterns in the feedback
And identify which activities were most/least engaging
And suggest modifications based on team preferences
And recommend topics that build on successful sessions
Scenario: Measuring knowledge retention over time
Given we learned about "SOLID principles" 6 weeks ago
When I ask Claude "do we need to revisit SOLID principles?"
Then Claude should analyze recent code for SOLID principle application
And identify which principles the team remembers and applies
And suggest focused refresher sessions for forgotten concepts
And recommend ways to reinforce learning through code reviews
```
---
## Priority Matrix (MCP-Focused)
### HIGHEST IMPACT (Build First)
1. **Repository Analysis & Team Context** (Epic 1)
- Transform generic content into team-specific Learning Hours
- Highest value differentiator for the MCP
- Directly addresses "relevance" problem with Learning Hours
### HIGH IMPACT (Build Next)
2. **Learning Intelligence & Memory** (Epic 2)
- Enable Claude to remember and build on previous sessions
- Creates adaptive learning pathways
- Makes repeat usage significantly more valuable
3. **Pull Request Integration** (Epic 3.1)
- Connect Learning Hours to actual code quality issues
- Provides concrete ROI measurement
- Closes feedback loop between learning and application
### MEDIUM IMPACT (Future Versions)
4. **GitHub Workflow Integration** (Epic 4.1)
- Extends learning into daily workflow
- Good for adoption and reinforcement
5. **Facilitation Intelligence** (Epic 5)
- Helps coaches improve over time
- Leverages Claude's knowledge of coaching best practices
### LOWER IMPACT
6. **Communication Tool Integration** (Epic 4.2)
- Convenience features, not core value
- Can be built as separate integrations later
---
## Success Metrics (MCP Context)
### For MCP Adoption
- 80% of generated Learning Hours use team-specific code examples
- Claude conversation length increases 3x when MCP context is available
- 90% of coaches report content as "immediately applicable"
### For Learning Effectiveness
- Teams request follow-up sessions 60% more often with contextual content
- Code quality improvements measurable within 30 days
- Learning retention increases when using real codebase examples
### For Technical Coaches
- Preparation time reduced from 2 hours to 30 minutes
- Confidence in content relevance increases significantly
- Ability to demonstrate concrete value to leadership