Computer Science Principles
Plagiarism TDD plan and ceremony (Sample)
Overview
Mission: Students test the plagiarism prevention prototype in pairs/groups of 3, providing rapid feedback to drive iterative improvements while learning user testing methods for their own e-learning quests.
Format: Action-packed 25-30 minute testing sessions with immediate feedback capture
1 person test on phone
1 person test on computer
1 person records results on Google form
Links:
- Test Plagiarism Prevention https://pages.opencodingsociety.com/plagiarism
- Capure feedback Google Form
Pre-Test Setup (2 minutes)
- Grouping: Form pairs or groups of 3 students
- Tools: Each group gets feedback form + timer
- Mindset: “You’re testing the system, not being tested!”
- Goal: Experience the quest while thinking like designers
Introduction (5-7 minutes)
- Go over cases
- Start on APA quote finding
- Discuss auto fill
- Show quest progress
- Introduce form
- Be sure everyone is engated.
Module Testing & Feedback Collection
Outline of testing ideas below as well as other assets supporting this ceremony.
Google TDD form desing Usability Test plan
🔍 C1: Case Studies (< 5 minutes)
Action: Navigate through plagiarism case studies, watch content, complete activities
Immediate Feedback Questions
Build live feedback form for following.
- Best UI Element (Page layout, audio, video, other)
- What worked best? _______
- Why? ________
- Biggest Plagiarism Takeaway/Surprise
- What shocked you most about the consequences?
-
- Least Favorite UI Element (Page layout, audio, video, other)
- What didn’t work? ______
- Why? ________
- Difficulty Ranking
- Hard 0 —-1—-2—-3—-4—-5 Easy
- Circle your rating
Group Discussion Prompt: “Share one memory from these cases with your partner”
📚 C2: APA Reference & Citation Training (5-7 minutes)
Action: Use interactive tools to learn APA format, practice with quote finder, build citations
Immediate Feedback Questions:
- Best UI Element (Page layout, Quote Finder, Building your own reference/citation, Test Fill)
- What tool/feature was most helpful? _______
- Why? ________
- Difficulty Assessment
- Was coming up with quotes too hard? Yes / No
- Why? ________
- Was building references/citations too hard? Yes / No
- Why? ________
- Quick Win: What’s one thing you learned that you’ll actually use?
-
✏️ C3: Error Correction Practice (5 minutes)
Action: Identify and fix citation errors in provided examples
Immediate Feedback Questions:
- Overall Experience
- Did you like this page? Yes / No / Sort of
- Best part: ________
- Missing Elements
- Did it feel like something was missing? Yes / No
- If yes, what? ______
- Difficulty Ranking
- Hard 0 —-1—-2—-3—-4—-5 Easy
- Circle your rating
- Engagement Check: Would you voluntarily do more of these exercises?
-
🛠️ C4: Build Your Own Document (5 minutes)
Action: Use AI-powered tools to create original content with proper citations
Immediate Feedback Questions:
- Favorite Feature
- What did you like best? ____
- Why? ________
- Challenge Identification
- What was hardest? ______
- How could it be easier? ____
- Real-World Application: Will you use tools like this for actual assignments?
-
Overall Quest Experience Review (5 minutes)
🎮 Main Page & Navigation
- Progress Understanding
- Did the status on main page help you understand how to progress? Yes / No
- What was clear? ______
- What was confusing? ____
- Quest Concept
- Did this help you understand the idea of a quest? Yes / No
- How is this different from regular lessons?
-
🎯 C5 & C6 Preview (Show after C1-C4 feedback)
Action: Demonstrate teacher assessment interface and student certificate system
- Data Accumulation Understanding
- After seeing teacher and student review, does this help you understand how data accumulates in the quest? Yes / No
- What’s smart about this approach?
-
- What would you change?
-
TDD Action Items Generation
Immediate Wins (Fix Today)
Based on feedback, identify:
- UI elements that consistently frustrate users
- Content that’s too hard/easy across multiple groups
- Missing features that multiple groups request
- Technical bugs or broken interactions
Sprint Improvements (Fix This Week)
Prioritize based on frequency of feedback:
- Navigation improvements
- Content clarity enhancements
- Interactive tool refinements
- Progress indicator adjustments
Future Enhancements (Next Iteration)
Consider for major updates:
- New features multiple groups suggest
- Advanced functionality requests
- Gamification improvements
- Accessibility enhancements
Student Learning Objectives
For Quest Builders (Your Students)
After this testing experience, students will:
- Understand User Testing: Experience giving and receiving rapid feedback
- Learn TDD Mindset: See how quick iterations improve products
- Practice Collaboration: Work in small groups to gather diverse perspectives
- Apply to Own Projects: Use similar feedback methods for their own quests
For Quest Improvement (Your Module)
This process will:
- Identify Pain Points: Find specific UI/content issues quickly
- Validate Learning: Confirm educational objectives are met
- Prioritize Fixes: Focus effort on most impactful improvements
- Build Engagement: Ensure quest format actually motivates students
Implementation Checklist
Before Testing Day:
- Print feedback forms for each group
- Set up timer stations (phone apps work great)
- Prepare C5/C6 demo for post-test reveal
- Brief students on “tester mindset” vs “student mindset”
During Testing:
- Keep energy high - this should feel like a game!
- Encourage honest feedback - “help make this better”
- Take photos of completed feedback forms
- Note which groups finish early/struggle with time
After Testing:
- Compile feedback within 24 hours while fresh
- Identify top 3 immediate fixes to implement
- Share results with students - show them their impact!
- Create “before/after” comparison for next class
Success Metrics for TDD Approach
Testing Process Success
- Students complete all modules within time limits
- Groups provide specific, actionable feedback
- Students understand connection to their own quest building
- Energy remains high throughout testing session
Content Improvement Success
- Clear patterns emerge in feedback (not random complaints)
- Specific UI elements consistently rated high/low
- Difficulty rankings cluster around appropriate levels
- Students express genuine interest in using final version
Learning Transfer Success
- Students reference this experience when building own quests
- Groups naturally discuss user experience principles
- Students ask to test each other’s prototypes
- Quality of student projects improves based on testing awareness
This TDD approach transforms testing from evaluation into collaboration - students become co-creators of the quest while learning essential UX skills for their own projects!