Section 2: Designing and Assessing Physical Computing Lessons¶
Learning Objectives
By the end of this section, you will be able to:
- Apply the "low floor, high ceiling, wide walls" principle to lesson design
- Create lesson plans that integrate computational thinking with curriculum subjects
- Use formative assessment strategies appropriate for physical computing
- Observe and document evidence of computational thinking in young learners
- Manage practical sessions effectively, including setup, routines, and troubleshooting
- Develop assessment approaches that capture learning without over-testing
2.1 Introduction: A Framework for What Follows¶
Before we dive into specific physical computing tools in the sections ahead—programmable robots, tangible interfaces, microcontrollers, and mechanical logic toys—it is essential to establish a pedagogical foundation. Having access to exciting tools is only the beginning. The real challenge—and opportunity—lies in designing lessons that use these tools effectively to develop children's computational thinking.
This section provides you with frameworks and strategies that you will apply throughout the rest of this course. As you explore each category of tool in subsequent sections, you will be asked to try them in your own classroom. The principles here will guide your lesson planning, help you observe and assess learning, and ensure your practical sessions run smoothly.
2.2 The Low Floor, High Ceiling, Wide Walls Principle¶
Origins of the Metaphor¶
In the 1970s, Seymour Papert—creator of Logo and pioneer of educational computing—articulated a design principle that remains central to good educational technology: tools should have low floors and high ceilings.
- Low floor: Easy to get started; novices can achieve meaningful results quickly
- High ceiling: Supports increasingly sophisticated work as learners develop expertise
Mitchel Resnick, Papert's colleague at MIT and creator of Scratch, extended this metaphor by adding wide walls:
- Wide walls: Supports diverse interests, learning styles, and creative directions
As Resnick (2017) explains: "It’s not enough to provide a single path from a low floor to a high ceiling; it’s important to provide multiple pathways. Why? We want all children to work on projects based on their own personal interests and passions—and because different children have different passions, we need technologies that support many different types of projects, so that all children can work on projects that are personally meaningful to them."
Why Wide Walls Matter¶
Wide walls ensure that physical computing appeals to all children, not just those who fit a particular profile. When walls are wide:
- A child interested in music can create a sound-responsive installation
- A child who loves animals can build a pet-feeding reminder
- A child fascinated by space can design a Mars rover simulation
- A child passionate about sport can create a reaction-time tester
Without wide walls, computing becomes a narrow corridor that excludes children whose interests don't align with stereotypical "tech" activities.
Applying the Principle to Lesson Design¶
| Principle | In Practice |
|---|---|
| Low floor | Start with pre-built examples children can explore and modify; use block-based programming; provide clear, achievable first tasks |
| High ceiling | Offer extension challenges; allow open-ended project work; don't cap what ambitious learners can achieve |
| Wide walls | Connect to multiple curriculum areas; offer choice in project themes; value diverse outcomes |
Example: A micro:bit emotion badge lesson
- Low floor: Children start by displaying a pre-programmed heart icon
- High ceiling: Advanced learners add button controls, animations, or sensor responses
- Wide walls: Children choose their own emotion theme—happy/sad, calm/excited, or something personal to them
2.3 Designing Effective Lessons¶
Starting with Learning Outcomes¶
Effective physical computing lessons begin with clear learning outcomes that specify what children will understand or be able to do. These outcomes should address both:
- Computational thinking skills: The CT concepts and practices being developed
- Curriculum connections: The subject knowledge being reinforced or explored
Weak outcome: "Children will use the micro:bit."
Strong outcome: "Children will use selection (if/then) to create a program that responds differently to different inputs, while exploring how living things respond to environmental stimuli."
The Explore-Create-Share Structure¶
Many successful physical computing lessons follow a three-phase structure:
Phase 1: Explore (10-15 minutes)
- Teacher introduces the tool/concept with a brief demonstration
- Children explore a working example: "What does this do? How does it work?"
- Discussion of what they notice and wonder
- Key vocabulary introduced in context
Phase 2: Create (20-30 minutes)
- Children work on structured or open-ended tasks
- Teacher circulates, observing, questioning, and supporting
- Opportunities for differentiation through challenge levels
- Collaboration encouraged (pairs or small groups)
Phase 3: Share (5-10 minutes)
- Children demonstrate their creations to peers
- Discussion of different approaches and solutions
- Reflection on what worked, what was challenging, what they learned
- Connection to broader concepts and future learning
Differentiation Strategies¶
Physical computing naturally supports differentiation because the same tools can be used at multiple levels:
| Strategy | Implementation |
|---|---|
| Graduated challenges | Provide bronze/silver/gold tasks with increasing complexity |
| Scaffolded resources | Offer starter code, worked examples, or step-by-step guides for those who need them |
| Open-ended projects | Allow children to set their own level of ambition |
| Mixed-ability pairs | Partner learners strategically so they can support each other |
| Extension prompts | "Now try..." challenges for those who finish quickly |
| Alternative tools | Some children might use visual programming while others use text-based code |
Common Pitfalls to Avoid¶
Pitfall 1: Technology without purpose
"Move the robot 10cm" teaches nothing meaningful. Instead: "Program the robot to deliver a message across the classroom without bumping into obstacles."
Pitfall 2: Demonstration overload
Long teacher demonstrations while children sit passively are counterproductive. Keep demonstrations brief, then let children explore.
Pitfall 3: One-size-fits-all pacing
In physical computing, children progress at very different rates. Build in flexibility rather than requiring everyone to move at the same pace.
Pitfall 4: Focusing on the tool, not the thinking
The micro:bit is not the learning outcome—computational thinking is. Keep the focus on the ideas, using the tool as a means to explore them.
2.4 Formative Assessment Strategies¶
The Challenge of Assessing CT¶
Computational thinking presents particular assessment challenges:
- It's a process, not just a product—how children think matters as much as what they create
- It involves multiple dimensions: concepts, practices, and perspectives
- It may be demonstrated in diverse ways across different tasks and contexts
- Traditional tests capture only a narrow slice of CT capability
Research confirms these challenges. Studies show that teachers associate CT assessment primarily with informal, formative approaches rather than formal tests (Sherwood et al., 2024). Experienced teachers rely heavily on observation to assess CT (Kalelioglu & Sentance, 2019).
Brennan and Resnick's Framework¶
A useful framework for thinking about CT assessment comes from Brennan and Resnick (2012), who identify three dimensions:
| Dimension | What It Includes | How to Assess |
|---|---|---|
| Computational concepts | Sequences, loops, conditionals, variables, events, parallelism, operators | Analyse what concepts appear in children's programs |
| Computational practices | Experimenting, debugging, reusing, abstracting, modularising | Observe how children work, not just what they produce |
| Computational perspectives | Expressing, connecting, questioning | Listen to how children talk about their work and computing |
Practical Assessment Strategies¶
Strategy 1: Observation with prompts
Circulate during activities with specific things to look for:
- Does the child break problems into smaller parts? (decomposition)
- Do they notice and use patterns? (pattern recognition)
- Do they test and refine their solutions? (debugging)
- Can they explain their thinking? (abstraction)
Use a simple observation checklist or sticky notes to capture evidence.
Strategy 2: Think-aloud protocols
Ask children to verbalise their thinking as they work:
- "Tell me what you're trying to do."
- "What do you think will happen when you run this?"
- "Something went wrong—how will you figure out what?"
- "Why did you choose to do it that way?"
Strategy 3: Predict-Run-Investigate
Before running a program, ask children to predict what will happen. This reveals their mental model of the system:
- Accurate prediction suggests solid understanding
- Inaccurate prediction reveals misconceptions to address
- The gap between prediction and reality drives learning
Strategy 4: Peer explanation
Children explain their work to a partner:
- "Show your partner what your program does and explain how it works."
- The partner asks one question about something they don't understand.
Ability to explain demonstrates deeper understanding than ability to do.
Strategy 5: Documentation and portfolios
Children document their work through:
- Photographs of physical constructions
- Screenshots or exports of programs
- Written or recorded reflections
- Design journals showing planning and iteration
Portfolios capture growth over time and provide evidence for reporting.
What to Look For: Observable Indicators¶
| CT Skill | Observable Indicators |
|---|---|
| Decomposition | Breaks task into steps; works on one part at a time; identifies sub-problems |
| Pattern recognition | Notices similarities to previous problems; reuses successful strategies; spots repeated elements |
| Abstraction | Ignores irrelevant details; identifies what's essential; explains concepts in own words |
| Algorithmic thinking | Creates ordered sequences; uses precise instructions; considers all cases |
| Debugging | Tests systematically; isolates problems; persists when things go wrong |
| Evaluation | Reflects on solutions; considers improvements; compares approaches |
2.5 Rubrics for Physical Computing¶
Designing Effective Rubrics¶
Rubrics make expectations clear and support consistent assessment. For physical computing, effective rubrics:
- Focus on process as well as product
- Include computational thinking dimensions
- Allow for diverse outcomes (wide walls)
- Use child-friendly language where appropriate
Example Rubric: Programming Task¶
| Criterion | Beginning | Developing | Secure | Exceeding |
|---|---|---|---|---|
| Sequence | Program runs but not in intended order | Program mostly runs in correct sequence | Program runs in correct sequence with clear logic | Program uses efficient sequencing with organisation |
| Selection | No use of conditionals | Attempts selection but with errors | Uses if/then correctly to respond to input | Uses complex conditions (if/else, multiple conditions) |
| Testing | Does not test before declaring finished | Tests once, makes no changes | Tests and debugs iteratively | Tests systematically, anticipates problems |
| Explanation | Cannot explain what program does | Describes what happens but not why | Explains how program works | Explains design choices and alternatives |
Example Rubric: Design Project¶
| Criterion | Beginning | Developing | Secure | Exceeding |
|---|---|---|---|---|
| Planning | No evidence of planning | Basic plan, not followed | Clear plan, mostly followed | Detailed plan with revisions |
| Problem-solving | Gives up when stuck | Asks for help immediately | Tries strategies before seeking help | Helps others when they are stuck |
| Creativity | Copies example exactly | Makes minor modifications | Adapts significantly to own ideas | Creates original design |
| Collaboration | Works alone or off-task | Participates when prompted | Contributes actively to team | Supports others' contributions |
Self-Assessment and Peer Assessment¶
Children can use simplified rubrics for self and peer assessment:
Traffic light self-assessment:
- 🔴 I found this very difficult and need more help
- 🟡 I could do this with some support
- 🟢 I can do this confidently
Two stars and a wish:
- ⭐ One thing I did well
- ⭐ Another thing I did well
- 💫 One thing I want to improve
2.6 Managing Practical Sessions¶
Before the Lesson¶
Equipment preparation:
- Check all devices are charged/have batteries
- Test that software loads correctly on all machines
- Have spare batteries, cables, and components ready
- Pre-load any starter files children will need
Room setup:
- Arrange tables to support collaboration (pairs or small groups)
- Ensure all children can see demonstration screen
- Clear floor space for robot activities
- Identify storage location for ongoing projects
Routines to establish:
- How to collect and return equipment
- What to do when stuck (see below)
- How to save and organise work
- Tidying and packing away procedures
During the Lesson¶
The "stuck" routine
Establish a clear protocol for when children are stuck. A popular approach is "Three Before Me":
- Try something yourself (re-read instructions, check connections)
- Ask a partner
- Ask another pair
- Then ask the teacher
This develops resilience and reduces teacher bottlenecks.
Avoiding "helpless hands"
In paired work, ensure both children engage:
- Assign roles (driver/navigator) and swap partway through
- Both children must be able to explain what the pair has done
- Use talking prompts: "Discuss with your partner before you try..."
Managing different paces
Physical computing sessions often see wide variation in progress:
- Have extension challenges ready for fast finishers
- Allow children to help peers once they've completed their task
- Don't hold everyone back for the slowest—provide catch-up support
- Consider "checkpoint" moments where you briefly gather attention
Troubleshooting Common Issues¶
| Problem | Likely Cause | Solution |
|---|---|---|
| Device not connecting | Cable issue or port selection | Try different cable; check correct port selected |
| Program won't download | Browser cache or driver issue | Refresh page; try different browser; restart device |
| Robot not responding | Batteries or pairing | Replace batteries; re-pair device |
| Unexpected behaviour | Logic error in program | Use predict-run-investigate to identify issue |
| "It's not working!" | Unspecified | Ask child to describe expected vs actual behaviour |
End of Lesson¶
Saving work:
- For screen-based: save to designated folder with clear naming
- For physical: photograph builds before dismantling
- For ongoing projects: store securely with name labels
Packing away:
- Build in adequate time (5+ minutes for complex equipment)
- Assign roles: "Table monitors" responsible for complete return
- Check all pieces returned before dismissal
Reflection:
- Brief whole-class reflection on learning, not just activity
- Connect to next lesson: "Next time we'll build on this by..."
2.7 Planning Your Fieldwork¶
As part of TEM5018, you will conduct fieldwork using physical computing tools with primary-age learners. This section provides guidance for planning effective fieldwork activities.
Selecting Your Tools¶
You will use three different tools across your fieldwork. Consider:
| Factor | Questions to Ask |
|---|---|
| Age appropriateness | Is this tool suitable for the age group you're working with? |
| Curriculum fit | How does the tool connect to what children are learning? |
| Practical constraints | Do you have enough devices? Is the technology reliable? |
| Your familiarity | Are you confident enough with the tool to support learners? |
| Diversity | Do your three tools offer different experiences (e.g., robot, microcontroller, unplugged)? |
Planning Considerations¶
Context:
- What year group(s) will you work with?
- What is their prior experience with computing/programming?
- What curriculum topics could your activities connect to?
- What time do you have available?
Logistics:
- What equipment do you have access to?
- What space will you use? (Classroom, hall, outdoor?)
- Will you work with whole class, groups, or individuals?
- What support will be available? (Classroom teacher, teaching assistant?)
Learning focus:
- Which CT skills will you prioritise?
- What curriculum connections will you make?
- How will you differentiate for different learners?
- How will you assess children's learning?
Documentation for Your Reflective Diary¶
Throughout your fieldwork, maintain a reflective diary that captures:
- What you planned and why
- What actually happened during sessions
- Evidence of children's learning (observations, quotes, work samples)
- What worked well and why you think so
- What you would change next time
- Your own professional learning from the experience
Useful documentation methods:
- Photographs (with appropriate permissions)
- Voice recordings of your reflections
- Children's work samples
- Field notes written during or immediately after sessions
- Video clips (if permitted by your setting)
2.8 Reflective Practice¶
What Makes a Successful Physical Computing Lesson?¶
Successful lessons are characterised by:
High engagement: Children are focused, motivated, and reluctant to stop. They talk enthusiastically about what they're doing.
Visible thinking: You can see and hear computational thinking in action—children decomposing problems, testing hypotheses, debugging solutions.
Meaningful learning: Children can articulate what they've learned, not just what they've done. The activity connects to bigger ideas.
Appropriate challenge: Most children experience productive struggle—the task is hard enough to be interesting but achievable with effort.
Inclusive participation: All children are engaged, not just the most confident or interested. Different routes to success are available.
Questions for Reflection¶
After teaching a physical computing lesson, consider:
About the children:
- What evidence did I see of computational thinking?
- Which children thrived? Which struggled? Why?
- What misconceptions emerged?
- What questions did children ask?
About the lesson design:
- Was the task pitched at the right level?
- Was there enough time for meaningful exploration?
- Did the differentiation work?
- What would I change for next time?
About my teaching:
- How effectively did I explain concepts?
- When did I intervene? When did I step back?
- What questions did I ask?
- How did I respond to children who were stuck?
About the tools:
- Did the technology work reliably?
- Was the tool appropriate for this learning?
- What would I do differently with this tool next time?
The Improvement Cycle¶
Effective practitioners engage in continuous improvement:
- Plan with clear learning intentions
- Teach with attention to what's happening
- Reflect on what worked and what didn't
- Adjust based on reflection
- Re-teach with improvements
This cycle applies to individual lessons, to your use of particular tools, and to your overall approach to teaching physical computing.
2.9 Resources¶
Lesson Planning¶
- Teach Computing Curriculum (UK-based, freely accessible): teachcomputing.org — Free lesson plans and resources adaptable for Malta
- Raspberry Pi Foundation: raspberrypi.org/teach — Resources for physical computing
- Barefoot Computing: barefootcomputing.org — CT resources for primary teachers
- Bebras Malta: bebras.computationalthinking.mt — Computational thinking challenges for Maltese schools
Assessment¶
- Computing at School: computingatschool.org.uk — Community resources including assessment guidance
- Teach Computing Assessment Resources: teachcomputing.org/courses — CPD on assessing CT in primary (UK-focused but applicable)
Research¶
-
Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. Proceedings of AERA 2012. Free PDF
-
Resnick, M. (2017). Lifelong Kindergarten: Cultivating Creativity through Projects, Passion, Peers, and Play. MIT Press.
-
Kalelioglu, F., & Sentance, S. (2020). Teaching with physical computing in school: The case of the micro:bit. Education and Information Technologies, 25, 2577–2603. DOI: 10.1007/s10639-019-10080-8
2.10 Summary¶
Designing and assessing physical computing lessons requires attention to principles, planning, and practice. The key ideas from this section:
-
Low floor, high ceiling, wide walls: Design activities that are accessible to beginners, allow for sophisticated development, and support diverse interests and approaches.
-
Start with learning outcomes: Be clear about what CT skills and subject knowledge you want children to develop, then choose tools and activities to serve those outcomes.
-
Integrate with curriculum: Physical computing is most powerful when it connects meaningfully to other areas of learning, not as an isolated "computing" activity.
-
Emphasise formative assessment: Observation, questioning, and discussion reveal more about children's computational thinking than formal tests. Look for evidence in process as well as product.
-
Use rubrics thoughtfully: Well-designed rubrics clarify expectations and support consistent assessment, but should allow for diverse outcomes and focus on thinking, not just technical skills.
-
Manage practical sessions carefully: Preparation, clear routines, and troubleshooting strategies make hands-on sessions more effective and less stressful.
-
Reflect and improve: Each lesson teaches you something. Capture that learning through reflection and use it to improve future practice.
As you move through the following sections, you will explore specific tools—robots, tangible interfaces, microcontrollers, and mechanical logic toys. After each section, you will be expected to try out one of the tools covered in your own classroom, applying the frameworks from this section to plan, deliver, and reflect on your lessons.
Ready to continue? Head to Section 3: Robots in the Classroom to explore floor robots and how to use them with your learners.