From Motion to Meaning: Detect your nose in Scratch 3
When I first wrote about Scratch and Webcams back in 2019, the excitement was centred on the "New Frontier" of Scratch 3.0. We were exploring how to bring back the classic video sensing features—making cats jump or bubbles burst when we waved our hands. At the time, that was the peak of "Magic" in the classroom: the physical world interacting with the digital one.
Fast forward to today, and the landscape for STEM educators and makers has shifted. We are no longer satisfied with the computer merely knowing that something moved; we want the computer to know what is moving We have moved from simple motion detection to Artificial Intelligence and Computer Vision.
If you want to re-energize your STEM lab or maker space, it’s time to move beyond the basic Video Sensing blocks and dive into the Face Sensing extension, So here is an example project playing with htese ideas.
The Project: The Nose-Tracking Security "Mask"
The goal of this project is to create a digital lock. When no one is present, a "Green Button" sits idle on the screen. But the moment a face is detected, the AI springs into action, switches to a "Red Button," and physically tracks your nose as you move.
Think of it as a biometric "Privacy Shield"—the computer recognises you and immediately places a digital mask over your face.
Step 1: Create the buttons
You will need two costumes for your "Lock" sprite:
Button 1: The standard button from the Scratch library (Green).
Button 2: A modified version with the centre filled in (Red).
Button2 just has the centre filled in, button 1 is the standard one from the library of sprites.
Step 2: The Idle State
Start with the camera closed or covered. We want the program to behave like a standard screensaver when no one is around, placing the green button randomly on the screen to show the system is "Ready."
Step 3: Adding the AI Magic
To level up, you need to add the Face Sensing extension (found in the Extensions menu in the bottom-left corner). Once added, you’ll see a brand-new set of blocks that can detect eyes, ears, and—most importantly for us—the nose.
We are going to use logic that continually checks the environment:
If a face is detected: Switch to the Red Button (Button 2) and use the
go to [nose]block.If no face is detected: Switch back to the Green Button and return to the random "idle" movement.
Using the face sensing extension
We have a number of new blocks to play with
We are going to use the logic that the program continuously checks
- if the face is detected, then go to the nose and change the button to the red burron (Button2) and then place it on the nose - check tracking the nose;
-if there is not a face detected, then switch to the green button
The "Aha!" Moment
What was the "wow" moment for me? It was the precision.
Using such a simple set of blocks, the tracking is incredibly robust. Even when I moved my head quickly or tilted to the side, the AI kept that red button pinned exactly to my nose. In 2019, this would have required complex coding; in 2026, it’s a three-block script. It proves to students that AI isn't "magic"—it's just very clever math that they can control
Where can we go next?
The possibilities for makers are endless:
The Biometric Password: Create a screen that only hides or reveals a "secret" message once a face is detected.
Physical Integration: Connect a Micro:bit. When the nose is detected, have a physical servo motor move a latch or ring a bell.
More AI: Go further by using
to train the computer to recognise specific faces or objects.Machine Learning for Kids
By building this, students aren't just consumers of AI technology—they are the architects of it.
What will you build with Face Sensing? Let me know in the comments!
Comments
Post a Comment