Skip to main content

Build Your Own Neural Network with micro:bits: An AI Challenge for Makers Young and not so Young

Image of a microbit based neural network in tinkercad



Artificial Intelligence (AI) is everywhere—recommending videos, recognising faces, and even helping cars drive themselves. But what actually powers these systems?

At the heart of many AI systems is something called a neural network. And here’s the exciting part: you don’t need a supercomputer to explore one.

👉 In this challenge, you’ll build a working neural network using micro:bits—and see how AI works from the inside.


🔍 What Is a Neural Network?

A neural network is a system made of connected “neurons” that pass information to each other.

It’s usually organised into layers:

  • Input layer → receives data
  • Hidden layer → processes information
  • Output layer → produces a result

One important detail: the input layer doesn’t actually process anything—it just passes signals forward.

If you want a deeper explanation, this post breaks it down clearly:
👉 https://robotsandphysicalcomputing.blogspot.com/2021/02/explaining-tinkercad-microbit-neural.html


⚡ Make It Physical with micro:bits

Instead of just talking about neural networks, we can build one.

In this project:

  • Each micro:bit acts as a neuron
  • Switches act as inputs (on/off signals)
  • micro:bits communicate to pass signals forward

This turns an abstract AI idea into something you can see, touch, and debug.

To understand how a single neuron works using a micro:bit, start here:
👉 https://robotsandphysicalcomputing.blogspot.com/2021/01/tinkercad-and-microbit-to-make-neuron.html


🧩 The Challenge: Build a Neural Network That Thinks

Now for the real challenge—connecting multiple micro:bits to create a simple neural network.

This network solves a classic problem in computing called XOR.

What is XOR?

It’s a rule:

  • TRUE if one input is ON
  • FALSE if both are the same
Input AInput BOutput
000
101
011
110

This is surprisingly tricky—it can’t be solved with just one neuron.

👉 That’s why we need a network.

Follow this build to create your own:
👉 https://robotsandphysicalcomputing.blogspot.com/2021/02/making-neural-network-in-tinkercad-from.html


⚙️ How the Network “Learns”

Your network works using three key ideas:

  • Weights (w1, w2) → how strongly inputs affect the neuron
  • Bias (w0) → a threshold that shifts decisions
  • Activation → deciding whether the neuron outputs TRUE or FALSE

By changing weights and bias, you change how the network behaves.

👉 Try this:

  • Adjust a weight
  • Test different inputs
  • Watch the output change

That’s essentially how real AI systems are trained—just on a much bigger scale.


🤖 Why This Matters

What you’ve built is a physical model of AI decision-making.

Real neural networks:

  • Recognise speech
  • Detect objects in images
  • Recommend content

They all rely on the same principles:
inputs → weighted decisions → outputs

You’ve just recreated that process using simple hardware.


🔧 Now It’s Your Turn to Tinker

Don’t stop at just making it work—start experimenting:

  • What happens if you change the weights?
  • Can you break the network?
  • Can you redesign it to solve a different problem?

This is where real learning happens—not just following instructions, but playing with the system.


🚀 Your Challenge

You’ve just built a neural network.

Now take it further.

👉 Can you improve it?
👉 Can you make it more reliable?
👉 Can you explain how it works to someone else?

AI isn’t just something you use—it’s something you can build, explore, and question.


💬 Share what you create in the comments—and tell us what you changed, improved, or discovered.

Because the best way to understand AI… is to make it yourself.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Comments

Popular posts from this blog

Robot Software

In the previous blog posts for this 'series' "It is a good time...."  Post 1  looked at the hardware unpinning some of this positive rise in robots; Post 2  looked at social robots; Post 3  looked at a collection of small robots; Post 4 looked at further examples of small robots Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see:  http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/ ). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit ).  This post contin ues with some of the software being used with the smaller robots.  A number ...

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language , with support Dale from Machine Learning for Kids , that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the  machinelearningforkids.co.uk   site anyway it is full of exciting things to do (for example loads of activities  https://machinelearningforkids.co.uk/#!/worksheets  ) . Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with. The critical thing for this post is  Machine Learning for Kids  have created a Scratch 3 template with their own extensions for Scratch 3 within it  https://machinelearningforkids.co.uk/scratch3/ . One of which is a Speech to Text extension (see below). You must use this one ...

Scratch and web-cams in Scratch 3

Scratch 3 was launched on 2nd January 2019, so I wanted to know would Webcams still work with Scratch 3 as it did with Scratch 2. For example, in a previous post  Scratch, webcams, cats and explosions  the cat (Scratch) moved across the screen and a button burst when the object moved in the camera onto it.  Can the same thing be done in Scratch 3? The short answer is yes, but it is done slightly differently. The first change the video capture is not there in the blocks automatically; but is an extension that needs to be added. First, you need to add the extension blocks for video sensing. Go to the little icon at the bottom left of the screen (as shown below) this takes you to the extensions menu. Next, find the Video Sensing option and selected. The webcam, if enabled, with start automatically. A video sensing set of blocks is now in the list of block options.  The rest is very similar to doing this in Scratch 2. Moving ...