Skip to main content

From Code to Cables: Building a Physical Neural Network with micro:bits


Artificial Intelligence often feels like "magic" happening inside a powerful computer. But at its core, a neural network is just a series of mathematical decisions. In my previous posts, we looked at how to simulate these decisions in Python. Today, we are taking that "brain" out of the computer and building it in the real world using three BBC micro:bits.

In this project, each micro:bit acts as a single neuron. By wiring them together, we create a physical network capable of logic and decision-making.


The Math: How a Neuron "Thinks"

Every neuron in our network follows a simple linear formula to decide whether or not to "fire" (send a signal):

Maths behind neurons
Figure 1 - the maths

If the result Net >=0  the neuron fires (Output = 1). If it’s less than 0, it stays at Output = 0.


Step 1: The Logic Gate "Cheat Sheet"

Before we flash the code, we need to decide what we want our neurons to do. By changing the weights and bias, we can turn a micro:bit into different types of logic gates. Use this table to program your "Input Neurons":

Desired LogicBias (w0​)Weight 1 (w1​)Weight 2 (w2​)Behavior
AND-1.511Only fires if BOTH inputs are 1.
OR-0.511Fires if AT LEAST one input is 1.
NOT0.5-10Inverts the input (1 becomes 0).

Step 2: The Hardware Setup

To build the network, you will need three micro:bits. We will use two as "Input Neurons" and one as the "Output Neuron."

circuit diagram for network of neurons to do XOR
Figure 2



The Wiring (Referencing Figure 2):

  • Input Neurons: Connect your sensors (or simple switches) to Pin 0 and Pin 1. Their result will be sent out via Pin 2.

  • Output Neuron: Connect Pin 2 of the first two micro:bits to Pin 0 and Pin 1 of the third micro:bit.

  • The Golden Rule: You must connect the GND (Ground) pins of all three micro:bits together. Without a common ground, the digital signals will be "floating" and the network will behave randomly!


Step 3: The Code (MicroPython)

Flash this code onto your micro:bits. You only need to change the W array (the weights) based on the cheat sheet above.

Python
The Inputs neurons
Neuron 1:
from microbit import *

W=[-1,-1,1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")

        pin2.write_digital(0)


Neuron 2
from microbit import *

W=[-1,1,-1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")

        pin2.write_digital(0)



Output Neuron.
Feeding the inputs from Neuron 1 and Neuron 2 as inputs
from microbit import *

W=[-1,1,1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")
        pin2.write_digital(0)

Taking it Further

What we’ve built here is a Pre-trained Inference Network. We did the "learning" on a PC to find the right weights, and then hard-coded them into the micro:bits.

The "Smart Gate" Challenge:

Instead of just showing "T" or "F" on the screen, try connecting a servo motor to the Output Neuron. Can you configure the weights so that the gate only opens when two specific safety sensors (connected to the input neurons) are triggered simultaneously?

The Next Frontier: Machine Learning

This network is currently "static"—it doesn't learn from its mistakes. A great follow-up project would be to write code that allows the micro:bit to adjust its own weights. For example, if you press Button A, the weights increase; if you press Button B, they decrease. This would move us from simple logic gates to a truly learning physical system!


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Comments

Popular posts from this blog

Robot Software

In the previous blog posts for this 'series' "It is a good time...."  Post 1  looked at the hardware unpinning some of this positive rise in robots; Post 2  looked at social robots; Post 3  looked at a collection of small robots; Post 4 looked at further examples of small robots Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see:  http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/ ). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit ).  This post contin ues with some of the software being used with the smaller robots.  A number ...

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language , with support Dale from Machine Learning for Kids , that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the  machinelearningforkids.co.uk   site anyway it is full of exciting things to do (for example loads of activities  https://machinelearningforkids.co.uk/#!/worksheets  ) . Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with. The critical thing for this post is  Machine Learning for Kids  have created a Scratch 3 template with their own extensions for Scratch 3 within it  https://machinelearningforkids.co.uk/scratch3/ . One of which is a Speech to Text extension (see below). You must use this one ...

Scratch and web-cams in Scratch 3

Scratch 3 was launched on 2nd January 2019, so I wanted to know would Webcams still work with Scratch 3 as it did with Scratch 2. For example, in a previous post  Scratch, webcams, cats and explosions  the cat (Scratch) moved across the screen and a button burst when the object moved in the camera onto it.  Can the same thing be done in Scratch 3? The short answer is yes, but it is done slightly differently. The first change the video capture is not there in the blocks automatically; but is an extension that needs to be added. First, you need to add the extension blocks for video sensing. Go to the little icon at the bottom left of the screen (as shown below) this takes you to the extensions menu. Next, find the Video Sensing option and selected. The webcam, if enabled, with start automatically. A video sensing set of blocks is now in the list of block options.  The rest is very similar to doing this in Scratch 2. Moving ...