Skip to main content

Ramblings about Social Robotics in Schools

Sometimes what I do as job can have some major personal pluses (I get to play with robots some of the time), one of these has been the opportunity to introduce people to social robots, and recently I have been lucky enough to managed to do this four times-  twice to my own computing students, but also to groups of primary school children in two events (see below). 

Apart from it's what I enjoyed doing; the social robots we are starting to see are great, but there is so much more that could be done. Who is going to develop this - possibly one of these children? Why not? It has taken nearly 40 years to get from R2D2 on the screen to some of the social robots we are seeing launched now, in another 40 years we might have something as bright as R2D2 (R2D2 was always brighter than C3PO). Why wouldn't one or more of these bright children or one of the students I teach, be the ones to contribute to this? They have the enthusiasm, with the changes in the National Curriculum in the UK they are developing some of the skills and asking the questions. Look at the work that work being done by Pi Foundation, the CamJam EduKit 3 robot kit ( and especially products such as the OhBot (see bottom of the post for details of this robot) as just as a few examples of how this is being developed.

Event 1.

In an in-reach STEAM activity day I have had the opportunity to show off two NAO robots in action to a group of 8-years olds. As well presenting a short presentation on social robots (see below). By the way Red and Smurf are the nicknames for the two robots.

Event 2. 

A talk on Social Robotics (with a little help from a Red friend) to an audience of primary school children as part of Lab_13's Lectures at Wollaston School, Northamptonshire. 

Red performed, walking with three of the children and the presentation included discussion about the robots JIBO and Buddy that are expected this year. 

Changes need to the presentation

A change I would like to make is to bring along an OhBot (a bit like the one in the video below) as well as including OhBot in the revised presentation slides.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Popular posts from this blog

Micro:bit, Servo control with Micropython or blocks

You can control servos (small ones) from a Micro:Bit directly. Following a link from the David Whale (Twitter ) , thank you, took me to a Kitronik blog post,, which has the answer.

The code uses Microsoft Blocks taken from the post, runs the servos 180 degrees and back again, when button A is pressed. It does exactly what it should. I am also using the Tower Pro SG90 servo.
Can it be replicated in Micropython? This is a new mini project, there seems to be little out there yet on how do this but the best so far is this video by PHILG2864:

The closest I have is the following, it is essentially there.
from microbit import *
while True:

Setting the time period to 20ms  pin0.set_analog_period(20)seems by experiment (and used in the video above) to be best value so far. The reason for pin0.write_analog(1)  set to 1 i…

4Tronix Bit:Bot Neuron Controlled Edge follower

In thelast post I was playing with 4Tronix'sBit:Bot. In this post I will show the initial experimentation with an artificial neuron controlling the Bit:Bot to follow the edge of a line (it follows the left-hand side of the line).

The neurons (well two separate ones, S1 and S2) are produced using weighted sums - summing the weights x inputs [ right-hand sensor (rs) and left-hand sensor (ls)] plus a bias for each neuron in this case w[0] and w[3].

    net=w[0]+w[1]*rs+w[2]*ls           net2=w[3]+w[4]*rs+w[5]*ls

  If weighted sum >=0 then its output 1 otherwise 0 if net>=0:          s1=1     else:         s1=0
    if net2>=0:         s2=1     else:         s2=0
What actual causes S1 to be either 1 or 0 is all defined by a set of weights w (three for the first neurone, S1,  three for S2).

Converting the outputs of the two neurones S1 and S2 into actions is shown below.

my robot BETT2017

I will start with a confession, I only had about 2 1/2 hours at BETT 2017 due to external time pressures so to say I didn't yet a chance for a good (or even a bad) look around is an understatement; so I am not reviewing the show just a few notes on what I did manage to see.

STEAM Village
First and mostly, it was great to talk to so many people, only few I had met face to face previously, about robots, micro:bits, Raspberry Pis and coding. Most of this happen in the relatively small (compared to the event space) STEAM village and nearby stalls. It was great to see the strong presence of both Raspberry Pi and Micro:Bit Foundation, along the variety of different activities and example usage of both, with Code Club (I know it is part of Raspberry Pi Foundation) there was well. This was all alongside some other companies

Four of these stuck in my mind.

1. DFRobot ( with their range of Arduino-based robots and non-programmable kits. The two kits that caught my eye w…