Skip to main content

NAO, chatbots, teaching and just plain showing off

I  managed to do something I have want to do for a long time, show off social robots; (now I have three goes at it) thanks to the recent purchase of NAO robots by the University of Northampton. 

Session 1
In an in-reach STEAM activity day I have had the opportunity to show the robots in action to a group of 8 years olds. As well presenting a short presentation on social robots.

I would be interested to hear how others are using NAOs via comments.

Session 2 and 3
I have managed to include a physical example of Social Robots into my teaching. The aim of the session was to teach about social AI, revolving around the using social cues, to a certain extent, using natural language through chatbots, for us to communicate with machines.

The robots were used as an example of a social robot,  the way we want to play with or work with them, without having to going through a steep learning curve on how to use them. Students were encouraged to consider why this was and that anthropomorphisation plays a part (NAO basically has some of the characteristics of a small child). The fact that it responds to voice commands, its looks, has a childlike voice, that it always moving (even slightly when standing) and the way it moves; were spotted by the group as ways it attracts us to it - it is really hard not to talk to it like a child sometimes (but perhaps that is just me).

This session also included the use of chatbots (one example, ALICE used is shown here) and AIML, Artificial Intelligence Markup Language, (a link to more about AIML is included below). Just as a bit of background, chatbots (also called variously, chatterbots, conversational agents, etc) are programs that hold a conversation with us using through either text or speech. The chatbots were used to show how we can create intelligent-like behave by in effect providing responses to questions. Followed by, how we then take this further by using the responses people give, while using the chatbot, to 'fine-tune' the model.

To read more about NAO robots go to
To read more on AIML go to

Example chatbots

If you would like to create your own chatbot personally I think one of the easiest ways to start is through

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Popular posts from this blog

Micro:bit, Servo control with Micropython or blocks

You can control servos (small ones) from a Micro:Bit directly. Following a link from the David Whale (Twitter ) , thank you, took me to a Kitronik blog post,, which has the answer.

The code uses Microsoft Blocks taken from the post, runs the servos 180 degrees and back again, when button A is pressed. It does exactly what it should. I am also using the Tower Pro SG90 servo.
Can it be replicated in Micropython? This is a new mini project, there seems to be little out there yet on how do this but the best so far is this video by PHILG2864:

The closest I have is the following, it is essentially there.
from microbit import *
while True:

Setting the time period to 20ms  pin0.set_analog_period(20)seems by experiment (and used in the video above) to be best value so far. The reason for pin0.write_analog(1)  set to 1 i…

mbots - graphical programming and Arduino

Makeblock ( funded through Kickstarter the development of a new robot - mBot ( with the subtitle "$49 educational robot for each kid". What they came up with is a interesting system that uses their mBlock software, which resembles Scratch but produces code for Arduino, to program a robot with LEDs, light sensors and buzzer integrated on the main board; but also comes with sensors for line-following, ultrasonic sensor and with the version in the kickstarter reward a 16x8 LED matrix.

My impression so far it is really quite intuitive to work with, in the example above the robot:

moves forward;displays 'f' on the LED matrix; turns right;displays 'r' on the LED matrix;repeats until the on-board is pressed to stop the motors. 

What I like most though is seeing the graphical code turned into Arduino code - the potential to see the same thing done into two ways…

4Tronix Bit:Bot Neuron Controlled Edge follower

In thelast post I was playing with 4Tronix'sBit:Bot. In this post I will show the initial experimentation with an artificial neuron controlling the Bit:Bot to follow the edge of a line (it follows the left-hand side of the line).

The neurons (well two separate ones, S1 and S2) are produced using weighted sums - summing the weights x inputs [ right-hand sensor (rs) and left-hand sensor (ls)] plus a bias for each neuron in this case w[0] and w[3].

    net=w[0]+w[1]*rs+w[2]*ls           net2=w[3]+w[4]*rs+w[5]*ls

  If weighted sum >=0 then its output 1 otherwise 0 if net>=0:          s1=1     else:         s1=0
    if net2>=0:         s2=1     else:         s2=0
What actual causes S1 to be either 1 or 0 is all defined by a set of weights w (three for the first neurone, S1,  three for S2).

Converting the outputs of the two neurones S1 and S2 into actions is shown below.