Monday, 9 October 2017

Sphero Mini - fun-sized Sphero

The Sphero Mini has been recently launched and at the time of writing seems to be popular, delivery times vary depending on the colour you select. There are some very good reasons for it to be popular.

It's a ping-pong sized ball, much smaller than the normal Sphero's and approximately half the price. Though slightly less featured, what it has got, is good. 

In the box, you get the sphero-mini, a charging cable, as well as packet of mini-skittles and cones. There is no charging station for this one, you charge it taking the outer shell off (it splits into two pieces) and the socket is on the ball inside. Actually, it is quite interesting to see the insides (I know the Spheros SPRK+ have a transparent shell to show the innards) and handle the ball inside, trying to see what it does. The ball is a little slower than the other Spheros, but this is not a problem in my opinion.

There is an app for playing the games, and this is fun to play with; providing control via the app to make the ball move and many features I have yet to explore. Alternatively, the SpheroEdu app, running on a tablet, allows programming via blocks or JavaScript. Had to have a go with writing a JavaScript routine.

The routine just makes the robot go in a square and not really much of a test, but a start. I especially liked the visual display of the sphero-mini's route on the tablet's screen.

I think this little robot, the price is not bad, though less featured than their bigger 'sibling' Spheros it is still good. I would like to see support for Swift in the future but it is early days for this kit and the skittles and cones are a good touch.

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Sunday, 3 September 2017

LittleBits Star Wars Droid and Swift Playgrounds

On the 1st September 2017, during 'Force Friday', LittleBits launched their new kit Star Wars Droid Inventor Kit an R2D2 shaped robot, though you are encouraged to customise it to form your own designs. It comes a number of tutorials, that take you through building a moving head, a proximity sensor to move away from you, and many others.

So confession time, I am not the target audience for this kit, but I have enjoyed playing with it. The tutorials take you through building and dismantling the kit, doing a range of different activities and in most cases controlling it remotely from an iPad. You can even record your voice and have it played back from the Droid, in my opinion, the wide of sounds is one of the things that lift this from being just a nice kit -  I will get onto the other one soon. Though good fun, I was left with a question can it be programmed?

This was my first time using a LittleBits kit I didn't know what the options were available for programming it, a quick search around online was not initially that helpful it looked like this was not an obvious way forward. 

Now for the good news on the LittleBit site there is a relatively easy solution to this see on an iPad. Swift Playgrounds can be used to program it, you need to download the separate playground after the Swift Playgrounds App (if you haven't already got it) is installed, After that it is follow the tutorials.

 Below is an example part of the code, I tried to make the Droid dance and makes some noises.

Below the video of the code in action.

The combination of Swift Playgrounds and Star Wars Droid Inventor Kit works well, perhaps you can't go wrong with R2D2. I would suggest the text from the website about linking to Swift Playgrounds should be included with the box and it would be nice to see some more commands. Apart from these very minor criticisms the kit and Swift Playgrounds is a nice combination (and the kit even comes with battery included), well worth a look and extending it to a wider age range. 

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Wednesday, 30 August 2017

Robots, fruit and computer coding

Taken from: by University of Northampton Press Team

A select group of sixth-form pupils has spent the summer working on a series of research projects at the University of Northampton.
Four pupils from Northampton and Kettering schools undertook projects involving robotics, coding and urban orchards after each were awarded a Nuffield Research Placement – a scheme which offers sixth formers the chance to work on university research projects.
David Obreja, from Northampton School for Boys, spent his time at the University researching the amount of fruit-bearing plants and trees in areas of Northampton, and mapping them on a computer.
He said: “We need more fruit-bearing plants and trees in the town to provide food for wildlife, encourage biodiversity and improve the aesthetics of urban areas. I hope my research might encourage residents to plant more species, which would also have a social benefit as people may take it up as a hobby. Growing their own produce will also increase their food security.
“The research might also inspire housing developers to consider the mix of planting on their new developments, by including more fruit-bearing plants, rather than planting that is purely architectural.”
Fellow Northampton School for Boys pupil, Nathaniel Roberts, spent his time at the University making a junkbot – a robot made from waste items, such as a drinks can, powered by a kit.
He then coded the junkbot to make it move and used his experiences to produce an instruction manual for junkbot beginners.
Nathaniel said: “The project has taught me how to manage myself, set aims and work towards completing them, and given me an insight into how to write a technical manual in language that’s informative and easy to understand.”
Michael Welsh, from Kettering Buccleuch Academy, spent the summer programming a robot to learn how to follow a straight line, using sensors.
He said: “It works as a concept, but pretty difficult to get it right in practice, and has meant lots of methodical research. I’d like a career in computer science and this project has been pretty intense and demonstrated things can be a lot more work than you initially assume, but I like a challenge.”
Hiren Mistry, from Caroline Chisholm School in Northampton, learnt how to program a robot before creating a set of tasks for Year 7 pupils to do the same.
He said: “It’s been fairly tricky, with a lot of research and testing, and has given me a real insight into computer science.”
The University’s Associate Professor in Computing and Immersive Technologies, Dr Scott Turner, said: “The Nuffield placements have been incredibly successful. The students were all set very challenging university level projects and have enjoyed a real taste of the work we do here.
“I was very impressed with their commitment and aptitude for their subjects and am sure they will all go on to be a success at university, should they choose to go down that route.”

Related Links to their work.

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Thursday, 24 August 2017

Robot control by a neuron.

This year the Computing team has been fortunate enough to host three Nuffield Research Placement students ( all working with Dr Scott Turner.

Michael Welsh
Michael has been working on using a micro:bit based bitbot from 4tronix to produce a potential teaching tool; an example of artificial neurons used control a robot. The aim is for this tool to be used with 3rd-year Undergraduates, as part of a module on Artificial Intelligence.

Michael's solution was to use the computer to run and train a single neuron; then for the robot to send values from the line sensors back to the program running on a Computer and receive control signals. 

Sounds easy? No really, in the end, the software running on the computer had to also send and receive the data through a microbit (via USB) and then use radio to communicate with the bit:bot robot. All the various developed parts of the solution were implemented in Python by Michael.

Example of the code.

import serial
from math import fabs
import random
import sys
import glob

#TO DO: GUI, implement manual mode. Make manual mode a button?? When in automatic, weight boxes are greyed out. Otherwise, they are able to be typed into.

def forward(n):
def backward(n):
def turnR(n):
def turnL(n):
def end():

def serial_ports():
    """ Lists serial port names

        :raises EnvironmentError:
            On unsupported or unknown platforms
            A list of the serial ports available on the system
    if sys.platform.startswith('win'):
        ports = ['COM%s' % (i + 1) for i in range(256)]
    elif sys.platform.startswith('linux') or sys.platform.startswith('cygwin'):
        # this excludes your current terminal "/dev/tty"
        ports = glob.glob('/dev/tty[A-Za-z]*')
    elif sys.platform.startswith('darwin'):
        ports = glob.glob('/dev/cu.*')
        raise EnvironmentError('Unsupported platform')

    result = []
    for port in ports:
            s = serial.Serial(port)
            if '/dev/cu' in port:
                if '/dev/cu.usbmodem' in port:
        except (OSError, serial.SerialException):
    return result

class neuron:

    def __init__(self,ins):

        self.inputs = [1]

        self.weights = []

        for i in range(ins+1):

        self.LC = 0.005
    def get_output(self,inputs):
        self.inputs = [1]+inputs

        WSum = 0

        for i in range(len(self.inputs)):
            WSum += int(self.inputs[i]) * self.weights[i]

        sums = sorted([fabs(WSum), fabs(WSum - (1/3)), fabs(WSum-(2/3)), fabs(WSum-1)])
        if sums[0] == fabs(WSum): #return 0 - move backwards
            return 'B',WSum
        elif sums[0] == fabs(WSum - (1/3)):
            return 'R', WSum
        elif sums[0] == fabs(WSum - (2/3)):
            return 'L', WSum
            return 'F', WSum

    def train(self,inputs,desired):

        result = self.get_output(inputs)[1]
        error = desired - result

        for w in range(len(self.weights)):

            change = self.LC * int(self.inputs[w]) * error

            print('change in weight ' + str(w) + ': ' + str(change))

            self.weights[w] += change

for i in serial_ports():
        microbit = serial.Serial(port=i, baudrate=115200)


control = neuron(2)

mode = input('Automatic or Manual? (A/M)\n')
while mode.lower() not in ['a','m']:
    mode = input('Automatic or Manual? (A/M)\n')
if mode == 'm':
    control.weights[0],control.weights[1],control.weights[2] = int(input('Enter weight 0/bias: ')),int(input('Enter weight 1: ')), int(input('Enter weight 2: '))
while True:
    #microbit.write(input('input: ').encode('utf-8'))
    out =
        out = out.decode()
    for i in range(len(out)):
        if str(out[i]) in ['0','1']:
            if i == 0:
                if out[1] == ']':
                    num2 = out[i]
                elif out[1] == ',':
                    num1 = out[i]
            elif i == 5:
                if out[4] == '[':
                    num1 = out[i]
                elif out[4] == ' ':
                    num2 = out[i]
                if out[i-1] == '[':
                    num1 = out[i]
                elif out[i-1] == ' ':
                    num2 = out[i]
                elif out[i+1] == ']':
                    num2 = out[i]
                elif out[i+1] == ',':
                    num1 = out[i]
        print([sleft, sright])
        if mode == 'a':
            if [sleft, sright] == [0, 0]:
            elif [sleft, sright] == [0, 1]:
                control.train([sleft,sright], 1/3)
            elif [sleft,sright] == [1, 0]:
                control.train([sleft,sright], 2/3)
                control.train([sleft,sright], 1)
            if [sleft, sright] == [0, 0]:
            elif [sleft, sright] == [0, 1]:
            elif [sleft,sright] == [1, 0]:

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Monday, 14 August 2017

Crumbly Toilet Roll Junkbot

Guest Blogger Nathaniel Roberts, Nuffield Research Placement Student working at the University of Northampton. Nuffield Research Placement scheme provides students in their first year of a post16 course to work with STEM professionals

Toilet Roll Junkbot
Nathaniel Roberts

Cut a slit in a toilet roll.
Cut another toilet roll in half, then arrange the pieces in a T.
Feed the bottom of the T into the slit, and tape together.

Cut two lines into the edge, and do the same on the opposite side of the circle. Copy this for the other side of the tube.
These geared motors from the Camjam EduKit 3 should fit into those gaps.

Stick a battery pack to the back, and use crocodile clips to wire it all up to a crumble. The battery pack can connect to the + and - on either side at the top of the crumble, and the motors connect to the + and - of their respective sides of the crumble.

Using a Micro USB to USB cable, the crumble can be plugged into a computer and can run code from the crumble software ( ).
This example code would make the junkbot move forward for a second, turn either left or right, then move forward for another second before stopping.

The back of the bot was extremely heavy (with all the batteries). The front wasn’t heavy enough to grip most surfaces very well so the wheels would often spin without the bot moving. Also, the toilet rolls were only just strong enough for the job. After a few uses, especially around the motors, they started bending. Eventually, the motors ended up 'wonky' despite attempts at fixing it.

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Thursday, 10 August 2017

Minecraft controlled Raspberry Pi Robot Arm

Guest Blogger Hiren Mistry, Nuffield Research Placement Student working at the University of Northampton. Nuffield Research Placement scheme provides students in their first year of a post16 course to work with STEM professionals

How to use Minecraft to move a USB Robot Arm on a Raspberry Pi
By Hiren Mistry
Currently, I am taking part in a Nuffield Foundation Research Placement at the University of Northampton, researching how to control a CBIS Robot Arm using a Raspberry Pi. This part of the project aims to learn more about using Python in Minecraft and how I can control the Robot Arm using Minecraft. When the player hits a specific block, the Robot Arm will move in a specific way.
Minecraft is an adventure game where players are put in a randomly-generated world and can create their own structures and contraptions out of textured cubes.
Here are the main commands used in Python for Minecraft Pi:
from mcpi.minecraft import Minecraft
Import modules from Minecraft folder
import minecraft.block as block
Imports Block Module
Used for creating a connection, modifying players and blocks and capturing events with the Minecraft World. This command uses the default address and port.
Return current position of player (as a float)
Post message in brackets to Minecraft Chat
Sets block (id) at position x, y, z
Moves player to position x, y, z
blockEvents =
for blockEvent in blockEvents:
This returns any block event hits since the last time the command was run.
Returns the position of the block that was hit

Before developing the main program which controls the Robot Arm, I wrote some basic programs which enabled me to learn more about how Python interacts with Minecraft Pi. One of the programs I created leaves a trails of blocks behind the player. Below is the code with explanations for each line. It works by using a While loop and continually checks to know if the player has moved. If the player has moved then a block will be placed. The block number (38) can be changed. 

For a list of block numbers and more information, go to

import mcpi.minecraft as minecraft #imports minecraft modules
import mcpi.block as blocksw #imports block modules
import time #imports time module needed for time.sleep()
mc = minecraft.Minecraft.create()#creates connection to mc game
while True:#Starts while loop
    pos = mc.player.getTilePos()#Gets position of player
    time.sleep(0.1)#Sleeps for 0.1 seconds
    newpos = mc.player.getTilePos()#Gets new position of player
    if newpos != pos: #if the position has changed
        mc.setBlock(pos,38)#Set Block 38 down on previous block

Robot Arm Program in Minecraft
Whilst researching into Minecraft Pi I had the concept that the user would press a button in Minecraft and a Minecraft ‘Sign Post’ would allow the user to know what that Minecraft button does. However, due to Minecraft Pi limitations Buttons and Signposts (with text) do not exist so therefore they cannot be used. As a solution, I produced a model of the Robot Arm in Minecraft (See Picture below), which would enable the user to move specific parts of the Robot Arm.

The torches are the blocks which will activate the Robot Arm movements (Except for the Grip Open/Close and the Rotate Clockwise/Anticlockwise blocks).
Limitations: As only one block can be hit at once, only one command can be out put to the Robot Arm. Therefore, multiple commands are not available.
An easier way of storing data, such as the co-ordinates of each block, was to use a dictionary. A dictionary contains keys which are assigned to values. For example, Dictionary = {"Key1":Value1, "Key2":Value2}would store two keys where "Key1" is assigned to Value1 and "Key2" is assigned to Value2.
A more efficient way of storing the commands and co-ordinates would be to store the commands in another dictionary with the same key name. E.g.
Coordinates = {"Shoulder Up":[77, 67, 1], "Shoulder Down":[74,67,1]}
Commands = {"Shoulder Up":ShoulderUp(), "Shoulder Down":ShoulderDown()}
I could then search through all the keys and if the coordinates of the block hit equalled the value of a key, then the corresponding command would be executed. Using code such as,
For k, v in Coordinates:
If BlockHit == v:
However, this code causes the an error (ValueError: too many values to unpack) as the Python code has to go through all the keys and values until a match is made. Therefore, I have decided to continue with the simpler code which uses many IF statements to decide which command to use.

Final Program

Below is the code which allows the player to hit blocks which control the Robot Arm.
Once Minecraft is running, press "TAB" and run the Python code and then switch back to the Minecraft Window.
Note: When hitting blocks in Minecraft, the sword has to be used to activate commands.
import mcpi.minecraft as minecraft #Imports Minecraft Module
import mcpi.block as blocksw #Imports Block Module
import time
from Maincommands import * #Imports Commands from file
mc = minecraft.Minecraft.create() # Creates connection to Minecraft
mc.postToChat("Robot Arm Control")#Post message to Minecraft chat
pos = mc.player.getTilePos()#Gets position of player

while True:
    blockHits = the block hit events
    if blockHits:# if a block has been hit
       for blockHit in blockHits:   # for each block that has been hit
            xyz = [blockHit.pos.x,blockHit.pos.y,blockHit.pos.z]
            Dict = {"Grip Open":[74,12,-70],"Grip Close":[72,12,-70], "Rotate AntiClockwise":[77,12,-63], "Rotate Clockwise":[77,12,-67],"Shoulder Up":[77, 13, -64], "Shoulder Down":[77, 13, -66], "Elbow Up":[77, 16, -64], "Elbow Down":[77, 16, -66], "Wrist Up":[77, 18, -64], "Wrist Down":[77, 18, -66], "Light On":[75, 19, -65], "Light Off":[76, 19, -65]}# Dictionary with Command and coordinates of block
            # Outputs command depending on block hit
            if xyz == Dict["Grip Open"]:
            elif xyz == Dict["Grip Close"]:
            elif xyz == Dict["Rotate Clockwise"]:
            elif xyz == Dict["Rotate AntiClockwise"]:
            elif xyz == Dict["Shoulder Up"]:
            elif xyz == Dict["Shoulder Down"]:
            elif xyz == Dict["Elbow Up"]:
            elif xyz == Dict["Elbow Down"]:
            elif xyz == Dict["Wrist Up"]:
            elif xyz == Dict["Wrist Down"]:
            elif xyz == Dict["Light On"]:
            elif xyz == Dict["Light Off"]:

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon