Most read posts on Robots and Physical Computing blog in September 2019

Popular Posts



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Moving Eyes with an Adafruit Adabox kit


One of the things I enjoy is a subscription to Adabox from Adafruit, receiving the box and very soon before that finding out what the main item is the box is a treat. The latest, at the time of writing, is AdaBox 13 with the MONSTER M4SK a set of small screens, based around 240x240 IPS TFT displays, acting as moving eyes in a mask, along with a couple of masks, speaker, microphone, lenses, etc and craft materials for decorating a mask - a full list can be found here.




My goal was, to play with this, was to create a slightly creepy mask where the eyes move - a simple build but fun to do. The Adafruit quick start guide for this (available here) provides all the instructions on setting it and downloading the different sets of eyes (that is really creepy to write). A set of different graphics files are already available for different sets of eyes;  a couple of examples are shown below. 








One suggestion is when you download the files for the eyes - still creepy - keep a copy elsewhere, in one case I accidentally delete the configuration file I want so it was handy to have a backup.  


Two masks are included in the box, being lazy, I choose the already decorated silvery mask to use. Along with the M4SK device, a set of lenses and holders are included. I struggled a bit with fixing the lenses into the holder and attach to the device - in the end, I choose to leave them out and stick the M4SK to the mask without them. A video of the mask in action is shown below.





There is much to explore with the Monster M4SK including a number of sensors already on the board including a capacitive touch sensor, buttons, a light sensor, a port for connecting a PDM microphone (included in the AdaBox 13). and three other ports. Sound effects can be played from a headphone socket or via a built-in amp driving an 8-ohm 1W speaker (also included in the Adabox). A Lipoly battery can be connected. A guide to getting going via the Arduino IDE is available here.



Related links






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

starting with NVIDA jetson nano




This is the third of a planned occasional series of posts on playing with some of the current AI specific boards for Intenet of Things (IoT). In the series, it is planned that will be some experiments with the Google Coral accelerator adapter and the Development Board; as well the NVIDIA Jetson Nano. In previous posts I started playing the Coral Accelerator adapter kit and the Coral Development Board.

This post looks a starting with the NVIDIA Jetson Nano Development Kit  which like the Coral Development Board is a small computer designed for running combined embedded and neural network applications. The processing power comes from a quad-core 64-bit ARM CPU and a 128-core integrated NVIDIA GPU (for more details see here)

So before we all get spooked; getting going is relatively easy, basically, follow https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit#intro. Following these instructions I would suggest if you are able to set-up a Raspberry Pi from a download Raspbian image this is not that different





What I wanted to do was attach a camera and grab an image. The board has a MIPI CSI-2 interface which means it should work with a Pi Camera Module V2, here I am using a Leopard Imaging 145FOV wide angle camera & ribbon cable because I had one nearby. A great site, and the one I am using here, for how to use Jetson Nanao and a camera is Jetson Nano + Raspberry Pi Camera which takes you through setting it and testing including the code below to test it. 





$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=0 ! 'video/x-raw,width=960, height=616' ! nvvidconv ! nvegltransform ! nveglglessink -e

The image below was grabbed using the Jetson and camera




I found it easier to get going with this board then the Coral development board (though I do like that as well) and I am looking forward to playing with this board more.


Related Links






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Coral Dev Board and Raspberry Pi

This is the second of a planned occasional series of posts on playing with some of the current AI specific add-on processors for Intenet of Things (IoT). In the series, it is planned that some experiments with the Google Coral adapter and the Development Board; as well the NVIDIA Jetson Nano will be shown. In the previous post I started playing with the Coral Accelerator with a Raspberry Pi https://robotsandphysicalcomputing.blogspot.com/2019/08/coral-accelerator-on-raspberry-pi.htmlThe Coral environment is related to Google's earlier AIY Edge Tensor Processing Unit (TPU) range https://aiyprojects.withgoogle.com/edge-tpu/ and designed to work with TensorFlow Lite.

In this post, the bigger sibling the Coral Development Board or Coral Dev Board is connected to a Raspberry Pi. The Coral Dev board is a single board Linux computer in its own right, running a derivative of Debian called Mendel. The Pi (in this case I used A Raspberry Pi 2 running Raspbian) is being used as a terminal and to set-up the system in the first place. How to set it up and use it, can be found at  https://coral.withgoogle.com/docs/dev-board/get-started/# - this is the best bet to follow. It was relatively easy to set up, if you follow the instructions (I made a few mistakes)and juggle having several terminals open. A few issues (probably mainly due to my lack of skill) I had during setting it up:


  • Spent while trying to set-up fastboot according to the instructions but the easiest way is sudo apt-get install fastboot
  • I needed to go into root to set up the udev and driver on the pi to flash the Development board.

Apart from that it wasn't too bad.

  

Image below ran on the Development board and was processed with a pre-trained machine learning model to recognise a parrot. 




Figure below shows 76% match to Scarlet Macaw



I am going to enjoy playing with this a bit more, using Pi as terminal, once it is all set-up, seems to work.

Another good source that expands on the use of this device is   https://medium.com/@aallan/hands-on-with-the-coral-dev-board-adbcc317b6af; giving more detail on the device and Python examples on using the Coral Dev board.


Related products, for the Development Board and AIY



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Top 10 posts on the Robots and Physical Computing Blog in August 2019


Popular Posts


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Coral Accelerator on a Raspberry Pi

This is the first of a planned occasional series of posts on playing with some of the current AI specific add-on processors for Intenet of Things (IoT). In the series, it is planned that some experiments with the Google Coral adapter and the Development Board; as well the NVIDIA Jetson Nano will be shown.

Why bother? Basic reason is I love playing with AI and hardware - so it is kind of fun. Another reason is AI, IoT and edge computing, are important and growing technologies, and I want to start getting my head around them a bit.

In this post, I look at starting to use Coral Accelerator with a Raspberry Pi. The Coral environment is related to Google's earlier AIY Edge Tensor Processing Unit (TPU) range https://aiyprojects.withgoogle.com/edge-tpu/ and designed to work with TensorFlow Lite.




Good place to start is Google's Get started with the USB Accelerator pretty much all you need to do to get going is in it, it also mentions Raspberry Pi. It makes a good point, if you are using Python 3.7 on Raspberry Pi, at the time of writing the TensorFlow Lite API is up to Python 3.5. Not a problem but just need to be aware of it and Get started with the USB Accelerator offers a solution.

The Coral site has a number of examples you can try out at https://coral.withgoogle.com/examples/ . If you do try the Face detection example within the Object Detection example  on a Raspberry Pi, you need to install feh to see the images; sudo apt-get install feh sorts this.


Some other good sources:
https://medium.com/@aallan/hands-on-with-the-coral-usb-accelerator-a37fcb323553
https://www.raspberrypi.org/magpi/teachable-machine-coral-usb-accelerator/





Related products, but for the Development Board and AIY

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Produce a Microbit python neural network 2: Building a Physical Microbit Neural Network

This is second in a two-post series on building a neural network using microbits with micropython. In the first post python was used to produce a neural network without the microbits. In this post the network is as shown in figure 1 is developed.

The figure below shows the arrangement of the connections to be built; pin 2 is the output of each neuron. The two micro:bits/neurons on the left of the picture taking in the two same inputs; the output from these neurons are the two inputs to the output neuron on the right.



figure 1

The micro:bit objects used in Figure 1 were produced using the micro:bit Fritzing diagram available at https://github.com/microbit-foundation/dev-docs/issues/36 thanks to David Whale (@whalleygeek ) for this.


The Inputs neurons
Neuron 1:
from microbit import *

W=[-1,-1,1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")

        pin2.write_digital(0)


Neuron 2
from microbit import *

W=[-1,1,-1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")

        pin2.write_digital(0)



Output Neuron.
Feeding the inputs from Neuron 1 and Neuron 2 as inputs
from microbit import *

W=[-1,1,1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")
        pin2.write_digital(0)




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Produce a Microbit python neural network 1: Without the microbit initially

These two posts are really extensions of a set of post in response to a question from Carl Simmons (@Activ8Thinking) concerning building a micro:bit simple neuron. In those, the Microsoft MakeCode was used.

In this two-post series is going to go through building neurones and neural networks in Python and by the end of the second post a python-based microbit neural network


  • Post 1 (this one) A single neuron and a simple neuron network will be produced in Python are produced.
  • Post 2 looks at producing a network of neurons, ie. neural network using the idea from post 1 but using a three microbits.; looking to solve the problem that a single neuron can't solve, making an Exclusive OR gate (XOR)



1. Overview and non-microbit neuron
1.1 Overview: 
The characteristics of the system will be:
  • Inputs are going to be binary
  • Weighted sum is bias+W1*input1+w2*input2
  • If weighted sum>=0 then the output is True (T on the LEDs) or '1'
  • If weighted sum<0 then the output is False (F on the LEDs) or '0

1.2. Single Neuron (without the Microbit)
Lets start without the microbit and build a single neuron in Python and useful exercise in it's own right just to see the mechanism. A class Neuron is produced and all possible combination of a two-input binary system are feed in. 

class Neuron:
    def __init__(self, input1, bias, w1, w2):
        self.input1 = input1
        self.bias = bias
        self.w1=w1
        self.w2=w2
    
    def CalculateOutput (self):
        output1 = 0
        net = self.bias+self.input1[0]*self.w1+self.input1[1]*self.w2
        if net >= 0:
            output1 = 1
        else:
            output1 = 0
        return output1


for x1 in range (2):
    for x2 in range (2):
        neuron1= Neuron([x1,x2],-1,1,1)

        print("x1="+str(x1)+"x2= "+str(x2)+" Output= " +str(neuron1.CalculateOutput()))

The code above implements a simple single neuron and the weights -1,1,1 produce an OR gate and -2,1,1 produces an AND gate.

1.3 A Neural Network
We can extend the code above to produce a neural network, by feeding the outputs of one or more neurons as inputs to other neurones. The code below produces an Exclusive XOR - essentially for the two input case if the two inputs are different then the output is True. The same inputs go to two neurones but they have different weight (bias, W1 and W2) but the outputs from these neurones are the inputs to a third neurone. The code is shown below (the Neuron class hasn't changed):
class Neuron:
    def __init__(self, input1, bias, w1, w2):
        self.input1 = input1
        self.bias = bias
        self.w1=w1
        self.w2=w2
    
    def CalculateOutput (self):
        output1 = 0
        net = self.bias+self.input1[0]*self.w1+self.input1[1]*self.w2
        if net >= 0:
            output1 = 1
        else:
            output1 = 0
        return output1


for x1 in range (2):
    for x2 in range (2):
        neuron1= Neuron([x1,x2],-1,-1,1)
        neuron2= Neuron([x1,x2],-1,1,-1)
        neuron3= Neuron([neuron1.CalculateOutput(),neuron2.CalculateOutput()],-1,1,1)
        print("x1="+str(x1)+"x2= "+str(x2)+" Output 1= "+str(neuron1.CalculateOutput())+" Output 2= "+str(neuron2.CalculateOutput())+" Output overall= "+str(neuron3.CalculateOutput()))

        



3. Where next - building a Physical Microbit Neural Network
The figure below shows the arrangement of the connections to be built; pin 2 is the output of each neuron. The two micro:bits/neurons on the left of the picture taking in the two same inputs; the output from these neurons are the two inputs to the output neuron on the right.



figure 1

The micro:bit objects used in Figure 1 were produced using the micro:bit Frtzing diagram available at https://github.com/microbit-foundation/dev-docs/issues/36 thanks to David Whale (@whalleygeek ) for this.








All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Robots and Physical Computing most popular posts during July 2019


Popular Posts

Search This Blog


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Simple PyGame Zero Apollo Lander #Apollo50th

With all the excitement of the 50th Anniversary of the first Moon Landing and the world record attempt Moonhack https://moonhack.com/, I wanted to experiment with PyGame Zero a little more. So I created my own, very simple, Moon Lander game using Python and PyGame Zero.

The lander has to pass through a red rectangle on the surface as it accelerates to the surface. Left and Right keys move it sideways and the up key gives it a boost. When the lander passes through the red rectangle it makes a noise; if it misses (or after it passes through the rectangle) it resets to a new position and starts again.

Two images were needed




An image from the game, the code and a link to the repository are shown below. It is a simple game and there is scope to improve (please feel free to do it - I would love to see it improve).







Repository https://github.com/scottturneruon/moonlander/archive/v1.0.zip


Resources
  1. Welcome to Pygame Zero https://pygame-zero.readthedocs.io/en/stable/
  2. Introduction to Pygame Zero https://pygame-zero.readthedocs.io/en/stable/introduction.html
  3. Built-in Objects - Pygame Zero https://pygame-zero.readthedocs.io/en/stable/builtins.html
  4. Space Asteroids - Pygame Zero http://www.penguintutor.com/projects/docs/space-asteroids-pgzero.pdf
  5. Pygame Zero Invaders https://www.raspberrypi.org/magpi/pygame-zero-invaders/
  6. Pygame Zero: SpaceInvaders II https://www.raspberrypi.org/magpi/pygame-zero-space-invaders-ii/
  7. My first PyGame attempt https://robotsandphysicalcomputing.blogspot.com/2019/07/my-first-pygame-zero-attempt.html
  8. Moon Hacks https://moonhack.com/


Cool book on the Moon Missions with Augmented Reality


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon