Wednesday, 28 August 2019

Coral Accelerator on a Raspberry Pi

This is the first of a planned occasional series of posts on playing with some of the current AI specific add-on processors for Intenet of Things (IoT). In the series, it is planned that some experiments with the Google Coral adapter and the Development Board; as well the NVIDIA Jetson Nano will be shown.

Why bother? Basic reason is I love playing with AI and hardware - so it is kind of fun. Another reason is AI, IoT and edge computing, are important and growing technologies, and I want to start getting my head around them a bit.

In this post, I look at starting to use Coral Accelerator with a Raspberry Pi. The Coral environment is related to Google's earlier AIY Edge Tensor Processing Unit (TPU) range https://aiyprojects.withgoogle.com/edge-tpu/ and designed to work with TensorFlow Lite.




Good place to start is Google's Get started with the USB Accelerator pretty much all you need to do to get going is in it, it also mentions Raspberry Pi. It makes a good point, if you are using Python 3.7 on Raspberry Pi, at the time of writing the TensorFlow Lite API is up to Python 3.5. Not a problem but just need to be aware of it and Get started with the USB Accelerator offers a solution.

The Coral site has a number of examples you can try out at https://coral.withgoogle.com/examples/ . If you do try the Face detection example within the Object Detection example  on a Raspberry Pi, you need to install feh to see the images; sudo apt-get install feh sorts this.


Some other good sources:
https://medium.com/@aallan/hands-on-with-the-coral-usb-accelerator-a37fcb323553
https://www.raspberrypi.org/magpi/teachable-machine-coral-usb-accelerator/





Related products, but for the Development Board and AIY

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Sunday, 18 August 2019

Produce a Microbit python neural network 2: Building a Physical Microbit Neural Network

This is second in a two-post series on building a neural network using microbits with micropython. In the first post python was used to produce a neural network without the microbits. In this post the network is as shown in figure 1 is developed.

The figure below shows the arrangement of the connections to be built; pin 2 is the output of each neuron. The two micro:bits/neurons on the left of the picture taking in the two same inputs; the output from these neurons are the two inputs to the output neuron on the right.



figure 1

The micro:bit objects used in Figure 1 were produced using the micro:bit Fritzing diagram available at https://github.com/microbit-foundation/dev-docs/issues/36 thanks to David Whale (@whalleygeek ) for this.


The Inputs neurons
Neuron 1:
from microbit import *

W=[-1,-1,1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")

        pin2.write_digital(0)


Neuron 2
from microbit import *

W=[-1,1,-1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")

        pin2.write_digital(0)



Output Neuron.
Feeding the inputs from Neuron 1 and Neuron 2 as inputs
from microbit import *

W=[-1,1,1]

while True:
    x1=pin0.read_digital()
    x2=pin1.read_digital()
    net = W[0]+W[1]*x1+W[2]*x2
    if net>=0:
        display.scroll("T")
        pin2.write_digital(1)
    else:
        display.scroll("F")
        pin2.write_digital(0)



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Produce a Microbit python neural network 1: Without the microbit initially

These two posts are really extensions of a set of post in response to a question from Carl Simmons (@Activ8Thinking) concerning building a micro:bit simple neuron. In those, the Microsoft MakeCode was used.

In this two-post series is going to go through building neurones and neural networks in Python and by the end of the second post a python-based microbit neural network


  • Post 1 (this one) A single neuron and a simple neuron network will be produced in Python are produced.
  • Post 2 looks at producing a network of neurons, ie. neural network using the idea from post 1 but using a three microbits.; looking to solve the problem that a single neuron can't solve, making an Exclusive OR gate (XOR)



1. Overview and non-microbit neuron
1.1 Overview: 
The characteristics of the system will be:
  • Inputs are going to be binary
  • Weighted sum is bias+W1*input1+w2*input2
  • If weighted sum>=0 then the output is True (T on the LEDs) or '1'
  • If weighted sum<0 then the output is False (F on the LEDs) or '0

1.2. Single Neuron (without the Microbit)
Lets start without the microbit and build a single neuron in Python and useful exercise in it's own right just to see the mechanism. A class Neuron is produced and all possible combination of a two-input binary system are feed in. 

class Neuron:
    def __init__(self, input1, bias, w1, w2):
        self.input1 = input1
        self.bias = bias
        self.w1=w1
        self.w2=w2
    
    def CalculateOutput (self):
        output1 = 0
        net = self.bias+self.input1[0]*self.w1+self.input1[1]*self.w2
        if net >= 0:
            output1 = 1
        else:
            output1 = 0
        return output1


for x1 in range (2):
    for x2 in range (2):
        neuron1= Neuron([x1,x2],-1,1,1)

        print("x1="+str(x1)+"x2= "+str(x2)+" Output= " +str(neuron1.CalculateOutput()))

The code above implements a simple single neuron and the weights -1,1,1 produce an OR gate and -2,1,1 produces an AND gate.

1.3 A Neural Network
We can extend the code above to produce a neural network, by feeding the outputs of one or more neurons as inputs to other neurones. The code below produces an Exclusive XOR - essentially for the two input case if the two inputs are different then the output is True. The same inputs go to two neurones but they have different weight (bias, W1 and W2) but the outputs from these neurones are the inputs to a third neurone. The code is shown below (the Neuron class hasn't changed):
class Neuron:
    def __init__(self, input1, bias, w1, w2):
        self.input1 = input1
        self.bias = bias
        self.w1=w1
        self.w2=w2
    
    def CalculateOutput (self):
        output1 = 0
        net = self.bias+self.input1[0]*self.w1+self.input1[1]*self.w2
        if net >= 0:
            output1 = 1
        else:
            output1 = 0
        return output1


for x1 in range (2):
    for x2 in range (2):
        neuron1= Neuron([x1,x2],-1,-1,1)
        neuron2= Neuron([x1,x2],-1,1,-1)
        neuron3= Neuron([neuron1.CalculateOutput(),neuron2.CalculateOutput()],-1,1,1)
        print("x1="+str(x1)+"x2= "+str(x2)+" Output 1= "+str(neuron1.CalculateOutput())+" Output 2= "+str(neuron2.CalculateOutput())+" Output overall= "+str(neuron3.CalculateOutput()))

        



2. Where next - building a Physical Microbit Neural Network
The figure below shows the arrangement of the connections to be built; pin 2 is the output of each neuron. The two micro:bits/neurons on the left of the picture taking in the two same inputs; the output from these neurons are the two inputs to the output neuron on the right.



figure 1

The micro:bit objects used in Figure 1 were produced using the micro:bit Frtzing diagram available at https://github.com/microbit-foundation/dev-docs/issues/36 thanks to David Whale (@whalleygeek ) for this.








All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Friday, 2 August 2019

Robots and Physical Computing most popular posts during July 2019


Popular Posts

Search This Blog


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Remote Data Logging with V1 Microbit

In an earlier post  https://robotsandphysicalcomputing.blogspot.com/2024/08/microbit-v1-datalogging.html  a single microbit was used to log ...