Friday, 1 November 2019

Top 10 read post on Robots and Physical Computing blog during October 2019



Popular Posts




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Monday, 30 September 2019

Most read posts on Robots and Physical Computing blog in September 2019

Popular Posts



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Sunday, 22 September 2019

Moving Eyes with an Adafruit Adabox kit


One of the things I enjoy is a subscription to Adabox from Adafruit, receiving the box and very soon before that finding out what the main item is the box is a treat. The latest, at the time of writing, is AdaBox 13 with the MONSTER M4SK a set of small screens, based around 240x240 IPS TFT displays, acting as moving eyes in a mask, along with a couple of masks, speaker, microphone, lenses, etc and craft materials for decorating a mask - a full list can be found here.




My goal was, to play with this, was to create a slightly creepy mask where the eyes move - a simple build but fun to do. The Adafruit quick start guide for this (available here) provides all the instructions on setting it and downloading the different sets of eyes (that is really creepy to write). A set of different graphics files are already available for different sets of eyes;  a couple of examples are shown below. 








One suggestion is when you download the files for the eyes - still creepy - keep a copy elsewhere, in one case I accidentally delete the configuration file I want so it was handy to have a backup.  


Two masks are included in the box, being lazy, I choose the already decorated silvery mask to use. Along with the M4SK device, a set of lenses and holders are included. I struggled a bit with fixing the lenses into the holder and attach to the device - in the end, I choose to leave them out and stick the M4SK to the mask without them. A video of the mask in action is shown below.





There is much to explore with the Monster M4SK including a number of sensors already on the board including a capacitive touch sensor, buttons, a light sensor, a port for connecting a PDM microphone (included in the AdaBox 13). and three other ports. Sound effects can be played from a headphone socket or via a built-in amp driving an 8-ohm 1W speaker (also included in the Adabox). A Lipoly battery can be connected. A guide to getting going via the Arduino IDE is available here.



Related links






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Monday, 16 September 2019

starting with NVIDA jetson nano




This is the third of a planned occasional series of posts on playing with some of the current AI specific boards for Intenet of Things (IoT). In the series, it is planned that will be some experiments with the Google Coral accelerator adapter and the Development Board; as well the NVIDIA Jetson Nano. In previous posts I started playing the Coral Accelerator adapter kit and the Coral Development Board.

This post looks a starting with the NVIDIA Jetson Nano Development Kit  which like the Coral Development Board is a small computer designed for running combined embedded and neural network applications. The processing power comes from a quad-core 64-bit ARM CPU and a 128-core integrated NVIDIA GPU (for more details see here)

So before we all get spooked; getting going is relatively easy, basically, follow https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit#intro. Following these instructions I would suggest if you are able to set-up a Raspberry Pi from a download Raspbian image this is not that different





What I wanted to do was attach a camera and grab an image. The board has a MIPI CSI-2 interface which means it should work with a Pi Camera Module V2, here I am using a Leopard Imaging 145FOV wide angle camera & ribbon cable because I had one nearby. A great site, and the one I am using here, for how to use Jetson Nanao and a camera is Jetson Nano + Raspberry Pi Camera which takes you through setting it and testing including the code below to test it. 





$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=0 ! 'video/x-raw,width=960, height=616' ! nvvidconv ! nvegltransform ! nveglglessink -e

The image below was grabbed using the Jetson and camera




I found it easier to get going with this board then the Coral development board (though I do like that as well) and I am looking forward to playing with this board more.


Related Links






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Sunday, 1 September 2019

Coral Dev Board and Raspberry Pi

This is the second of a planned occasional series of posts on playing with some of the current AI specific add-on processors for Intenet of Things (IoT). In the series, it is planned that some experiments with the Google Coral adapter and the Development Board; as well the NVIDIA Jetson Nano will be shown. In the previous post I started playing with the Coral Accelerator with a Raspberry Pi https://robotsandphysicalcomputing.blogspot.com/2019/08/coral-accelerator-on-raspberry-pi.htmlThe Coral environment is related to Google's earlier AIY Edge Tensor Processing Unit (TPU) range https://aiyprojects.withgoogle.com/edge-tpu/ and designed to work with TensorFlow Lite.

In this post, the bigger sibling the Coral Development Board or Coral Dev Board is connected to a Raspberry Pi. The Coral Dev board is a single board Linux computer in its own right, running a derivative of Debian called Mendel. The Pi (in this case I used A Raspberry Pi 2 running Raspbian) is being used as a terminal and to set-up the system in the first place. How to set it up and use it, can be found at  https://coral.withgoogle.com/docs/dev-board/get-started/# - this is the best bet to follow. It was relatively easy to set up, if you follow the instructions (I made a few mistakes)and juggle having several terminals open. A few issues (probably mainly due to my lack of skill) I had during setting it up:


  • Spent while trying to set-up fastboot according to the instructions but the easiest way is sudo apt-get install fastboot
  • I needed to go into root to set up the udev and driver on the pi to flash the Development board.

Apart from that it wasn't too bad.

  

Image below ran on the Development board and was processed with a pre-trained machine learning model to recognise a parrot. 




Figure below shows 76% match to Scarlet Macaw



I am going to enjoy playing with this a bit more, using Pi as terminal, once it is all set-up, seems to work.

Another good source that expands on the use of this device is   https://medium.com/@aallan/hands-on-with-the-coral-dev-board-adbcc317b6af; giving more detail on the device and Python examples on using the Coral Dev board.


Related products, for the Development Board and AIY



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Top 10 posts on the Robots and Physical Computing Blog in August 2019


Popular Posts


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Wednesday, 28 August 2019

Coral Accelerator on a Raspberry Pi

This is the first of a planned occasional series of posts on playing with some of the current AI specific add-on processors for Intenet of Things (IoT). In the series, it is planned that some experiments with the Google Coral adapter and the Development Board; as well the NVIDIA Jetson Nano will be shown.

Why bother? Basic reason is I love playing with AI and hardware - so it is kind of fun. Another reason is AI, IoT and edge computing, are important and growing technologies, and I want to start getting my head around them a bit.

In this post, I look at starting to use Coral Accelerator with a Raspberry Pi. The Coral environment is related to Google's earlier AIY Edge Tensor Processing Unit (TPU) range https://aiyprojects.withgoogle.com/edge-tpu/ and designed to work with TensorFlow Lite.




Good place to start is Google's Get started with the USB Accelerator pretty much all you need to do to get going is in it, it also mentions Raspberry Pi. It makes a good point, if you are using Python 3.7 on Raspberry Pi, at the time of writing the TensorFlow Lite API is up to Python 3.5. Not a problem but just need to be aware of it and Get started with the USB Accelerator offers a solution.

The Coral site has a number of examples you can try out at https://coral.withgoogle.com/examples/ . If you do try the Face detection example within the Object Detection example  on a Raspberry Pi, you need to install feh to see the images; sudo apt-get install feh sorts this.


Some other good sources:
https://medium.com/@aallan/hands-on-with-the-coral-usb-accelerator-a37fcb323553
https://www.raspberrypi.org/magpi/teachable-machine-coral-usb-accelerator/





Related products, but for the Development Board and AIY

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Who wants to produce AI produced cartoon strips

Question: How easy is it produce a comic/cartoon using genetative AI? Let's start with  using ChatGPT4o to produce cartoons. The idea wa...