Saturday 1 December 2018

Scratch, webcams, cats and explosions

Still enjoying playing with Scratch and webcam. in this post, initially improving on the example shown in a previous post Webcam and Scratch; enhancing the movement of Scratch the Cat by adding the interim step and facing in the direction of the 'moving finger' (see the code below).















 Please go to the code at https://scratch.mit.edu/projects/263334488/  to try it yourself.


Going to add one more feature to experiment with getting objects to react to the motion. In this case to explode a button or change a button to a small ball by moving the finger onto the button. The motion on the sprite is used to do this.



Please try the code yourself at https://scratch.mit.edu/projects/266837380/






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Sunday 25 November 2018

Getting Adafruit Circuit Playground Express to respond to music

In a previous post, Adafruit Circuit Playground reacting to music (updated with simulator) using vibrations to change the RGB of pixels on the Circuit Playground Express was played with (it is too much fun to not). Here I am going to go even simpler, using the sound level directly, again using MakeCode.




The circuit playground includes a microphone so sound levels can be used directly, by using them to vary the RGB inputs and brightness of the pixel (see the code above). You can try the idea out on the simulator below; the circle bottom left can be used to simulate varying the sound level. 

Please feel free to alter the code and share; the code is available at https://makecode.com/_8UPY8oD54bmE










All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Monday 19 November 2018

Webcam and Scratch

I have to admit I do enjoy playing with Scratch. I heard you can connect a webcam to Scratch and though there might be quite a lot of set-up. I was wrong, it is was very easy just one block really. So in this very short post, I share my (very simple) code. Getting Scratch the Cat to follow my finger left or right.





I am intrigued to see what else can be done.

Code is at https://scratch.mit.edu/projects/263334488/ 

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Saturday 10 November 2018

Strictly Cube:Bit

In previous posts I looked at using the 4Tronix Cube:Bit with Python (http://bit.ly/2DcXcei) or Adafruit Circuit Playground with MakeCode (http://bit.ly/2T0ddcN) both used to make a 'Disco-Light' essentially reacting to vibrations from a speaker and therefore indirectly react to music.

In this post, a short experiment with the Cube:Bit combined with MakeCode is shown. First thing once you are in https://makecode.microbit.org/# is to add the MakeCode package for the Cube released by 4tronix; details on setting this up are at https://4tronix.co.uk/blog/?p=1770.

So the basic (and it is basic) idea in the code, is change one of the pixel/LEDs colour in response to the acceleration in the three axes, with each axis controlling either the amount of Red, Green or Blue in the LED's output. The colour is shifted one pixel along each time and the process repeats. The effect is run the colours through all the LEDs. The code is shown below:







One of the other things of found I quite enjoy doing with it; is putting the cube at different angles and see what colours are produced.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Friday 26 October 2018

Adafruit Circuit Playground reacting to music (updated with simulator)

This post looks at using the Adafruit Circuit Playground Express to react to music. I used Adafruit MakeCode (https://makecode.adafruit.com/) to experiment with it. 

The code is simple and shown below and using the acceleration in the x, y and z axis to selected values for RGB on the neopixels - the circuit is placed on a speaker and the vibrations cause the changes in the pixel RGB value. The 10 pixels are lit up in sequence, but it is so quick it is hard to see a difference between the pixels.







The video below shows it action with music available from the YouTube audio library “Eternal Structures” by Asher Fulero.






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Tuesday 23 October 2018

WebVR 5 Playtime: Augmented Reality to display Videos

In previous posts (post 1, post 2) I have raved over the brilliant combination of Mozilla's AFrame and Jerome Etienne's fantastic AR code - guess what this post is no different in that respect. 

Here is the problem I was interested in - having a set of markers that when viewed, a different video plays for each marker. 

Glitch is a great editor for this (for example https://glitch.com/~ambitious-hub ) easy to use and it can stored assets (in this case a video). Add the assets by clicking on the asset folder and then use the add asset button, upload the file. You will need the URL for the asset this is what is included as a source in the HTML files; you get this by clicking on the asset and then copying the URL.




So some initial code using Aframe and aframe-ar.js is shown below. This is set-up to display a video when the HIRO marker is shown to the camera.
<!DOCTYPE html>
<script src="https://aframe.io/releases/0.6.0/aframe.min.js"></script>
<!-- include ar.js for A-Frame -->
<script src="https://jeromeetienne.github.io/AR.js/aframe/build/aframe-ar.js"></script> 

<body>
  <a-assets>
    <video preload="auto" id="vid" autoplay loop="true" crossorigin webkit-playsinline playsinline controls src="https://cdn.glitch.com/b0bce38d-a212-4ce2-a89f-9c03bd45e85a%2Fdeep_learning_example_(SD_Small_-_WEB_MBL_(H264_900)).mp4?1540209123141"></video>
  </a-assets> 
<a-scene embedded arjs="sourceType: webcam; detectionMode: mono_and_matrix; matrixCodeType: 3x3;">

  <a-marker preset="hiro">
    <a-video src="#vid" width="1" height="1" position="0 0.5 0" rotation="-90 0 0"></a-video>
  </a-marker>
</body>

There is a problem, in some browser including Chrome the video doesn't play it needs something to make it play. A great solution can be found at https://github.com/jeromeetienne/AR.js/issues/400 revolving around adding a button to play the video; a new HTML file aframe_video needs to be created as per the advice; a new call to the HTML file aframe_video.html (<script src="aframe_video.html"></script>)  and some extra code available in the advice needs to be add to index.html. If you want to see the code I used, please  go to https://glitch.com/edit/#!/ambitious-hub; please feel free to remix and reuse; I have built this on top of code developed by others, so I more than happy for others to use it.

Another great tool (developed again by Jerome Etienne) is the AR-Code Generator https://jeromeetienne.github.io/AR.js/three.js/examples/arcode.html which you put your URL for the index.html file of your AR example in and it generates a Hiro marker and a QR code for the site (an example is shown below). More details can be found at https://medium.com/arjs/ar-code-a-fast-path-to-augmented-reality-60e51be3cbdf 





All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Saturday 13 October 2018

It listens to me - Anki's Vector Robot

The Anki Cozmo is great fun to play with (loads of games to get it to do) and nice to program through the codelab (https://robotsandphysicalcomputing.blogspot.com/2017/06/program-cozmo.html) and is just so cute you want to play with it. But now Anki has released Vector (https://developer.anki.com/blog/news/hey-vector/) with it's enhanced camera and built-in microphones, essentially better sensors; but the main feature (so far) has to the speech recognition.






All request start with "Hey Vector", pause until the lights go blue; followed by commands for example

- "my name is" which records your name and then learns your face.
- "play blackjack" to not surprisingly plays games of blackjack with you.

There are a load of commands, including to get the time, the temperature, etc. I am partial, to getting a fist bump from it. I have only just started exploring it (ok playing) there are loads of features I have yet to play with - just got to let it recharge it has had a busy time.

It is early days with the features developed for it so far, the SDK for Vector is expected to be released next year, so it doesn't have all the activities that the brilliant Cozmo has but I am sure it will and more. It might be me but there is more of a feel of a robopet/companion to it, with the addition of capacitive sensor on the back to 'pet' it. 

So in summary great fun to play with, love the voice recognition, ability to ask it questions, lots to explore with it and based its hardware and what Anki have done with Cozmo, I expect there is going to be a wealth of new activities to add the current ones.





All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

ChatGPT, Data Scientist - fitting it a bit

This is a second post about using ChatGPT to do some data analysis. In the first looked at using it to some basic statistics  https://robots...