Thursday 5 November 2020

Free your Augmented Reality


 

Dr Scott Turner 

Director of Computing at Canterbury Christ Church University, Kent and a Code Club Volunteer. 




 

Age Range 10-14 years 

 

Year Group 5-9, Code Clubs. 

 

Lesson Type: Web programming 

 

Objective 

How to use free web technologies to create Augmented Reality. 

 

Requirements 

  • A computer capable of accessing the required software and ideally has a webcam 

  • Access to AR.JS Studio (free) https://ar-js-org.github.io/studio/ 

  • Access to webhosting or free webhosting sites such as Glitch.com or GitHub 

  • An image (png, jpeg, gif), or video (mp4)  

  • (optional) mobile device with a camera and internet access. 

 

 

 

Augmented Reality (AR) is becoming increasing popular, but often means using proprietary packages such as Blippar to create your own, or write the code yourself. A middle ground on this is AR.js Studio which allows web-based AR applications to be developed completely with out writing code (if connected to a GitHub account) or with a few minor tweaks otherwise 

 

Activity 1 

We are going to link a marker (in this case a specific image with a black border and the letters AR) to an image using AR.JS Studio, to produce a webpage that uses the a camera connected  to  a browser to replace the marker on screen with the image. 

 

First go to https://ar-js-org.github.io/studio/ and pick marker based project  

Text

Description automatically generated 

Graphical user interface, application

Description automatically generated 

Download the marker by clicking on the Download marker link, print out a copy of it, you will need this later. 

Icon

Description automatically generated 

Now export the project. If you have a Github account this is the easier route once the connection between the website and GitHub is set-up it creates the URL containing your project. Now you just run that in a browser on a mobile device or computer with a webcam and start waving your printed out marker to see it in action. My advice is, if you don't have your own web-server, get yourself a Github account and choose that option, and you just log-in to your account. You will need to give the project a name and then push Publish. Depending on your internet connection it can take a few seconds to a minute or so, but it is worth the wait. An example is available athttps://scottturneruon.github.io/Testobjectexs5y2/ 

 

If this is not possible or you have your own space to store webpages then the download the package is useful. This produces a zipped version of the file and we need to add it to a web-serverTo see that in action, the free web-hosting site Glitch.com  is going to be used to host it, you may have to set up a free account.  

  • First step is unzip the files from AR.JS studio. Open Glitch and start a new project, choose a Hello-webpage it will automatically generate a name for the project.  

  • From the unzip file copy the contents of the index.html file and in Glitch replace the contents of the index file already there with the contents on the unzipped index.html. 

  • Open the Assets folder in Glitch and then click on upload new asset, selected the image from the unzipped file.  

  • Click on the uploaded image and it will give you a long URL press the copy button.  

  • Go back to the index.html find the section that goes <a-imag src="assets/asset.png" replace assets/asset.png with your copied URL from the asset folder.  

  • Go back to the unzipped asset folder and then in Glitch upload the marker.patt and get the URL in the same way as we did with the image again copying the URL for this asset 

  • This time in the section that starts <a-marker replace the assets/marker.patt in  url="assets/marker.patt" with the URL copied from the asset folder for marker.patt 

 

Now we can have fun. Near the top of the screen there is what looks like a pair of sunglasses this lets us test it out. Give permission for the camera to share in the browser (if allowed) and wave the marker in front of the camera. In my case it replace the marker with an image of the planet.  

 

We can now share it with the world. In the Glitch go to share and change the tab to Live App and press copy you should now have copied a new address for example https://shelled-humdrum-rainbow.glitch.me that we can use on a browser including those on mobile devices. A word of warning there can sometimes be a problem with the Glitch approach working on IOS on some devices. 

 

 

 

 

A picture containing coffee, cup, table, sitting

Description automatically generated 

 

Follow on Activities 


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Saturday 29 August 2020

10 Top posts on Robot and Physical Computing Blog - August 2020










Statistics

Where are people searching from:











All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Tuesday 25 August 2020

Nuffield Foundation Researcher Placement Project: Further Development of Junkbots



New blog https://junkbotactivities.blogspot.com/ by Muhammad Vadia on the Nuffield Foundation Researcher Placement Program has been working on developing a series of activities for the Junkbots project. These activities focus on using Scratch to develop coding skills by building up a simulation of the junkbot.

These can be found at:











All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Friday 14 August 2020

Spooky Hot-Cross Buns - Sonic Pi



In a previous post on another blog I discussed quick play with Sonic Pi ( see https://computingnorthampton.blogspot.com/2020/08/quick-play-with-sonic-pi.html ). In this post I am really just going to take it a bit further and some new features.

The goal - spooky/moanful Hot-Cross Buns (the only bit of music I know the notes for), just so I can play a bit. So let us start with resources I have found useful, alongside Sonic Pi (https://sonic-pi.net/); a really useful webpage is https://newt.phys.unsw.edu.au/jw/notes.html to turn the notes into the MIDI number (60, etc) 

So by the end of the last post I going to adding a techno effect on top of the tune:
use_synth :prophet
with_fx :ixi_techno do
  2.times do
    play chord(:b4, :minor7)
    sleep 0.5
    play chord(:a4, :minor7)
    sleep 0.5
    play chord(:g4, :minor7)
    sleep 0.5
  end
  4.times do
    play chord(:g4, :minor7)
    sleep 0.25
  end
  4.times do
    play chord(:a4, :minor7)
    sleep 0.25
  end
  play chord(:b4, :minor7)
  sleep 0.5
  play chord(:a4, :minor7)
  sleep 0.5
  play chord(:g4, :minor7)
  sleep 0.5
end

Sonic Pi is a cool system I found out you get it to pan the sounds from left to right and played with that but didn't include it in the end. What I did change was a different synth but also samples, after trail-and-error, chose the ambi_dark_whoosh (had to with that name)

use_synth :tech_saws
sample :ambi_dark_woosh, amp: 0.25
with_fx :ixi_techno do
  2.times do
    play chord(:b3, :minor7)
    sleep 0.5
    play chord(:a3, :minor7)
    sleep 0.5
    play chord(:g3, :minor7)
    sleep 0.5
  end
  sample :ambi_dark_woosh, amp: 0.25
  4.times do
    play chord(:g3, :minor7)
    sleep 0.25
  end
  4.times do
    play chord(:a3, :minor7)
    sleep 0.25
  end
  sample :ambi_dark_woosh, amp: 0.25
  play chord(:b3, :minor7)
  sleep 0.5
  play chord(:a3, :minor7)
  sleep 0.5
  play chord(:g3, :minor7)
  sleep 0.5
end

This is a bit of software (and support on Patreon) it is just fun and free. I have no musical ability but I enjoy creating with it.

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Friday 31 July 2020

10 most read post (July 2020) on Robots and Physical Computing Blog


Popular Posts



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Saturday 18 July 2020

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language, with support Dale from Machine Learning for Kids, that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the machinelearningforkids.co.uk site anyway it is full of exciting things to do (for example loads of activities https://machinelearningforkids.co.uk/#!/worksheets ). Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with.



The critical thing for this post is Machine Learning for Kids have created a Scratch 3 template with their own extensions for Scratch 3 within it https://machinelearningforkids.co.uk/scratch3/. One of which is a Speech to Text extension (see below). You must use this one not the standard Scratch 3.



My idea is to can I set it to react one way when I say "hello"; then say "french" and then say "hello" it says "Bonjour". Two other extensions are needed along with the Speech to Text one - one for speech to text and the translate shown below.



Ok, so to the fun bit. The listen and wait, and when I hear blocks are the key new blocks, and they do what they say. The three sets of the code are ones I used for this activity.




Thank you to Machine Learning for Kids for creating such a brilliant Scratch extension - this is well worth a play with.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Monday 13 July 2020

Dancing Kitronik's Game Zap - reacts to music

You will glad to hear this is only a short post.  

In an earlier post, Build a Disco Cube:bit that reacts to music; the vibrations of music, makes the cube sitting on a speaker with the volume pushed to 11 (just to test it of course) react to the music. The accelerometers values in the micro:bit, in the three axis, are feedback to change the neopixels colour. Simple but good fun.




With some very minor (and I do mean minor) changes it works on the Kitronik's Game Zap - eight pixels are altered at a time instead of five but apart from that nothing more. The code in python is shown below:

from microbit import *
import neopixel, random

np = neopixel.NeoPixel(pin0, 64)

while True:
    for pxl in range (3,64, 8):
        rd=int(abs(accelerometer.get_x())/20)
        gr=int(abs(accelerometer.get_y())/20)
        bl=int(abs(accelerometer.get_z())/20)
        np[pxl] = (rd, gr, 0)
        np[pxl+1] = (rd, gr, 0)
        np[pxl-1] = (rd, gr, 0)
        np[pxl+2] = (rd, gr, 0)
        np[pxl+3] = (0, gr, rd)
        np[pxl-2] = (0, gr, rd)
        np[pxl-3] = (rd, 0, 0)
        np[pxl+4] = (0, gr,0)

        np.show()


I was impressed with a few tweaks it worked! Please feel to share and copy, if this useful to you please share in the comments.




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Remote Data Logging with V1 Microbit

In an earlier post  https://robotsandphysicalcomputing.blogspot.com/2024/08/microbit-v1-datalogging.html  a single microbit was used to log ...