Sunday 13 March 2016

waste as tool to inspire potential computing students

Originally posted as: http://computingnorthampton.blogspot.co.uk/2012/02/waste-as-tool-to-inspire-potential.html in 2012.






A recent article in the Northampton Herald and Post " How a university is using waste as tool to inspire students" by Lawrence John discusses the Junkbots project. 

"FUNNY looking robots called junkbots could be the key to encouraging more children across the county to become engineers, computer programmers or scientists.



One force which is driving this idea forward is the University of Northampton.


For the past few years, staff from its science and technology department have been going out to primary and secondary schools to spread the word that science is fun.

By working with schools, the university hopes to show pupils a different side to computing and hopefully raise their interest in what they can achieve" Lawrence John


For the whole article click here.this takes you to the Newspaper site.



To read more about the junkbot project go to: http://junkbots.blogspot.co.uk/





I would be interest in hearing from others who are doing similar things, please feel free to add a comment below.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Friday 11 March 2016

Virtual Reality Minecraft on a PC

This is an update of the an earlier post.


As a little experiment with the Oculus Rift I wondered if I could view a Minecraft world through the oculus Rift. The answer is yes and relatively easily (certain easier than I though it was going to be). The Oculus Rift used with this was from the first developer kit, there are some latency issues (you move, it moves slightly later)

Download Minecrift from: https://share.oculusvr.com/app/minecrift


Unzip the files and run the installer file.



figure 1
You should get something similar to figure 1. All I did then was press OK.
















Figure 2
Run the Minecraft launcher and create a new profile. After creating a new profile, edit it to change Use version to release minecrift-1.6.4-b12-nohydra which can be found in the drop down menu. 

Now save the profile.

Play the Minecraft normally, but with the Oculus Rift on (it helps to have two people one wearing the Oculus Rift, the other pressing keys to control the movement).



Figure 3


Most people who have tried it, enjoyed it; but you really can only play for a short while. If are susceptible motion sickness I wouldn't try this.



Related links

http://minecraft-vr.com/

http://www.theriftarcade.com/minecrift/


http://vrwiki.wikispaces.com/Minecrift 


http://www.pcgamer.com/2013/05/10/minecraft-is-the-latest-game-to-get-oculus-rift-support-with-minecrift-mod/


http://riftmod.com/how-to-setup-minecraft-for-oculus-rift/



I would be interested in finding out what others are doing with combining Minecraft with other devices/software via comments.

All views are the author's and do not necessarily reflect the views of any organisations the author is associated with in any way. Nor is post advocating the use of approach described above, but is reporting on an experiment.  

Friday 4 March 2016

Raspberry Pi geste contrôlé Minecraft X -Wing (revisited )


Translated using Google Translate from http://robotsandphysicalcomputing.blogspot.co.uk/2016/01/gesture-controlled-minecraft-x-wing.html I apologise if there are any translation issues.


Ce poste se fonde sur deux postes précédents et tente de répondre à certains des commentaires très utiles de personnes qui ont essayé cela. J'espère que cela aide.

Globalement, le projet se fonde sur un projet antérieur pour obtenir d'un simple X -Wing dans Minecraft sur ​​un Raspberry Pi . Le but était obtenir Python pour construire et déplacer le X -Wing . Les détails de ce projet peuvent être trouvés ici .


Révision Principale: Dans ce projet, et le précédent est basé autour de Python 3 exécutant le Raspbian ' Jessie ' Novembre version de l'OS . Aussi les bibliothèques supplémentaires peuvent devoir être ajouter pour obtenir le minecraftstuff (comme ShapeBlock ( ) et MinecraftShape ( ) ) . Détails sur la façon d' obtenir et d'installer ceux-ci peuvent être trouvés à


Dans ce post, la supplémentaire de Skywriter de Pirmoroni est inclus pour permettre les mouvements d'une main ou un doigt pour permettre à la X -Wing pour le décollage , la terre , aller de l'avant ou vers l'arrière .


Il se fonde sur des idées du livre Adventures in Minecraft sur ​​l'utilisation de Python et Minecraft utilisant un Raspberry Pi .


Le Skywriter  est une HAT Raspberry Pi (voir figure 2 ) qui permet à l'information de position de la main juste au-dessus du bord. Dans ce projet, il détecte films de la main , vers le bas, ou à travers le conseil d'administration afin de déterminer la direction du mouvement



Avant de commencer , utilisez le Skywriter dans le terminal que vous devez ajouter
 curl -sSL get.pimoroni.com/skywriter | bash


Pour commencer nous avons juste placé le X -Wing au-dessus du lecteur en plaçant des blocs en forme ( à peu près ) de la X -Wing basée autour de la méthode MinecraftShape ( voir le chapitre 8 de Adventures in Minecraft ) .







• Pour éviter de construire sur le joueur la position de départ de la X -Wing est défini par:• Trouver la position du joueur ;

o    ajouter 5 à la position x du joueur ;
o    ajouter 10 à la position y du joueur ( Le bit je dois me rappeler est l'axe y est vertical. ) ;
o    ajouter 5 à la position z du joueur;
• L'utilisation de ces valeurs construire en utilisant des blocs de laine , le X -Wing - 0 pour le blanc , et 14 pour les blocs rouges ;
• Si un film commence au sommet de la planche (ou «nord» ) cela déplace le X -Wing vers le sol ;
• Si un film commence au bas de la carte (ou " sud " ) cela déplace le X -Wing verticalement vers le haut ;
• Si un film commence sur la droite de la carte (ou «est» ), le X -Wing se déplace vers l'arrière horizontalement ;
• si un film commence sur la gauche de la carte ( ou « ouest »), le X -Wing se déplace vers l'avant .
 
from mcpi.minecraft import Minecraft
from mcpi import block
import mcpi.minecraftstuff as minecraftstuff
import time
import skywriter
import signal

mc=Minecraft.create()
xPos=mc.player.getTilePos()
xPos.x=xPos.x+5
xPos.y=xPos.y+5
xPos.z=xPos.z+5

xWingBlocks=[
minecraftstuff.ShapeBlock(0,0,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(-1,0,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(-2,0,0,block.WOOL.id,14),
minecraftstuff.ShapeBlock(-3,0,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(1,0,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(0,1,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(1,1,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(2,0,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(2,1,0,block.WOOL.id,0),
minecraftstuff.ShapeBlock(1,2,-1,block.WOOL.id,14),
minecraftstuff.ShapeBlock(1,2,1,block.WOOL.id,14),
minecraftstuff.ShapeBlock(1,-1,-1,block.WOOL.id,14),
minecraftstuff.ShapeBlock(1,-1,1,block.WOOL.id,14),
minecraftstuff.ShapeBlock(1,3,-2,block.WOOL.id,0),
minecraftstuff.ShapeBlock(1,3,2,block.WOOL.id,0),
minecraftstuff.ShapeBlock(1,-2,-2,block.WOOL.id,0),
minecraftstuff.ShapeBlock(1,-2,2,block.WOOL.id,0)]

xWingShape=minecraftstuff.MinecraftShape(mc,xPos,xWingBlocks)

@skywriter.flick()
def flick(start,finish):
  if start=="south":
    for count in range(1,10):
      time.sleep(0.1)
      xWingShape.moveBy(0,1,0)
  if start=="west":
    for count in range(1,10):
      time.sleep(0.1)
      xWingShape.moveBy(-1,0,0)
  if start=="east":
    for count in range(1,10):
      time.sleep(0.1)
      xWingShape.moveBy(1,0,0)
  if start=="north":
    for count in range(1,10):
      time.sleep(0.1)
      xWingShape.moveBy(0,-1,0)
signal.pause()





Pour plus de détails sur Minecraft et Python je suggère d'aller à http://www.stuffaboutcode.com/2013/11/coding-shapes-in-minecraft.html~~V en particulier sur la façon de télécharger le logiciel à mettre en œuvre MinecraftShape . Si vous n'utilisez ou modifiez cette s'il vous plaît laisser un commentaire, je serais ravi de voir ce que les autres faire.









All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Translation was done with Google Translate – sorry if it causes any offence this was not intentional.

Thursday 3 March 2016

NAO, chatbots, teaching and just plain showing off

I  managed to do something I have want to do for a long time, show off social robots; (now I have three goes at it) thanks to the recent purchase of NAO robots by the University of Northampton. 

Session 1
In an in-reach STEAM activity day I have had the opportunity to show the robots in action to a group of 8 years olds. As well presenting a short presentation on social robots.








I would be interested to hear how others are using NAOs via comments.



Session 2 and 3
I have managed to include a physical example of Social Robots into my teaching. The aim of the session was to teach about social AI, revolving around the using social cues, to a certain extent, using natural language through chatbots, for us to communicate with machines.

The robots were used as an example of a social robot,  the way we want to play with or work with them, without having to going through a steep learning curve on how to use them. Students were encouraged to consider why this was and that anthropomorphisation plays a part (NAO basically has some of the characteristics of a small child). The fact that it responds to voice commands, its looks, has a childlike voice, that it always moving (even slightly when standing) and the way it moves; were spotted by the group as ways it attracts us to it - it is really hard not to talk to it like a child sometimes (but perhaps that is just me).




This session also included the use of chatbots (one example, ALICE used is shown here) and AIML, Artificial Intelligence Markup Language, (a link to more about AIML is included below). Just as a bit of background, chatbots (also called variously, chatterbots, conversational agents, etc) are programs that hold a conversation with us using through either text or speech. The chatbots were used to show how we can create intelligent-like behave by in effect providing responses to questions. Followed by, how we then take this further by using the responses people give, while using the chatbot, to 'fine-tune' the model.



To read more about NAO robots go to https://www.aldebaran.com/en
To read more on AIML go to http://www.alicebot.org/aiml.html

Example chatbots

If you would like to create your own chatbot personally I think one of the easiest ways to start is through https://playground.pandorabots.com/en/quickstart/


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday 25 February 2016

MakeyMakey at Beavers


Recent had a great time playing with a MakeyMakey board (see an example below taken from an Amazon site) with some Beaver Scouts. 

The basis of the activity was a very simple scratch program; where the 'space' key (banana number one) played one recording and 'left key' (better know as banana number two) played another recording. 



Started with a few drum sounds, but when the session really took off was when they started recording their own sounds. Lots of shouts, shrieks and names but it engaged them…and left me with two very smashed up bananas.

Details of the MakeyMakey boards can be found at: http://www.makeymakey.com/

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Remote Data Logging with V1 Microbit

In an earlier post  https://robotsandphysicalcomputing.blogspot.com/2024/08/microbit-v1-datalogging.html  a single microbit was used to log ...