Skip to main content

Posts

Raspberry Pi gesture controlled Minecraft X-Wing

figure 1 This post builds on an earlier project to get a simple X-Wing into Minecraft on a Raspberry Pi.  The goal was get Python to build and move the X-Wing. Details of this project can be found here . In this post the additional of Pirmoroni's Skywriter HAT included to allow movements of a hand to enable the X-Wing to take-off, land, move forward or backward. It builds on ideas from the book Adventures in Minecraft on using Python and Minecraft using a Raspberry Pi. figure 2 The  Skywriter  is a Raspberry Pi HAT (see figure 2) that allows positional information of the hand just above the board. In this project it is detecting flicks of the hand up, down, or across the board to determine the direction of motion. Before you start, to use the Skywriter, in the terminal you need to add  curl -sSL get.pimoroni.com/skywriter | bash To start with we just placed the X-Wing above the player by placing blocks in the shape (roughly) of the X-Wing bas...

Daleks, cameras, and a mutant rabbit.

A little more detail on my experience of PiCademy and some of the code developed (and I apologies it is not well developed). Programming LEDs and Motors through either the the GPIO or using an HAT (see the images below) is just what I enjoy the most. To have a go, you may have to have the following: The latest version of Raspbian, at the time of writing Jessie ( https://www.raspberrypi.org/downloads/raspbian/ ) Import the following sudo apt-get install python-twython sudo pip3 install explorehat sudo pip3 install gpiozero In the above image was my attempt at a simple 'Dalek' - essentially a cup and straw, with a wheeled motor inside. Controlled using python,  Pi through an Explorer HAT PRO . It essentially moved in a circle either clockwise (button 1 on the explorehat) or anti-clockwise (button 2). import explorerhat from time import sleep from random import randint def wheel(channel, event):     duration = randint(1,2) ...

BB-8, Droid I was looking for... - Tynkering

In a recent post controlling the Sphero BB-8 with the Tickle App was discussed. This is not the only alternative software, the Tynker App can also control it. This is also a graphical drag and drop programming tool, that you can connect certain 'toys' to. Though the App itself is about developing programming skills. The Sphero BB-8 Droid can be connected to Tynker (or how I did it anyway) by: Clicking on the Create button on the opening screen; Clicking on Blank Template; Deleting the 'Actor' that has there and clicking on the + button in the top right hand corner of the screen; Clicking on connected toys and selecting the grey ball; On the main screen it should say spherobot with a code button at the side, click on the button; You should get a screen with some code for changing the colour shown and then moving in a square- you can change this for your own code. Not all the commands, listed down the side, will work with the BB-8 - I re...

BB-8, this is the Droid I was looking for...

Previously I have shown the Sphero BB-8 rolling around the room under its own control .  One of the features of the Sphero BB-8 Droid  is it programmable either by its own downloadable software but also by one of my favourite apps - TickleApp  which has been discussed in previous posts. This app allows control of a quite an impressive range of devices using the same interface. Examples, some of which have been discussed previously (e.g Parrot Minidrone  or Dash and Dot ), are shown below.  The App uses an graphical programming interface similar to Scratch or Blockly  to produce code. The example here it a very simple one of: Spin twice for one second; Move roughly in a square; If the BB-8 collides (or is bumped) it is set to spin twice for a second.   Ok, not the most sophisticated bit of coding; but it ...

Playing with Aldebaran's NAO - walking and talking.

Ok, I need to read the manual! Managed today to play with Aldebaran NAO again and was struggling to get it to interact - this is the should have read the manual bit, it was all in there. Mistake number 1 - I hadn't set a channel for all the apps so it was reacting to sounds and movement but not much more. So I set it. Mistake number 2 - not understanding the meaning of the changes in the colour of the eyes, when the eyes go blue NAO is listening. Now  it does what I was after - to be lead by the hand using the follow me app and react to some vocal commands. The video below shows "Red" in action. I would be interest in others experiences with these robots, if you would like please add your comments below. All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Aldebaran NAO 'Red' in Teaching

Photo by John Sinclair I had my first opportunity today to try an Aldebaran NAO robot as a teaching tool in an AI class today. The session was an end of term activity around summarising what we did in the AI class so far and questions.  A question came up around AI and it's impact on society. Perfect opportunity to bring in a social robot - especially as a precursor for when we include a session on social robotics next term. All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Playing with Aldebaran NAO

This is just a short post, as well as being able to go to Picademy this week ( http://robotsandphysicalcomputing.blogspot.co.uk/2015/12/picademy-7-8th-december-2015.html ); I have been fortunate to be able to borrow an Aldebaran NAO robot ( https://www.aldebaran.com/en ) for the weekend to play with. This is an extermly cool robot, straight out of the box, tracking movement and dynamic balancing. Hopefully, more on this in future posts. All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.