Skip to main content

Daleks, cameras, and a mutant rabbit.

A little more detail on my experience of PiCademy and some of the code developed (and I apologies it is not well developed).

Programming LEDs and Motors through either the the GPIO or using an HAT (see the images below) is just what I enjoy the most.

To have a go, you may have to have the following:




In the above image was my attempt at a simple 'Dalek' - essentially a cup and straw, with a wheeled motor inside. Controlled using python,  Pi through an Explorer HAT PRO . It essentially moved in a circle either clockwise (button 1 on the explorehat) or anti-clockwise (button 2).


import explorerhat
from time import sleep
from random import randint

def wheel(channel, event):
    duration = randint(1,2)
    print(duration)
    explorerhat.motor.one.forward(100)
    sleep(duration)
    explorerhat.motor.one.stop()

def wheel2(channel, event):
    duration = randint(1,2)
    print(duration)
    explorerhat.motor.one.backward(100)
    sleep(duration)
    explorerhat.motor.one.stop()
    

explorerhat.touch.one.pressed(wheel)
explorerhat.touch.two.pressed(wheel2)

It needs a lot more work, not least of which is a moving head under seperate motor control but it is a start.


Playing with the PiCamera and a button attached to the GPIO, I came up with a simple system that everytime the button is pressed a image is captured this was based on the activities and worksheets at PiCademy. The extra was the tweak concerning providing a different filename each time. Essentially:

  •  create a string with most of the filename and path ('/home/pi/Desktop/image'); 
  • include a count of how many pictures have been taken and convert that to a string (str(count)); 
  • add the file extension ('.jpg');
  • combine them and use them as the filename.
      str1='/home/pi/Desktop/image'+str(count)+'.jpg
      camera.capture(str1)


The whole code is shown here.


from time import sleep
from picamera import PiCamera
from gpiozero import Button

camera = PiCamera()
button = Button(17)
str1=[]
count=1

while True:
    camera.start_preview(alpha=192)
    button.wait_for_press()
    str1='/home/pi/Desktop/image'+str(count)+'.jpg'
    camera.capture(str1)
    count=count+1
    camera.stop_preview()


Rise of Rabbitsapien - A team of us put together a project of a robot with a rabbit (no other soft toys were available) with a Passive IR sensor in its belly; that carries out a set routine when movement is detected.




It was also great to come away with some many resources both physical and activities. Thank you to the Pi Foundation for such a good experience.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Comments

  1. This comment has been removed by a blog administrator.

    ReplyDelete

Post a Comment

Popular posts from this blog

Robot Software

In the previous blog posts for this 'series' "It is a good time...."  Post 1  looked at the hardware unpinning some of this positive rise in robots; Post 2  looked at social robots; Post 3  looked at a collection of small robots; Post 4 looked at further examples of small robots Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see:  http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/ ). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit ).  This post contin ues with some of the software being used with the smaller robots.  A number ...

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language , with support Dale from Machine Learning for Kids , that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the  machinelearningforkids.co.uk   site anyway it is full of exciting things to do (for example loads of activities  https://machinelearningforkids.co.uk/#!/worksheets  ) . Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with. The critical thing for this post is  Machine Learning for Kids  have created a Scratch 3 template with their own extensions for Scratch 3 within it  https://machinelearningforkids.co.uk/scratch3/ . One of which is a Speech to Text extension (see below). You must use this one ...

WebVR 3 Playtime: Augmented Reality

I am going to try to persuade you that using A-Frame it is not hard to do some simple Augmented Reality (AR) for free, via a browser, but that also can run on a mobile device. Introduction This is part of a short series of articles about some experiments with WebVR Web-based Virtual Reality - in this case based on the wonderful A-Frame  ( https://aframe.io )   .  In the first post  WebVR playtime 1: Basics of setting up, images and rotating blocks ,  I looked at setting up a scene and then rotating an object.  In the second pos t, recapped the basics, then look at adding video, 360 degree video, and models developed elsewhere. In this post we are going to start looking at using WebVR as part of an augmented reality solution. I going to start by building on the great resource Creating Augmented Reality with AR.js and A-Frame by Jerome Etienne, creator of AR.js - the starting code below and the basis of the solution ...