Tuesday 19 April 2016

Minecraft, jam and history in the making

Taken from: http://www.northampton.ac.uk/news/minecraft-jam-and-history-in-the-making/


Pi canva
History was made on Saturday as the University of Northampton hosted Northamptonshire’s first-ever Raspberry Jam.
Raspberry Jams see those with an interest in the affordable – and tiny –Raspberry Pi computer get together to share knowledge, learn new things and meet other enthusiasts.
More than 30 people of all ages attended the county’s inaugural Jam at Avenue Campus, which was organised by the University’s Associate Professor in Computing and Immersive Technologies, Dr Scott Turner.
He said: “The Jam was a real success, with a wide mixture of people including fairly notable experts; those who have a Pi, but aren’t quite sure what to do with it and complete novices.
“It was great to see people who had some sort of Pi-related query have their questions answered, and others showing off what they have managed to get their Pi to do.
“It really helped to inspire the novices to get more involved in the Raspberry Pi, which will ultimately help them develop their coding skills.”
Computing and Science teacher Steve Foster, from Wollaston School, led a session on the popular Minecraft game, and was ably assisted by five of his Year 10 pupils.
One of the pupils, Ellie, said: “One of the groups had a problem with their coding and I managed to solve it for them. I love the challenges a Raspberry Pi can give you, and when you are able to solve the problem it’s really cool.”
The University is committed to making a positive social impact on the people of Northamptonshire and has set itself four ambitious challenges to meet by 2020.
One of these ‘Changemaker+ Challenges’ is to make Northamptonshire the best county in the UK for children and young people to flourish and learn – something the Raspberry Jam has contributed to.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Tuesday 12 April 2016

Dancing bot on a Microbit

In a earlier post on using micro:bit (Playing with microbit emulator-dancing bot)  a simple dancing robot image (using the 5x 5 grid was created). 
A dancing bot - 3x3 box for the body, with two legs. 

Thanks to a loan of a Micro:Bit from Lancaster University I can experiment with an actual micro:bit ( )











Experiment  1 - Using the buttons

So the functions for the idea were:


  • Button A - Move to the left and then back to the starting position;
  • Button B - Move to the right and then back to the starting position;
  • Buttons A+B - Jump up and then back to the starting position;
  • Shake - 'Crouches' and then back to the starting position



On the Microbit









Experiment 2- To add left and right tilting to it.
So if the micro:bit is tilted to the right the 'bot'  moves to the right, and the same for the left.

The tilting operation here is essentially - when the x on the accelerometer is less than zero move the 'bot' to the left and when it greater than zero go to the right.

Video showing it in action





All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Monday 11 April 2016

Pimoroni Flotilla first play with Python.



The Mega Treasure Chest Flotilla set from Pimoroni, was kickstarter project that got a lot of people interested. A nice package - a hub for a collection of devices such as light sensors, barometer, temperature, switches, motors and many more; all linked to a Raspberry Pi. The kit is shown in the image to the left.



A Python API exist for this system. Instructions on how to set up the Flotilla to work with Python can be found at http://flotil.la/start/ .

I wanted to play with switching the Rainbow (A set of RGB LEDs) outputs to Red, Blue and Green by pressing either 2,3, or 4 on the Touch Sensor as in the images.





Using the mini-kit example from https://github.com/pimoroni/flotilla-python/blob/master/examples/mini-kit.py as the basis, produced a simple system that uses the Touch module and its buttons 2,3 and 4 to change the Rainbow; the code is shown below and ran on a Raspberry Pi 3 using Python 3.


import colorsys
import flotilla
import time

client= flotilla.Client(
    requires={
        'one':flotilla.Rainbow,
        'two':flotilla.Touch
    })

def module_changed(channel,module):
    rainbow=client.first(flotilla.Rainbow)
    if module.is_a(flotilla.Touch):
        if module.one:
            rainbow.set_pixel(0,255,0).update()
        else:
            rainbow.set_pixel(0,0,0).update()

while not client.ready:
    pass

touch=client.first(flotilla.Touch)
rainbow=client.first(flotilla.Rainbow)
hue=0
lights_on= True

try:
    while True:
        if touch.one:
            lights_on = not lights_on
        if touch.two:
            rainbow.set_all(255,0,0).update()
        if touch.three:
            rainbow.set_all(0,255,0).update()
        if touch.four:
            rainbow.set_all(0,0,255).update()

        time.sleep(0.5)

except KeyboardInterrupt:
    client.stop()


The video below shows the system in action.




I look forward to playing with it a bit more and I would love to hear what others are doing with the Flotilla.


An other example of Python and Flotilla in action can be seen at 

 http://home.uktechreviews.com/Raspberry/Pi%20blog/files/flotilla1.html 

Related links

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Saturday 9 April 2016

Experiences with Raspberry Pi Touch Screen

Well for once I am not going to be talking about robots, but my experience in setting up a Raspberry Pi touch screen - this is not a how to guide, a couple links to those are included in the post, but my experience of setting one up. 

So the I bought the Raspberry Pi Touch Screen from Pimoroni and along with the stand/frame for it.

Setting up the LCD frame was simple with the instructions provided and the link at the end of the instructions provide some further help http://learn.pimoroni.com/tutorial/pi-lcd/getting-started-with-raspberry-pi-7-touchscreen-lcd on setting up the screen

A tutorial from The PiHut (https://thepihut.com/blogs/raspberry-pi-tutorials/45295044-raspberry-pi-7-touch-screen-assembly-guide) was very useful on how to connect the screen to the Pi. The blue side  on the white ribbon cable (provided with the screen) used in connecting the two together needs to blue side down towards the LCD (as explained in PiHut tutorial) and facing away from the Pi when connecting to it, which is the bit I was unsure of. Connecting the power via the jumper leads that came with the screen means only a single power supply is need for both the screen and Pi. If you follow the  PiHut tutorial the colours of the jumper leads may needed to be changed but that is not a problem. 

I used Raspbian OS and when power up with a power lead (it needs to output enough current to power both devices) both the Pi and screen came on without any difficulty.  

The touchscreen worked well as a mouse and the single power lead makes the system much more compact.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday 31 March 2016

mBot - cute, fun and Arduino based

I have had an opportunity (ie, the time) to play with mBot, Scratch Programmable Robot using the mBlock software which appears to be a modified version of Scratch - so relatively easy to use. They have added a section of blocks, to the standard set, marked Robots containing blocks for both Arduino and mBot. 

An earlier blog post (mbots - graphical programming and Arduino) discuss some of the basics of the robot. Just for fun I wanted to play with the ultrasonic sensor, getting the robot to react, change direction (run away) and the 'face' on an LED matrix that came with the robot if there is an object in the way.



The routine
  • Loop
    • Show a smiley face (using Port 4 )
    • If the ultrasonic detector senses something close (guessed at a setting of 10)
      • Go backwards quickly
      • Play a tone
      • Show an upside-down smiley face 
      • wait 1 sec
    • Otherwise
      • Move forward
  • End the loop


Download the code to the mBot using the Upload to Arduino button (see below). Here is where you find out whether you have set the system up correctly. In the mBlock editor pull down menu choose Connect and select the required connection; I have been using a USB cable so I needed to select the serial option and select the USB hub. After that using the Upload to Arduino button did lead to the code downloading.






The video below shows it in action






This is good fun, and a very cute. The build quality of the bots (not my building ability) the metallic construction means the bots feel substantial.  The software as it is Scratch based I think it will be interesting to try it out with my Code Clubbers - especially as they have been asking to play with more robots.

As always I would be interested to hear from others on their experiences of using this little robot.

Related links
mbots - graphical programming and Arduino






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Monday 28 March 2016

Playing with the micro:bit Emulator - Dancing bot part 2

In a earlier post on using micro:bit (Playing with microbit emulator-dancing bot)  a simple dancing robot image (using the 5x 5 grid was created). In this post of modified version using the events to do pretty much the same thing (except the two button action)- A dancing bot - 3x3 box for the body, with two legs. 













So the functions for the idea were:

  • Button A - make it bob up and down;
  • Button B - makes it move to the left and right;




  • On shake - make it jump up and down.








All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Playing with the micro:bit Emulator - Dancing bot

The micro:bit (https://www.microbit.co.uk/about) has been in the press quite a bit recently - rightly so. 

At the moment I have not got my hands on an actual one, though someone generously (more in later posts on that) will be lending me one to play with. Luckily you do not need the device to start playing; the code editor (https://www.microbit.co.uk/create-codehas an emulator built in, so you can start playing. This post reports on my first go.

So, I am starting with experimenting with the editor, using Microsoft's Block Editor - which is similar to Blockly and Scratch as an interface.





What I built is simple but that is fine - A dancing bot - 3x3 box for the body, with two legs. I wanted to play with getting some interaction that uses the buttons (A and B) and shaking. So the functions for the idea were:


  • Button A - make it bob up and down;
  • Button B - makes it move to the left and right;
  • Shaking it - makes it appear to jump and land (see below);
  • Button A+B together makes it do one of the moves of up, down, left and right.




The video below shows it action. 




The other nice thing is go convert to the Microsoft touch Developer version (see below) with a click.



I really enjoyed playing with this, the next stage to try it on physical micro:bit. 

I would be interested to hear from others about they have been doing with this (including just the emulator).


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Friday 25 March 2016

Ozobot in Code Club

Earlier this week (21st March 2016) the Ozobot Bit were used as an extra activity at a Code Code - they went down very well with the children. 

Using Ozoblocky (http://ozoblockly.com/editor) they just played with making a short routine on a PC and download it to the ozobot.





Some very anecdotal observations:
- It would probably be better running this on tablet rather than a PC. There is nothing wrong with the software, but holding a bot against a screen, even one as light as ozobots, gets a bit tiring. If it was on a screen flat on the desk there wouldn't be a need to hold it. It is obvious with hindsight.
- Though movement was a big attraction for the children, the flashing light patterns seem to be, for the groups who work on it, a bigger attraction.
- The transition in moving from Scratch to Blockly was relatively smooth.

They all asked can they have the next session being solely about using robots.

As the author of this blog, I am clearly positively biased to physical computing - but the engagement, even from the more reticent children, was quite high with these.




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Sunday 20 March 2016

Playing with Ozobot Bit and Blockly

Previously I discussed the older Ozobot that were programmed by colours on paper or made to dance via an App. The newer Ozobot Bit still can be used in these ways but also can be programmed using a Blockly web-based interface at http://ozoblockly.com/editor, so are now programmable on tablets, but also PC, Macs, essentially anything that can run the webpage.



First stage is the calibration of the 'bot'
1. Hold the power button on the ozobot until it starts flashing with a white light.
2. Move to the white space that is similar to the bottom of the Ozobot, the wheels might start moving but as you get closer to the space they should stop.
3. The Ozobot should start flashing green - that is ok continue holding the ozobot to the screen until it stops flashing green - if it flashes red start again.





Build your code blocks and when you are ready press the power button on the Ozobot. Hold the ozobot against the white space again and holding it there press the load button. The white space should now be flashing different colours, but the ozobot should be flashing green programming the ozobot.



Now to run them press the power button twice. I have add a two second delay in the code so I can get my hands out of the way before they start moving. The video below shows two of them so through this routine twice
- 2 second delay;
- Zigzag a little
- Flash the lights through the rainbow colours;
- Move in an arc;
- Flash the lights a bit like fireworks going off



It is relatively easy to program these and fun, but you may have to hold the Ozobot to the screen for a while if you use a Mac or PC. They are cute and the flashing LEDs offers some interesting effects like the 'fireworks'.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Saturday 19 March 2016

Ramblings about Social Robotics in Schools

Sometimes what I do as job can have some major personal pluses (I get to play with robots some of the time), one of these has been the opportunity to introduce people to social robots, and recently I have been lucky enough to managed to do this four times-  twice to my own computing students, but also to groups of primary school children in two events (see below). 

Apart from it's what I enjoyed doing; the social robots we are starting to see are great, but there is so much more that could be done. Who is going to develop this - possibly one of these children? Why not? It has taken nearly 40 years to get from R2D2 on the screen to some of the social robots we are seeing launched now, in another 40 years we might have something as bright as R2D2 (R2D2 was always brighter than C3PO). Why wouldn't one or more of these bright children or one of the students I teach, be the ones to contribute to this? They have the enthusiasm, with the changes in the National Curriculum in the UK they are developing some of the skills and asking the questions. Look at the work that work being done by Pi Foundation, the CamJam EduKit 3 robot kit (http://camjam.me/?page_id=1035) and especially products such as the OhBot (see bottom of the post for details of this robot) as just as a few examples of how this is being developed.



Event 1.

In an in-reach STEAM activity day I have had the opportunity to show off two NAO robots in action to a group of 8-years olds. As well presenting a short presentation on social robots (see below). By the way Red and Smurf are the nicknames for the two robots.









Event 2. 

A talk on Social Robotics (with a little help from a Red friend) to an audience of primary school children as part of Lab_13's Lectures at Wollaston School, Northamptonshire. 

Red performed, walking with three of the children and the presentation included discussion about the robots JIBO and Buddy that are expected this year. 



Changes need to the presentation

A change I would like to make is to bring along an OhBot (a bit like the one in the video below) as well as including OhBot in the revised presentation slides.

 



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday 17 March 2016

ozobot - cute, fun and colour-mad

I have had these for a while, a couple of Ozobots. Small little robots that react to colours on the ground. Below are figures showing the Ozobots working with an App Ozogroove which allows the bots to perform dance routines, and out of the box, they are set up to run around a track that has coloured blocks causing the bots to change what they do.
figure 1.
 Interesting thing about these is the way a routine is transferred from the App to the bot. - by flashing light. In figure 1 the two ozobots are mid way through being programmed with a dance routine. Figure 2 shows one of the bots during the dance.


figure 2
The latest version these, Ozobot Bit 2.0 can be programmed via Blockly-based website with light being used to transfer the routine. I had a play with one a couple of days ago, liked it so much I order a pair from Amazon!

As an aside, when I first saw the light transfer method it reminded me of some of the early British TV programmes on Computing in the 1980s, who used a system where a flashing cursor on the screen was used to send data to be recorded on cassette or into the computer.

I would be interested in finding out what else has or can be done with these interesting little robots. If you have any ideas or experience with them please free to add comments.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Sunday 13 March 2016

Blog stats - March 2016

The blog has been going less than a year (first post was 15th July 2015) but I thought the statistics about the blog so far might be of interest.

The Top Ten posts based on page views:


716








401








395








23 Jul 2015, 
385








13 Aug 2015, 
323








318








301








300








299








286









So small robots and Raspberry Pi, which I was hoping for as they are my interests as well.



The audience based on pages views is an interesting mix and not really sure what to make of it, apart from I hope they are all finding it useful.


Pageviews by Countries

Graph of most popular countries among blog viewers
EntryPageviews
United States
8650
Slovakia
1664
United Kingdom
1390
France
394
Germany
305
Ireland
96
Russia
80
Singapore
73
Portugal
72
Sweden
43


Comments and ideas are welcome, i would love to find out what others are doing.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Top posts on this blog in March 2024

The Top 10 viewed post on this blog in March 2024. Covering areas such as small robots, augmented reality, Scratch programming, robots. Micr...