Wednesday, 30 December 2015

Daleks, cameras, and a mutant rabbit.

A little more detail on my experience of PiCademy and some of the code developed (and I apologies it is not well developed).

Programming LEDs and Motors through either the the GPIO or using an HAT (see the images below) is just what I enjoy the most.

To have a go, you may have to have the following:




In the above image was my attempt at a simple 'Dalek' - essentially a cup and straw, with a wheeled motor inside. Controlled using python,  Pi through an Explorer HAT PRO . It essentially moved in a circle either clockwise (button 1 on the explorehat) or anti-clockwise (button 2).


import explorerhat
from time import sleep
from random import randint

def wheel(channel, event):
    duration = randint(1,2)
    print(duration)
    explorerhat.motor.one.forward(100)
    sleep(duration)
    explorerhat.motor.one.stop()

def wheel2(channel, event):
    duration = randint(1,2)
    print(duration)
    explorerhat.motor.one.backward(100)
    sleep(duration)
    explorerhat.motor.one.stop()
    

explorerhat.touch.one.pressed(wheel)
explorerhat.touch.two.pressed(wheel2)

It needs a lot more work, not least of which is a moving head under seperate motor control but it is a start.


Playing with the PiCamera and a button attached to the GPIO, I came up with a simple system that everytime the button is pressed a image is captured this was based on the activities and worksheets at PiCademy. The extra was the tweak concerning providing a different filename each time. Essentially:

  •  create a string with most of the filename and path ('/home/pi/Desktop/image'); 
  • include a count of how many pictures have been taken and convert that to a string (str(count)); 
  • add the file extension ('.jpg');
  • combine them and use them as the filename.
      str1='/home/pi/Desktop/image'+str(count)+'.jpg
      camera.capture(str1)


The whole code is shown here.


from time import sleep
from picamera import PiCamera
from gpiozero import Button

camera = PiCamera()
button = Button(17)
str1=[]
count=1

while True:
    camera.start_preview(alpha=192)
    button.wait_for_press()
    str1='/home/pi/Desktop/image'+str(count)+'.jpg'
    camera.capture(str1)
    count=count+1
    camera.stop_preview()


Rise of Rabbitsapien - A team of us put together a project of a robot with a rabbit (no other soft toys were available) with a Passive IR sensor in its belly; that carries out a set routine when movement is detected.




It was also great to come away with some many resources both physical and activities. Thank you to the Pi Foundation for such a good experience.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Saturday, 26 December 2015

BB-8, Droid I was looking for... - Tynkering

In a recent post controlling the Sphero BB-8 with the Tickle App was discussed.

This is not the only alternative software, the Tynker App can also control it.


This is also a graphical drag and drop programming tool, that you can connect certain 'toys' to. Though the App itself is about developing programming skills.










The Sphero BB-8 Droid can be connected to Tynker (or how I did it anyway) by:

      • Clicking on the Create button on the opening screen;
      • Clicking on Blank Template;
      • Deleting the 'Actor' that has there and clicking on the + button in the top right hand corner of the screen;
      • Clicking on connected toys and selecting the grey ball;
      • On the main screen it should say spherobot with a code button at the side, click on the button;
      • You should get a screen with some code for changing the colour shown and then moving in a square- you can change this for your own code.
Not all the commands, listed down the side, will work with the BB-8 - I restricted myself to the ones under common (star in the list).

The Tynker app is a nice tool anyway with lots of games related activities to try. Having the ability to connect and program certain devices is a benefit.


If you have comments or experiences with Tynker, Sphero BB-8 or Tickle app please add them.





All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Friday, 25 December 2015

BB-8, this is the Droid I was looking for...




Previously I have shown the Sphero BB-8 rolling around the room under its own control

One of the features of the Sphero BB-8 Droid  is it programmable either by its own downloadable software but also by one of my favourite apps - TickleApp which has been discussed in previous posts. This app allows control of a quite an impressive range of devices using the same interface. Examples, some of which have been discussed previously (e.g Parrot Minidrone or Dash and Dot), are shown below. 






















































The App uses an graphical programming interface similar to Scratch or Blockly to produce code. The example here it a very simple one of:

  • Spin twice for one second;
  • Move roughly in a square;
  • If the BB-8 collides (or is bumped) it is set to spin twice for a second.  

Ok, not the most sophisticated bit of coding; but it does demonstrates the simplicity of controlling this robot with the app. 

Sphero BB-8 Droid is great fun, and with the head appearing to float over the body and face in the direction of movement it is hard resist. The video below shows it action.





 All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Monday, 21 December 2015

Playing with Aldebaran's NAO - walking and talking.

Ok, I need to read the manual! Managed today to play with Aldebaran NAO again and was struggling to get it to interact - this is the should have read the manual bit, it was all in there.


  • Mistake number 1 - I hadn't set a channel for all the apps so it was reacting to sounds and movement but not much more. So I set it.
  • Mistake number 2 - not understanding the meaning of the changes in the colour of the eyes, when the eyes go blue NAO is listening.


Now  it does what I was after - to be lead by the hand using the follow me app and react to some vocal commands. The video below shows "Red" in action.



I would be interest in others experiences with these robots, if you would like please add your comments below.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday, 17 December 2015

Aldebaran NAO 'Red' in Teaching

Photo by John Sinclair

I had my first opportunity today to try an Aldebaran NAO robot as a teaching tool in an AI class today. The session was an end of term activity around summarising what we did in the AI class so far and questions. 

A question came up around AI and it's impact on society. Perfect opportunity to bring in a social robot - especially as a precursor for when we include a session on social robotics next term.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Sunday, 13 December 2015

Playing with Aldebaran NAO

This is just a short post, as well as being able to go to Picademy this week (http://robotsandphysicalcomputing.blogspot.co.uk/2015/12/picademy-7-8th-december-2015.html); I have been fortunate to be able to borrow an Aldebaran NAO robot (https://www.aldebaran.com/en) for the weekend to play with.




This is an extermly cool robot, straight out of the box, tracking movement and dynamic balancing. Hopefully, more on this in future posts.

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Saturday, 12 December 2015

Experience at Raspberry Pi Cademy 7-8th December 2015

I was fortunate to get a place at Picademy (#picademy) this week. It was a fantastic opportunity and great fun, especially as most of it was about Physical computing.





In the screenshot above I was playing SonicPi (http://sonic-pi.net/) programming music (or trying to create music in my case). If you haven't had a go at throughly recommend it. It is great that SonicPi is available on the Mac and PC as well. 

Playing with connecting Python and Minecraft is very engaging and fun, but programming LEDs and Motors through either the the GPIO or using an HAT (see the images below) is just what I enjoy the most.


In the above image was my attempt at a simple 'Dalek' - essentially a cup and straw, with a wheeled motor inside. Controlled using python,  Pi through an ExplorerHat. It essentially moved in a circle either clockwise or anti-clockwise.

Rise of Rabbitsapien - A team of us put together a project of a robot with a rabbit (no other soft toys were available) with a Passive IR sensor in its belly; that carries out a set routine when movement is detected.




It was also great to come away with some many resources both physical and activities. Thank you to the Pi Foundation for such a good experience.




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Tuesday, 1 December 2015

Controlling junk with LEGO

Up to this point the junk bot building has largely being about building a moving (or drawing) 'bot' moved by vibration - limited control, but fun. A Nuffield funded bursary student, Hayden Tetley,  has being working within staff from the University of Northampton on whether LEGO 8547: Mindstorms NXT 2.0: Robot or Raspberry Pi based solutions can be incorporated with the bot to add some control of the movement (still by vibration).


Idea One 

Is to add a LEGO NXT brick, to move a junkbot similar.The motor and broken propeller combination in the earlier junkbots is replaced with the NXT brick and LEGO motor. A good potential feature is it a self-contained unit with power and control together, as well as being potentially fairly simple to set-up. This is the focus of this post. 

Here are some videos showing idea one in action using LEGO motors, brick and the software that comes with the LEGO 8547: Mindstorms NXT 2.0: Robot :





For more information on how this was done go to: http://legojunkbots.weebly.com/uploads/3/7/2/2/37227791/nuffield_nxt_mindstorms.docx or http://legojunkbots.weebly.com/

Idea Two

Is to do a similar approach as idea one but keep the motor and broken propeller combination but control the motors via a Raspberry Pi. This is discussed in another post http://robotsandphysicalcomputing.blogspot.co.uk/2015/07/raspberry-pi-controlled-robot-from-junk.html

Details of the work will be published on the Junkbots Blog (htttp://junkbots.blogspot.co.uk/ ) as the project progresses.




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Monday, 23 November 2015

5 Fascinating Facts about the Booming Robot Market

An analyst report explains why the robot industry is booming.The robot industry is experiencing a boom period that’s not likely to slow anytime soon.Bank of America Merrill Lynch  BAC  released a report this week that said that annual global sales of robots reached a record $10.7 billion in 2014. The authors valued the overall market for robotic technologies, which also includes related software and sensors, at $32 billion for the same year. By 2020, the authors expect the robot market to be worth $83 billion.

To read more go to: http://snip.ly/0eeu#http://fortune.com/2015/11/06/five-fascinating-facts-robotics-market



'via Blog this'

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Sunday, 22 November 2015

Robot Software

In the previous blog posts for this 'series' "It is a good time...." 


  • Post 1 looked at the hardware unpinning some of this positive rise in robots;
  • Post 2 looked at social robots;
  • Post 3 looked at a collection of small robots;
  • Post 4 looked at further examples of small robots

Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see: http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit). This post continues with some of the software being used with the smaller robots. 

A number of these robots are being programmed via Scratch or Scratch-like environments for example the OhBot (http://ohbot.weebly.com/) or Crumblebot (http://robotsandphysicalcomputing.blogspot.co.uk/2015/07/edge-following-crumblebot.html). Arduino based systems, discussed in Post 1, form the basis of a relatively large number of robots. Some other ways are discussed below.  



LeJOS
LeJOS (http://www.lejos.org/index.php) is an alternative way to program the LEGO Mindstorms Robotic Systems including the oldest RCX to the latest EV3. What it does is allow the robots to be programmed in Java by putting a small virtual machine on the controller/Brick. 

Some examples of it in use or being discussed can be found at:


A relate tool that use LeJOS as one of its underpinning technologies is Enchanting. A Scratch-like way to program LEGO robot based around Mindstorm NXT and EV3. For more details on this go to: http://enchanting.robotclub.ab.ca/tiki-index.php



Tickle


Tickle (https://tickleapp.com/en-us/) is one of my favourite of the physiclal computing programming tools at the moment. It is designed for program a quite range of devices using a 'Blockly-like' graphical programming approach. The Sphero range of robots and some of the Parrot Drone are supported.


When  I recently bought a Parrot Rolling Spider Mini-drone, I used the Tickle App (https://tickleapp.com/en-us/)  to control it. This was the first time I have actual programmed something that flies; the fact you are controlling  something you able to move in all directions is very engaging.

On the left is an example used; essentially lift off, repeatedly move forward, turn and in the end land.

As well as drones, the Sphero robots can be controlled using Tickle (that is how I first came across it). This does also include the entertaining and popular Sphero Star Wars BB-8. Which is well worth a play, if you get an opportunity. Dash and Dot (see http://robotsandphysicalcomputing.blogspot.co.uk/2015/07/cutest-computational-thinking-in-world.html for more details)  are also controllable through Tickle was well. 


Also a number of devices such as Punch Through Design's Arduino-based LightBlue Bean (https://punchthrough.com/bean-teaser), a Bluetooth Low Energy (BLE) microcontroller are supported- I have get to play with this one though.

I like the Tickle App because of its easy of use but mainly for the company's expansion of the range of devices supported.




Feedback
Please add comments with other software choices.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. The author does not and can not take responsible for any harm cause by the software discussed - if you are unsure do not use the software.

It is a good time to play with robots

In the previous blog posts for this 'series' "It is a good time...." 

  • Post 1 looked at the hardware unpinning some of this positive rise in robots;
  • Post 2 looked at social robots;
  • Post 3 looked at a collection of small robots;

This post continues with small robot idea a bit more, looking at some of the other robots I have been fortunate to be able to play with. The opinions are from a personal point of view of playing with them, but comments are very welcome.

Kbots

The kilobots (http://www.k-team.com/mobile-robotics-products/kilobot)were designed to be relatively low-cost devices specifically designed for work on swarm/collective intelligence experiments. Developed at Harvard University as a scalable system to program groups of robots (now into the thousands) (http://www.eecs.harvard.edu/ssr/projects/progSA/kilobot.html).



Individually these are quite simple units, they move by vibration. The real advantage, in my opinion, of the system though is you can program lots of them in one go - scalability is therefore not that difficult.

The video below is from a colleague's work who used these during his MSc work on collective intelligence. To read more on this go to: http://robotsandphysicalcomputing.blogspot.co.uk/2015/07/narinders-swarm-robots.html





Scratch Robot Arm


It is not physical but CBiS Education have release a free robot arm simulator for Scratch. 

Downloadable from their site here - it includes a Scratch project, guidance on Scratch along with an exercises in using the robot arm simulation and an exercise with teacher's guidance. 

CBiS produce a physical version of this, details are available at  http://www.cbinfosystems.com/cardboard2code_module3.aspx




What I like about this is, apart from being free, is it is Scratch-based and it does simulate physical problems such as the need to co ordinate multiple parts of the arm often to achieve a task.


LEGO
Where do I start with these? LEGO have done a great deal to get a lot of people interested in, and provided a route into robotics. Whether though the 1980 - 1990s with the Technics range or when they released their Mindstorms (I can see another post coming on here).

My interest has been focused for the last ten years or so on their use in teaching problem-solving and Java Programming to undergraduates (read more here). The combination of either the earlier RCX or NXT ranges with the incredible LeJOS (http://www.lejos.org/) provides an accessible and easy (I think easy some times) way to link robots and undergraduate programming in Java.

I will expand on these a little more in a post of dedicated to LEGO .


Junkbots

Another shameless plug, Junkbots, was a project that started close eight years ago concerned with linking computing, engineering and environmental science in activities for use in schools. The core was to use and look at waste and how could we combined waste materials and robotics to either build a 'bot' out of junk or used in combination with robotics (initially based around LEGO) to clear small junk piles (a few light materials - nuts, bolts). To read more on this project go to http://junkbots.blogspot.com/ .

The area that quickly became the focus was the building 'bots' out of junk (session plan: http://junkbots.blogspot.co.uk/2015/08/junkbot-session-overview.html). This evolved into the building one of these but controlled via a raspberry Pi. This is the idea discussed below.

The card chosen to control the motors was the 4Tronix PiRoCon card. It fits straight onto the Pi through the GPIO - no extra cables needed. ScratchGPIO has it as an addon so it makes programming it even easier (see http://cymplecy.wordpress.com/2013/10/31/pirocon-from-4tronix/). It is quite easy plug the board directly on to the GPIO connector of the Raspberry Pi (4tronix provide some advice in section 15 of http://4tronix.co.uk/blog/?p=22 on mounting the board). The only other changes I needed to make because I wasn't powering the motors through the DC input I had to change the jumper settings next to Vin Connector (see http://4tronix.co.uk/blog/?p=41 for layout) to reflect this.

Now for the fun bit getting the whole thing to draw (see Figure 1 and the video at the end)!

The junkbot itself is made up of a drinks can, three supports ( LEGO was used here but it equally could be straws, sticks), a pen/pencil, and a  motor and broken propeller combination to create an unbalanced motor.

With the Raspberry Pi off, the the motor's wires are connected to the controller card at the connections for MotorA and the battery is also connected. Turn the Pi on and run ScratchGPIO5plus.


Figure 2
Figure 3





Figure 4















The first task is to make the variables AddOn (which will be used to tell the program we are using the PiRoCon card) and MotorA for the motor (see Figure 3).

In Figure 4 the program can be seen, essentially the left and right key spin the junkbot clockwise or anticlockwise by setting the Motor to either +ve or -ve values from 0 to 100. The space bar is used to stop the motor.

As it moves because one of the supports is a pen it draws. See the video below to watch it draw a squiggly line - control is still a challenge.

 
The bot was developed by Hayden Tetley and Scott Turner. Hayden's time was paid  for through the Nuffield Research Placements  Scheme (http://www.nuffieldfoundation.org/nuffield-research-placements).

Related Link

 




In the next post in the series I want to look at the software a bit more.




All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Wednesday, 18 November 2015

Impact of research

A recently released kickstarter project website http://www.robotixedu.com/phiro.aspx has quoted research from the University of Northampton. This is an interesting product designed to teach children programming . In essence programming robots is good way to develop problem-solving skills.



The publication mentioned can be found at



  • Robots in problem-solving and programming (Scott J Turner, Gary Hill), In Proceedings of 8th Annual Conference of the Subject Centre for Information and Computer Sciences, Higher Education Academy Information and Computer Sciences Centre, Ulster, pp. 82--85, 2007. [paper]

  • With example related paper :

      • Problems first second and third (Gary Hill, Scott J Turner), In International Journal of Quality Assurance in Engineering and Technology Education (IJQAETE), volume 3, pp. 88--109, 2014. [paper]
      • Robotics within the teaching of problem-solving (Scott J Turner, Gary Hill), In ITALICS, volume 7, pp. 108--119, 2008.[paper]

    To read more about the research by the team in the area of robots for developing problem-solving skills go to:

    http://compuationalthinking.blogspot.co.uk/2015/03/problem-solving-publications.html


    If you'd like to find out more about Computing at the University of Northampton go to: www.computing.northampton.ac.uk. All views and opinions are the author's and do not necessarily reflected those of any organisation they are associated with or endorse the product.


    All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

    Sunday, 15 November 2015

    Robotics within the Teaching of Problem-Solving

    Robotics within the teaching of Problem-Solving



    Volume/Issue:  Vol 7, Issue 1

    Date:Sunday, 1 June, 2008

    Journal Name: ITALICS

    Author(s)

    Scott Turner
    Gary Hill

    Abstract
    This paper considers the experiences of teaching on a module where problem-solving is taught first, then programming. The main tools for the problem-solving part, alongside two problem-solving approaches, are tasks using Mindstorm (LEGO, Denmark) robot kits. This is being done as a foundation step before the syntax of a language (Java) is taught to enable a Graphical User Interface (GUI) emulation of a previous robot problem. Results of student evaluation and feedback will be presented and the use of two simulators will be considered.

    Full paper available at: https://www.heacademy.ac.uk/robotics-within-teaching-problem-solving or PDF version https://www.heacademy.ac.uk/sites/default/files/ital.7.1h.pdf

    All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

    Who wants to produce AI produced cartoon strips

    Question: How easy is it produce a comic/cartoon using genetative AI? Let's start with  using ChatGPT4o to produce cartoons. The idea wa...