Showing posts with label social robot. Show all posts
Showing posts with label social robot. Show all posts

Sunday, 26 February 2017

Ubtech Alpha 2 is alive and dancing

Alpha2 is the next stage on from the more readily available Ubtech Alpha 1s, funded through a recent crowd sourcing project. At the moment all I have done is the set it up with an iPad and tried out a few voice commands. It does dance on the slightly creepy command "Dance for me", personally, I am not sure of it saying "master".

So far it looks good and there is plenty to investigate. 





All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Saturday, 19 March 2016

Ramblings about Social Robotics in Schools

Sometimes what I do as job can have some major personal pluses (I get to play with robots some of the time), one of these has been the opportunity to introduce people to social robots, and recently I have been lucky enough to managed to do this four times-  twice to my own computing students, but also to groups of primary school children in two events (see below). 

Apart from it's what I enjoyed doing; the social robots we are starting to see are great, but there is so much more that could be done. Who is going to develop this - possibly one of these children? Why not? It has taken nearly 40 years to get from R2D2 on the screen to some of the social robots we are seeing launched now, in another 40 years we might have something as bright as R2D2 (R2D2 was always brighter than C3PO). Why wouldn't one or more of these bright children or one of the students I teach, be the ones to contribute to this? They have the enthusiasm, with the changes in the National Curriculum in the UK they are developing some of the skills and asking the questions. Look at the work that work being done by Pi Foundation, the CamJam EduKit 3 robot kit (http://camjam.me/?page_id=1035) and especially products such as the OhBot (see bottom of the post for details of this robot) as just as a few examples of how this is being developed.



Event 1.

In an in-reach STEAM activity day I have had the opportunity to show off two NAO robots in action to a group of 8-years olds. As well presenting a short presentation on social robots (see below). By the way Red and Smurf are the nicknames for the two robots.









Event 2. 

A talk on Social Robotics (with a little help from a Red friend) to an audience of primary school children as part of Lab_13's Lectures at Wollaston School, Northamptonshire. 

Red performed, walking with three of the children and the presentation included discussion about the robots JIBO and Buddy that are expected this year. 



Changes need to the presentation

A change I would like to make is to bring along an OhBot (a bit like the one in the video below) as well as including OhBot in the revised presentation slides.

 



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday, 3 March 2016

NAO, chatbots, teaching and just plain showing off

I  managed to do something I have want to do for a long time, show off social robots; (now I have three goes at it) thanks to the recent purchase of NAO robots by the University of Northampton. 

Session 1
In an in-reach STEAM activity day I have had the opportunity to show the robots in action to a group of 8 years olds. As well presenting a short presentation on social robots.








I would be interested to hear how others are using NAOs via comments.



Session 2 and 3
I have managed to include a physical example of Social Robots into my teaching. The aim of the session was to teach about social AI, revolving around the using social cues, to a certain extent, using natural language through chatbots, for us to communicate with machines.

The robots were used as an example of a social robot,  the way we want to play with or work with them, without having to going through a steep learning curve on how to use them. Students were encouraged to consider why this was and that anthropomorphisation plays a part (NAO basically has some of the characteristics of a small child). The fact that it responds to voice commands, its looks, has a childlike voice, that it always moving (even slightly when standing) and the way it moves; were spotted by the group as ways it attracts us to it - it is really hard not to talk to it like a child sometimes (but perhaps that is just me).




This session also included the use of chatbots (one example, ALICE used is shown here) and AIML, Artificial Intelligence Markup Language, (a link to more about AIML is included below). Just as a bit of background, chatbots (also called variously, chatterbots, conversational agents, etc) are programs that hold a conversation with us using through either text or speech. The chatbots were used to show how we can create intelligent-like behave by in effect providing responses to questions. Followed by, how we then take this further by using the responses people give, while using the chatbot, to 'fine-tune' the model.



To read more about NAO robots go to https://www.aldebaran.com/en
To read more on AIML go to http://www.alicebot.org/aiml.html

Example chatbots

If you would like to create your own chatbot personally I think one of the easiest ways to start is through https://playground.pandorabots.com/en/quickstart/


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Monday, 21 December 2015

Playing with Aldebaran's NAO - walking and talking.

Ok, I need to read the manual! Managed today to play with Aldebaran NAO again and was struggling to get it to interact - this is the should have read the manual bit, it was all in there.


  • Mistake number 1 - I hadn't set a channel for all the apps so it was reacting to sounds and movement but not much more. So I set it.
  • Mistake number 2 - not understanding the meaning of the changes in the colour of the eyes, when the eyes go blue NAO is listening.


Now  it does what I was after - to be lead by the hand using the follow me app and react to some vocal commands. The video below shows "Red" in action.



I would be interest in others experiences with these robots, if you would like please add your comments below.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday, 17 December 2015

Aldebaran NAO 'Red' in Teaching

Photo by John Sinclair

I had my first opportunity today to try an Aldebaran NAO robot as a teaching tool in an AI class today. The session was an end of term activity around summarising what we did in the AI class so far and questions. 

A question came up around AI and it's impact on society. Perfect opportunity to bring in a social robot - especially as a precursor for when we include a session on social robotics next term.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Sunday, 13 December 2015

Playing with Aldebaran NAO

This is just a short post, as well as being able to go to Picademy this week (http://robotsandphysicalcomputing.blogspot.co.uk/2015/12/picademy-7-8th-december-2015.html); I have been fortunate to be able to borrow an Aldebaran NAO robot (https://www.aldebaran.com/en) for the weekend to play with.




This is an extermly cool robot, straight out of the box, tracking movement and dynamic balancing. Hopefully, more on this in future posts.

All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Sunday, 25 October 2015

It is good time to play with Social Robots

Social robotics has a research area in Universities for a while, looking into interface with robots that are based around our social cues, or modelling social cues to understand neurodiversity such as Autism. Some great work by companies such Aldebaran Robotics (https://www.aldebaran.com/en) with their Nao and Pepper robots have raised the profile of social robotics.



People like Cynthia Breazeal leading on this:



What I find most exciting is these robots are now they are coming into the home.





OhBot
At the entry level in terms of price, and very well featured, is the OhBot (http://ohbot.weebly.com/). This is a  is a kit for a robot head with a Scratch-like interface having face-detection, some speech recognition in the current version; controlling several servos to get facial movement. It has provided hours of fun so far (see the video below). This is a great bit of kit for its price.




Jibo
Jibo has been developed by a company headed by Cynthia Breazeal. It is not yet released (end of 2015/beginning of 2016) but the videos make it look very interesting. A stationary robot that seems to be about providing a social interface to many of things we do.







Buddy
A robot soon to be released by Bullfrog Robotics (http://www.bluefrogrobotics.com/buddy-your-companion-robot/) . This is an incredible cute robot. 




Related links
It is a good time: 1 Introduction


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Thursday, 13 August 2015

OhBot Experiment

As a bit of fun and an excuse to play with the OhBot - I was wondering whether I could get it to produce an introduction to a module when hello is said by the user.


Features

  • To move randomly with small movements.
  • When the word hello is spoken it starts speaking (or appear to).
  • Go to a standard starting point initially.



The video below shows the results.





Related links

Ohbot- social robot
OhBot (http://ohbot.weebly.com/)



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Monday, 10 August 2015

Ohbot a social robot

I have just finished  building an OhBot (http://ohbot.weebly.com/); a robot face (see picture to the left - I fixed the cross-eyes later). This cool little kit actually comes with some very nice software, that includes face tracking and a Scratch-like blocks programming language.

One bit of advice is put as aside several hours to do this, my experience is takes quite a while to build (that might just be me though). It is worth it, when you see the head, eyes, etc moving it is very engaging. 


The site has links to all the software needed and some very useful sample programs.

This is nice engaging robot that comes with a user-friendly programming language. The finished robot reminds me a bit of Cynthia Braziel's Kismet robot (http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html) from MIT in the 1990s. So this might also be a good introduction to the area of social robotics and, as at the time of writing this, only £99, a relative inexpensive way into this area.






All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.

Who wants to produce AI produced cartoon strips

Question: How easy is it produce a comic/cartoon using genetative AI? Let's start with  using ChatGPT4o to produce cartoons. The idea wa...