With a bit of time off, I had a chance to play with the Ohbot (see Previous related links below) a little more. My son and I played with trying to find could we get it to say hello if there is a face on camera but otherwise make it move randomly as if it was looking.
We based the code on the examples codes that can be found at the OhBot website (http://ohbot.weebly.com/).
The video below shows it in action.
Previous related posts
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Robots and getting computers to work with the physical world is fun; this blog looks at my own personal experimenting and building in this area.
Thursday, 27 August 2015
Sunday, 23 August 2015
When is a bug not buggy - CodeBug
When it is a Codebug
- a board with 5x5 LED matrix and 4 connectors that can be either inputs or outputs.
Programming is through a Blockly interface (as above). Code can be tested using the simulator on the left, before downloading to the CodeBug. The site contains a excellent video showing all the steps (see http://www.codebug.co.uk/gettingstarted/ ).
Codebug, Experimenter Kit CODEBUG-EXPKIT at the moment is usually programmed on-line. There is though a Scratch version in development by Cymplecy (SimpleSi) http://simplesi.net/scratchcodebug-beta-testing/ that is an potential option for off-line programming. It is currently at the Beta testing stage but is good fun to play with.
What is nice about Codebug and, in fact, most of the Physical Computing device coming out that the moment is the developing community. There is a lot of sharing of projects, ideas and solutions to problems being provide to others through the site (and via Twitter)
I like the Codebug it is a small, relatively inexpensive device (or will be when it comes on the market- most of the ones out there at the moment are in the hands of those who contributed to the Kickstarter funding). The site (http://www.codebug.co.uk/) is full of useful links, examples and ideas.
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Programming is through a Blockly interface (as above). Code can be tested using the simulator on the left, before downloading to the CodeBug. The site contains a excellent video showing all the steps (see http://www.codebug.co.uk/gettingstarted/ ).
Codebug, Experimenter Kit CODEBUG-EXPKIT at the moment is usually programmed on-line. There is though a Scratch version in development by Cymplecy (SimpleSi) http://simplesi.net/scratchcodebug-beta-testing/ that is an potential option for off-line programming. It is currently at the Beta testing stage but is good fun to play with.
What is nice about Codebug and, in fact, most of the Physical Computing device coming out that the moment is the developing community. There is a lot of sharing of projects, ideas and solutions to problems being provide to others through the site (and via Twitter)
I like the Codebug it is a small, relatively inexpensive device (or will be when it comes on the market- most of the ones out there at the moment are in the hands of those who contributed to the Kickstarter funding). The site (http://www.codebug.co.uk/) is full of useful links, examples and ideas.
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Thursday, 20 August 2015
Problems First, Second and Third DOI: 10.4018/ijqaete.2014070104
A paper has recently been published in International Journal of Quality Assurance in Engineering and Technology Education on problem-solving and programming by two members of the Department of Computing and Immersive Technologies, University of Northampton.
Abstract
This paper considers the need to focus initial programming education on problem-solving, prior to the teaching of programming syntax and software design methodology. The main vehicle for this approach is simple Lego based robots programmed in Java, followed by the programming of a graphical representation/simulation to develop programming skills. Problem solving is not trivial (Beaumont & Fox, 2003) and is an important skill, central to computing and engineering. The paper extends the authors earlier research on problems first and problem solving (Hill & Turner, 2011) to further emphasise the importance of problem-solving, problem based learning and the benefits of both physical and visual solutions. An approach will be considered, illustrated with a series of problem-solving tasks that increase in complexity at each stage and give the students practice in attempting problem-solving approaches, as well as assisting them to learn from their mistakes. Some of the problems include ambiguities or are purposely ill-defined, to enable the student to resolve these as part of the process. The benefits to students will be discussed including students' statements that this approach, using robots, provides a method to visually and physically see the outcome of a problem. In addition, students report that the method improves their satisfaction with the course. The importance of linking the problem-solving robot activity and the programming assignment, whilst maintaining the visual nature of the problem, will be discussed, together with the comparison of this work with similar work reported by other authors relating to teaching programming using robots (Williams, 2003). In addition, limitations will be discussed relating to the access to the physical robots and the alternative attempts to simulate the robots using three options of, Microsoft Robotics Studio (MSRS), Lego Mindstorms and Greenfoot simulators.
To read a preview of the paper go to: http://www.igi-global.com/viewtitlesample.aspx?id=117560&ptid=91662&t=Problems%20First,%20Second%20and%20Third
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Problems First, Second and Third.
Gary Hill and Scott Turner
DOI: 10.4018/ijqaete.2014070104
DOI: 10.4018/ijqaete.2014070104
Abstract
This paper considers the need to focus initial programming education on problem-solving, prior to the teaching of programming syntax and software design methodology. The main vehicle for this approach is simple Lego based robots programmed in Java, followed by the programming of a graphical representation/simulation to develop programming skills. Problem solving is not trivial (Beaumont & Fox, 2003) and is an important skill, central to computing and engineering. The paper extends the authors earlier research on problems first and problem solving (Hill & Turner, 2011) to further emphasise the importance of problem-solving, problem based learning and the benefits of both physical and visual solutions. An approach will be considered, illustrated with a series of problem-solving tasks that increase in complexity at each stage and give the students practice in attempting problem-solving approaches, as well as assisting them to learn from their mistakes. Some of the problems include ambiguities or are purposely ill-defined, to enable the student to resolve these as part of the process. The benefits to students will be discussed including students' statements that this approach, using robots, provides a method to visually and physically see the outcome of a problem. In addition, students report that the method improves their satisfaction with the course. The importance of linking the problem-solving robot activity and the programming assignment, whilst maintaining the visual nature of the problem, will be discussed, together with the comparison of this work with similar work reported by other authors relating to teaching programming using robots (Williams, 2003). In addition, limitations will be discussed relating to the access to the physical robots and the alternative attempts to simulate the robots using three options of, Microsoft Robotics Studio (MSRS), Lego Mindstorms and Greenfoot simulators.
- Hill, G. and Turner, S. J. (2014) Problems First, Second and Third. International Journal of Quality Assurance in Engineering and Technology Education (IJQAETE). 3(3), pp. 88-109. ISSN: 2155-496 DOI: 10.4018/ijqaete.2014070104
To read a preview of the paper go to: http://www.igi-global.com/viewtitlesample.aspx?id=117560&ptid=91662&t=Problems%20First,%20Second%20and%20Third
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Monday, 17 August 2015
International partnership pays off for robotics expert Safaa Shwail - defends his PhD thesis
International partnership pays off for robotics expert Safaa Shwail, as he defends his PhD thesis: "Postgraduate student Safaa H Shwail, who studied at both the University of Babylon and University of Northampton, has recently defended his PhD thesis. His work investigated robot pathfinding - how to make robots move at the same time, going to different places, without crashing into each other"
To read more go to: http://www.northampton.ac.uk/news/international-partnership-pays-off-for-robotics-expert-safaa-shwail-as-he-defends-his-phd-thesis
'via Blog this'
If you'd like to find out more about Computing at the University of Northampton go to: www.computing.northampton.ac.uk. All views and opinions are the author's and do not necessarily reflected those of any organisation they are associated with
To read more go to: http://www.northampton.ac.uk/news/international-partnership-pays-off-for-robotics-expert-safaa-shwail-as-he-defends-his-phd-thesis
'via Blog this'
If you'd like to find out more about Computing at the University of Northampton go to: www.computing.northampton.ac.uk. All views and opinions are the author's and do not necessarily reflected those of any organisation they are associated with
Thursday, 13 August 2015
OhBot Experiment
As a bit of fun and an excuse to play with the OhBot - I was wondering whether I could get it to produce an introduction to a module when hello is said by the user.
Features
The video below shows the results.
Related links
Ohbot- social robot
OhBot (http://ohbot.weebly.com/)
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Features
- To move randomly with small movements.
- When the word hello is spoken it starts speaking (or appear to).
- Go to a standard starting point initially.
The video below shows the results.
Related links
Ohbot- social robot
OhBot (http://ohbot.weebly.com/)
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Monday, 10 August 2015
Ohbot a social robot
I have just finished building an OhBot (http://ohbot.weebly.com/); a robot face (see picture to the left - I fixed the cross-eyes later). This cool little kit actually comes with some very nice software, that includes face tracking and a Scratch-like blocks programming language.
One bit of advice is put as aside several hours to do this, my experience is takes quite a while to build (that might just be me though). It is worth it, when you see the head, eyes, etc moving it is very engaging.
The site has links to all the software needed and some very useful sample programs.
This is nice engaging robot that comes with a user-friendly programming language. The finished robot reminds me a bit of Cynthia Braziel's Kismet robot (http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html) from MIT in the 1990s. So this might also be a good introduction to the area of social robotics and, as at the time of writing this, only £99, a relative inexpensive way into this area.
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
One bit of advice is put as aside several hours to do this, my experience is takes quite a while to build (that might just be me though). It is worth it, when you see the head, eyes, etc moving it is very engaging.
The site has links to all the software needed and some very useful sample programs.
This is nice engaging robot that comes with a user-friendly programming language. The finished robot reminds me a bit of Cynthia Braziel's Kismet robot (http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html) from MIT in the 1990s. So this might also be a good introduction to the area of social robotics and, as at the time of writing this, only £99, a relative inexpensive way into this area.
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Tuesday, 4 August 2015
Playing with Tickle and Mini-drone
Recently I bought a Minidrone Rolling Spider Parrot
and used a beta version of the Tickle App (https://tickleapp.com/en-us/) to control it.
This was the first time I have actual programmed something that flies and it is quite addictive having something you are controlling being able to move in all directions.
On the left is an example code used, essentially lift off, repeatedly move forward, turn and in the end land.
I wish the mini-drone had a little bit more battery time (I would suggest getting an extra battery). Combining with the drone and the Tickle App does add something to the experience, rather than just control it directly (though that is fun).
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
This was the first time I have actual programmed something that flies and it is quite addictive having something you are controlling being able to move in all directions.
On the left is an example code used, essentially lift off, repeatedly move forward, turn and in the end land.
I wish the mini-drone had a little bit more battery time (I would suggest getting an extra battery). Combining with the drone and the Tickle App does add something to the experience, rather than just control it directly (though that is fun).
All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with.
Subscribe to:
Posts (Atom)
Who wants to produce AI produced cartoon strips
Question: How easy is it produce a comic/cartoon using genetative AI? Let's start with using ChatGPT4o to produce cartoons. The idea wa...