Skip to main content

What does Gen AI 'think' of this blog?


Ok, no surprise that I enjoy playing with Generative AI :-) A few of the recent posts are around this. Here is the question, though: what does Generative AI pick out from the blog when asked?

Using the Gemini and the following two prompts:
Prompt 1: Go to the following blog https://robotsandphysicalcomputing.blogspot.com/ read it in and summarises it as a 400 word blog post to go on the blog and urls will not count in the word limit

Prompt 2:Embed urls to make it easier for the reader to access the posts

The result

Exploring the Intersection of Code and Hardware: A Look at "Robots and Physical Computing"

Welcome to a comprehensive overview of Robots and Physical Computing, a dedicated space where the digital realm directly meets the physical world. Authored by Dr. Scott Turner, this blog operates on a delightfully simple yet profound premise: getting computers to interact with their physical environment is incredibly fun. It serves as an ongoing digital notebook that documents personal experiments, educational coding projects, and hands-on building experiences in the ever-expanding field of physical computing.

For educators, hobbyists, and STEM advocates, the blog is an absolute treasure trove of accessible, step-by-step tutorials. The core philosophy here centers on lowering the barrier to entry for coding and electronics. By heavily featuring user-friendly platforms like the micro:bit, TinkerCad, and various smart electronics kits, the blog demystifies the often-intimidating world of hardware design. Whether you are looking to integrate physical computing into a university computing curriculum or just tinkering on your workbench over the weekend, the content is consistently geared toward practical and immediate application.

A quick dive into the most recent posts perfectly highlights the blog’s dynamic range. In early 2025, the focus successfully bridged simulated environments with smart technology. For instance, a recent tutorial explores how to build a TinkerCad microbit traffic light simulator (note: exact URL path based on blog archives). This post beautifully illustrates how free, online CAD systems can be leveraged to prototype electronic circuits and test neural networks before you ever touch a physical wire. Similarly, a review of the Elecfreaks micro:bit Smart Home kit walks readers through creating a voice-activated light solution, demonstrating how simple code and accessible sensors can mimic real-world smart building technology.

Interestingly, the blog also seamlessly ventures beyond traditional hardware. A late 2024 post explores producing AI-generated cartoon strips, proving that the blog's scope continuously evolves alongside modern technology trends. Additionally, practical data applications remain a recurring theme, as seen in the detailed tutorials on setting up remote data logging systems using older V1 micro:bits and local Microbit V1 datalogging to monitor environmental conditions.

Ultimately, Robots and Physical Computing is much more than just a technical guide; it is a celebration of curiosity and maker culture. It encourages readers to pick up a microcontroller, write a few lines of block code, and watch an LED light up or a motor spin. If you are eager to bridge the gap between software and hardware, delving into this blog’s rich archives will undoubtedly spark your next great project.


Not bad as a summary! Little more gushing than I would have written.


Next task: summarise this as an image. Still using Gemini (and automatically Nano Banna) the following prompts were used
Prompt 1:Use the text produce an image that encapsulates what was summarised and make the image downloadable

Prompt 2: Compress the image so it is suited to the web and allow it to be downloaded

The result, in my opinion, was good and can be seen as the image at the top of the post.

This is just a bit of fun and normal service with Physical Computing will be resumed soon.



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Comments

Popular posts from this blog

Robot Software

In the previous blog posts for this 'series' "It is a good time...."  Post 1  looked at the hardware unpinning some of this positive rise in robots; Post 2  looked at social robots; Post 3  looked at a collection of small robots; Post 4 looked at further examples of small robots Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see:  http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/ ). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit ).  This post contin ues with some of the software being used with the smaller robots.  A number ...

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language , with support Dale from Machine Learning for Kids , that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the  machinelearningforkids.co.uk   site anyway it is full of exciting things to do (for example loads of activities  https://machinelearningforkids.co.uk/#!/worksheets  ) . Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with. The critical thing for this post is  Machine Learning for Kids  have created a Scratch 3 template with their own extensions for Scratch 3 within it  https://machinelearningforkids.co.uk/scratch3/ . One of which is a Speech to Text extension (see below). You must use this one ...

Scratch and web-cams in Scratch 3

Scratch 3 was launched on 2nd January 2019, so I wanted to know would Webcams still work with Scratch 3 as it did with Scratch 2. For example, in a previous post  Scratch, webcams, cats and explosions  the cat (Scratch) moved across the screen and a button burst when the object moved in the camera onto it.  Can the same thing be done in Scratch 3? The short answer is yes, but it is done slightly differently. The first change the video capture is not there in the blocks automatically; but is an extension that needs to be added. First, you need to add the extension blocks for video sensing. Go to the little icon at the bottom left of the screen (as shown below) this takes you to the extensions menu. Next, find the Video Sensing option and selected. The webcam, if enabled, with start automatically. A video sensing set of blocks is now in the list of block options.  The rest is very similar to doing this in Scratch 2. Moving ...