Skip to main content

Remote Data Logging with V1 Microbit


Remote Data Logging with Two Micro:bits

In a previous post, we used a single micro:bit to log sensor data. That worked well, but it came with a significant limitation: the micro:bit had to stay plugged into the computer the whole time. In many real situations, you want to place a sensor somewhere away from your computer — across a room, outside a window, or just somewhere more useful.

The solution is to split the job between two devices: one micro:bit collects and transmits sensor data wirelessly, while the other receives it and passes it to your computer for logging. This is actually very close to how remote monitoring systems work in the real world.


The Plan

We'll build on the previous data-logging project and turn it into a simple wireless monitoring system. 

Tip: To avoid confusion, program each micro:bit separately. Only plug one in at a time, and use a separate MakeCode window for each.


Micro:bit 1 — The Transmitter

This micro:bit handles the sensing. It reads the light level and temperature, then broadcasts those values wirelessly using the micro:bit's built-in radio. You can read more about how the radio works here.

The key idea is that both micro:bits are set to the same radio group — just a shared number that lets them communicate with each other while ignoring other devices.

Once programmed, this micro:bit doesn't need to stay connected to the computer. Attach a battery pack and you can move it wherever you want to take measurements. That's what makes it remote.




 

 

Micro:bit 2 — The Receiver

This one stays plugged into your computer. It listens for incoming radio signals from the transmitter and, when it picks one up, stores the received values in variables. It then writes those values to the MakeCode data viewer, exactly as the single-micro:bit version did.

To make this work, it needs to:

  • Use the same radio group number as the transmitter
  • Identify which value has been received (light or temperature) based on the label sent with each reading
  • Log the value to the serial output for the data viewer to display

.

Figure 2 Reciever

 

 

Seeing It in Action

Once both micro:bits are running, open the Show Data panel in MakeCode on the receiver's window. You should see graphs updating in real time, just like before.

Temperature changes slowly, so don't be surprised if that line looks fairly flat at first — try holding the transmitter in your hand to warm it up gradually. Light is easier to experiment with: point the transmitter's LED panel toward a bright screen or lamp and you'll see the values respond immediately.

To save your data as a CSV file and open it in a spreadsheet later, use the blue download icon in the data viewer — same as before.

figure 3 Data logging remotely


 

Ideas to Take It Further

  • Add a visual indicator on the transmitter. When it's running without the computer, there's no obvious sign it's doing anything. Could you add something to the display to show it's actively sending data?
  • Experiment with sampling rate. Right now the transmitter sends data as fast as it can. Try replacing the forever loop with one that sends data at set intervals — like once per second or every five seconds. This mirrors what we did in the single micro:bit version and can make your graphs much easier to read.


All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Comments

Popular posts from this blog

Robot Software

In the previous blog posts for this 'series' "It is a good time...."  Post 1  looked at the hardware unpinning some of this positive rise in robots; Post 2  looked at social robots; Post 3  looked at a collection of small robots; Post 4 looked at further examples of small robots Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see:  http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/ ). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit ).  This post contin ues with some of the software being used with the smaller robots.  A number ...

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language , with support Dale from Machine Learning for Kids , that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the  machinelearningforkids.co.uk   site anyway it is full of exciting things to do (for example loads of activities  https://machinelearningforkids.co.uk/#!/worksheets  ) . Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with. The critical thing for this post is  Machine Learning for Kids  have created a Scratch 3 template with their own extensions for Scratch 3 within it  https://machinelearningforkids.co.uk/scratch3/ . One of which is a Speech to Text extension (see below). You must use this one ...

Scratch and web-cams in Scratch 3

Scratch 3 was launched on 2nd January 2019, so I wanted to know would Webcams still work with Scratch 3 as it did with Scratch 2. For example, in a previous post  Scratch, webcams, cats and explosions  the cat (Scratch) moved across the screen and a button burst when the object moved in the camera onto it.  Can the same thing be done in Scratch 3? The short answer is yes, but it is done slightly differently. The first change the video capture is not there in the blocks automatically; but is an extension that needs to be added. First, you need to add the extension blocks for video sensing. Go to the little icon at the bottom left of the screen (as shown below) this takes you to the extensions menu. Next, find the Video Sensing option and selected. The webcam, if enabled, with start automatically. A video sensing set of blocks is now in the list of block options.  The rest is very similar to doing this in Scratch 2. Moving ...