Skip to main content

Fast-Track Beginner’s Guide to Building VR and AR in Your Browser 2026


Have you ever looked at a Virtual Reality (VR) headset and thought, "I wish I could build something for that," only to be scared off by the mention of complex game engines, expensive hardware, or high-level coding?

The landscape of digital creation has changed. You no longer need a massive workstation or a background in computer science to build immersive worlds. If you can write a few lines of basic HTML, you have everything you need to become a VR developer. By using A-Frame, an open-source web framework, and the power of modern browsers, your "playtime" can turn into a gateway to the Metaverse.

Why WebVR?

WebVR (and its modern successor, WebXR) is built on a simple philosophy: accessibility. Unlike traditional VR apps that require massive downloads, WebVR experiences are just links. They work on your laptop, your smartphone, and high-end headsets like the Meta Quest or Apple Vision Pro.

In this guide, we have collated a series of experiments—originally shared by Scott Turner—to help you go from a blank screen to a fully interactive augmented world.


🛠 Step 0: Setting Up Your Laboratory

In the early days of this series, we used tools like Mozilla Thimble and Glitch. In 2026, the most beginner-friendly "no-setup" environment is CodePen.io.

  1. Go to CodePen.io and click "Start Coding".

  2. You will see three boxes: HTML, CSS, and JS. We only need the HTML box.

  3. Click the Settings button at the top.

  4. In the HTML tab, look for the "Stuff for <head>" section and paste this script tag: <script src="https://aframe.io/releases/1.7.1/aframe.min.js"></script>

  5. Click Save & Close.

This script is the "engine" that tells your browser how to render 3D shapes, lighting, and movement.

Example of codepen.io used to make this work.




🚀 Path 1: Building a Virtual World (VR)

The first path is purely digital. We are going to create a scene from scratch, set the atmosphere, and add a rotating object. In A-Frame, everything happens inside the <a-scene> tag. Think of this as your stage.

The Copy-and-Paste Template

Paste this into your HTML box:

HTML
<a-scene>
  <a-sky color="#FFCC00"></a-sky>

  <a-text value="Welcome to Playtime!" color="black" position="-1.5 2.5 -3" width="6"></a-text>

  <a-box position="0 1 -5" rotation="0 45 0" color="#4CC3D9"
         animation="property: rotation; to: 0 405 0; loop: true; dur: 5000">
  </a-box>
  
  <a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4"></a-plane>

  <a-camera position="0 1.6 0"></a-camera>
</a-scene>

Understanding the 3D Space

To move objects, you use the Position attribute: position="X Y Z".

  • X (Horizontal): Negative is left, positive is right.

  • Y (Vertical): How high it sits off the floor.

  • Z (Depth): Negative numbers move the object away from you.

In the code above, our box is at 0 1 -5. This means it is centered, one meter high, and five meters away from your starting face.


📱 Path 2: Blending Reality (Augmented Reality)

Augmented Reality (AR) is where the real magic happens. Instead of a digital background, we use your webcam to overlay objects onto your desk. To do this, we use AR.js and a "Marker"—a physical image that the computer uses as an anchor.

The AR Template

Clear your HTML box and paste this:

HTML
<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8">
    <link rel="stylesheet" href="./style.css">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Pen</title>
    <script src="https://aframe.io/releases/1.7.1/aframe.min.js"></script>
<script src="https://raw.githack.com/AR-js-org/AR.js/master/aframe/build/aframe-ar.js"></script>
  </head>
<body style="margin: 0; overflow: hidden;"> <a-scene embedded arjs> <a-marker preset="hiro"> <a-sphere position="0 0.5 0" radius="0.5" color="white" src="https://res.cloudinary.com/demo/image/upload/w_1200,q_auto,f_auto/sample.jpg" alt="flowers on the ball"" animation="property: rotation; to: 0 360 0; loop: true; dur: 3000" > </a-sphere> <a-text value="it worked" color="blue" align="center" position="0 1.2 0"></a-text> </a-marker> <a-marker preset="kanji"> <a-box src="https://images.unsplash.com/photo-1615933530038-314d91bb1039?ixid=MXwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHw%3D&ixlib=rb-1.2.1&auto=format&fit=crop&w=1500&q=80" position="0 0.5 0" material="opacity: 0.7; color: white;"> <a-animation attribute="rotation" dur="3000" to="360 360 0" repeat="indefinite"></a-animation> </a-box> <a-text value="box it" color="red" align="center" position="0 1.2 0"></a-text> </a-marker> <a-entity camera></a-entity> </a-scene> </body> </html>

How to see the AR in action

When you run this code, your browser will ask for camera permission. Once allowed, you need to show the Hiro Marker or Kanji Marker to the camera. You can print the images below or simply hold your phone up to the webcam with the image on the screen. Print the markers out ensure there is a frame of white around the marker, and then start waving them in from our the camera.

The "Hiro" Marker (Primary Anchor)

Hiro marker to use with the AR code


The "Kanji" Marker 

kanji marker to use with the AR



🎨 Leveling Up: Adding "Skins" and 360° Video

Once you can move boxes and spheres, the next step is making them look real.

1. Mapping Images (Textures)

In the original series, I experimented with "wrapping" images onto objects. In A-Frame, this is incredibly simple. If you have an image file (like a logo), you just add it to the src attribute: <a-box src="your-image.png" color="white"></a-box>. Here are the links to the examples used in the AR example

Flowers: https://res.cloudinary.com/demo/image/upload/w_1200,q_auto,f_auto/sample.jpg

Computer on a desk: https://images.unsplash.com/photo-1615933530038-314d91bb1039?ixid=MXwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHw%3D&ixlib=rb-1.2.1&auto=format&fit=crop&w=1500&q=80


Top Tip: If you use a colour other than white, A-Frame will "tint" your image. This is a great way to create different variations of the same texture.

2. 360° Video

You can turn your scene into a "virtual cinema" or a teleportation device using 360° video. Instead of a flat background, you use the <a-videosphere> tag. This allows the user to stand inside the video and look around as it plays.


💡 Troubleshooting for Beginners

  • Camera won't start: Ensure you are using https:// (CodePen handles this automatically). Browsers block camera access on non-secure sites.

  • The object is missing: Check your Z-coordinate. If the Z is positive (e.g., position="0 1 5"), the object is actually behind you!

  • Code is messy: A-Frame is "Parent-Child" based. If you want an animation to belong to a box, make sure the animation tag is inside the box tag.

  • Fingers in the way: With the markers, if your fingers go into the markers' black frame when you're holding it, the AR will not happen.

Summary

The goal of this "Playtime" series was to prove that VR isn't just for AAA game studios. It’s for teachers, students, and hobbyists. With just a few lines of code, you have built a bridge between the physical and digital worlds.

Now that you have the templates, change the colors, swap the shapes, and see what happens. The best way to learn WebVR is to break it and fix it again.

What if you had more markers for AR? This link might be of interest  https://aframe.io/blog/arjs/#different-type-of-markers-pattern-and-barcode and there are more markers at https://github.com/artoolkit/artoolkit5/tree/master/doc/patterns 


Have fun.


Reference Links



All opinions in this blog are the Author's and should not in any way be seen as reflecting the views of any organisation the Author has any association with. Twitter @scottturneruon

Comments

Popular posts from this blog

Robot Software

In the previous blog posts for this 'series' "It is a good time...."  Post 1  looked at the hardware unpinning some of this positive rise in robots; Post 2  looked at social robots; Post 3  looked at a collection of small robots; Post 4 looked at further examples of small robots Robots, such as the forthcoming Buddy and JIBO, will be based some established open sourceand other technologies. Jibo will be based around various technologies including Electron and JavaScript (for more details see:  http://blog.jibo.com/2015/07/29/jibo-making-development-readily-accessible-to-all-developers/ ). Buddy is expected to be developed around tools for Unity3d, Arduino and OpenCV, and support Python, C++, C#, Java and JavaScript (for more details see http://www.roboticstrends.com/article/customize_your_buddy_companion_robot_with_this_software_development_kit ).  This post contin ues with some of the software being used with the smaller robots.  A number ...

Speech Recognition in Scratch 3 - turning Hello into Bonjour!

The Raspberry Pi Foundation recently released a programming activity Alien Language , with support Dale from Machine Learning for Kids , that is a brilliant use of Scratch 3 - Speech Recognition to control a sprite in an alien language. Do the activity, and it is very much worth doing, and it will make sense! I  would also recommend going to the  machinelearningforkids.co.uk   site anyway it is full of exciting things to do (for example loads of activities  https://machinelearningforkids.co.uk/#!/worksheets  ) . Scratch 3 has lots of extensions that are accessible through the Extension button in the Scratch 3 editor (see below) which add new fun new blocks to play with. The critical thing for this post is  Machine Learning for Kids  have created a Scratch 3 template with their own extensions for Scratch 3 within it  https://machinelearningforkids.co.uk/scratch3/ . One of which is a Speech to Text extension (see below). You must use this one ...

Scratch and web-cams in Scratch 3

Scratch 3 was launched on 2nd January 2019, so I wanted to know would Webcams still work with Scratch 3 as it did with Scratch 2. For example, in a previous post  Scratch, webcams, cats and explosions  the cat (Scratch) moved across the screen and a button burst when the object moved in the camera onto it.  Can the same thing be done in Scratch 3? The short answer is yes, but it is done slightly differently. The first change the video capture is not there in the blocks automatically; but is an extension that needs to be added. First, you need to add the extension blocks for video sensing. Go to the little icon at the bottom left of the screen (as shown below) this takes you to the extensions menu. Next, find the Video Sensing option and selected. The webcam, if enabled, with start automatically. A video sensing set of blocks is now in the list of block options.  The rest is very similar to doing this in Scratch 2. Moving ...