Skip to main content

Virtual Reality Minecraft on a PC

This is an update of the an earlier post.


As a little experiment with the Oculus Rift I wondered if I could view a Minecraft world through the oculus Rift. The answer is yes and relatively easily (certain easier than I though it was going to be). The Oculus Rift used with this was from the first developer kit, there are some latency issues (you move, it moves slightly later)

Download Minecrift from: https://share.oculusvr.com/app/minecrift


Unzip the files and run the installer file.



figure 1
You should get something similar to figure 1. All I did then was press OK.
















Figure 2
Run the Minecraft launcher and create a new profile. After creating a new profile, edit it to change Use version to release minecrift-1.6.4-b12-nohydra which can be found in the drop down menu. 

Now save the profile.

Play the Minecraft normally, but with the Oculus Rift on (it helps to have two people one wearing the Oculus Rift, the other pressing keys to control the movement).



Figure 3


Most people who have tried it, enjoyed it; but you really can only play for a short while. If are susceptible motion sickness I wouldn't try this.



Related links

http://minecraft-vr.com/

http://www.theriftarcade.com/minecrift/


http://vrwiki.wikispaces.com/Minecrift 


http://www.pcgamer.com/2013/05/10/minecraft-is-the-latest-game-to-get-oculus-rift-support-with-minecrift-mod/


http://riftmod.com/how-to-setup-minecraft-for-oculus-rift/



I would be interested in finding out what others are doing with combining Minecraft with other devices/software via comments.

All views are the author's and do not necessarily reflect the views of any organisations the author is associated with in any way. Nor is post advocating the use of approach described above, but is reporting on an experiment.  

Popular posts from this blog

Micro:bit, Servo control with Micropython or blocks

You can control servos (small ones) from a Micro:Bit directly. Following a link from the David Whale (Twitter ) , thank you, took me to a Kitronik blog post, https://www.kitronik.co.uk/blog/using-bbc-microbit-control-servo/, which has the answer.

The code uses Microsoft Blocks taken from the post, runs the servos 180 degrees and back again, when button A is pressed. It does exactly what it should. I am also using the Tower Pro SG90 servo.
Can it be replicated in Micropython? This is a new mini project, there seems to be little out there yet on how do this but the best so far is this video by PHILG2864:



The closest I have is the following, it is essentially there.
from microbit import *
pin0.set_analog_period(20)
while True:
    pin0.write_analog(180)
    sleep(1000)
    pin0.write_analog(1)
    sleep(1000)

Setting the time period to 20ms  pin0.set_analog_period(20)seems by experiment (and used in the video above) to be best value so far. The reason for pin0.write_analog(1)  set to 1 i…

mbots - graphical programming and Arduino

Makeblock (http://mblock.cc/mbot/) funded through Kickstarter the development of a new robot - mBot (https://www.kickstarter.com/projects/1818505613/mbot-49-educational-robot-for-each-kid) with the subtitle "$49 educational robot for each kid". What they came up with is a interesting system that uses their mBlock software, which resembles Scratch but produces code for Arduino, to program a robot with LEDs, light sensors and buzzer integrated on the main board; but also comes with sensors for line-following, ultrasonic sensor and with the version in the kickstarter reward a 16x8 LED matrix.

My impression so far it is really quite intuitive to work with, in the example above the robot:

moves forward;displays 'f' on the LED matrix; turns right;displays 'r' on the LED matrix;repeats until the on-board is pressed to stop the motors. 

What I like most though is seeing the graphical code turned into Arduino code - the potential to see the same thing done into two ways…

4Tronix Bit:Bot Neuron Controlled Edge follower

In thelast post I was playing with 4Tronix'sBit:Bot. In this post I will show the initial experimentation with an artificial neuron controlling the Bit:Bot to follow the edge of a line (it follows the left-hand side of the line).


The neurons (well two separate ones, S1 and S2) are produced using weighted sums - summing the weights x inputs [ right-hand sensor (rs) and left-hand sensor (ls)] plus a bias for each neuron in this case w[0] and w[3].







    net=w[0]+w[1]*rs+w[2]*ls           net2=w[3]+w[4]*rs+w[5]*ls

  If weighted sum >=0 then its output 1 otherwise 0 if net>=0:          s1=1     else:         s1=0
    if net2>=0:         s2=1     else:         s2=0
What actual causes S1 to be either 1 or 0 is all defined by a set of weights w (three for the first neurone, S1,  three for S2).
w=[0,-1,1,-1,1,-1]


Converting the outputs of the two neurones S1 and S2 into actions is shown below.