Menu
Mika Tasich

VR is child's play

Unity project coded by a 13yr old.

One of the most fascinating things I discovered about "room scale" VR/AR experiences is the immediacy of user interactions. Unlike other Human-Computer Interfaces (HCI) which rely solely on metaphors, experience, and evolved behaviour, interaction within virtual worlds is immediate, direct, and absolutely intuitive.

When I talk about this to non-experts, they often reply with something like, "my favorite app has a great, super-intuitive user interface." But they invariably forget what leads to that intuition. Your favorite mobile app with an "intuitive" interface relies on knowledge gained through frustration, trial, and error. It counts on you knowing that three little lines in the top corner will open a menu, or that a small image of a human figure opens your user profile. Not so with VR.

I bet that if I were to code up a VR experience where you have to throw rocks at incoming saber-toothed tiger and place a caveman in it, they wouldn't have any problems figuring out how to do it.

This intuitive nature of VR experiences doesn't just affect users. It also dramatically reduces the complexity of the code, to such a degree that even a child could do it.

Back in 2018, when my son was 13, his school gave students an Independent Learning Project (ILP) to complete at home over several weeks, with minimal teacher and parent supervision. As my son was studying computer science, I thought it would be a good idea to encourage him to code up his own VR experience. Not because he was one of those geeky kids who would consider such a thing fun, but because he would consider it a chore, and as with any chore, the easier it is, the easier it is.

I didn't want to impose. I simply showed him how in Unity, with the help of Valve's absolutely brilliant SteamVR Plugin, creating fairly complex VR interactions is reduced to a simple drag-and-drop process. And it worked! As I expected, he quickly realised that the task wasn't that daunting, and that with relatively little effort, he could achieve something which would allow him to get a good grade for the novelty factor alone.

My son very quickly figured out that it would be absolutely trivial to create an archery game. Drag a bow and arrow on stage, drop in some prefab targets, and you're done. Literally. The interaction is built in. You grab the bow in your hand, load it with a nearby arrow and shoot, like you would a real thing. The physics engine takes over, arrows fly, targets are hit. Job done.

I honestly thought he would use that, polish it up a bit, maybe add some scenery and stop there. What I didn't expect, and what really made me proud, probably even more than him completing the whole thing, was the experience he came up with. He said, "Could I make this so that I can feel what it would be like to shoot arrows on different planets?" and blew my mind.

As any parent would, I'm attributing this brilliant idea to my deeply thought out plan. I'd like to think that he understood the power VR has to convey previously impossible things and to bring us knowledge in the best way possible, through personal experience.

I knew about the dense atmosphere of Venus and Titan, but I never really "understood" what it means. Not until I shot some virtual arrows there in my son's ILP, and now I know. I know in a way I will never be able to forget. I've experienced it!

For me, this shows the true potential of VR in education. It's one thing to read about the atmospheric conditions on different planets, but it's an entirely different level of understanding when you can experience those conditions firsthand, even if only virtually.

I hope you'll enjoy the gameplay video he made, which showcases this innovative educational VR experience. It's a testament to how VR can spark creativity and enhance learning in ways we're only beginning to explore.