Flow Arena wins Best Project at SF VR Hackathon

Flow Arena wins Best Project at SF VR Hackathon

600 100 flowadmin

Last weekend in San Francisco, Flow founders won the Best Project award at the VR Hackathon. The team included Kieran Farr on server-side implementation, Iker Jamardo doing the socket.io implementation and the complicated mathematical 3D work in ThreeJS, and Jason Marsh doing visual design and ThreeJS implementation.

Team: Kieran Farr, Iker Jamardo, Jason Marsh

Team: Kieran Farr, Iker Jamardo, Jason Marsh

The project was built entirely on March 11-13.

Project goal: multi, multi, multi

a) Multi-player: using socket.io with a server maintaining state and passing position and orientation data to each player.

b) Multi-platform: functioning in web browsers on:

  • all smart phones showing VR (for Google Cardboard),
  • all desktop browsers,
  • using Iker’s prior work, in an alpha version of a GearVR web-app, and
  • on a web-based app for phone-based Augmented Reality (not successfully completed)

c) Multi-perspective: different apps for player mode and god-view mode.

 

Flow Arena multi-user avatars

Flow Arena multi-user avatars

Experience Design

The main point of the project was the orientation of each avatar’s gaze. When the user stares at another avatar, the targeted avatar experiences a ‘locked on’ beeping sound, and after 5 seconds, gets ‘killed’ with a crunch sound and transported to another spot in the world.

Gradient spheres, with cute eyes, are less important than the ‘nose’ that extends out until it touches either the floor or another avatar. Initially a small nose competed with a laser, and then I just combined the two into some sort of strange Pinocchio. Since the experience is all about gaze direction, the nose enhances the user’s perception of orientation.

The floor is a flattened cylinder wire-frame with an undulating color ripple. The environment is a sphere with a 8-point gradient, and a point cloud for better spatial reference in VR.

Tech implementation

We implemented entirely in ThreeJS, with socket.io, a Node server, and the webVR-polyfill.

NodeJS server

NodeJS server

On a flat-screen system, the mouse drag turns the world. On the god-viewer version, the user can fly through the system using keyboard controls.

In our 10 minute demo time to the judges and audience, we managed to get about 25 users simultaneously in (minimal) VR using their phones connected to our tiny localhost server, which brought the audience into the presentation in a nice way.

Next steps

We’ll clean up this demo and post it live to AWS EC2. A cute eye-blink was prepared but barely implemented, and there is no clear indication of a kill. We’d also love to see the AR version finished to satisfy the multi-perspective goal. A VR scoreboard would be nice.

I’d like to create a master-slave ‘join/release’ mechanism that can apply to the data visualization projects Flow is working on.

As my friend Josh Carpenter tweeted, the project shows us “glimpses of #WebVR empires to come!” Perhaps multi-user VR on the web will lead us to world domination (as a benevolent distributed democracy of course!)