Posts By :

flowadmin

Quote Post Format

Consul scaevola no ius, ponderum repudiandae no sed, labore aliquid verterem an sea. Modo erroribus ne eam, ne vel dicta ridens. Quo ex alterum deseruisse. Lorem ipsum dolor sit amet, insolens adipisci dissentiunt mea ea, quo id everti suavitate accommodare. Sit te omnium tritani, quo id ridens commune pertinacia.

Peter Smith
1120 1120 flowadmin

Flow Arena wins Best Project at SF VR Hackathon

600 100 flowadmin

Last weekend in San Francisco, Flow founders won the Best Project award at the VR Hackathon. The team included Kieran Farr on server-side implementation, Iker Jamardo doing the socket.io implementation and the complicated mathematical 3D work in ThreeJS, and Jason Marsh doing visual design and ThreeJS implementation.

Team: Kieran Farr, Iker Jamardo, Jason Marsh

Team: Kieran Farr, Iker Jamardo, Jason Marsh

The project was built entirely on March 11-13.

Project goal: multi, multi, multi

a) Multi-player: using socket.io with a server maintaining state and passing position and orientation data to each player.

b) Multi-platform: functioning in web browsers on:

  • all smart phones showing VR (for Google Cardboard),
  • all desktop browsers,
  • using Iker’s prior work, in an alpha version of a GearVR web-app, and
  • on a web-based app for phone-based Augmented Reality (not successfully completed)

c) Multi-perspective: different apps for player mode and god-view mode.

 

Flow Arena multi-user avatars

Flow Arena multi-user avatars

Experience Design

The main point of the project was the orientation of each avatar’s gaze. When the user stares at another avatar, the targeted avatar experiences a ‘locked on’ beeping sound, and after 5 seconds, gets ‘killed’ with a crunch sound and transported to another spot in the world.

Gradient spheres, with cute eyes, are less important than the ‘nose’ that extends out until it touches either the floor or another avatar. Initially a small nose competed with a laser, and then I just combined the two into some sort of strange Pinocchio. Since the experience is all about gaze direction, the nose enhances the user’s perception of orientation.

The floor is a flattened cylinder wire-frame with an undulating color ripple. The environment is a sphere with a 8-point gradient, and a point cloud for better spatial reference in VR.

Tech implementation

We implemented entirely in ThreeJS, with socket.io, a Node server, and the webVR-polyfill.

NodeJS server

NodeJS server

On a flat-screen system, the mouse drag turns the world. On the god-viewer version, the user can fly through the system using keyboard controls.

In our 10 minute demo time to the judges and audience, we managed to get about 25 users simultaneously in (minimal) VR using their phones connected to our tiny localhost server, which brought the audience into the presentation in a nice way.

Next steps

We’ll clean up this demo and post it live to AWS EC2. A cute eye-blink was prepared but barely implemented, and there is no clear indication of a kill. We’d also love to see the AR version finished to satisfy the multi-perspective goal. A VR scoreboard would be nice.

I’d like to create a master-slave ‘join/release’ mechanism that can apply to the data visualization projects Flow is working on.

As my friend Josh Carpenter tweeted, the project shows us “glimpses of #WebVR empires to come!” Perhaps multi-user VR on the web will lead us to world domination (as a benevolent distributed democracy of course!)

Skyspheres Released: 360 Panos in WebVR on all platforms

1107 312 flowadmin

I’ve been working on getting WebVR projects working across all platforms and in all modes, and I’m declaring cautious success!

You can see the result here: skysphere.flow.gl.

skysphere screenshot

This is a simple app to show 360 photos that I’ve taken around California. There are four aspects that I needed to figure out:

  1. 360 image capture and editing
  2. WebVR across all platforms
  3. Control panel interface
  4. Text labeling

360 Image capture and editing

Photography is a serious hobby of mine, and in the last year I’ve used my Samsung Note 4 to capture dozens of 360 photos on my various travels and hikes around California.

I used the built-in 360 Panorama feature in my Samsung Note 4 using the Samsung Camera photosphere extension. I found this interface superior to the Google camera app, because of the little sphere icon that fills in as you take the 40 or so images required. (Note that this is possible with any Android phone, with the built-in Android 360 capture.) The result is certainly not perfect, and even though I did some serious tweaking to the original files in Photoshop to fix some stitching errors, there are still some issues. But overall, it is exciting to think that this is possible with any phone, today.

Sierra Buttes Ridge 2015-07-13x1024

Photoshop editing is pretty easy, except for the view when looking directly down and up. Getting the stairs below me at Sierra Buttes to line up through photo editing was impossible using straight Photoshop without jumping though complicated hoops… oh well, good enough!

Sierra Buttes Steps 2015-07-13x1024

WebVR across all platforms

I used Boris Smus’s WebVR boilerplate: https://github.com/borismus/webvr-boilerplate.

When I started the project a few months ago, getting it working across all platforms was a huge pain, but it kept getting better, and just as soon as I had it all working pretty well, Boris (at Google) released this boilerplate and I adopted his. (What’s a little bit of wasted effort when living life on the bleeding edge!)

To enter stereo VR mode, just click the little “cardboard” icon in the lower corner.

Control panel interface

I spent way too much time thinking about and experimenting with this control panel.

The design expectations included:

  • no obstructions (control panel or text) of the 360 content
  • ability to navigate to the next photo without going ‘back’ to a menu screen
  • ability to see what the next photo is… avoid typical blind forward/back mechanisms. Since these photos are big, slower connections will take a while, so only pull in the big 360 photos with intention.

I liked the idea of finding the control panel by looking down, so I experimented with finding the right pitch of the head to have it appear.

skysphere screenshot control panel

I wanted it to always appear in front of the camera, but I wanted to avoid doing complex math, so my trick here was to create an invisible object as a child of the camera so that it will always be in front of the camera at a certain z distance. Then I use that object’s location as the location to pop up the control panel. Easy, no math!

Once the control panel pops up, then it locks into that position, so that additional gaze interactions will move a cursor over the particular photo to be selected.

I haven’t seen a lot of similar control panel techniques, so I’m interested in getting your feedback on Twitter @jmarshworks.

Text labelling

The typical text labeling mechanisms for 360 photos seems to lock the text front and center of the camera, or up in the center of the view, with the text always directly facing the camera. Although this is highly readable, it also can be distracting until it fades. So I adopted a different design, just to see what it felt like (VR interaction design requires experimentation at this point!) I’m obsessed with beautiful letter forms in 3D space, so maybe I’ve gone overboard.

skysphere text example

I actually put the text plane in space, and this plane responds as though it were a real 3D object, particularly in that it’s not locked into facing the camera. While still clearly an artificial addition to the content, I felt like the sense of presence was stronger because it moves naturally. Do you?

Note: I haven’t seen people do this! It seems to make so much sense to me, and I enjoy it much better.

Conclusion

I will next experiment with adding audio to the experience. For each of the photos taken, I had recorded several minutes of ambient audio in stereo on my phone. (Simple, I know, not binaural, nor with a stereo array to spatialize correctly.) My intent was to just see if I could enhance the experience with the simplest audio that any user could capture with the phone. But I lost the files due to a phone glitch!

So I’ll be more careful with the audio files, and next time will see if a streaming audio loop will provide a significant benefit in presence to still photos.