Architecture of Ideas: FLOW, by Jason Marsh and Iker Jamardo

150 150 Jason Marsh

We presented at the California Academy of Sciences as part of Mediate Art Group’s SoundWave Festival in July, 2016.

“Architecture of Ideas” is an experience of symbolic information in Virtual Reality. The source content is quotes from the book “Flow: The Psychology of Optimal Experience” by Mihaly Csikszentmihalyi. The quotes gain a new double meaning: his ideas about a psychological subjective mental experience take on a new perspective when seen applied to a technological immersive experience. We see ideas floating around and through us visually and sonically using both Virtual Reality headsets and computer screens linked together in this short multi-user experience.

The multi-user experience included three Samsung GearVR headsets, synced with a server so that each user could see the avatars of two other users. The three users needed to work together to gaze in the same direction in order to proceed with each scene of the experience.



The Hall of Africa, right next to live penguins, was a great space for the event, and plenty of room for the unfortunately long line. We ran approximately one hundred users, mostly first-time VR users, through the experience.



As users gazed at panels, the sea of letters would animate in sync with a narration track. Since the goal was to engage deeply with the concepts, the narration wasn’t a single voice, but a multiple takes artistically timed and mixed, so phrases were repeated and mixed on top of each other, in tone from serious to silly. This unusual technique was surprising at first, but users quickly understood that it was by design and helped to enhance the meaning of the content by repetition with variation.

At the end, the letters swirl around the user and a giant “FLOW” fades in, larger than can fit in the field of view, so users can only read by turning to see the letters two at a time.


I probably spoke to about 20 people coming out of the experience. Comments included:

“At first, the multiple voices were distracting, even sounded like a bug, but then I got used to it and liked it.”

“I liked the repetition in the voice so you could really think about the words.”

“I was really engaged with the words and towards the end I realized how I was taking it in so much that I wondered if this was what it would feel like to be brainwashed.”

A life coach/corporate motivational consultant had a lot to say about intrinsic motivation and has contacted me afterward.

“The big ‘FLOW’ at the end felt mind-expanding.”

“Enjoyed the flowing letters at the end.” Most positive comments were about this.

Some users were “amazed, unlike anything” they’d ever seen.


My ideas for future events:

Throughput. This is and will continue to be a huge challenge in live events. The lines are just going to be long, which might in itself serve as an attractor on a trade-show floor. Perhaps we can develop media so that all that time waiting in line could be used to familiarize users with a product or service. Some good conversations could/should be had at that point.

I would like to think more about how to build expectations but not necessarily show the whole presentation on the preview monitors. Something to be said for having people experience the unexpected in VR.

Interaction with the presentation for VR virgins should be kept to an absolute minimum. Guided experiences are going to be most effective. But ways that users can pause and ask questions of the guide would be very valuable. It will be challenging to balance the competing interests of “the message is the medium” and “the message is the message.”

User testing follow-up. Get emails and phone numbers to conduct follow-up interviews. I want to see if people remember the content the next day.

Skyspheres Released: 360 Panos in WebVR on all platforms

1107 312 flowadmin

I’ve been working on getting WebVR projects working across all platforms and in all modes, and I’m declaring cautious success!

You can see the result here: skysphere.flow.gl.

skysphere screenshot

This is a simple app to show 360 photos that I’ve taken around California. There are four aspects that I needed to figure out:

  1. 360 image capture and editing
  2. WebVR across all platforms
  3. Control panel interface
  4. Text labeling

360 Image capture and editing

Photography is a serious hobby of mine, and in the last year I’ve used my Samsung Note 4 to capture dozens of 360 photos on my various travels and hikes around California.

I used the built-in 360 Panorama feature in my Samsung Note 4 using the Samsung Camera photosphere extension. I found this interface superior to the Google camera app, because of the little sphere icon that fills in as you take the 40 or so images required. (Note that this is possible with any Android phone, with the built-in Android 360 capture.) The result is certainly not perfect, and even though I did some serious tweaking to the original files in Photoshop to fix some stitching errors, there are still some issues. But overall, it is exciting to think that this is possible with any phone, today.

Sierra Buttes Ridge 2015-07-13x1024

Photoshop editing is pretty easy, except for the view when looking directly down and up. Getting the stairs below me at Sierra Buttes to line up through photo editing was impossible using straight Photoshop without jumping though complicated hoops… oh well, good enough!

Sierra Buttes Steps 2015-07-13x1024

WebVR across all platforms

I used Boris Smus’s WebVR boilerplate: https://github.com/borismus/webvr-boilerplate.

When I started the project a few months ago, getting it working across all platforms was a huge pain, but it kept getting better, and just as soon as I had it all working pretty well, Boris (at Google) released this boilerplate and I adopted his. (What’s a little bit of wasted effort when living life on the bleeding edge!)

To enter stereo VR mode, just click the little “cardboard” icon in the lower corner.

Control panel interface

I spent way too much time thinking about and experimenting with this control panel.

The design expectations included:

  • no obstructions (control panel or text) of the 360 content
  • ability to navigate to the next photo without going ‘back’ to a menu screen
  • ability to see what the next photo is… avoid typical blind forward/back mechanisms. Since these photos are big, slower connections will take a while, so only pull in the big 360 photos with intention.

I liked the idea of finding the control panel by looking down, so I experimented with finding the right pitch of the head to have it appear.

skysphere screenshot control panel

I wanted it to always appear in front of the camera, but I wanted to avoid doing complex math, so my trick here was to create an invisible object as a child of the camera so that it will always be in front of the camera at a certain z distance. Then I use that object’s location as the location to pop up the control panel. Easy, no math!

Once the control panel pops up, then it locks into that position, so that additional gaze interactions will move a cursor over the particular photo to be selected.

I haven’t seen a lot of similar control panel techniques, so I’m interested in getting your feedback on Twitter @jmarshworks.

Text labelling

The typical text labeling mechanisms for 360 photos seems to lock the text front and center of the camera, or up in the center of the view, with the text always directly facing the camera. Although this is highly readable, it also can be distracting until it fades. So I adopted a different design, just to see what it felt like (VR interaction design requires experimentation at this point!) I’m obsessed with beautiful letter forms in 3D space, so maybe I’ve gone overboard.

skysphere text example

I actually put the text plane in space, and this plane responds as though it were a real 3D object, particularly in that it’s not locked into facing the camera. While still clearly an artificial addition to the content, I felt like the sense of presence was stronger because it moves naturally. Do you?

Note: I haven’t seen people do this! It seems to make so much sense to me, and I enjoy it much better.


I will next experiment with adding audio to the experience. For each of the photos taken, I had recorded several minutes of ambient audio in stereo on my phone. (Simple, I know, not binaural, nor with a stereo array to spatialize correctly.) My intent was to just see if I could enhance the experience with the simplest audio that any user could capture with the phone. But I lost the files due to a phone glitch!

So I’ll be more careful with the audio files, and next time will see if a streaming audio loop will provide a significant benefit in presence to still photos.