This week I had a chance to attend the 2016 Virtual Reality Kickoff at the Surf Incubator in downtown Seattle and finally demo my first virtual reality HMD (head mounted display), the HTC Vive. This event was by a group I joined through Meetup.com, comprised of designers, engineers, developers and enthusiasts and created by Ricardo Parker, the event sponsor and founder of ChronosVR, the first school in the world to offer a curriculum aimed at teaching how to design and develop specifically for virtual reality using applications such as Maya, Unreal and Zbrush.
Throughout the evening, I got the chance to meet some really fantastic people, among them someone creating an articulating VR control glove and also the prototype developer from HTC. I spoke with a number of people about the open virtual world Second Life, my experiences there and the coming of project Sansar, also from Linden Lab. I was pleasantly surprised that most of them had heard of Second Life or were familiar with it, but were genuinely interested in it, though most did not really grasp quite what it was or that it was as diverse as I described. I spoke about the currency system and how it worked, the various open aspects and how people do what interests them, the social and marketing aspects and so on. I also mentioned attending a wedding and a funeral, both very real, inside a virtual world. Oddly, the thing which drew the most interest was the fact that Cheryl (who was at the event with me) and I met and fell in love in Second life, were together there for several years and were now together in real life from a country apart. Most of them had no idea that such things could or did take place.
I have two distinct impressions from the conversations about this new phase of virtual reality: That the HTC Vive is the hardware of choice for pretty much everyone there; It was what everyone seemed to be waiting for and regarded as the clear leader in terms of performance, certainly ahead of the Rift. Several people I spoke with mentioned that they believe that HMD units which use your existing phone will be the eventual market leader because of the entry price point, especially with the 4K phone screen resolutions which will hit the market this year.
After the initial opening discussion and q&a session, a line quickly formed to demo the Vive, each person taking a few minutes to try one of perhaps 9 or 10 different “experiences” with the hardware, which consisted of the head mounted display, a separate headphone unit (nice, I can keep my own!), two identical, wireless hand controllers and several base station sending / recieving units called “lighthouses,” which are the most often misunderstood element in the setup. How they work is pretty amazing:
- Lighthouse stations are passive, they just need power to work. There is no radio signal between the lighthouse boxes and the vive or pc. (However the lighthouse stations can communicate via radio signals for syncronization purposes)
- They work literally just like lighthouses in maritime navigation: they send out infrared light signals which the vive’s IR-diodes can see. Here’s a gif from gizmodo where you can see an early prototype working: Lighthouse: how it works
- Three different signals are sent from the lighthouse boxes:
1. First they send an omni-directional, syncronized flash from both stations to the Vive headset or the controllers as a “start now to trigger a stopwatch” command.
2. Then each station transmits two IR-laser swipes consecutively – much like a ‘scanning line’ through the room. One swipe is sent horizontally the other one after that is transmitted vertically.
3. The vives’s IR-Diodes register the laser swipes at different times due to the speed of the angular motion of the swipe. With the help of these (tiny) time differences between the flash and the swipes and also because of the fixed/known position of the IR-diodes on the vive’s case, the exact position and orientation can be calculated. This video on youtube illustrates the process: “HTC Vive Lighthouse Chaperone tracking system Explained”
- Lastly, the calculated positions/orientations are sent to the pc along with other position relevant sensory data.
While in use, the user is tethered by a single, 3-strand cable about 15 feet long that connects from the PC to the back of the headset. The controllers are a pistol-grip style with a trigger and a directional thumb pad which accepts clicks and linear or circular swipes for a variety of input configurations.
As I stood in line, I had time to reflect on a few things; How long I had waited to do this and was finally going to, several years of research and studying, the 1001 discussions I have had over the last decade about the coming virtual revolution and how it will affect our culture and how I might look back on this moment many years from now as one which helped define the direction my life took. I must admit that I went into this with high expectations, prepared to be pleasantly surprised at what I was about to see and eager to find out if my personal beliefs about the future were warranted. The irony that I had spent almost a decade inside a virtual world yet was totally unfamiliar with what the majority of tech-savvy people now consider “virtual reality” was heavy on my mind. I was nervous like a whore in church. It was my turn.
First, I placed on the headset, which was comfortable and not as heavy as I expected (though rather hot from use), and then I placed on the wireless headphones.
The application was started and the empty black void was broken by the appearance of a 3D model of the controller floating in front of me as it was held out for me to grasp by someone in the room. With one click it was chosen as the brush.
A second control model appeared, which I took in my right hand. Clicking the 2nd control trigger set it as the menu palette, which appeared as a cube of 4 holographic menus that you turn your wrist to access, each with icons for brush shapes, a color wheel/value slider and paint style, etc. Pointing the brush at the palette produced a laser-pointer-ish line which is used to make selections, very accurate and responsive. Rotating my wrist or hand even slightly produced an equivalent motion on the models, accurate to the point of making me want an adjustment for the overall response curve, and a speed adjustment.
Aside from a very slight disorientation at first, more from the blank void/sensory deprivation than anything else, there was no feeling of motion sickness, dizziness or loss of balance. Once the visual anchor of the controls defined the space for me, I became instantly comfortable within the virtual space. This may be partly because of my previous exposure to navigating in 3D space, but at no time did the frame rate or motion disrupt my perception to the point of being noticeable or uncomfortable. Other than an occasional flicker, the 120 frames a second were sharp, the colors saturated and the contrast level was excellent. The 120 degree field-of-view eliminated any portal/window frame effect.
What happened next was well beyond what I was prepared for.
I followed Ricardo’s advice and asked to try “TiltBrush” by Google, which lets you experience painting within a volume of space, not just on a 2D surface, for my demo.
After selecting the paint color and oil paint brush, swiping my left thumb across the pad to enlarge the brush radius and squeezing the trigger, it became readily apparent that everything I understood about the process of painting was redefined in the span of about 10 seconds. In many ways, it felt like a combination of painting and sculpting and my brain kept switching back and forth between brush technique and working with the volume or direction the strokes occupied. Brush left, right, up, down – ok, got it. Then: whoa! Brush forward, back, under, in front of, behind, around…even around ME. Essentially, I was sculpting a painting, and it was everything I loved about traditional mediums and everything I love about digital ones. It let me use what I already understood in a completely new way.
Some of my first memories are of doing art. I did not know that is what it was or wonder why, It was simply something that was always there, as much a part of me as my hands. I did not question it, and it eventually became my passion, my career and my way of life. At 47, I had assumed that creativity and the many processes whereby one shares ideas were soundly established, the boundaries clearly defined. Google and HTC changed that. What I experienced in a few minutes of walking around my paint strokes – through them – and sculpting with paint strokes in 3 dimensions has literally blown me away. I was so moved by a brief glimpse into that new, dark universe that my brain did not shut off until 5am. I lay awake, replaying the experience in my head, generating new ideas and thoughts about what I want to do with this, how badly I now must have this thing in my life. I would have punched my own face to get an hour inside that headset.
As I write this a day later I recall how amazing the experience was and what it felt like. I was instantly comfortable inside the painting space, the newness of the controller very quickly got pushed to the background as my brain settled comfortably into the familiar processes and motions I already knew but was now applying in a very new way. It was like finding a hidden room inside your own house. I felt comfortable and wanted to sit down and paint for a week, a month, until they cut the power off or I have to pee, whichever comes first. In my head (and possibly out loud) I kept repeating the same phrases. “Oh my God. This is going to change everything. This is bigger than I thought. This is going to change our society. Are we ready for this? Holy shit, we aren’t prepared for this, and its here. The future is here right now on my face.”
The only thing which bothered me was that while I could move around and did take a few steps, the rather thick cord on the ground made me hesitant to venture from the starting point more than a step or two, concerned I would trip and yank the PC off the desk. With the cordless hand controllers, the thought that I might knock someone in the head or tip over the table lamp did not even register; I imagine that the removal of the cables from the headset would produce a similar result, which may be a good reason for leaving it there.The base station sensors would produce a planar grid if you approached the wall of the room too closely, a neat feature, but not knowing where the cord was stuck in my mind. I think I would rig something which kept the cord up to the ceiling somehow. I would also like to see a mic on the headphones, and perhaps some kind of small fan to keep the headset cooler. Everything else about my time inside that headset, even at this early stage of it’s development, exceeded what I had imagined. I am already trying to decide which room in my house is going to become the virtual den, devoid of any furnishings. Now if only Pixologic would make ZbrushVR for the Vive, my life would be complete.
This experience has made me redefine what I understood about both art and virtual reality, but not in the ways I thought it might. Comparing my experience in Second Life with the one I just had in relation to my first reaction and immersion levels, I would say that my time in Second Life was like being able to see through an open door into a new world but that tonight, for the first time, I was able to step into that world and shut the door behind me. I went inside. It is something I have wished for in Second Life since my first login – 3,381 days ago. For each of those days, I have spent most of my time as a 3D avatar – living, playing, working, socializing and even loving there – which has provided me with a somewhat…unusual perspective on extended exposure to virtual reality. If I could offer civilization one single bit of advice after my experience with the HTC Vive and what this type of technology is about to do to it, it would be to put your seat belts on. If this is any indicator of where we are going, it is going to be one hell of a ride.
To conclude, please consider these thoughts from the French poet and philosopher Paul Valéry in 1931, as they are perhaps more relevant now than ever:
“Our fine arts were developed, their types and uses were established, in times very different from the present, by men whose power of action upon things was insignificant in comparison with ours. But the amazing growth of our techniques, the adaptability and precision they have attained, the ideas and habits they are creating, make it a certainty that profound changes are impending in the ancient craft of the Beautiful. In all the arts there is a physical component which can no longer be considered or treated as it used to be, which cannot remain unaffected by our modern knowledge and power. For the last twenty years neither matter nor space nor time has been what it was from time immemorial. We must expect great innovations to transform the entire technique of the arts, thereby affecting artistic invention itself and perhaps even bringing about an amazing change in our very notion of art.”
Paul Valéry, Pièces sur L’Art, 1931
Le Conquete de l’ubiquite
In 5 minutes, my notion of art has been changed. I have no remaining doubt that this technology will change us considerably, for good or ill.