The conference was in September 2018.
Oculus Connect 5 was the fifth conference hosted by Oculus, this time in San Hose, California. It was a two-day event with announcements, demos, and talks. Because I am a part of Oculus Start (a developer program they run), I was able to attend the event. Thanks to Oculus and the Oculus Start team!
It’s a little funny to have an in-person conference for virtual reality technology. We aren’t at the point where virtual meetings approach the value of in-person ones, though. The first steps are being taken: the keynote presentations were broadcast in 180-degree videos via Oculus Venues, and you could sit in a virtual audience with other spectators while watching them.
The conference attendees were the most diverse (gender, ethnicity, age) than I had ever seen at a tech conference. It was very heartening to see. Also the production values of the keynote talks were very high, close to what I would imagine Apple keynotes are like.
According to Oculus, the degree of success the Oculus Go has experienced surprised them. They were hopeful about it, but so far it’s outperforming their expectations. They do admit that the most common usage is to watch videos. I am happy to hear that it is doing well, since I am a great fan of the Go and this means it will receive continued support. I am developing Cave Defender for it and will have it out later this year.
The Oculus Quest was announced, which is their next headset due out in 2019. It is a standalone device like the Go (compute in the headset) but has Rift Touch controllers, and inside-out tracking (so no external tower sensors for positional tracking). I had the chance to try it in an arena-scale, multiplayer demo of Dead and Buried. The UX was great and is what I would expect from Oculus. The in-HMD compute means developers will have to get very creative to deliver high-quality, high-performance experiences.
John Carmack (CTO of Oculus and long-time game development hero of mine) was often found in the hallways, chatting with a growing crowd. When he wasn’t in a hallway, he was in dedicated sessions or giving his exclusive keynote speech.
Carmack loves to get into the details about things, and also is known for being transparent about Oculus and his feelings about all related subjects. Despite being a CTO he also likes to dive in and dig deeply into technical problems to get them solved. In some cases it is to fix a flaw (like coming up with chromatic aberration correction with no performance hit, just a little extra power usage on the Go) or to provide what he describes as ‘lighthouse examples or experiences’: sample projects and code to demonstrate a solution to a technical or user experience problem.
What surprised me most is how much Carmack talked about user experience. You can tell that Oculus as a company cares about this, their headsets are comfortable, the software ecosystem is fairly unified and mostly feels consistent. Carmack talked about internal reviews he did of user interface metaphors to communicate to users, and resulting efforts to further unify and standardize these. He also emphasized consistent and high performance for apps and experiences, and went over pros and cons of different approaches to get that result. Some other VR user experiences issues he spoke about included:
- Text readability
- should be large enough to easily read without strain
- should be rendered with anti-aliasing, crisp
- should not be so large that you have to move your head as you read
- The Oculus Store has minimums for this
- Mobile hardware actually makes this not as costly as on PC
- May be trouble for deferred rendering approaches
- If you really insist on locomotion, follow the best practices
- In general follow VR best practices!
- Fading to 100% white is painful. (I am guilty of this in Cave Defender and changed it after learning about it)
- Consistency and clarity in UI
- Use the same metaphors; there should not be many ways of communicating the same things
Carmack also briefly touched on Vulkan support on Go – it is coming but hasn’t yielded as much benefit as they expected. The MultiView stereo optimization was much more impactful (reduces the amount of duplicate work performed by rendering a view per-eye).
Both Unity and Oculus hosted office hours, which were available by appointment. The time slots filled up quickly, but I managed to get a spot with Unity. I met with Dan Miller who graciously answered all the questions I prepared.
If you are building for the Oculus Go with Unity, there are a few easy wins I learned about:
- 72hz mode in Unity, which can be turned on with a line of code
- Multiview optimization (‘Single Pass Stereo Rendering’ in Unity)
Overall it was a great conference: I learned a lot, met new people, caught up with former colleagues and even managed to run into some MMA legends in my hotel lobby. I’m looking forward to Oculus Connect 6!