I haven’t updated in a little while, but we’ve still been doing a thing a week, it just turns out that things like LARPing really interrupt the whole formality of the process.
So what have we been up to?
The weekend after the last post I put on this blog, I went to a VR game jam where I got a chance to check out Valve’s VR API, and it’s pretty awesome. It currently shines in a few spots where the Oculus VR API needs some work. Specifically, you can actually use the default Unity skyboxes using Valve’s VR libraries, whereas Oculus has been struggling to fix that feature since day one and still hasn’t got it right in spite of multiple updates. They just aren’t concerned with specific engine implementations the way that Valve is.
The coolest part for me was to get to use a VR HMD that actually had positional tracking for your head. We were challenged to create game play that took advantage of this new feature in interesting ways.
In our game, (which I’m gonna call Giraffe FUNK because we never really settled on a name) you play as a funky Giraffe who lives outside a hotel, and you’re trying to stick your head in the window to eat people’s food without disturbing them. Here are some pictures (giraffe’s head is posed, actual game play was from a first person perspective):
The tricky parts were all solved by the honorable space Rev. Chris “Dinosaurs” Maire, most of which revolved around the fact that a giraffe and a human don’t have the same length neck. It was weird at first to stick your neck out and only have it reach as far as your real neck goes. I mean on the one hand that’s what the device was designed for - immersing you, as a human, with your virtual (but equally human) avatar. The natural limitations of my own neck felt like they were breaking immersion, however, when the avatar had a neck as long as a football field. But Chris was a hero, and with a few expert tweaks to the way the game handled your head’s movement speed, he got it to actually FEEL like your neck was super long. It was fun, challenging game play that took advantage of the unique features of the Rift’s head tracking in a creative way!
I was also LARPing that weekend, so I didn’t get to work on the actual VR part of the game we made. Instead I tried to learn as much as I could from my teammates about the challenges of implementing VR so that I can continue to apply that to my projects. I felt bad that I wasn’t there the whole time, but I like to think that I have a lot of time on my hands to experiment with this at home on my own dev kit, and none of the other members of the team actually had a Rift dev kit. They probably got a lot more out of doing it themselves than I would have, but I did feel bad about abandoning them for a whole day.
Last week I took a break. I really needed to unwind. I am kicking myself for it now, but I DID learn a lot about leatherworking for my LARP project - building my own spider carapace, and I’ve constructed a basic pattern for the garment. Draft 2 will be happening this week, probably culminating with me cutting a first draft of the armor out of yoga mat and pleather so that I have an idea of how to assemble said garment before I try and make it for real with actual leather - a process which is both difficult and extremely costly. I’d rather waste the $20 yoga mat first.
Additionally, this week I have been working on rigging up a character to use both the Oculus Rift and the Hydra - which was my original goal for the jam. I HAVE already managed to set up the Rift and the Hydra on the same human avatar, but I’m having trouble getting the inverse kinematics to work correctly and… the hands don’t stay attached. It’s just a matter of time, but that’s my goal for this week in terms of game development: Get that character rigged.
I think this is a good time for me to experiment with gameplay, without a specific agenda in mind. The game jam really inspired me in that way - nobody came into the event with a specific goal other than to do something with the positional tracking and the rift that they couldn’t have done without it. It rocked.
Here’s a pro tip though: It is really really difficult to develop a game without constant access to all the hardware you need. Emulating the head tracking on my generation 1 Oculus Rift dev kit was not the same as having access to it when we went upstairs to Valve’s demo area to test it. Not the same at all. When you’re building gameplay specifically to take advantage of those features, you really ought to have the hardware in front of you the whole time. Preferably multiple units per team. HOWEVER, having said that the setup they had worked fine with the number of teams that we had there (less than 10, teams of five, two devices with positional tracking). The only problem is that some teams wound up building games where the star of the show was the hand tracking with the Razer Hydra instead of the head tracking on Valve’s device. The ones that didn’t really shone though, and I was psyched to see the intense creativity that was showcased at that event - which from what I understand was the first jam ever to use HMDs with positional tracking. The previous jam I participated in used VR but didn’t allow anything other than mouse and keyboard / xbox controller for input devices (no razer hydra = no positional tracking). It was such a cool thing to be a part of and has been a major inspiration for my continuing work!