Launch Pad Blog 7/30/17
Animating with the timeline!
We actually ended up working on Virtually Home for a while this week. It took us time to get our house tour video just right, but the version we have now is ready for review by our realtor friends. Rob is working on a ‘walking’ 360 video shot as well. We have three more shoots lined up, and I’ve been working on the admin side of things. I registered a DBA as ‘Virtually Home’ for my company, and went to the bank to ensure deposits would be accepted when they bare the new name.
So Adobe’s Mixamo product is going through some changes… and while they figure out what’s next, they’ve kindly opened up their Mixamo family of characters and animations to anyone with an Adobe ID for free! However, it’s unknown how long this vast treasure trove of resources will be around. So we spent a few hours going over the content and downloading everything we may need to Danciverse, as well as future projects. It’s was something like 2k files with a bunch of really nice mo-cap data. What’s nice is this animation data can work in Unity3d with any humanoid model if you know how to adjust a few settings. So BunnyGun’s archive of characters and animations has increased substantially. Thank you Adobe!
Along those lines, I proofed out some animation concepts with the collection. I don’t know if this is good practice but I put all 15 gig of characters and animations from Mixamo into a single Unity project. It took about 20 minutes to index and get started but when it did start it was actually decent performance wise. The idea here is that I’d like to create a project for setting up characters and animations from the collection into packages which I can then put into actual projects. Well, it’s working so far! I was able to run a few tests, and it looks like I can indeed take any given character and apply any given animation (there are a couple exceptions). Above is a funny example of a dude named The Boss doing a shuffle to some of our music. This video, while somewhat amusing, shows the use of timeline as a way to fine control the playback of animations on characters. If there was a way to right click an animation clip on the Timeline and select ‘duplicate reverse’ .. it would be an amazing feature.. I’ll have to find some program or something to reverse a fbx animation for me in the meantime. While some of the dance and other animations from the series do loop, several of them do not, so reversing a clip would be helpful to create loops.
The tests were highly amusing, and I had fun mixing and matching characters, and sequencing them on the animation timeline. I want so badly to bring Cinemachine into the mix and experiment with nicer camera shots (and I will for another project) but I need to stay focused on GearVR things for now. Side note: as far as my last experiment went, Cinemachine doesn’t do much for GearVR.. I was thinking perhaps use it for a camera path, but I didn’t have luck with this approach.
My mind goes back to optimization… while the motion capture data is useable on humanoid models, I do wonder how viable the characters are for a GearVR game in their current form? Some of them appear to be pretty high resolution. One of them had about 30 draw calls by default.. That’s pretty expensive out of the box. So now I’m at the point where I’m spending a couple hours attempting to optimize them as much as possible, to get an idea of just how expensive they’re going to be. I left off at installing MeshBaker into the project.
I spent some time learning about Unit Testing in Unity. It has changed a lot in the last few years as far as how Unity has handled it but the long story short version is this, Unity 2017 has integrated unit testing support into the Editor. There’s a window now called ‘Test Runner’ that manages ‘play mode’ and ‘editor mode’ tests. I’m finding it easy to use, and so I’ve been writing some test for practice. One issue I always have when developing for VR is that there’s a ton of little checkboxes and options to remember to check. I’ve been working on a unit test for Gear VR that verified important settings such as ‘multi threading’ is in fact enabled. In the screenshot you can see a few. Discovering how to access the state of these option in code has been .. tedious? But educational. The productor though, I really like so far. The idea is that by having this test file, it would be easy enough to share it with folks who want to run GearVR but are not 100% if they configured all the settings or not in the Editor. One click unit test, boom! Definitive answers. So I’m now on board with Unity’s new testing tools. I’ll be using them sparingly as I develop in the weeks ahead.
I spent a few hours today reading about the .. thrilling.. World of unity resource management. When I say the headline ‘5 part series’ I knew I was in for the long haul. Pretty much, I think I’ve walked away with a good overview and a strategy. The architecture aspect of the project is important to me, but at the same time, I need to keep in mind the clock is ticking. This is another area where a fine balance between best practices and over optimization will need to be found. My approach here is to spend some more time on a simple solution that delivers on the performance I need while not making it hard to test and develop. The underlying code of bundle management can always be rewritten after the demo deadline.. there’s higher higher priority items to consider.
As far as time goes, I expect to being working on Danciverse full time after 8/4. I’ve been planning and saving for this opportunity for a while, and I’m looking forward to it!