Unfortunately I haven't been able to work on the game for a while now, and I will have less time to work on it because I started teaching game design!
Every day I go back and forth between a high and elementary school to teach coding within the context of game design.
For the high schoolers it's a little easier because, well, they're high schoolers. For them I'm teaching GDevelop (a great no-code 2D game engine) and Coppercube (a great, no-code 3D game engine), with Piskel and Tinkercad for asset creation. The point of that class is to explore game development as an art form, with less focus on the code aspect.
To the middle schoolers, I'm teaching Tinkercad for 3D modeling (that can also be applied to 3D printing), and then importing those models into CoSpaces.
CoSpaces is surprisingly underused in education, and it's essentially a step up from Scratch. It uses block-based code, but in a 3D environment that, when paired with a smartphone, can be viewed in VR and AR--my two favorite bits of tech.
At the high school, each student has their own laptop--All Macs, incidentally.
At the middle school, however, the students have chromebooks. This forced me to only use Web-Based software (which CoSpaces and Tinkercad are).
The only problem is that...
1) The wheel-in large monitor isn't available yet, so it's hard for the students to see what I'm explaining.
2) Oh, yeah, they don't even have the chromebooks yet...
So for this week, it's impossible for me to teach anything without computers! I gave a 15 min introduction to what they will be learning, but aside from that, there isn't much else I can do.
Luckily today is my last day of the week (they don't have code on Fridays), and they should be getting the computers Monday.
On Fridays, however, there are electives they can choose from. This is super cool, except for my coding elective, they are most likely doing the same things they are doing during the rest of the week. So, I'm trying to replace the coding elective with magic instead. It's an extremely unique part of any school (no other school has it), and it teaches kids to perform/speak in front of crowds and build confidence in them. Hopefully that will work out...
Thanks for reading!
Building off of what I talked about two posts ago, I just wanted to give a little update on the VR Sword Combat.
I mentioned last time that my system, when paired with something like FinalIK can be used on humanoid characters as well. This is absolutely true, but I found a better, simpler way of doing it. It's a simple script you apply to the hand bone and, after setting some parameters, the arm will behave like it should. Additionally, it still respects animations with no changes.
I originally thought I'd have to use an Avatar Mask to mask off the arms that are to be moved with IK (not animations), but luckily I don't even have to do that!
I am still working on it to make it the best it can be before release (and I still have to resolve some dependencies on VRTK 3.x) on the Asset Store.
Thanks for reading!
As promised, here's another blog post about the music of the upcoming VR game!
This will be short and sweet, but it is important to note:
Each piece of music in the game is expertly crafted by my brother Eli Mantell, to match the tone of every scene.
This ranges from every cutscene, trailer, interaction, combat sequence, etc--all have their own unique backing track that, hopefully, matches the mood. With the upcoming announcement/teaser trailer, I hope you will all get a sense of the great audio work to come...
Thanks for reading!
So after taking a little time off to work on other important matters (see future post!), I finally got back to working on the new game, only to see that a new version of Unity is out -- Cool!
But, long story short, when I tried updating the project, it broke many things that were important to my workflow.
Luckily, after the unfortunate Oh Captain incident, I had version control setup via the Github Desktop app, and it has served me well in the past, so I wasn't too worried.
BUT, for some reason it wouldn't let me revert to an earlier commit! Granted, at this point I still wasn't fully aware of the way the Git system works.
Now, everything is working fine except one thing: Unity's tool for recording cutscenes and exporting them as MP4 files--Something I am using heavily in my quest to make the perfect teaser trailer. It's throwing a bunch of errors, and only recording the first frame of the cutscene. I updated it to the newest version and haven't tested it yet. Hopefully it will work now.
On Friday I have another post about the really cool music in the game. Stay tuned for that!
Thanks for reading!
So, without any sort of consistency to this blog, here's another post!
Lately I've been obsessed with getting sword combat in VR just right. BUT, when I say "just right," I don't mean the most realistic. GORN's combat is great but it isn't realistic in any way. Blade and Sorcery's combat is also great, but it's too realistic for what I'm going for, so I have to work right between those two.
Any humans in my game use guns to fight, but can still be killed by a sword or a stealth kill. What will have melee weapons, however, are the robots. They are sort of like the Lightsaber-Weilding robots in Vader Immortal (which I highly recommend!), but fit more into the world I am creating.
With some help from some inspiring articles on VR combat on the internet, I think I've found a good middle ground.
Essentially, enemy sword weapons detect players' swords in a specified range and position themselves in such a way that effectively blocks the player from hitting/stabbing them.
In each blocking movement, the enemy tries to swipe their sword across the player to damage him/her, and it's up to the player to block the enemy back.
None of this would work without a good sword physics collision system, and with a bit of work, I've got a pretty good way of doing that too. I won't get into it in too much detail for reasons I will get into,*** but the gist of it is that when the player picks up the hilt of the sword, the blade follows it with a configurable joint that is setup to collide with objects while still not moving away from the player's hand unless it really has to.
Right now the swords just kinda hover in front of the placeholder enemy bodies, but when I'm finished with it, the swords will be hooked up to an inverse kinematics arm system attached to the body. Because these are robots we're talking about, I can have the arms look like whatever I want, coming out of whatever I want, this isn't a problem, but what's so cool about my system is that when paired with something like Final IK from the Asset Store, this can easily be applied to human enemies as well.
*** Now, the reason why I don't want to get into it too much right now is because I'm considering taking my entire VR Combat system, including the robot IK arm, the combat, the player sword, and a few other bonuses (including my collision sound system for VR and more) and putting them up on the assetStore for anyone to use, which is pretty exciting! :D
The only problem is that right now everything is deeply connected to VRTK 3.x. While this works great for me, it might not be so great for anyone else using a different VR interaction system. So if I want to publish it anywhere, I have to make it VR-Framework-Agnostic, which is a little more work. In any case, we will see!
One more thing: I'm considering turning all previous blog posts that show major features into a few video devlogs with clips of what they look like in game. So be on the look out for that!
Thanks for reading!
Yes, I'm back!
Development has halted for a little bit due to technical difficulties with my WMR Headset, so after much tinkering, I've decided it was best to move on with newer tech.
After much research, I settled on the new Rift S headset, and I gotta say... this thing is awesome. The lack of external sensors is always great, I've had no issues with the tracking so far, and, despite what everyone else says, the audio is pretty good too!
The problem now is that I have to make a choice: Do I stick with targeting SteamVR, or move on to using the Oculus SDK. I tried just using SteamVR, but this time in Oculus mode, but it caused odd delays when picking up and dropping objects, which basically made throwing impossible. I found no mention of this issue online... except for one random comment on one random YouTube video. Thanks, Google!
Since I've been using the VRTK Unity Package for my VR development, I can actually target BOTH SteamVR and Oculus. Down the line, I'm gonna have to get a tester to mess with a demo on Vive or WMR, but for now, the Oculus SDK + VRTK is working pretty great.
I'm having trouble getting the hand animations working, but everything else works fine. I have no doubt that I'll be able to fix it soon enough.
In the game there are certain objects that you can throw line boomerangs. Kinda like ninja stars--but not ninja stars.
Originally I had the player hold one, then press a button to launch one spinning, but after the switch to Oculus, and because the input mapping are a bit different, I found a really simple solution:
Add a multiplier to the force of your throw. This means that all the player has to do is flick their wrist and the object will spin out of their hand in a really satisfying way.
More info to come!
Thanks for reading!
So I have no idea what happened with last week's post, and unfortunately it didn't save anywhere else for me to repost, so I'm just gonna continue with this one...
With my previous Unity projects, I've shied away from using character animations, purely because it seemed so intimidating to use premade animations on characters I would have to create from scratch...
This time, I knew I would have to learn how to do it.
For some reason Adobe Fuse doesn't play well with Unity 2019, so that was out. BUT, if I use the old (pre-Adobe) Fuse application through Steam, then upload that .obj file to Mixamo's auto-rigger website, I can perfectly import the generated .fbx file into Unity, with all the materials intact. Perfect!
BUT, the trouble was finding animations that would work with what cutscenes we had in the game's script. After messing about for a little while with all the animations Mixamo offers, I looked into different ways of making custom animations, without having a panic attack over Blender's terrifying UI. (All joking aside though, to be fair, Blender 2.8 is pretty awesome!)
I had an Xbox 360 Kinect sensor sitting in a closet for literally years, and would never have imagined that I would dig it out for any reason, especially after the 360 itself stopped working. But then I found the CinemaMocap Unity Asset.
This asset is incredible. After picking up a USB adapter for the sensor, the asset allows me to record animations (raw files that I can apply to any rigged humanoid character in any software) using really affordable motion capture!
This is amazing because now I can have my voice actors record dialogue AND act at the same time, for super realistic motion--it matches to the dialogue perfectly because the actors are essentially in the game! (For lack of a better explanation.)
So, it seems like this is the way we are going to be recording out cutscenes from now on. This added super realistic human motion also adds to the immersion in VR--when characters with generic animations are "interacting" with the player in VR, it can be extremely unsettling.
In addition to all this, I'm still messing around with trying to get my Mixamo Fuse characters working with facial blendshapes for lip sync, but that's still up in the air.
More to come!
Thanks for reading!
Yes, I know it's been a while since I posted, but that's because I've been working hard on the new project.
To reiterate: Guns, robots, holograms, stealth kills, parkour, and so much more--all in VR!
Though story details--as well as a proper title--have yet to be unveiled, I can honestly say that the game is coming along quite well.
The trouble with having large worlds in VR is that the more geometry being rendered in memory at a time, the more expensive it is on performance. So, after I build as much as I can in Unity, I'm going to split the scene into multiple smaller scenes to load an unload depending on where the player is at any given time. If I do this correctly, the player, when In VR, won't have any idea this is happening.
I use Adobe Fuse and Mixamo for my animated characters. This works really well, except when Unity decides to just not import the textures. I found a fix, by exporting the character model as an obj, with textures packed into one file, then let Mixamo re-export the rigged character as an FBX. Then, it works fine. The only issue is that there isn't any facial animation to work with for voice acting. But, I found a really great, in-world solution to this... which you will find out about when the game is released!
Thanks for reading!
For the past few weeks, I've been working hard on making a game environment that looks and feels like the final game should. I plan on putting the mechanics I finished in the blockout scene (mentioned in a previous post) into this scene and have a pretty cool vertical slice to release for free.
With this demo release, I, of course, will be announcing what the game is and overall just more info about it.
In putting together this environment I'm met with a few challenges:
The first being that in a greybox scene, it's very clear where you can and cannot go--that's the point, it's a basic prototype. But, with a finalized game environment, the design of the level itself has to educate the player on what to be doing, while still not making the player feel restricted.
This is especially difficult in a city map, where I only want a small section to be explorable, without making the player want to see other places that they can't access at that moment.
The other challenge is making the level design work hand in hand with the mechanics. Just because a level looks great, it doesn't mean it fits the way the game plays, and vice versa.
Especially in VR, where you can't stop the player from sticking their head in everything, AND a game where you should be able to run and jump wherever.
With some trial and error (and lots of rebaking lights!), this area seems to be coming together nicely.
Thanks for reading!
So, along with the force grab and push mechanics, the actual objects need to work well too!
The past few days I've been focusing on the combat with enemies to make it as satisfying as possible. This, of course, starts with guns.
So far everything has ben working out well: The guns function as similar to real world pistols as possible, with working slide racks, magazines, a safety, and bullets. Each magazine has ten rounds in it, and when the gun is empty, it will drop out. You can then find more in the world and either pop them into your trusty pistol, or put them into your pocket for later.
Additionally, after you insert a magazine into the gun, in order to fire it, you have to rack the gun, just like in real life.
The reason I opted for instantiating actual bullet objects and applying a force to them, rather than just using a raycast (invisible line that calculates the hit point), is because I won't have to worry about choosing what objects to apply a force to when they're hit. This way, static objects won't move, and non-static objects will! Easy enough!
For the enemies, each of their body parts are connected with joints, so they react similar to how a real object reacting to a bullet would.
Thanks for reading!
Short update today; I unfortunately got Tendinitis in my left hand, so development has been a little slow--I have to have my brother do all testing for now...
But, that aside: After finishing the great force grab mechanic, I've implemented a force push mechanic.
Now, when you're holding an object with the trigger, if you hold the grip button, then release the trigger, the object will forcefully shoot out of your hand. This is useful for distracting enemies quickly, and even knock other objects around to damage enemies... or people. Maybe.
Overall, once I get the combat and gun mechanics done, I should be migrating from my greyboxed scene to the actual game scenes with all the finished art. Then I will be announcing more concrete details about the final game.
Thanks for reading!
VR is supposed to be better than the real world! So, why do I have to keep bending down to pick up objects I dropped (and sometimes have the headset almost fall of my face!)? It's really annoying!
So, using some of the code from the failed grappling hook, just applying it to interactable objects instead of the player, I now have a system where if you're pressing the grip button, you can point at objects to highlight them. Then, if you press the trigger, they will fly to your hand as if you're using the force!
It works really well and is super satisfying to use. I am considering adding a pointer to indicate what you're pointing at to make it more clear, but I don't think that's necessary. It would be good for UI though. But I'm trying to avoid the usual point and click style UI, as it can break immersion in VR.
I am hesitant to announce anything concrete about the project because I am working on all the mechanics in a "greyboxed" scene and I have very few fully realized scenes done already for me to share, but I promise there is more to come!
Thanks for reading!
So far most of the simple object interactions work well in VR, but there's one problem: the player's hands don't collide with the environment.
Because I'm just using SteamVR's controllers as hands in their [Camera Rig] prefab, there isn't much I can do about this. The hands are colliding with objects in the scene, allowing you to knock objects around, but they don't collide with static geometry.
The question in doing this is that if the hands collide with static geometry, there will be a difference in position between the player's hands in real life and in VR. Originally this was thought as a no-no in VR--You don't separate the player's real body from VR body. After extensive testing, many developers are going against this trend as having hands pass through objects is more immersion breaking than a little space between the hands.
Because VRTK's animated hand models rely on having the ControllerEvents script in a parent object to animate, I wouldn't be able to do this; in order for the hands to collide with static geometry they need to be separate objects from the controller alias scripts but track with them.
I ended up modifying the VRTK's animated hand script so I can manually drop the scripts in the inspector. No big deal. But I still needed a good way to have the hand rigidbody move with the controller but still collide. VRTK does have a good RigidbodyFollow script that seems to work exactly as I need with a cube, but not so much with the hand prefabs. I'm gonna continue working on it to get the desired effect.
Additionally, I really want the player to have arms that collide as well, so I've been looking for a good IK solution that will allow me to do this. I found a really cool script online that I can apply to any rigged body, then parent the hands to the controllers and the player will have a full body. Should be pretty cool. Either that, or I'm going to mess with VRArmIK (I think that's what its called), a great Unity Asset that is essentially plug and play--though it's been a little more difficult because I'm using VRTK. Either way, I'll figure it out.
Thanks for reading!
After messing more with the grappling hook (solving some issues, causing more), I've decided to take a bit of a break from it. From this point I've started working on the weapon system. Melee weapons are going to be next, after I finish the ranged weapons.
The problem with VR weapons is that you can't just hold down left click to shoot, and press R to reload. You need to have physical interactions. Yes, we can just add a gun object to the players hand and have them shoot it, but I want more than that.
For ammunition, there will be reloadable magazines that you have to find and use to add ammo to the guns, and hopefully, if I can get this right, you can slide down the top to cock the gun in place.
Instead of instantiating prefabs as bullets (this can be taxing on performance), I'm going to be using a raycast.
Because I'm using VRTK for my object interactions, there are multiple ways to handle shooting input.
I've messed with many of them, but I'm trying to find the one that has the best compatibility with existing functions.
I can either extend from their base InteractableObject class and just override the "OnUse" method, but this hasn't been working for me for some reason. I can also just subscribe to the use method, but this proved tedious to add to each usable object and I got fed up with it. Now I'm just checking if "TriggerPressed" is true in Update(), but the problem is this won't allow me to change the use button in the future. I'm gonna continue messing around with this to find the best way to do this.
Additionally, I've added a page on the site for my brother and composer Eli Mantell where you can listen to the entire Oh Captain VR soundtrack. If you want to use any of it, please contact him via the form on the page!
Thanks for reading!
Yes, once again there's more news about the grappling hook--who knew something so simple would generate so much blog content! Also, in the future, I hope to structure blog posts in mini sections like you see below:
Anyway, after messing with it some more, I struck a good balance between using the grappling hook, and climbing the environment.
Now only certain objects can be hooked onto. These include things you wouldn't be able to reach by just climbing.
The way I limited this is as follows: When holding the grappling hook, the hook part will be either red or green. When pointing at an object that can be grappled to, the hook will turn green, and the object will highlight yellow. This makes it clear what can and cannot be grappled on. This works perfectly. Because now you can't use it at any time (and therefore the object is listening for a controller trigger press at every frame), there is less bugs with the other inputs--though it still needs to be worked on.
I also changed the jump controls from the grip buttons to the touchpad buttons. Now you can hold one of them to move (walk in place, to put it simply), and immediately push the other touchpad at the same time, and pull back to launch yourself in the air. This makes it much easier to run, jump, land, and continue running immediately after.
All of this, in combination with the enemy AI, works really well to get around the environment, and hide from the enemies' field of view. Depending on the enemy's range, you can run away from a melee attack (sword slashes), or try to dodge projectiles being fired at you.
The primary way of getting around in this game is by walk-in-place movement, but I'm also working on a mechanic that relies on teleportation as well: Being that it's a stealth game--Oh, did I not mention that it's a stealth game yet?--one of the objects you can have in your inventory are Flash Bangs. Kinda like Batman. Or ninjas. Or both. Whatever.
The way they work is as follows: You grab one from your inventory. When you grab them, time slows down to 0.25 speed, and a teleportation cursor appears. You chose where you want to go, and warp there in a cloud of smoke, and time goes back to normal speed. This is essentially a special ability that will allow you to get to places you otherwise wouldn't be able to.
. . .
Once all of these are fully prototyped and work the way I want them to, I will be moving onto the actual level design, moving all the mechanics into what will be the actual game.
P.S: The image in the last post is a screenshot from one of the test scenes I made. It's cool, but the final game will look even better. :D
Thanks for reading!
Okay, so, first of all, yes, I changed the site look. I got tired of the old one and found one that I like better.
But, on to the more important stuff:
I am almost finished with the enemy AI! I've been working on it for a while and am super proud of what I have at this point:
There are three types of enemies: melee, ranged, and "flier"
The melee and ranged enemies are self-explanatory (one fights close up with hand weapons, while the other shoots from distances), but the "flier" is something different (excuse spelling, I wasn't really sure what to call it internally!)
It moves around the waypoints in the same way the other ones do, but at set intervals, it will fly up to the highest point, survey the area, then fly down. In the game, the player can run and jump on objects of different heights, so the enemies will react to that and try to fly up and shoot the player down. In VR it is not only quite nerve-racking to suddenly see the flying enemy coming up beside you as you're climbing, but it's a ton of fun to shoot them down from the sky as well.
I also worked on the interaction system a bit better. Now, objects that can be "used" will stay in your hand until you find an acceptable place to put them, allowing for the trigger button to toggle actions, such as shooting a gun.
Along with this, I've reworked the grappling hook to play better with VRTK (Virtual Reality ToolKit), and now it works so much better!
Instead of allowing the player to grapple onto anything, I've decided that it can only be used on certain objects you couldn't just jump to, like cranes above you, etc.
One of the next things I will be working on is a collision sound framework.
In VR, it's extremely important to make the world as immersive as possible, and this includes sound.
While music (more about this later this week!) and ambiance are easy to implement, but with so many objects moving around the scene, most of which are controlled by the player's hands, it's hard to have a single audiosource controlling all of them.
So, I've come up with a way to mark different objects as being "made of" different materials (such as wood, metal, stone), and depending on what type of object is colliding with it (while taking into account how hard the object hits the other object), a certain sound will play. While it's all still conceptual right now, I have a feeling that when I write it up tomorrow it should work properly.
More info to come about this awesome game! :D
Thanks for reading,
So, I originally wanted to implement a feature from Oh Captain VR into this game: the grappling hook.
I thought it would add another stealth element to the game, but I ran into two problems:
1) It made playtesters only want to use the grappling hook to get around, not the locomotion system I talked about yesterday, and made the game too easy and not very fun to use.
2) While it did work, having the physics object being usable in the player's hand made jumping wonky and inconsistent, which we cant have, obviously.
I'm considering just removing the grappling hook altogether because of these issues, and instead making the focus on stealth combat.
Thanks for reading,
As I've mentioned previously, locomotion in VR is always a challenge. For games with a smaller game area, you might not need any form of movement; Games with larger worlds, however, have to either use teleportation or artificial locomotion (when the player's head camera is essentially moved. Similar to how FPS cameras work, just in VR). Teleportation can sometimes break immersion, and artificial locomotion (AL) can cause motion sickness, so what do we do?
Well, after much trial and error with different types of movement systems, we've decided on "Walk in place" locomotion. While this might sound ridiculous (and the implementation in many games have been quite silly *ahem* LA Noire VR *ahem*), but with some tweaking, it can prove to be really cool. It's highly sensitive, so you don't need to swing your arms wildly to go anywhere, you can sorta just sway in place to move. Because your body is actually moving in the real world, there is less vestibular disconnect when the camera is moving. (Which causes motion sickness.) As someone who is usually extremely sensitive to VR motion sickness, I must say that it works really well.
The other issue is jumping: I've noticed that very few VR games have a jumping mechanic. Doom VFR has a really good jumping implementation, where if you move your hands up by your head quickly, you'll jump. This works well and feels great to do, but its easy to confuse normal movements for jumping movements, and that gets annoying fast. What I've done with this game is if you hold the grip buttons on both controllers at once, then essentially pull yourself into the air, you'll jump. Almost as if you are propelling yourself from an invisible slingshot. The amount you jump is directly proportional to the amount you pull yourself, so it's easy to get the hang of and is very accurate when jumping from building to building (*spoilers!*).
In terms of objects, unlike in Oh Captain VR, where objects just become a child of the controllers, allowing them to clip through other objects, the objects in this game track with the controller movement, meaning that they move with physics and collide with each other and other objects in the scene. This means you can duck behind cover, holding a gun against walls/corners/counters and shoot blindly at enemies--it's a ton of fun.
I'm super excited to share more info with everyone as the development progresses!
Thanks for reading,
So, after the unfortunate debacle that was the demise of Oh Captain VR, I started working on my next project. Though I won't reveal what it is exactly just yet, it is something that I've wanted to do since I first started working with VR and I am super excited about it.
I do have a level mockup that I will show sometime, but I've been focusing on finalizing the core gameplay mechanics of the game in a "greyboxed" level.
It has a heavy focus on parkour in VR, with running and jumping across rooftops, along with shootouts, stealth elements...flash bangs, telekinesis, steampunk robots, and more... As someone with a surprisingly weak stomach for VR motion sickness, the way locomotion is handled is important to get right. I'll be explaining exactly how it works in the next post.
It's gonna be super cool, I promise...
PS: Yes, I know there's the annoying "Weebly" branding on the site. As of a few days ago, they changed something that breaks the custom code I added to remove it. I'm working on a fix currently.
Thanks for reading!
Today I ran into the biggest problem I have ever encountered while developing a game...
...The entire Unity project of Oh Captain VR is corrupted. I've tried everything and nothing seemed to fix the problem. Any previous backups of the project, locally or in the cloud, are either broken, or too outdated to be usable from this point on...
At this point, it seems like Oh Captain VR will have to be delayed indefinitely.
I've put a whole year of work into this project, and though I still am having a difficult time believing that all my work is gone, and as disappointed as I am, I am happy to say that over this past year I have learned a lot to become a better artist and programmer, and am super excited to move on to bigger and better things.
In addition to now finally using Unity 2019 after being stuck with 2017 for Oh Captain due to VRTK APIs, I have a few really cool ideas I can't wait to share. More info will be coming soon about my new project, and I promise it will be worth the wait.
Thank you for reading,
I just have no words.
The past few months have been ridiculously difficult, all due to Apple.
Yes, I have been using a PC for my VR development, but I've been using Macs since I was 7. Aside from accidental damage, I've never had any issue--Until now.
As some of you may know, I got a grant to develop The Jeremiah VR Experience prototype. Part of this grant allowed me to purchase a Macbook Pro 13 with Touch Bar. It had 16gb of RAM and 256gb of storage.
This machine became my primary non-home/non-VR computer that I took to school, used for all Photoshop and Illustrator work, and some light video editing, 3D modeling, and programming. (I also used it to remotely access my VR rig when I'm not at home.)
Then came the keyboard issue.
It was (and still is) mentioned all around the internet, but I never thought much of it--until it hit my machine.
Oh well. I took it to my local Genius bar and had it fixed at no cost. Yay. After being delayed twice, I finally got it back.
A week later the touch bar was acting weird, and then stopped working completely. Just my luck.
I took it in again, waited a week, picked it up and brought it home, again at no cost. The second I took it home and turned it on, I noticed two things: There was a loud clicking sound whenever I opened or closed the lid, and a white splotch on the screen. Crap.
Okay, fine, I took it in again, waited another week, then, finally, I got it back again. The Genius handed me the laptop, and I immediately noticed that the lid was misaligned. Yes, it was shifted to the right. This is the third time I am left with a defective product that was truly their fault. Their policy is that after three botched repairs they replace the laptop. Okay, fine, if that's the best I can do, so be it. They told me that I can either wait two weeks for a refurbished laptop of the same model number, or take what they had in stock at the store. After all that, I refused to risk accepting a refurbished computer, so I went with what they had in stock at the time. All they had that was comparable was a 13 inch Macbook Pro with 512gb of storage and 8gb of RAM. Not the same as what I had originally. After all the waiting, I gave up and took that machine. It's still acceptable for the Adobe Suite, though the extra 8gb of RAM I had before would have been helpful.. oh well. I now have enough storage to dump my entire movie library on it, so that's good!
Please Apple. Don't let this happen again. People trust in your (mostly) superior build quality, great software, and, more importantly, your customer support as being some of the best in the business. But, if incidents like these keep happening, it's hard to maintain that trust.
As I've mentioned before, I am using the Dell Visor WMR Headset, running on Microsofts "Mixed Reality" software, but developing for SteamVR. This has worked great and hasn't caused any issue before. (Except when Steam tells me my boundaries aren't set when they are, just through the WMR home.)
Because of all of the issues plaguing the Windows October Update, my computer didn't get it until yesterday. The update includes a few new cool features for the WMR platform, and I figured I'd break them down and discuss what they mean for SteamVR Developers!
You can see the full list of updates at the link at the bottom of this post, but with that being said, let's see what's included!
First we have one of the coolest features, the "Mixed Reality Flashlight." This allows users to "Open a portal into the real world to find your keyboard, see someone nearby, or take a look at your surroundings without removing your headset!" This is super useful if you need to quickly make changes to something on your computer screen... but for that, it does make more sense to just flip up the headset. For people walking into the room, however, this is actually perfect. The question is, can we turn it on while in a SteamVR App?
...Well yes you can, because one of the other new features is that you can 1) launch SteamVR apps from the WMR home! While this isn't a huge benefit to developers, it is extremely useful to VR game players because launching SteamVR just to play a game is a little annoying. And 2) The Windows button on the controllers now changes function depending on what you're doing! This means that it can launch the SteamVR menu, turn on the flashlight, or go back to the home!
But, more technical stuff, developers can now use QR codes in their apps to scan outside objects. Additionally, there is now hardware DRM support for WMR immersive apps. Devs can also have the Mixed Reality Capture UI directly in their apps, so users don't have to start it in the home.
As cool as these features are, the problem is that they only work for WMR apps, and won't have any affect in normal SteamVR apps. In any case, we'll have to see what cool stuff people come up with!
You can see the full changelog here:
Thanks for reading!
You may have noticed that the past two posts used a color block behind the text unlike the usual all black background with white text. Though I much preferred the white text on black, when taking the RSS feed and publishing it elsewhere, the white text color carries over, making the text invisible on white backgrounds, which the majority of websites use...
I didn't realize this until yesterday so for now on this will be the way it is. No big deal, just thought I'd mention it.
EDIT: Now, for some reason, the RSS feed is picking up the HTML code of the color block, so that's out. I can't change the theme of the site to have a white background, because then, unless I go back and change the text color from every post, older posts will be invisible! So then, I guess I'll be sticking with this grey text on black. It translates well to any other background color, so that's good.
Additionally, my overall workflow has been set back for a bit. My setup usually consists of a gaming PC for VR, and a Macbook Pro for other art creation for the game. A few weeks ago, my Mac suffered from the same damn keyboard issue that plagued many other machines. After being delayed twice by Apple, I finally got it repaired... BUT, since then the Touch Bar has been acting strange, until a few days ago where it just became completely unresponsive. So, once again, I am eagerly awaiting my laptop to be back from the Apple Repair Center. Very frustrating.
In any case, despite these issues, I have been working as hard as I can, implementing new features and fixing old bugs every day. Be on the look out for a new post detailing exactly what I've been working on specifically, and maybe even a rundown of my entire game dev setup!
Thanks for reading!
First of all, we now have an email newsletter! Be sure to subscribe via any email box on the site!
But okay, back to the main content...
So, in contemplating where to go next with Oh Captain VR--publishers, crowdfunding, etc--I realized something:
In focusing on the bigger picture of the whole game, I've been overlooking my main priority at the moment, the demo.
I've been trying to either work on the save system, the open world, modding, or other features that won't be in the demo, when I need to be making the demo the best it can be. No, this post isn't some grand announcement about some change I'm making, it's merely an insight into how I work.
Now, technically speaking, the question is, what does this entail? I know there won't be a save system in the demo, I don't think that's necessary for something that is meant to be only a taste of the full game. However, all the mechanics will be the same. This means that when someone plays the demo, they know that the full game will be like this, but so much bigger and better in every way.
Another thing that's been bugging me is profiling. Profiling in Unity3D, I mean. When playing other VR games, I'm realizing that my game doesn't run as smoothly. Everything works totally fine and is comfortable to play for long periods of time, but it seems like my VR tracking has a little more latency than other games. This is just an optimization issue that I have to work on. Just like any game, it's a balance between performance and visuals. While Oh Captain VR does have a low-poly aesthetic, because of the amount of stuff in the scene, there definitely can be better optimization on my part. Since I only have one main realtime light (the sun/moon), this shouldn't be a problem, but I will continue to work on it.
Any thoughts? Leave a comment down below!
Thanks for reading!