It has recently come to my attention that Graymount doesn't work on Valve Index.
Plain and simple. Because I don't have the Index hardware, there is no way for me to try and fix this problem. Additionally, I am using a version of SteamVR 1.x in Unity, with no way to upgrade to 2.x without breaking the entire project, so even then I couldn't generate the proper actions to work with Index's input. I specifically omitted Index support from the Steam page, but of course this doesn't prevent Index users from purchasing/downloading the game. It runs well on Oculus, seeing as that's what it was developed on, so for the time being we will be doing two things: 1) Due to these problems, as well as a few others with our version of Unity, and overall project direction, MidnightCoffee, Inc. has made the decision to distance itself (at least 6 feet!) from Graymount for a little while so work can begin on a newer project, built for the ground up for all SteamVR headsets in mind, using 2.x and the new Action input system. 2) When (when! ) we decide to return to the world of Graymount, we will focus on the Oculus platform, for publication on that store, and possibly publish to Steam later, if it's possible. We hope you all understand this decision, and thank you for your support along the way. I'm confident that our new project will blow Graymount out of the water in every way, while innovating on what VR can do, as a whole. Thank you, Judah This post is accompanied by a YouTube video below. So yes, I did get the demo published by the May 15th deadline, and was allowed to push updates to it after that. When I did upload the original demo, there was one thing missing that I wanted to showcase that is featured heavily in the full game: Telekinesis! Previously the "force" mechanic was so you can grab objects from afar--more of a quality-of-life feature than anything. But, in a game where you play the part of a magician, being able to use magic should be a bigger deal, right?! So that's exactly what I added, a new section at the beginning of the demo level that teaches you how to use one of the abilities featured in the full game. In the past, when I was messing with the force-like powers in Graymount it felt more robotic than magical, which wasn't what we wanted. So, to put it super simply (watch the video for more dev-details!), I attached a physics joint to your "magic cursor," making levitated objects feel more bouncy, magical, and fully driven by physics--something that we all love in VR. I'm super excited for everyone to play the demo, starting June 9th for the Steam Summer Games Festival and hear all your awesome feedback! Wishlist now to be notified when the full game comes out! https://store.steampowered.com/app/955090/Welcome_to_Graymount/ Thanks for reading! - Judah Mantell Okay, so. I have about 5 days until the deadline (May 15) to submit my demo for the Steam Summer Games Festival.
This...is fine...? Basically, I realized that if the player throws an essential item (a battery needed to complete the level) into the water, the only way to get it back is going back to the main menu and restarting the level. So I gotta write a quick script to make the item reset position when it falls into the water--not a huge deal... but when Unity crashes on you on startup, things seem a little scarier... Luckily it was a one time thing and everything was good. Now all I gotta do is push the build through Steampipe and then I'll be set! Once the demo is released, until the Steam Festival, I'll be working on a new project for a different festival also using Steam and I'm super excited to show everyone. Thanks for reading! - Judah Hey! While yes, the demo isn't publically playable until June 9th, I got some valuable feedback on Reddit based on the gameplay video (below) and am now working it into the demo and uploading the built to Steam. Basically, as you can see in the video, if I miss a jump to a ledge or platform, there's no way to get back up onto it unless you die/respawn. So I basically just added some climbable ledges so you can quickly grab them and pull yourself back onto it.
Thanks for reading! - Judah Well.
It's been quite a while since I last posted about what I've been working on. My goal lately has been to have a playable demo showcasing the main mechanics of Welcome to Graymount (the official title)...and after working like crazy on it, that's exactly what I have! It'll be playable for the first time at the Steam Summer Games Festival and I can't wait for all the feedback. Yes, I'll admit parts of it could be a little more polished (specifically the menu and control screen), but that's what I will be focusing on as I work on the full game. The steam page is now live, so I highly suggest wishlisting it to be notified on release! While short, the demo allows the player to use some of the main mechanics featured throughout the game (wall running, jumping, gunplay, etc) all in one contained area, while providing bits of information about the story/city of Graymount. Thanks for reading! Judah Wow, it's been a while since I posted...
I've been extremely busy, both on this game specifically, and game design as a whole--I teach programming and video game design to grades 3-12 every day, and then get to work on my game when I get home in the evening, so needless to say, progress has been a bit slower than I would like. At the moment, I have most of the mechanics in place (Listed below with explanations, just for the fun of it!), so most of my time now has been working on putting the narrative together. So far so good, but I need voice actors for some key parts, which isn't too easy to secure. As I've mentioned in previous posts, while I'm not trying to recreate Blade and Sorcery or Boneworks' gameplay, I do want my combat to have satisfying physics elements. So, my player character is not physics based but can interact with other physics objects such as doors, pushing heavy objects, pushing enemies around, etc. The main mechanics I've been working on perfecting are:
So yeah, that's what I've been working on lately. I hope to have some sort of video coming soon showcasing everything that will be coming with the new game, as well as an official announcement. Thanks for reading! - Judah Lately I've been obsessed with adding procedural generation (PG) to the game. While I know that, when shoehorned into a game, PG can have disastrous effects, I still think that it would be a fun challenge that would add a cool amount of replayability to the game.
What I'm working on now is heavily narrative based, but many of the mechanics are similar to a light survival stealth game. So, I'm thinking that if, aside from the scripted intro/tutorial sequences, the city that the game takes place in, is randomly generated each playthrough, it would make the game less linear. My only worry is that if levels are randomly generated, it would lack the polish of levels being hand laid out. The way I am trying to overcome this is by creating large 45x45 (Unity meters) tiles with multiple buildings and interiors, all hand designed. Then, I would Instantiate them along a 45x45 grid, with random rotations in 90 degree increments. Meaning that the city would be different every time, but each city block would be really good looking and won't have the empty computer generated feel. This would also mean that, if I align everything correctly, I can essentially have an endless city with nice streets and even underground areas. BUT, because it's a VR game, performance is extremely important, so I'm probably only gonna have a 5x5 tile grid. Because each tile is 45x45, it would be a lot larger than it sounds. The other issue is lighting. Because I want it to be generated at runtime, I can't rely on baked lighting to increase the framerate. Each tile would need to have one point light in the center to illuminate the area, and that's not including the directional (sun/moon) light over the entire map. In any case, for now I'm working on perfecting the intro sequence, and making sure the script and mechanics are solid, then I can move onto the rest of the game. Thanks for reading! - Judah Unfortunately I haven't been able to work on the game for a while now, and I will have less time to work on it because I started teaching game design!
Every day I go back and forth between a high and elementary school to teach coding within the context of game design. For the high schoolers it's a little easier because, well, they're high schoolers. For them I'm teaching GDevelop (a great no-code 2D game engine) and Coppercube (a great, no-code 3D game engine), with Piskel and Tinkercad for asset creation. The point of that class is to explore game development as an art form, with less focus on the code aspect. To the middle schoolers, I'm teaching Tinkercad for 3D modeling (that can also be applied to 3D printing), and then importing those models into CoSpaces. CoSpaces is surprisingly underused in education, and it's essentially a step up from Scratch. It uses block-based code, but in a 3D environment that, when paired with a smartphone, can be viewed in VR and AR--my two favorite bits of tech. What's great about coSpaces is that for those students with a little bit of Javascript knowledge, there is a really powerful API that they can use. At the high school, each student has their own laptop--All Macs, incidentally. At the middle school, however, the students have chromebooks. This forced me to only use Web-Based software (which CoSpaces and Tinkercad are). The only problem is that... 1) The wheel-in large monitor isn't available yet, so it's hard for the students to see what I'm explaining. 2) Oh, yeah, they don't even have the chromebooks yet... So for this week, it's impossible for me to teach anything without computers! I gave a 15 min introduction to what they will be learning, but aside from that, there isn't much else I can do. Luckily today is my last day of the week (they don't have code on Fridays), and they should be getting the computers Monday. On Fridays, however, there are electives they can choose from. This is super cool, except for my coding elective, they are most likely doing the same things they are doing during the rest of the week. So, I'm trying to replace the coding elective with magic instead. It's an extremely unique part of any school (no other school has it), and it teaches kids to perform/speak in front of crowds and build confidence in them. Hopefully that will work out... Thanks for reading! - Judah Building off of what I talked about two posts ago, I just wanted to give a little update on the VR Sword Combat.
I mentioned last time that my system, when paired with something like FinalIK can be used on humanoid characters as well. This is absolutely true, but I found a better, simpler way of doing it. It's a simple script you apply to the hand bone and, after setting some parameters, the arm will behave like it should. Additionally, it still respects animations with no changes. I originally thought I'd have to use an Avatar Mask to mask off the arms that are to be moved with IK (not animations), but luckily I don't even have to do that! I am still working on it to make it the best it can be before release (and I still have to resolve some dependencies on VRTK 3.x) on the Asset Store. Thanks for reading! - Judah As promised, here's another blog post about the music of the upcoming VR game!
This will be short and sweet, but it is important to note: Each piece of music in the game is expertly crafted by my brother Eli Mantell, to match the tone of every scene. This ranges from every cutscene, trailer, interaction, combat sequence, etc--all have their own unique backing track that, hopefully, matches the mood. With the upcoming announcement/teaser trailer, I hope you will all get a sense of the great audio work to come... Thanks for reading! - Judah So after taking a little time off to work on other important matters (see future post!), I finally got back to working on the new game, only to see that a new version of Unity is out -- Cool! But, long story short, when I tried updating the project, it broke many things that were important to my workflow. Oh well. Luckily, after the unfortunate Oh Captain incident, I had version control setup via the Github Desktop app, and it has served me well in the past, so I wasn't too worried. BUT, for some reason it wouldn't let me revert to an earlier commit! Granted, at this point I still wasn't fully aware of the way the Git system works.
Now, everything is working fine except one thing: Unity's tool for recording cutscenes and exporting them as MP4 files--Something I am using heavily in my quest to make the perfect teaser trailer. It's throwing a bunch of errors, and only recording the first frame of the cutscene. I updated it to the newest version and haven't tested it yet. Hopefully it will work now.
On Friday I have another post about the really cool music in the game. Stay tuned for that! Thanks for reading! - Judah So, without any sort of consistency to this blog, here's another post!
Lately I've been obsessed with getting sword combat in VR just right. BUT, when I say "just right," I don't mean the most realistic. GORN's combat is great but it isn't realistic in any way. Blade and Sorcery's combat is also great, but it's too realistic for what I'm going for, so I have to work right between those two. Any humans in my game use guns to fight, but can still be killed by a sword or a stealth kill. What will have melee weapons, however, are the robots. They are sort of like the Lightsaber-Weilding robots in Vader Immortal (which I highly recommend!), but fit more into the world I am creating. With some help from some inspiring articles on VR combat on the internet, I think I've found a good middle ground. Essentially, enemy sword weapons detect players' swords in a specified range and position themselves in such a way that effectively blocks the player from hitting/stabbing them. In each blocking movement, the enemy tries to swipe their sword across the player to damage him/her, and it's up to the player to block the enemy back. None of this would work without a good sword physics collision system, and with a bit of work, I've got a pretty good way of doing that too. I won't get into it in too much detail for reasons I will get into,*** but the gist of it is that when the player picks up the hilt of the sword, the blade follows it with a configurable joint that is setup to collide with objects while still not moving away from the player's hand unless it really has to. Right now the swords just kinda hover in front of the placeholder enemy bodies, but when I'm finished with it, the swords will be hooked up to an inverse kinematics arm system attached to the body. Because these are robots we're talking about, I can have the arms look like whatever I want, coming out of whatever I want, this isn't a problem, but what's so cool about my system is that when paired with something like Final IK from the Asset Store, this can easily be applied to human enemies as well. *** Now, the reason why I don't want to get into it too much right now is because I'm considering taking my entire VR Combat system, including the robot IK arm, the combat, the player sword, and a few other bonuses (including my collision sound system for VR and more) and putting them up on the assetStore for anyone to use, which is pretty exciting! :D The only problem is that right now everything is deeply connected to VRTK 3.x. While this works great for me, it might not be so great for anyone else using a different VR interaction system. So if I want to publish it anywhere, I have to make it VR-Framework-Agnostic, which is a little more work. In any case, we will see! One more thing: I'm considering turning all previous blog posts that show major features into a few video devlogs with clips of what they look like in game. So be on the look out for that! Thanks for reading! - Judah Yes, I'm back!
Development has halted for a little bit due to technical difficulties with my WMR Headset, so after much tinkering, I've decided it was best to move on with newer tech. After much research, I settled on the new Rift S headset, and I gotta say... this thing is awesome. The lack of external sensors is always great, I've had no issues with the tracking so far, and, despite what everyone else says, the audio is pretty good too! The problem now is that I have to make a choice: Do I stick with targeting SteamVR, or move on to using the Oculus SDK. I tried just using SteamVR, but this time in Oculus mode, but it caused odd delays when picking up and dropping objects, which basically made throwing impossible. I found no mention of this issue online... except for one random comment on one random YouTube video. Thanks, Google! Since I've been using the VRTK Unity Package for my VR development, I can actually target BOTH SteamVR and Oculus. Down the line, I'm gonna have to get a tester to mess with a demo on Vive or WMR, but for now, the Oculus SDK + VRTK is working pretty great. I'm having trouble getting the hand animations working, but everything else works fine. I have no doubt that I'll be able to fix it soon enough. In the game there are certain objects that you can throw line boomerangs. Kinda like ninja stars--but not ninja stars. Originally I had the player hold one, then press a button to launch one spinning, but after the switch to Oculus, and because the input mapping are a bit different, I found a really simple solution: Add a multiplier to the force of your throw. This means that all the player has to do is flick their wrist and the object will spin out of their hand in a really satisfying way. More info to come! Thanks for reading! - Judah So I have no idea what happened with last week's post, and unfortunately it didn't save anywhere else for me to repost, so I'm just gonna continue with this one...
With my previous Unity projects, I've shied away from using character animations, purely because it seemed so intimidating to use premade animations on characters I would have to create from scratch... This time, I knew I would have to learn how to do it. For some reason Adobe Fuse doesn't play well with Unity 2019, so that was out. BUT, if I use the old (pre-Adobe) Fuse application through Steam, then upload that .obj file to Mixamo's auto-rigger website, I can perfectly import the generated .fbx file into Unity, with all the materials intact. Perfect! BUT, the trouble was finding animations that would work with what cutscenes we had in the game's script. After messing about for a little while with all the animations Mixamo offers, I looked into different ways of making custom animations, without having a panic attack over Blender's terrifying UI. (All joking aside though, to be fair, Blender 2.8 is pretty awesome!) I had an Xbox 360 Kinect sensor sitting in a closet for literally years, and would never have imagined that I would dig it out for any reason, especially after the 360 itself stopped working. But then I found the CinemaMocap Unity Asset. This asset is incredible. After picking up a USB adapter for the sensor, the asset allows me to record animations (raw files that I can apply to any rigged humanoid character in any software) using really affordable motion capture! This is amazing because now I can have my voice actors record dialogue AND act at the same time, for super realistic motion--it matches to the dialogue perfectly because the actors are essentially in the game! (For lack of a better explanation.) So, it seems like this is the way we are going to be recording out cutscenes from now on. This added super realistic human motion also adds to the immersion in VR--when characters with generic animations are "interacting" with the player in VR, it can be extremely unsettling. In addition to all this, I'm still messing around with trying to get my Mixamo Fuse characters working with facial blendshapes for lip sync, but that's still up in the air. More to come! Thanks for reading! - Judah Yes, I know it's been a while since I posted, but that's because I've been working hard on the new project.
To reiterate: Guns, robots, holograms, stealth kills, parkour, and so much more--all in VR! Though story details--as well as a proper title--have yet to be unveiled, I can honestly say that the game is coming along quite well. The trouble with having large worlds in VR is that the more geometry being rendered in memory at a time, the more expensive it is on performance. So, after I build as much as I can in Unity, I'm going to split the scene into multiple smaller scenes to load an unload depending on where the player is at any given time. If I do this correctly, the player, when In VR, won't have any idea this is happening. I use Adobe Fuse and Mixamo for my animated characters. This works really well, except when Unity decides to just not import the textures. I found a fix, by exporting the character model as an obj, with textures packed into one file, then let Mixamo re-export the rigged character as an FBX. Then, it works fine. The only issue is that there isn't any facial animation to work with for voice acting. But, I found a really great, in-world solution to this... which you will find out about when the game is released! Thanks for reading! - Judah For the past few weeks, I've been working hard on making a game environment that looks and feels like the final game should. I plan on putting the mechanics I finished in the blockout scene (mentioned in a previous post) into this scene and have a pretty cool vertical slice to release for free.
With this demo release, I, of course, will be announcing what the game is and overall just more info about it. In putting together this environment I'm met with a few challenges: The first being that in a greybox scene, it's very clear where you can and cannot go--that's the point, it's a basic prototype. But, with a finalized game environment, the design of the level itself has to educate the player on what to be doing, while still not making the player feel restricted. This is especially difficult in a city map, where I only want a small section to be explorable, without making the player want to see other places that they can't access at that moment. The other challenge is making the level design work hand in hand with the mechanics. Just because a level looks great, it doesn't mean it fits the way the game plays, and vice versa. Especially in VR, where you can't stop the player from sticking their head in everything, AND a game where you should be able to run and jump wherever. With some trial and error (and lots of rebaking lights!), this area seems to be coming together nicely. Thanks for reading! - Judah So, along with the force grab and push mechanics, the actual objects need to work well too!
The past few days I've been focusing on the combat with enemies to make it as satisfying as possible. This, of course, starts with guns. So far everything has ben working out well: The guns function as similar to real world pistols as possible, with working slide racks, magazines, a safety, and bullets. Each magazine has ten rounds in it, and when the gun is empty, it will drop out. You can then find more in the world and either pop them into your trusty pistol, or put them into your pocket for later. Additionally, after you insert a magazine into the gun, in order to fire it, you have to rack the gun, just like in real life. The reason I opted for instantiating actual bullet objects and applying a force to them, rather than just using a raycast (invisible line that calculates the hit point), is because I won't have to worry about choosing what objects to apply a force to when they're hit. This way, static objects won't move, and non-static objects will! Easy enough! For the enemies, each of their body parts are connected with joints, so they react similar to how a real object reacting to a bullet would. Thanks for reading! - Judah ![]() Short update today; I unfortunately got Tendinitis in my left hand, so development has been a little slow--I have to have my brother do all testing for now... But, that aside: After finishing the great force grab mechanic, I've implemented a force push mechanic. Now, when you're holding an object with the trigger, if you hold the grip button, then release the trigger, the object will forcefully shoot out of your hand. This is useful for distracting enemies quickly, and even knock other objects around to damage enemies... or people. Maybe. Overall, once I get the combat and gun mechanics done, I should be migrating from my greyboxed scene to the actual game scenes with all the finished art. Then I will be announcing more concrete details about the final game. Thanks for reading! - Judah VR is supposed to be better than the real world! So, why do I have to keep bending down to pick up objects I dropped (and sometimes have the headset almost fall of my face!)? It's really annoying! So, using some of the code from the failed grappling hook, just applying it to interactable objects instead of the player, I now have a system where if you're pressing the grip button, you can point at objects to highlight them. Then, if you press the trigger, they will fly to your hand as if you're using the force! ![]() It works really well and is super satisfying to use. I am considering adding a pointer to indicate what you're pointing at to make it more clear, but I don't think that's necessary. It would be good for UI though. But I'm trying to avoid the usual point and click style UI, as it can break immersion in VR.
I am hesitant to announce anything concrete about the project because I am working on all the mechanics in a "greyboxed" scene and I have very few fully realized scenes done already for me to share, but I promise there is more to come!
Thanks for reading! - Judah So far most of the simple object interactions work well in VR, but there's one problem: the player's hands don't collide with the environment.
Because I'm just using SteamVR's controllers as hands in their [Camera Rig] prefab, there isn't much I can do about this. The hands are colliding with objects in the scene, allowing you to knock objects around, but they don't collide with static geometry. The question in doing this is that if the hands collide with static geometry, there will be a difference in position between the player's hands in real life and in VR. Originally this was thought as a no-no in VR--You don't separate the player's real body from VR body. After extensive testing, many developers are going against this trend as having hands pass through objects is more immersion breaking than a little space between the hands. Because VRTK's animated hand models rely on having the ControllerEvents script in a parent object to animate, I wouldn't be able to do this; in order for the hands to collide with static geometry they need to be separate objects from the controller alias scripts but track with them. I ended up modifying the VRTK's animated hand script so I can manually drop the scripts in the inspector. No big deal. But I still needed a good way to have the hand rigidbody move with the controller but still collide. VRTK does have a good RigidbodyFollow script that seems to work exactly as I need with a cube, but not so much with the hand prefabs. I'm gonna continue working on it to get the desired effect. Additionally, I really want the player to have arms that collide as well, so I've been looking for a good IK solution that will allow me to do this. I found a really cool script online that I can apply to any rigged body, then parent the hands to the controllers and the player will have a full body. Should be pretty cool. Either that, or I'm going to mess with VRArmIK (I think that's what its called), a great Unity Asset that is essentially plug and play--though it's been a little more difficult because I'm using VRTK. Either way, I'll figure it out. Thanks for reading! - Judah After messing more with the grappling hook (solving some issues, causing more), I've decided to take a bit of a break from it. From this point I've started working on the weapon system. Melee weapons are going to be next, after I finish the ranged weapons.
The problem with VR weapons is that you can't just hold down left click to shoot, and press R to reload. You need to have physical interactions. Yes, we can just add a gun object to the players hand and have them shoot it, but I want more than that. For ammunition, there will be reloadable magazines that you have to find and use to add ammo to the guns, and hopefully, if I can get this right, you can slide down the top to cock the gun in place. Instead of instantiating prefabs as bullets (this can be taxing on performance), I'm going to be using a raycast. Because I'm using VRTK for my object interactions, there are multiple ways to handle shooting input. I've messed with many of them, but I'm trying to find the one that has the best compatibility with existing functions. I can either extend from their base InteractableObject class and just override the "OnUse" method, but this hasn't been working for me for some reason. I can also just subscribe to the use method, but this proved tedious to add to each usable object and I got fed up with it. Now I'm just checking if "TriggerPressed" is true in Update(), but the problem is this won't allow me to change the use button in the future. I'm gonna continue messing around with this to find the best way to do this. Additionally, I've added a page on the site for my brother and composer Eli Mantell where you can listen to the entire Oh Captain VR soundtrack. If you want to use any of it, please contact him via the form on the page! Thanks for reading! - Judah Hi again!
Yes, once again there's more news about the grappling hook--who knew something so simple would generate so much blog content! Also, in the future, I hope to structure blog posts in mini sections like you see below: Enjoy! Grappling Hook Anyway, after messing with it some more, I struck a good balance between using the grappling hook, and climbing the environment. Now only certain objects can be hooked onto. These include things you wouldn't be able to reach by just climbing. The way I limited this is as follows: When holding the grappling hook, the hook part will be either red or green. When pointing at an object that can be grappled to, the hook will turn green, and the object will highlight yellow. This makes it clear what can and cannot be grappled on. This works perfectly. Because now you can't use it at any time (and therefore the object is listening for a controller trigger press at every frame), there is less bugs with the other inputs--though it still needs to be worked on. Jumping I also changed the jump controls from the grip buttons to the touchpad buttons. Now you can hold one of them to move (walk in place, to put it simply), and immediately push the other touchpad at the same time, and pull back to launch yourself in the air. This makes it much easier to run, jump, land, and continue running immediately after. Enemy AI All of this, in combination with the enemy AI, works really well to get around the environment, and hide from the enemies' field of view. Depending on the enemy's range, you can run away from a melee attack (sword slashes), or try to dodge projectiles being fired at you. Teleportation The primary way of getting around in this game is by walk-in-place movement, but I'm also working on a mechanic that relies on teleportation as well: Being that it's a stealth game--Oh, did I not mention that it's a stealth game yet?--one of the objects you can have in your inventory are Flash Bangs. Kinda like Batman. Or ninjas. Or both. Whatever. The way they work is as follows: You grab one from your inventory. When you grab them, time slows down to 0.25 speed, and a teleportation cursor appears. You chose where you want to go, and warp there in a cloud of smoke, and time goes back to normal speed. This is essentially a special ability that will allow you to get to places you otherwise wouldn't be able to. . . . Once all of these are fully prototyped and work the way I want them to, I will be moving onto the actual level design, moving all the mechanics into what will be the actual game. P.S: The image in the last post is a screenshot from one of the test scenes I made. It's cool, but the final game will look even better. :D Thanks for reading! - Judah Okay, so, first of all, yes, I changed the site look. I got tired of the old one and found one that I like better.
But, on to the more important stuff: I am almost finished with the enemy AI! I've been working on it for a while and am super proud of what I have at this point: There are three types of enemies: melee, ranged, and "flier" The melee and ranged enemies are self-explanatory (one fights close up with hand weapons, while the other shoots from distances), but the "flier" is something different (excuse spelling, I wasn't really sure what to call it internally!) It moves around the waypoints in the same way the other ones do, but at set intervals, it will fly up to the highest point, survey the area, then fly down. In the game, the player can run and jump on objects of different heights, so the enemies will react to that and try to fly up and shoot the player down. In VR it is not only quite nerve-racking to suddenly see the flying enemy coming up beside you as you're climbing, but it's a ton of fun to shoot them down from the sky as well. I also worked on the interaction system a bit better. Now, objects that can be "used" will stay in your hand until you find an acceptable place to put them, allowing for the trigger button to toggle actions, such as shooting a gun. Along with this, I've reworked the grappling hook to play better with VRTK (Virtual Reality ToolKit), and now it works so much better! Instead of allowing the player to grapple onto anything, I've decided that it can only be used on certain objects you couldn't just jump to, like cranes above you, etc. One of the next things I will be working on is a collision sound framework. In VR, it's extremely important to make the world as immersive as possible, and this includes sound. While music (more about this later this week!) and ambiance are easy to implement, but with so many objects moving around the scene, most of which are controlled by the player's hands, it's hard to have a single audiosource controlling all of them. So, I've come up with a way to mark different objects as being "made of" different materials (such as wood, metal, stone), and depending on what type of object is colliding with it (while taking into account how hard the object hits the other object), a certain sound will play. While it's all still conceptual right now, I have a feeling that when I write it up tomorrow it should work properly. More info to come about this awesome game! :D Thanks for reading, Judah So, I originally wanted to implement a feature from Oh Captain VR into this game: the grappling hook.
I thought it would add another stealth element to the game, but I ran into two problems: 1) It made playtesters only want to use the grappling hook to get around, not the locomotion system I talked about yesterday, and made the game too easy and not very fun to use. 2) While it did work, having the physics object being usable in the player's hand made jumping wonky and inconsistent, which we cant have, obviously. I'm considering just removing the grappling hook altogether because of these issues, and instead making the focus on stealth combat. Thanks for reading, Judah |
About MeAside from being a game developer and CEO of MidnightCoffee, Inc, Judah teaches game design to middle and high school students. He is also a professional magician and retro game enthusiast. Categories
All
Powered by
Powered by
|