I guided a small team of developers to create a playable demo of Swords and Shovels, a game project by Pluralsight aimed at illustrating game development (and tie in to their classes).
I contracted with Oculus (Facebook) for several months in with the UX department. As part of this work I prototyped several VR-centric effects, including the Orb that showed up when you activated voice recognition. It was a fun mix of procedural mesh generation and voice-data-driven deformations.
This was a project for a sci-fi adventure game, obviously inspired by Bladerunner. Although the project didn’t end up getting greenlit, it was a fun time working on an Adventure-type game, with interesting camera systems, scripted scenes and non-linear story.
Bomb Squad Academy is my current game project, inspired by my love of electronics. It is currently greenlit and will be on sale in the first quarter of 2017.
"Save the world one wire at a time with Bomb Squad Academy, a puzzle game where you have to defuse bombs under a time constraint."
This game is a brand new adventure for me. The underlying tech for it is extremely simple. Compared to planets and pathfinding systems, simulating small digital logic circuits is pretty trivial. In fact, my friend Olivier and I prototyped the mechanics in less than a week. But in so many ways, making this game, and more importantly, being intent on finishing it, is forcing me to become a better all-around developer.
Because I can't hide behind technical challenges, because there isn't some incredibly difficult programming task that I must complete before I can work on the rest of the game, I have had to constantly step out of my comfort zone.
So far I have done everything from level design to sound design to even concepting and then editing a trailer! Something that even on Fireborne, I always imagined tasking someone else with. But out of this effort to make everything in the game, I've come to realize something as well.
An indie game can generally only shine by its design and identity, not any technical or artistic achievement, something that is so often underestimated, especially by ex-AAA developers like me. No indie game will have better Dragons than Skyrim, or better animations than Uncharted.
But while any single part of Bomb Squad Academy is unremarkable on its own, the sum is completely unique, and unique to me. Regardless of its future success, there is no game like it out there, and that's what makes it special!
Fireborne was my first attempt at making independent games after leaving the AAA world. Together with a couple of friends, we spent many many months crafting a hybrid game reminiscent of Populous and Diablo at the same time...
"Fireborne is a God game with RTS elements set in a fantasy universe. Guide the Fireborne and her people, the Yin, in a journey to free their home-worlds from the gruesome lifeforms that have destroyed them."
While unfortunately Fireborne was never finished (the game was over-scoped and lacked a singular vision), it was an amazing learning experience for all of us.
I was able to create many cool systems for the game, a fully deformable planet, an automata-based water simulation, behavior graph AI for the characters, optimized pathfinding and lots and lots of shaders for all the fx and unique visuals of the game. All this development really allowed me to become completely fluent in Unity. In fact, I think this is when I fell in love with C# as well. I was able to create systems so quickly, while still maintaining enough control over the clarity and performance of the code to do what I needed. It really clicked: C# is an amazing language for games if you are okay with trading off a little bit of performance for a huge gain in productivity.
The main lesson I learned while working on Fireborne, however, had nothing to do with coding in fact. It had to with the importance of having a strong Vision and the need for that vision to drive the rest of development. It was a hard-learned lesson, of course, as very little of the work done on Fireborne will ever make it into any future product of mine, but it needed to be learned.
I only worked on Fallout 4 for about a year (out of its 3 year development cycle), but I still like to point to it as an example of my work.
For Fallout 4, I prototyped a new Behavior Graph-based AI system, as well as continuing to maintain and improve the Navigation and Pathfinding system that I had developed for Bethesda's previous titles.
Skyrim is probably the game I enjoyed working on the most at Bethesda. Not only because it was the first Elder Scrolls game I worked on from beginning to end, but because it was the one where I had the most opportunity to add my own touch.
Along with significantly improving the Pathfinding system I had begun work on for Fallout 3 (including a complete overhaul of how it interacted with the animation system, or how it handled quadrupeds), I was able to add a completely new aspect to the game itself, namely the critter system.
Butterflies, Dragonflies or fish weren't supposed to be in the game, or if they were going to be, they would simply be animated meshes, because the actor system of the game couldn't support that many new intractable entities in the world.
I created a small actor system that relied almost entirely on scripting for lightweight processing. This way, butterflies to would have their own (extremely simple) AI and interact with the world, landing on objects, running away from the player, etc... all under the control of designers and artists. By removing the hurdle of needing programmer time to add new behaviors, the critter system spread like wildfire and pretty quickly every world artist in the company was adding small animals to the world.
I have a love-hate relationship with Liberty Prime!
As a new, ambitious programmer, I somehow convinced the team that I should re-write Bethesda's pathfinding system for Fallout 3. Navmeshes were all the rage, so of course that's what I started to implement.
Of course, with a large, connected open world, data size was an issue, and I had to come up with a multi-level pathing solution system so that actors could traverse the world even when the player wasn't around. That part of the system was pretty complicated to put together, but it went over fairly well.
What was more of a challenge, unfortunately, was that simple navmesh-based pathfinding doesn't work very well with agents of varying sizes. Navmeshes are very much a precompute-once-use-forever kind of support.
For the most part that wasn't too much of an issue, most characters in Fallout were, if not the same size, in the same order of magnitude! Most actors needed to get through doorways, be they supermutants or dogs.
Liberty-Prime, however, was another story. Tall as a building and as wide as a bus, he broke everything! All the clever tricks I had used until then didn't work. And Liberty prime was crucial to the game, he HAD to walk a pretty long distance while blowing stuff up and throwing nukes and spreading capitalist propaganda.
So I spent several months rewriting portions of the pathfinding system to accommodate him and all the havoc he wrecked. In the process, I made the system able to handle any character size properly, by utilizing a clever 2-pass pathing solution.
And it was all worth it! Liberty prime is awesome! I think one day I'll build a real articulated 12" version of him...
I joined Bethesda about two thirds of the way through the development of Oblivion. Not long enough to have an entire system to my name, but enough to have had a small, visible impact.
I implemented various minor AI features for the game, but the one thing that really stood out was that I gave the characters 'shifty eyes' !
The characters in Oblivion would already look at you when you talked to them, and sometimes blink, but never look away. It was creepy. I was asked to add a little bit or randomness in the eyes, and so I took the opportunity to tie their eye movement to their disposition.
If a character was happy, I would animate their eyes up slightly and of course down when sad. When angry I would give them really fast saccades. When scared, they would look left and right. All very simple stuff, especially by today's standards, but it added a tremendous amount of life to the NPCs.
Aeon Flux was the second game I worked on at Terminal Reality. It was a very short schedule (9 months) for a movie tie-in.
Mostly, this coincided with my development of a new Particle Engine and Editor for the Infernal Engine. I called it Ephemera after one of my favorite characters on the previous game I worked on: Bloodrayne 2.
This was my first real experience building not just a feature or a system, but a complete tool for artists to use.
Bloodrayne 2 was the very first professional game I worked on!
I very quickly became in charge of all things blood-related on the game. From blood splats, to blood geysers to blood rage and blood tornadoes (yes tornadoes!), I worked on all things gross and amazing.
Blood pools spread, dried off, left footprints, all sorts of things that were both hard to do (due to xbox and ps2 limitations) and extremely satisfying! I also worked on several special effects for the game, including freeze time or a character with animated body tattoos: Ephemera.
Overall, I had an amazing time and will forever have fond memories of working on this game.
Unleash the power of your mouse with Back Button - a full-featured history and favorite system for Unity:
Get it from the Asset Store here: https://www.assetstore.unity3d.com/#!...
- Use your Mouse's Back and Forward buttons to navigate through your selection history *. That's right,
just like you would in the explorer or your favorite browser.
- Easily highlight-and-lock previous selection items while drag-and-dropping a selection.
- Powerful but simple per-scene and per-project favorites for quick access. You can favorite anything you
can select, much more than with the standard favorite system.
Low Poly Terrain is a simple-to-use asset for generating faceted, hard-edge, low-poly terrains in Unity.
Grab it here: https://www.assetstore.unity3d.com/#!...
- Import height and color maps from your favorite generation/painting tool.
- Full collision support.
- Achieve a properly faceted look that the default terrain system can't.
- Generate and manager LOD levels dynamically, keeping a low poly count.
- Low CPU and GPU Usage, and taking advantage of dynamic batching.
- Place trees or other objects procedurally on your terrains.
- Ideal for high performance environments such as VR.
- Works with PBR or legacy/custom lighting models.
- Full source and shader code included.
A demo of the UV Remap shader I wrote for our game Fireborne.
The main effect doesn’t use particles or animation sheets (the embers and glow ball do), just a custom pixel shader.
Faceted Flight is a VR-first game designed by my friend Matt Scott of The Department of Silly Stuff! I helped him on a few different areas of the game, mainly creating a hard edge, faceted looking terrain system that would run on VR devices and prototyping multiplayer. Sadly the multiplayer version of the game could never be finished.
A demonstration of the Planetary Fog System I put together for our game Fireborne. The fog deforms semi-procedurally, and uses custom shaders to achieve the end result. The atmosphere also uses its own shader to complement the effect.
Textures by Jonah Lobe (http://jonahlobe.com).
Another project related to our game Fireborne, procedural mountain generation was a very interesting exercice. Good art helps a ton, of course, but coming up with an algorithm that satistified our requirements (creating mountain walls between areas while not looking repetitive) was really fun!
A short demonstration of the Planetary Deformation System I implemented for our game Fireborne. Most of the processing happens on the GPU.
I wrote P4Connect back in 2013 (back then I called it UnityP4). It has since been graciously taken over by Perforce themselves, and you can now download it for free!
A few examples of Procedural Terrain Generation and Object Placement.
A donut-shaped pyroclastic effect, derived from the effect you can find here: https://www.assetstore.unity3d.com/en/#!/content/10580
Most of what I did was modify the raycasting equations to work with a different shape than a sphere. Yes that's right, the Pyroclastic effect is in fact ray-traced by your GPU!!! :)