On the latest Nightshift Galaxy weekly dev stream I demonstrated the specialized level editting tool I’m building inside the Unreal Level Editor that I’m calling Scaffold.
I gave an impromptu introduction to my motivations and inspirations live, but on reflection the topic deserves a written deep-dive.
As I transition from developing systems to developing content, a larger ratio of my programming time is devoted to AI programming. Note: I’m discussing classical game AI – how NPCs make decisions – not the overhyped odious tech-fad of the same name.
Most recently, I coded the logic that pilots enemy jets: how to fly, when to turn, how to avoid obstacles, when to react to the player, and how to line up strafing attacks.
Music by EMMIE (Chiisana Kiseki) from the Project A-KO Greyside OST
Let’s break down the choices I made in the high-level technical design, as well as the syntax tricks I employ in low-level C++ code.
Unreal Engine applies the aspect ratio of the viewport to a fixed Horizontal Field of View. Therefore, switching from standard 16:9 to ultrawide will crop the top and bottom of the view and zoom, rather than revealing more to the left and right as you’d expect, and shrinking it to 4:3 creates an extreme fisheye effect.
Unreal Default: content on the sides of the viewport is fixed, causing weird cropping and zooming.
Our Camera: fixing content on the top and bottom of the viewport, like we expect.
The default is basically never what we want, but it’s hard-coded and there’s no way to override it with just configuration, so we have to patch it. 😖
I can’t believe how long it’s been since I wrote for the blog! My time has been fully-committed to an ambitious project to get funding to launch a new game studio.
Spoiler alert: 2024 was a bad year to try and start a studio. 😩
What follows is a post-mortem – including receipts – to pull back the curtain on this shadowy part of game development, as a service to enthusiasts and other indie developers. I also recorded a short video essay version if you just want the headlines and not all the nitty-gritty details.
Here I’ll discuss:
How game funding works
Our experience seeking funding
The state of the games industry
The future for my game, Nightshift Galaxy (TL;DR – Wishlist on Steam!)
Hey everyone, sorry for the posting delay, but I’ve been focussed on my game for the last two months, and now I have a demo build, which I made for Glitch City LA x GUMBO NYC!
In Part 1 we took care of the preliminaries, setting up our assets, and getting a base layer root and hip motion going on which to start grooving. In this part we’ll take the next step and start stepping.
This is the first article in a series on implementing procedural locomotion. I’m going to walk through step-by-step generating walkcycles for a pair of robo-pants. Part 1 will mostly be setup and preliminaries, but we’ll at least go far enough for the gestures to start exihibiting personality.
This is my second article on procedural animation. This time I’ll break down a simple “trailing” effect, for joints which “hang loose” and trail behind the primary movement. This is a subtle effect which can create a physical feel without too much actual physics.
Wings trailing behind the character to create secondary motion.
Procedural animation is a priority on Tiny Starpilot. As a designer, I like how it it improves the responsiveness and “juiceness” of interactive characters. As a programmer, it’s my area of professional expertise. Finally, as a solo dev, it lets me create more “modular” assets which can be shared between characters with different skeletons.
The biggest tool in the procedural-animation toolkit is called Inverse Kinematics. With these algorithms, designers can specify “targets” (often called effectors) for the animation system, and joints are internally rotated to satisfy those goals.
An example of my modular-design: anim nodes automatically discover and coordinate with very little explicit setup.
I’ve begun consolidating my various demos and prototypes into a single Unreal Engine project.
A big feature to resolve with this port is how collision is handled. Let’s discuss collision handling in general, and how to start implementing it in Unreal in particular. This will be a code-heavy post (C++).
I’ve planned for three playable pilots since my earliest prototype.
The design goal is straightforward: approaching the same mission with different pilots who handle differently increases replay variety. It also acts as a difficulty handicap for casual or hardcore players to opt-into by choosing particular easy or challenging pilots for particular maps.
This week I was feeling ambitious and decided to flex my illustration muscles, and made a nice set of pilot-portrait key-art, as well as sketch concepts for their fighter craft. I really want to push the 80s mecha-anime space-opera creative direction.
Experience has made me opinionated about implementing 3rd person cameras. People naturally, but naively, think about the camera as a second actor in the world, following the player around (like Lakitu in Super Mario 64).
Lets discuss an alternative perspective, where you consider instead the player on the 2D picture plane (with code!).
I juggle a dozen little side-projects which I dust off from time to time.
Tiny Starpilot was a minigame I conceived of years ago when I was freelancing between jobs, and considering the possibility of becoming a one-man-band mobile game developer.
The design was simple: hold the phone sideways and slide your thumbs along the side to act as tank-tread controls. Your ship would autofire on a regular beat.