On the latest Nightshift Galaxy weekly dev stream I demonstrated the specialized level editting tool I’m building inside the Unreal Level Editor that I’m calling Scaffold.
I gave an impromptu introduction to my motivations and inspirations live, but on reflection the topic deserves a written deep-dive.
As I transition from developing systems to developing content, a larger ratio of my programming time is devoted to AI programming. Note: I’m discussing classical game AI – how NPCs make decisions – not the overhyped odious tech-fad of the same name.
Most recently, I coded the logic that pilots enemy jets: how to fly, when to turn, how to avoid obstacles, when to react to the player, and how to line up strafing attacks.
Music by EMMIE (Chiisana Kiseki) from the Project A-KO Greyside OST
Let’s break down the choices I made in the high-level technical design, as well as the syntax tricks I employ in low-level C++ code.
Unreal Engine applies the aspect ratio of the viewport to a fixed Horizontal Field of View. Therefore, switching from standard 16:9 to ultrawide will crop the top and bottom of the view and zoom, rather than revealing more to the left and right as you’d expect, and shrinking it to 4:3 creates an extreme fisheye effect.
Unreal Default: content on the sides of the viewport is fixed, causing weird cropping and zooming.
Our Camera: fixing content on the top and bottom of the viewport, like we expect.
The default is basically never what we want, but it’s hard-coded and there’s no way to override it with just configuration, so we have to patch it. 😖
In Part 1 we took care of the preliminaries, setting up our assets, and getting a base layer root and hip motion going on which to start grooving. In this part we’ll take the next step and start stepping.
This is the first article in a series on implementing procedural locomotion. I’m going to walk through step-by-step generating walkcycles for a pair of robo-pants. Part 1 will mostly be setup and preliminaries, but we’ll at least go far enough for the gestures to start exihibiting personality.
This is my second article on procedural animation. This time I’ll break down a simple “trailing” effect, for joints which “hang loose” and trail behind the primary movement. This is a subtle effect which can create a physical feel without too much actual physics.
Wings trailing behind the character to create secondary motion.
Procedural animation is a priority on Tiny Starpilot. As a designer, I like how it it improves the responsiveness and “juiceness” of interactive characters. As a programmer, it’s my area of professional expertise. Finally, as a solo dev, it lets me create more “modular” assets which can be shared between characters with different skeletons.
The biggest tool in the procedural-animation toolkit is called Inverse Kinematics. With these algorithms, designers can specify “targets” (often called effectors) for the animation system, and joints are internally rotated to satisfy those goals.
An example of my modular-design: anim nodes automatically discover and coordinate with very little explicit setup.
I’ve begun consolidating my various demos and prototypes into a single Unreal Engine project.
A big feature to resolve with this port is how collision is handled. Let’s discuss collision handling in general, and how to start implementing it in Unreal in particular. This will be a code-heavy post (C++).
Experience has made me opinionated about implementing 3rd person cameras. People naturally, but naively, think about the camera as a second actor in the world, following the player around (like Lakitu in Super Mario 64).
Lets discuss an alternative perspective, where you consider instead the player on the 2D picture plane (with code!).