We're just a bunch of developers, trying to do something, creating experiences that stand out, creating fun. Sometimes we succeed, sometimes we fail but for us, it's not always about whether you win or lose, it's about how you make the game.
You can read all about the latest stuff in the blog or check out these featured articles:
These two weeks involved lots of different activities, including content creation (crouch animations, particle effects), code cleaning and developing new systems.
Some of the work can be seen in this animation:
Design upgrades & implementation.
The design was simplified to reduce some coding issues and make it easier to understand. Used more artistic representations (strikethrough lines and question marks) instead of words and clever groups. Double readability of everything (words + symbols) seems like a good rule for accessible UI.
The implementation required refactoring some of the systems made for the previous (mini-)game to include things like radio buttons and control themes. This is more or less the current state of the implementation:
I expect to finish this up in the following weeks as the UI is ~50% done.
Interactive entities defined fully in SGScript code.
I needed some way to easily produce destructible/dynamic items, which was the reason behind the design of scripted items. Currently they support:
Most items will probably work according to these few rules (which are fully supported now):
However, there are still a few things that might be necessary to add for proper effect coverage:
Some future wishes:
Minor upgrades that are still quite important.
In the end, I'd just like to mention this one idea that popped up recently - producing a storyless, arcadey, gameplay-test shooter that would be released before TACStrike and contain pretty much just the shooting aspect of the game. Full alert mode, minimal stealth, no exploration, just fighting against AI in different environments, trying to survive various encounters.
The reason behind the idea is that I've been struggling with the shooting aspect of the game due to the chosen perspective of the game (aiming is currently somewhat unintuitive at the moment) and I'd like not to bet on too many moving parts with a big game. Exploration takes time to be fun, fights require tweaking. I'd like to tweak first and take the time later (preferably with some funding gained in the process).Add a comment
The work done in these two weeks was so varied that it's hard to describe it in a few words. There's also nothing new in terms of character motion and effects so a video won't help much this time. So, let's just dive right into it.
But I do have one gif "video" of the item pickup feature:
The map update consists of showing enemy positions and cones of vision.
As for the item UI - just drawing a circle, line, transparent rectangle and its outline. Or is it just that simple?
Turns out, it isn't so simple if antialiasing has to be used for UI even if 3D doesn't require it.
The way this is achieved is by first generating the polygon that needs to be rendered and adding an outline to it on rendering. There are some differences between line and polygon rendering, and there are limitations (up to 90 degree corners, can only render lines/polygons in one color, polygons must be convex). But it's exactly enough for most designs, and as the designer, I can keep the system's limitations in mind.
First I attached the UI to items that can be picked up.
Then I created a new "actionable" entity. Couldn't think of a more appropriate noun at the time for the type name. Also, I forgot to put that into a separate screenshot so it'll just show up later on in this post.
Turns out the previous method was highly "combustible" (didn't work often) even after some redesigns so I finally redesigned it along more mathematically stable lines.
A mathematically (and universally) stable method generally requires no intersection computations of any kind. If a method is not analytical or returns a strict boolean value, it is not stable. Numbers tend to be imprecise with computers and thus these methods are, as well. Protip: any method suggested by crazy mathematicians such as Delaunay triangulation will most likely waste a lot of time while reaching some stability and very few actually stable implementations exist (if any).
Rant aside, my approach this time around was rather simple: BVH (AABB) to gather sample points, do inverse squared distance weighting to combine the light values. If anything goes wrong, sampling radius can be increased (and is always adjustable) and that'll be it. No more complications.
Also, I removed the need to place samples manually by generating them (as seen in the above picture). This should save me some time in the future in case I want to make visibility dependent on lighting.
When they finally get to be fixed, it should be a nice addition to the scene.
Two reasons for brokeness: 1) no lighting = why they're black, and 2) for some reason they are made off of character meshes as well (which not only looks wrong but is a performance issue as well). As soon as those get fixed, we get some cool blood on the walls from shootouts.
Just another entity in the game.
This was the model I created in Blender. Nothing too exciting but then again, it'll be rather small from the player's point of view.
When it was time to make it alive, I took the character editor to make it happen.
The specifics of the entity required changes in editor (character selection) so that had to come as well. But since it looks just like mesh selection...
... it's not in the screenshot and what we see is just the preview of the entity.
After that I started implementing the entity in game.
At this point it's moving from side to side with configurable timeouts and has a flare to indicate its state of alertness. I also added the position and view cone in the map, but there's no more behavior implemented for it at this point. Oh and by the way - here's that "actionable" entity - the lamp with some UI connected to it.
Just something I've been slightly needing for some time.
I can now apply fitting which allows me to avoid messing up texture coordinates while resizing the block.
This screen shows all mission objectives and all available info about them.
I would like to say that gameplay is coming soon but it's hard to tell if/when I'll get there. There had been growing worries about the possible lack of variety of the available content, which was one of the reasons I started to work on the camera. But it does seem that it's close.
AI is finally getting some design time, which is very good. Soon I should figure out how to build it in a way to make everything work right.Add a comment
Character editor was the subject of focus for these two weeks. Gameplay's not quite there yet but I am getting quite close, as you can see in the video (P.S. I'm aware of the "fire" coming out of the gun, it's expected to look strange at this point - it's just a placeholder):
Let's get into all the rest first, as character editor is a bigger topic.
Material lookup from texture names.
This was necessary to apply normalmap to water, for example. It's not easy to see at the moment but there's going to be an environment map and scrolling there as well and it will look more watery soon enough.
What was done to make the video possible was rather simple - had to fix aiming, add weapon particle system, retrieve gun barrel attachment matrix, add basic bullet filtering (whoever shot it shouldn't get hit by it) and reconfigure shooting (faster bullet, different trigger, origin and direction).
To get there, however, some effort was required to build the...
Editing/generating metadata for character meshes with realtime preview.
Character data consists of 5 major parts: mesh, bones, attachments, layers and masks.
Mesh must be selected from the "Edit character" tab, all very simple so far.
Bones are where it gets a bit more exciting. Apart from joints, that aren't implemented yet, there are bodies - physical representation of bones, and hitboxes - raycast-friendly representation. A body can be a box, a sphere or a capsule, with size/position/rotation relative to the bone. A hitbox is about the same thing, just limited to the box. Both can be generated automatically. Joints will be an important part of the system when I get to ragdolls, and that leaves bodies unused for now as well.
Attachments are the next best thing. They define a transform (position+rotation) that is relative to some bone. This makes it easy to position anything relative to any bone of the character, and to retrieve that position by a name, thus making positioning portable across multiple characters. This is how I place weapon particle effects at the right spot - I get the attachment matrix by name and pass it to the particle system.
But there's a plot twist - currently the weapon is welded into the character mesh, how could I possibly do the same thing when I make it possible to swap weapons? The answer is once again very simple - weapons will be characters. While they contain no bones, their transforms will still be relative to them.
This also allows me to define weapon "rest position" (on back for larger weapons, on the side of any leg for a sidearm) and placement of hats that could fall off while fighting.
Now while attachments determine where things should be put, layers do the opposite - they can produce basic movement (movement and rotation) in response to basic input (a number). They can be used to turn a character's upper and lower parts separately, or to make its arms go closer when near a wall, or turn its head towards a target etc.
So basically layers will be used to implement basic and responsive pseudo-IK, without constraint resolution.
And finally there's masks. They're most simple - specifies named sets of weights for bones, to use for masking bones to animation players (letting them affect only the specified bones by the given amount). Just like layers, it helps for upper/lower body part separation, to allow blending partial animations on the character (reloading animations rarely need to move legs, for example).
As I'm sure you know, many of these things can be easily done in code, however there's one issue - not all characters can have the same bone configuration. Even if the names can be matched (though as I found out - it's a bit of a pain to do so on Blender as bones lose their animation tracks and it takes time to restore them), transforms often can't. A slightly turned bone can make all the difference in the world, making weapons hover or appear to be sucked into something.
Since I can make enemies shoot bullets now, I would like to take the time to develop a fully animated character and work on the AI. If I can make all that work, the rest is not likely to cause much trouble.Add a comment