Devlog: How My NPC System Works

Hello. In this devlog I want to talk about my NPC system. Before I jump into the code I want to establish some context. In my game “Traitors on the Train”, NPCs are integral to the gameplay loop. The player who acts as a ticket inspector but is secretly a spy must check tickets of each NPC.

Some NPCs are traitors that must be caught. Based on the profiles that describe the behaviours and appearances of each traitor that are given to the player on a notepad, the player must check the suspected traitor’s ticket and allocate the station they’ll get off at on the given profile. I hope that makes sense.

Currently I have 5 NPC types. Tourist, Glasses Lady, Tradie, Businessman and Necklace Lady. There’s quite a few things established about how NPCs behave and the sort of data that I need to generate for the player.

Behaviour
    [Flags] public enum Behaviours
    {
        Nothing = 0,
        Frequent_smoker = 1 << 0,
        Takes_naps = 1 << 2,
        Always_hungry = 1 << 3,
        Listens_to_music = 1 << 4,
        Lots_of_phone_calls = 1 << 5,
        Enjoys_reading = 1 << 6,
    }

First is behaviour. Each NPC type acts out a mix of unique and generic behaviours. As an example, almost all NPC types sleep, however only the businessman and tradie takes phone calls. I made a firm rule that each NPC instance only acts out two of the however many behaviours that are designated to the NPC type. This is to prevent the situation where an NPC who isn’t a traitor which I dub “bystanders” don’t have the same behaviours and appearances as the traitors. I can see how this would be unfair for the player. The two selected behaviours are randomised upon instantiation of the NPC. I also ensure the behaviour combinations are unique to each NPC by selecting from a pool or structs that I dub “NPCProfile”.

    [Serializable] public struct NPCProfile
    {
        public string fullName;

        public int startStationIndex;
        public int exitStationIndex;

        public int npcPrefabIndex;

        public Behaviours behaviours;
        public Appearence appearence;
    }
Appearance
    [Flags] public enum Appearence
    {
        Nothing = 0,
        White_hair = 1 << 0,
        Blue_collar_worker = 1 << 1,
        Has_a_cain = 1 << 2,
        Near_sighted = 1 << 3,
        Suit_and_tie = 1 << 4,
        Is_bald = 1 << 5,
        Big_boned = 1 << 6,
        Wearing_a_dress = 1 << 7,
        Wears_shorts = 1 << 8,
        Carries_a_bag = 1 << 9,
        Wears_a_hat = 1 << 10,
        Wears_a_necklace = 1 << 11,
    }

Second is Appearance. This one is quite simple. Essentially, I just need to designate a combination of appearance descriptions of each NPC prefab as they are already have predetermined appearances based on how they are drawn. Meaning, if I draw an NPC with glasses, one of the appearances selected in the inspector would be “Wears glasses”. If NPC is a traitor, that one of those appearance descriptions would appear in the notepad.

Name
    [Serializable] public class NameData
    {
        public FirstName[] firstNames;
        public LastName[] lastNames;
    }
    [Serializable] public struct FirstName
    {
        public string gender;
        public string ethnicity;
        public string name;
    }
    [Serializable] public struct LastName
    {
        public string ethnicity;
        public string name;
    }

Third is name generation which could be a topic on it’s own. The world I’m building is based around the public transport of Melbourne. Gender and ethnicity are heavy influencers of what sort of name I generate for each NPC. The challenge is avoiding stereotypes and respecting ethnic backgrounds. For the system however, I just need to make sure I can pair these generated names to the appropriate NPC instance. I do so with the function below where I match names from a JSON file to the gender and ethnicity inputs.

private void Awake()
{
    nameData = JsonUtility.FromJson<NameData>(namesJSON.text);
}

public string GenerateName(Gender gender, Ethnicity ethnicity)
{
    string genderString = gender.ToString();
    string ethnicityString = ethnicity.ToString();
    List<FirstName> firstNamesList = new List<FirstName>();

    for(int i = 0; i < nameData.firstNames.Length; i++)
    {
        FirstName fn = nameData.firstNames[i];
        if (fn.gender.Equals(genderString, StringComparison.OrdinalIgnoreCase) &&
            fn.ethnicity.Equals(ethnicityString, StringComparison.OrdinalIgnoreCase))
        {
            firstNamesList.Add(fn);
        }
    }

    if (firstNamesList.Count == 0) return "NoFirstName";

    int firstNameIndex = UnityEngine.Random.Range(0, firstNamesList.Count);
    string firstName = firstNamesList[firstNameIndex].name;
        
    List<LastName> lastNameList = new List<LastName>();
    for(int i = 0; i < nameData.lastNames.Length; i++)
    {
        LastName ln = nameData.lastNames[i];
        if (ln.ethnicity.Equals(ethnicityString, StringComparison.OrdinalIgnoreCase))
        {
            lastNameList.Add(ln);
        }
    }
    if (lastNameList.Count == 0) return firstName;

    int lastNameIndex = UnityEngine.Random.Range(0, lastNameList.Count);
    string lastName = lastNameList[lastNameIndex].name;

    return firstName + " " + lastName;
}
Stations
public class StationSO : ScriptableObject
{
    public Station station_prefab;
    public int targetTrainSpeed = 100;
    public int metersPosition = 0;

    [Range(0, 1)]public float busynessFactor = 0.2f;
    public int traitorSpawnAmount = 2;

    public bool isFrontOfTrain;
    [Header("Generated")]
    public bool hadSpawned;
    public  List<NPCProfile> bystanderProfiles;
    public  List<NPCProfile> traitorProfiles;
}

Finally, the stations the NPCs start and end at are huge influencers of difficulty. If a traitor is only on the train for 1 or 2 stations, the likelihood of the player catching the traitor would be quite low. So like most procedural generation systems, fairness and variety need to be balanced. I treat each station as an NPC spawner so I can influence the distribution of NPCs. Some stations can be more busy than others, which reflects real world patterns.

public class Station : MonoBehaviour
{
    public StationSO station;
    ...
    private void SpawnNPCs()
    {
        for (int i = 0; i < station.bystanderProfiles.Count; i++)
        {
            NPCProfile bystanderProfile = station.bystanderProfiles[i];

            float randXPos = Random.Range(
            platformRenderer.renderInput.bounds.min.x + SPAWN_BUFFER,
            platformRenderer.renderInput.bounds.max.x - SPAWN_BUFFER);

            Vector3 spawnPos = new Vector3(
            randXPos,
            transform.position.y + 0.1f,
            platformRenderer.transform.position.z);

            NPCBrain bystander = Instantiate(
            trip.npc_prefabsArray[bystanderProfile.npcPrefabIndex],
            spawnPos,
            Quaternion.identity,
            platformRenderer.transform);

            bystander.profile = bystanderProfile;
            bystander.role = Role.Bystander;
        }

        for (int i = 0; i < station.traitorProfiles.Count; i++)
        {
            NPCProfile traitorProfile = station.traitorProfiles[i];
            float randXPos = Random.Range(
            platformRenderer.renderInput.bounds.min.x + SPAWN_BUFFER,
            platformRenderer.renderInput.bounds.max.x - SPAWN_BUFFER);

            Vector3 spawnPos = new Vector3(randXPos,
            transform.position.y + 0.1f,
            platformRenderer.transform.position.z);

            NPCBrain traitor = Instantiate(
            trip.npc_prefabsArray[traitorProfile.npcPrefabIndex],
            spawnPos,
            Quaternion.identity,
            platformRenderer.transform);

            traitor.profile = traitorProfile;
            traitor.role = Role.Traitor;
        }
    }
    ...
}

How I like to code is very data oriented based. I rely serialized data, static APIs and limited classes to avoid dependencies and cross references. I have a video explaining a bit more in depth as to why. Here I want to talk more about the how.

NPCBrain Class

I start off with the NPC component I call “NPCBrain”. Just for context I call my state machines “Brains” hence the name. The goal I have here as a programmer is to keep the NPC object script in one script. Following my video I linked above, I create a basic switch case state machine. Despite the NPCBrain having many states, the logic to get to each state is very simple. Considering the “Behaviours” enum is a bitmask, I just need to match the current behaviour to one of the behaviour enum flags using an AND operator.

 ... else if ((curBehaviour & Behaviours.Frequent_smoker) != 0)
 {
     SetState(NPCState.Smoking);
 }
 else if ((curBehaviour & Behaviours.Takes_naps) != 0)
 {
     SetState(NPCState.Sleeping);
 }...

When the bitwise result isn’t zero, I set the state. An advantage I found, that is different to how state machines are regularly taught, is decoupling the animations from the states. I can see, that overtime, the states and animations will grow. To reduce the amount states needed, I can couple animations like “sitting eating” and “standing eating” into one “Eating” state. Each state already has access to the first frame, every frame, every fixed frame and the last frame of the being the state so handling the further logic to decide which animation to play can be stored in those scopes. I’ll also take advantage of the simplification that comes with being on a train. NPCs are either standing or sitting for most behaviours so a simple check on whether the NPC is designated a seat will determine which of the animations to play.

        switch (curState)
        {
            ...
            case NPCState.Eating:
            {
                stateDuration = UnityEngine.Random.Range(
                npc.pickBehaviourDurationRange.x, 
                npc.pickBehaviourDurationRange.y);

                if (chairPosIndex != int.MaxValue)
                {
                    curClip = atlas.clipDict[(int)NPCMotion.SittingEating];
                }
                else
                {
                    curClip = atlas.clipDict[(int)NPCMotion.StandingEating];
                }
            }
            break; ...
        }

Onto to the behaviour selection logic, I want the NPCs to randomly choose the next behaviour on a timer. The timer value is also randomly selected from a range each time the NPC enters into a new random behaviour state. I have to consider that this logic influences two paradigms, difficulty and naturalness. If the NPC takes a long to change behaviours, the information is more limited to the player. If the range of the timer value is too small, the more robotic the NPC will appear to be. It is too soon to tell where the sweet spots are for these values, but I can tell I will need control here when level designing.

private Behaviours[] behaviourFlags;    
private Behaviour curBehaviour
private void UpdateStates()
    {
        switch (curState)
        {
            ...
            case NPCState.Eating:
            {
                if (behaviourClock  > stateDuration)
                {
                    curBehaviour = PickBehaviour();
                }
            }
            break;...
        }
    }

    private Behaviours PickBehaviour()
    {
        return behaviourFlags[UnityEngine.Random.Range(0, behaviourFlags.Length)];
    }
   

There is also logic outside of behaviours every NPC needs. They all need to be able to find a slide door, board the train, find a seat, find a standing position and exit the train. The challenge here is CPU optimisation. I want a lot of NPCs existing at once. A pattern here is each function calculates a position for the NPC to move towards. Considering the train doesn’t move in world space, these world positions can be cached beforehand to minimise spikes in performance. Lets take for example the FindSlideDoor() method.

    private void FindSlideDoor()
    {
        float shortestDist = float.MaxValue;
        float selectedSlideDoorPos = float.MaxValue;
        for (int i = 0; i < trainStats.slideDoorPositions.Length; i++)
        {
            float dist = Mathf.Abs(trainStats.slideDoorPositions[i] - transform.position.x);

            if (dist < shortestDist)
            {
                shortestDist = dist;
                selectedSlideDoorPos = trainStats.slideDoorPositions[i];
            }
        }
        targetXPos = selectedSlideDoorPos;
        curPath = Path.ToSlideDoor;
    }

The goal here is to find the closest SlideDoor object. Instead of querying each slide door position individually, I cache every slide door’s x position into a float array stored in a scriptable object trainStats.slideDoorPositions and I can loop through those values instead.

    private void SetSlideDoorPositions()
    {
        int slideDoorsPerCarriage = carriages[0].exteriorRenderers.Length;
        int totalSlideDoors = carriages.Length * slideDoorsPerCarriage;

        stats.slideDoorPositions = new float[totalSlideDoors];

        for (int i = 0; i < carriages.Length; i++)
        {
            Carriage carriage = carriages[i];

            for (int j = 0; j < carriage.exteriorSlideDoors.Length; j++)
            {
                int curIndex = i * slideDoorsPerCarriage + j;
                stats.slideDoorPositions[curIndex] = carriage.exteriorSlideDoors[j].transform.position.x;
            }
        }
    }

This is cheaper because it avoids the overhead of interacting with scene objects at runtime which are scattered in memory. Float arrays are contiguous, making iterations to find the closest position much faster. I use the same technique for the other methods too. The con is obviously there’s more arrays to create, paying the upfront cost either in edit time or in the first frame. During runtime, I’m set to maximise my NPC amount with a less performance cost in comparison.

The final framework allows me to scale in a specific way that serves the overall game loop. NPCs hold a duality of purpose, giving hints to the player and world building. Later on, the plan is to create specific NPCs that hold more of a narrative purpose. I plan to have a sports event happen and so the train would be full of sports fans. The framework should support this idea. Sports fans would have sport fan behaviours such as chanting, drinking etc. NPCs interacting with each other is another avenue I want to explore as well which I hope to help further build a convincing world. The idea would be to set the target position to another NPC, once close enough, enter the conversation state and talk until the behaviour timer stops. The framework should hold… I hope. We’ll see how we go.

Thank you for reading.

Tutorial: How to Make a Realtime Signal Graph Shader in Unity

I wanted to make a quick tutorial on using render textures in Unity and why they might be useful.

When I first started off making this shader, I started where most would start, an old fashioned Unity shader graph. It doesn’t seem too complex. Just sample a noise texture that scrolls. remap the values so the range of values is -1 to 1. then add the result to the g channel of the UV and absolute those values. Add some parameters for the amplitude, frequency and value. Here’s the shader graph:

Great!

oh no

See how the line is thinner when there’s a sudden change. That sorta sucks. The specific area of issue is this part here:

What is happening is the gradient I am outputting isn’t equal on all sides of the line. The greater the amplitude, the smaller the range between 0-1. The fix is… I don’t know… probably some crazy math that can measure the line thickness relative to the line angle. Alarm bells were ringing and I knew there was a better approach.

When I see this graph in action I imagine one of those lie detector machines that print these sorts of graphs.

Enter render textures. I wrote about these in my Creating a Fog of War Shader devlog but I’ll give a simple run down. Render textures are read and write textures that are built to be updated at runtime. GPU’s have frame length memory so the only way you can store data from the GPU is with a render texture.
So why would this help me?
Well like the lie detector graph, I can use a circle shape to use like a brush and scroll the render texture UV backwards. I like to call this the “brush stroke” method. The idea is because I use a circle shape, the line thickness is a consistent value no matter the direction of the line. I can use the “Temporal Accumulation” method here where I sample the previous frame result to affect the next frame result, which creates the brush stroke like effect. I mention this in my fog of war devlog too.

Unlike how I set up a render texture in the fog of war devlog, I wanted to try out the shader graph work flow instead of the code based work flow. The more I develop as a tech artist, the more I’m understanding the importance of learning both work flows.

To set up you need to create a “Material”, “Custom Render Texture” in the “Rendering” tab and a “Custom Render Texture” in the “Shader Graph” tab.

No idea why Unity decided to name them the exact same thing but at least you get different icons. I’ll call the shader version “Render Texture Shader Graph”. From here you set your custom render texture the same as you would with a normal texture until you get to the material. Initialization Mode I put as “OnLoad” and I put Update Mode as “Realtime”. This just made sense to me and I don’t see a use case where I would pick anything different. Color is the starting value of the render texture. This is important and it does depend on how your render texture shader graph works. The last important note is the “Double Buffered” option. This allows you to use the previous frame result of the render texture which is what I mentioned earlier.

Great. Now you can assign the render texture shader graph to the material and we can get started. Oh and also this render texture will need to be sampled in a regular material shader graph so you can actually see it in the game.

To get things started I like to separate everything into small easy steps and see the output in game at each step. The first step is to create a make the brush shape and put it in the correct position.

From there I scroll a noise texture to randomly offset the vertical position over time and add in some public parameters.

A little technique that I use everywhere to remapping a value cheaply is what I do with the “Amplitude” parameter. If you know the original range is between 0-1 and you want to extend the range while keeping 0 in the middle, you can multiply the parameter by the 0-1 texture, then subtract half the parameter. Say the amplitude value is 3, the result would be a range between -3 and 3. Everything else is using basic offset, scale and rounding functions for better control.

Now for the temporal accumulation.

The new node for me at least is the “Custom Render Texture Self” node which is the previous frame render texture and you treat it like any other texture. It defaults to bilinear filtering even though I set the custom render texture filtering to point so I have to override that using the sampler state node. The main focus is how I set the UV. I add a very small constant value on the x axis. If I keep that a hard coded number, the graph works. Now every frame this shader will compare the calculated shader to the previous result and pick the smallest texel drawing a line. However, a very crucial and often forgotten component is adding in GPU frame independence. In other words, I don’t want the FPS to dictate the speed of the graph. Unity supplies the delta time value already. If I use that updating value instead, it will no longer matter what your FPS is, thus the graph will run at a constant speed.

Nice.

Now this is much better, but pushed to the brink, the shader still breaks.

I still get this amplitude problem. As a game dev I come to the very naturally choice between fixing it or letting it be.

I had one idea. What if instead of drawing dots, I draw lines? I already know the coordinate of each point, so just connect the dots.

I have another very handy technique I use that does exactly this. This is the sub graph I take to almost every project that I called “Distance Line”.

I got this math from a Shadertoy shader and I’ll admit, I still don’t quite understand the math here. I’ll link the shader below. The inputs are two UVs. The output is a linear line between the two origin points of the UVs.

So to connect the dots, I need more information coming from the previous frame render texture or more specifically the last y value that was calculated. This is the 0-1 value that makes the brush wobble up and down randomly. I store that value directly into the R channel to use later. I also use this value to set the y origin of the next UV.

Then I get that same y value to recalculate the previous UV. These two UVs are the inputs to my Distance Line node I showed earlier.

I also used the delta time value as the horizontal offset for the second UV because that is the exact x position of the tail of the line.

Then its just the same temporal accumulation pattern as before where I reuse the G channel.

Looks like it passes the stress test.

Its weird how often seemingly simple shaders turn into something much more complex and how they uncover hardware limitations. This is such a tiny part of this game and this game I’m making is a relatively small game. However, what lies underneath, is a whole system using a custom render texture graph that uses temporal accumulation and distance lines that is then sampled again through a material shader to output to the screen. I find this sort of stuff so fascinating, because to the player, they will never know.

Anyways, I hope to continue these blogs. As go continue my development as a tech artist. See you guys later.

Distance Line Shader: https://www.shadertoy.com/view/WddGzr