The CRT Shader in Cool Basketball Game

This shader was all about simulating CRT technology with the challenge of being as accurate as possible. It turns out it is just a combination of smaller effects and for a 4 week project this fit the scope well.

Warped Screen Coordinates and Vignette

The electron beam of a CRT screen doesn’t naturally travel in a straight line which is why the screen shape is warped. CRT screens usually have a vignette around the edges to minimise the noticeable curve of the screen. Heres how I did it:

float3 CRTCoordsAndSDF(float2 p, float2 screenParams, float vignetteWidth, float warpOffset)
{
    float2 centredUV = p * 2 - 1;
    float2 warpUV = centredUV.yx / warpOffset;
    float2 uv = centredUV + centredUV * warpUV * warpUV;
    uv = uv * 0.5 + 0.5;
    
    if (uv.x <= 0.0f || 1.0f <= uv.x || uv.y <= 0.0f || 1.0f <= uv.y) uv = 0;
    
    float2 vignette = vignetteWidth / screenParams.xy;
    vignette = smoothstep(0, vignette, 1 - abs(uv * 2.0f - 1.0f));
    
    float vignetteSDF = saturate(vignette.x * vignette.y);
    
    return float3(uv, vignetteSDF);
}

To get a curve I squared the warped UV and added back to the centred UV. I combined both the vignette and warped UV because the vignette needed the calculated UV. It was just cheaper that way. I cull out anything outside the the 0-1 range to get the shape. The UV ends up looking like this:

Scanlines

Scanlines are due to the horizontal raster scanning of the CRT laser beam. The laser rapidly scans one line at a time. The area between each line remain darker causing this scanline effect. I wanted control over the scanline amount and fall off. The function I use here is quite short and simple, but it does the job.

float ScanLines(float rawScale, float colValue, float scanLineFallOff)
{
    return colValue * pow(distance(frac(rawScale), 0.5), scanLineFallOff);
}

The input colValue will be the greyscale of the blit texture. This is too get brightness as a factor of the line width as brighter colours will yield thinner lines due to the bloom and colour bleeding they cause.

YIQ Colour Space

During the 1980s-1990s, broadcast television tend to use YIQ as a colour space. Y is the luminance and IQ is the chrominance. Using this colour space will constrain the colours I have available based on what was available during this period.

float3 YIQColorGrading(float3 color, float cyanToRed, float magentaToGreen, float brightness)
{
    float3x3 RGBToYIQ = float3x3(0.299, 0.596, 0.211,
                                 0.587, -0.274, -0.523,
                                 0.114, -0.322, 0.312);
    
    float3x3 YIQToRGB = float3x3(1, 1, 1,
                                 0.956, -0.272, -1.106,
                                 0.621, -0.647, 1.703);
    
    float3 colParams = float3(brightness, cyanToRed, magentaToGreen);
    float3 crtConversion = float3(0.9, 1.1, 1.5);
    
    float3 final = mul(color, RGBToYIQ);
    final = colParams + (crtConversion * final);
    
    return mul(final, YIQToRGB);
}

I found the calculations to get to YIQ on the wikipedia page . This function allows me to colour correct in the Unity scene and still output RGB values. The YIQ parameters are also public for me to use in the Unity Scene.

Chromatic Aberration and Bloom

Chromatic Aberration occurs as the electron beams age. Each beam of a CRT is either red, green or blue and all must hit the same physical point on the screen. When the accuracy wears off chromatic aberration starts to occur. These beams were prone to bloom sensitive, where lighter colours bled into neighbouring phosphor dots (CRT version of pixels). Unity’s default bloom shader sufficed well so I won’t go into much detail about that. One thing I’ll say is to replicate the CRT sensitivity, I lowered the bloom threshold. To achieve the chromatic aberration effect, I sampled the blit texture three times, each with an offset controlled in the Unity scene. One channel of each blit texture was added to a new float3, creating a similar camera texture, but with each RGB channel slightly offset from the centre of the screen.

            float chromABias = length(crtUV * 2 - 1) * _chromaticAberationIntensity;
            float3 chromCol = float3(0,0,0);
            for (int i = -offset; i <= offset; i++)
            {
                float o = chromABias * i;
                float2 uv = crtUV.xy + o;

                float4 blit = SAMPLE_TEXTURE2D_X(_BlitTexture, point_clamp_sampler, uv);
                float3 col = YIQColorGrading(blit, _cyanToRed, _magentaToGreen, _brightness);
                
                if (i== -offset)
                {
                    chromCol.x += col.x;
                }
                else if (i == 0)
                {
                    chromCol.y += col.y;
                }
                else
                {
                    chromCol.z += col.z;
                }
            }

Signal Static

Broadcast signals were prone to being unstable at times causing signal static. I use a vertical scrolling sinewave with a high frequency that changes in amplitude randomly.

float SignalSDF(float2 p, float time, float range, float freq, float mag ,float bias)
{
    float mask = 1 - saturate(range * distance(p.g, 1 - frac(time * 0.5)));
    
    float sinIn = freq * p.g;

    float b = 1 + (mask * mag * sin(sinIn));

    float wave = 1 - saturate(distance(p.r, b));
    
    float flooredTime = floor(time * 10);
    float normRandRange = Hash21(flooredTime.xx);
    float flicker = round(bias * normRandRange);
    
    float t = mask * wave * flicker;
    
    float sdf = lerp(1, 0.9, t);
    return sdf;
}

I want it to snap in an out so I floored time and put it through a Hash21 I have which will output a random value between 0-1. The bias controls how frequently the static shows on the screen.

This is the final product. All fairly simple techniques here where research is the driving force for quality.

Sonar Ping Post Processing Shader

The idea behind this game is to simulate the deep sea in an abstract fashion. 80% to 90% of screen is pitch black and the colours are all on a large grid that can move with player abilities. The challenge was to still make the game legible, working with these quite extreme constraints.

This is how the scene is set. I Use the 3D render pipeline and an orthographic camera. Every material uses a flat colour and a simplified custom lighting model where all I use is the distance attenuation, and light direction to render light. This is all done in shader graph using a custom function.

void BasicAdditionalLights_half(float3 WorldPosition, float3 WorldNormal, out float TotalAttenuation, out float3 LightColor)
{
    TotalAttenuation = 0.0;
    LightColor = float3(0, 0, 0);

#ifndef SHADERGRAPH_PREVIEW
    uint pixelLightCount = GetAdditionalLightsCount();
    LIGHT_LOOP_BEGIN(pixelLightCount)

    uint perObjectLightIndex = GetPerObjectLightIndex(lightIndex);
    Light light = GetAdditionalPerObjectLight(perObjectLightIndex, WorldPosition);

    float atten = light.distanceAttenuation;

    float NdotL = saturate(dot(WorldNormal, light.direction));

    float diffuse = atten * NdotL;
    TotalAttenuation += diffuse;
    LightColor += light.color;
    LIGHT_LOOP_END

#endif
}

Too simplify it even further I use basic cell shading to get as few colours as possible for the post processing filter. Legibility is my number one priority here.

Onto the post processing, I use a Custom Render Feature because the shader I create is not possible in shader graph and optimisation is a huge factor particularly for this game.

        float4 calc(Varyings input) : SV_Target
        {
            float2 screenPos = float2(_ScreenParams.x, _ScreenParams.y);
            float2 scaledAspectRatioUV = screenPos / _gridScale;

            float2 scaledTexCoord = input.texcoord * scaledAspectRatioUV;
            float2 id = round(scaledTexCoord);
            float2 gridTexCoord = id / scaledAspectRatioUV; // quantized UV
            float4 blit = SAMPLE_TEXTURE2D_X(_BlitTexture, point_clamp_sampler, gridTexCoord);
            return blit;

I start off quantising the camera colour to retrieve this output.

Already looking tough to see. The reason why the cells need to be so big is because a black grid will need to be rendered over the top of it. Now making a grid is fairly simple and I got a video tutorial for it here. The real challenge is moving each grid cell without it loosing its square shape.

I use a compute shader to render a render texture that my post processing shader can sample.

[numthreads(8, 8, 1)]
void CSMain(uint3 id : SV_DispatchThreadID)
{
    float2 aspScreen = aspectRatioPentile(_Resolution);
    float2 uv = (float2(id.xy) + 0.5) / _Resolution;

    float2 scaledAspectRatioUV = _Resolution / _gridScale;
    float2 scaledTexCoord = uv * scaledAspectRatioUV;

    float2 idCoord = round(scaledTexCoord);
    float2 gridTexCoord = idCoord / scaledAspectRatioUV;

    float2 gridSpacePlayerPos = (_playerPos * _Resolution) / _gridScale;

    float centerLight = circleSDF(gridTexCoord, aspScreen);
    float sonarPing = sonarSDF(gridTexCoord, aspScreen, 0, 0.5);
    float flare = flareSDF(gridTexCoord, aspScreen, 20, 2) * 50;
    float radialScan = radialScanSDF(gridTexCoord, aspScreen, 1, 1) * 10;
    float col = 0;
    int neighbourRange = 4;
    for (int x = -neighbourRange; x <= neighbourRange; x++)
    {
        for (int y = -neighbourRange; y <= neighbourRange; y++)
        {
            float2 offset = float2(x, y);
            float2 localTexCoord = scaledTexCoord - offset;
            float4 currSC = float4(frac(localTexCoord), floor(localTexCoord));

            float2 localUV = (currSC.zw) / scaledAspectRatioUV;
            float sonar = sonarSDF(localUV, aspScreen, 0, 2);
            float flare = flareSDF(localUV, aspScreen, 20, 2) * 4;
            float radialScan = radialScanSDF(localUV, aspScreen, 1, 5);
            
            float totalMask = sonar + flare + radialScan;
            
            float2 displacementDir = normalize(gridSpacePlayerPos - currSC.zw);
            float2 displacedPos = currSC.xy + offset + (displacementDir * totalMask);
            float currDistFromSquare = max(-(max(abs(displacedPos.x) - 0.5, abs(displacedPos.y) - 0.5)), 0);
            col += currDistFromSquare;
        }
    } 
    float totalMask = saturate(centerLight + sonarPing + flare + radialScan);
    col = smoothstep(_gridThickness, 1 - _gridThickness, col);
    col *= totalMask;
    col = step(0.01,col);
    Result[id.xy] = float4(col, totalMask, 0, 0);
}

The way this compute shader works is I quantise the UVs to the same size as the post processing and create a mask for each type of ability that I will show later. For context I have the “Flare”, “Radial Scan” and “Sonar Ping” abilities. I make each of the SDFs and combine them all in one single mask. Then I loop through a 2D array with “offset” being each index and use that to calculate the direction the grid cell should be moving. The ability SDFs I generate determine the distance the cell will move. The larger the 2D array, the further the grid cell can travel before being culled.

It works as intended, however there is a great cost. The red flag for me is the nest for loop. Unfortunately I couldn’t find anyway of avoid the nest for loop considering I rely on a 2D array to control the direction. Along with keeping both the grid and square cell intact, made for a huge challenge. Keeping the neighbouring samples around 4 was a good balance between performance and effectiveness. Here are the other two abilties in the render texture form:

Now the mask is done I can use render texture as a multiplying factor in the UV for the blit texture.

Heres the final result with all three abilities:

In the end this was a quick solo project that tested my current capabilities as a tech artist. I feel like I stretched the what can be done here with these aesthetic and hardware limitations. For what looks to be a simplistic looking game, there’s a lot that happens underneath!