Tutorial: How to Make a Realtime Signal Graph Shader in Unity

I wanted to make a quick tutorial on using render textures in Unity and why they might be useful.

When I first started off making this shader, I started where most would start, an old fashioned Unity shader graph. It doesn’t seem too complex. Just sample a noise texture that scrolls. remap the values so the range of values is -1 to 1. then add the result to the g channel of the UV and absolute those values. Add some parameters for the amplitude, frequency and value. Here’s the shader graph:

Great!

oh no

See how the line is thinner when there’s a sudden change. That sorta sucks. The specific area of issue is this part here:

What is happening is the gradient I am outputting isn’t equal on all sides of the line. The greater the amplitude, the smaller the range between 0-1. The fix is… I don’t know… probably some crazy math that can measure the line thickness relative to the line angle. Alarm bells were ringing and I knew there was a better approach.

When I see this graph in action I imagine one of those lie detector machines that print these sorts of graphs.

Enter render textures. I wrote about these in my Creating a Fog of War Shader devlog but I’ll give a simple run down. Render textures are read and write textures that are built to be updated at runtime. GPU’s have frame length memory so the only way you can store data from the GPU is with a render texture.
So why would this help me?
Well like the lie detector graph, I can use a circle shape to use like a brush and scroll the render texture UV backwards. I like to call this the “brush stroke” method. The idea is because I use a circle shape, the line thickness is a consistent value no matter the direction of the line. I can use the “Temporal Accumulation” method here where I sample the previous frame result to affect the next frame result, which creates the brush stroke like effect. I mention this in my fog of war devlog too.

Unlike how I set up a render texture in the fog of war devlog, I wanted to try out the shader graph work flow instead of the code based work flow. The more I develop as a tech artist, the more I’m understanding the importance of learning both work flows.

To set up you need to create a “Material”, “Custom Render Texture” in the “Rendering” tab and a “Custom Render Texture” in the “Shader Graph” tab.

No idea why Unity decided to name them the exact same thing but at least you get different icons. I’ll call the shader version “Render Texture Shader Graph”. From here you set your custom render texture the same as you would with a normal texture until you get to the material. Initialization Mode I put as “OnLoad” and I put Update Mode as “Realtime”. This just made sense to me and I don’t see a use case where I would pick anything different. Color is the starting value of the render texture. This is important and it does depend on how your render texture shader graph works. The last important note is the “Double Buffered” option. This allows you to use the previous frame result of the render texture which is what I mentioned earlier.

Great. Now you can assign the render texture shader graph to the material and we can get started. Oh and also this render texture will need to be sampled in a regular material shader graph so you can actually see it in the game.

To get things started I like to separate everything into small easy steps and see the output in game at each step. The first step is to create a make the brush shape and put it in the correct position.

From there I scroll a noise texture to randomly offset the vertical position over time and add in some public parameters.

A little technique that I use everywhere to remapping a value cheaply is what I do with the “Amplitude” parameter. If you know the original range is between 0-1 and you want to extend the range while keeping 0 in the middle, you can multiply the parameter by the 0-1 texture, then subtract half the parameter. Say the amplitude value is 3, the result would be a range between -3 and 3. Everything else is using basic offset, scale and rounding functions for better control.

Now for the temporal accumulation.

The new node for me at least is the “Custom Render Texture Self” node which is the previous frame render texture and you treat it like any other texture. It defaults to bilinear filtering even though I set the custom render texture filtering to point so I have to override that using the sampler state node. The main focus is how I set the UV. I add a very small constant value on the x axis. If I keep that a hard coded number, the graph works. Now every frame this shader will compare the calculated shader to the previous result and pick the smallest texel drawing a line. However, a very crucial and often forgotten component is adding in GPU frame independence. In other words, I don’t want the FPS to dictate the speed of the graph. Unity supplies the delta time value already. If I use that updating value instead, it will no longer matter what your FPS is, thus the graph will run at a constant speed.

Nice.

Now this is much better, but pushed to the brink, the shader still breaks.

I still get this amplitude problem. As a game dev I come to the very naturally choice between fixing it or letting it be.

I had one idea. What if instead of drawing dots, I draw lines? I already know the coordinate of each point, so just connect the dots.

I have another very handy technique I use that does exactly this. This is the sub graph I take to almost every project that I called “Distance Line”.

I got this math from a Shadertoy shader and I’ll admit, I still don’t quite understand the math here. I’ll link the shader below. The inputs are two UVs. The output is a linear line between the two origin points of the UVs.

So to connect the dots, I need more information coming from the previous frame render texture or more specifically the last y value that was calculated. This is the 0-1 value that makes the brush wobble up and down randomly. I store that value directly into the R channel to use later. I also use this value to set the y origin of the next UV.

Then I get that same y value to recalculate the previous UV. These two UVs are the inputs to my Distance Line node I showed earlier.

I also used the delta time value as the horizontal offset for the second UV because that is the exact x position of the tail of the line.

Then its just the same temporal accumulation pattern as before where I reuse the G channel.

Looks like it passes the stress test.

Its weird how often seemingly simple shaders turn into something much more complex and how they uncover hardware limitations. This is such a tiny part of this game and this game I’m making is a relatively small game. However, what lies underneath, is a whole system using a custom render texture graph that uses temporal accumulation and distance lines that is then sampled again through a material shader to output to the screen. I find this sort of stuff so fascinating, because to the player, they will never know.

Anyways, I hope to continue these blogs. As go continue my development as a tech artist. See you guys later.

Distance Line Shader: https://www.shadertoy.com/view/WddGzr