Tips for Debugging Unity Games

When facing issues during game development, it is crucial to be familiar with techniques you can use to debug and find the cause of bugs. Some bugs are harder to debug than others, for instance, debugging shader issues is hard because you can’t output log statements to the console like you can in normal code. This blog will present tips for debugging different types of bugs.

Printing to Console

The easiest way to debug is printing messages or variables to the Unity console. For example, If you have a vector v and want to check its values, there are 3 ways you can output a message to the console:

using UnityEngine;

public class Logging : MonoBehaviour
{
    void Start()
    {
        Vector3 v = new Vector3(10, 20, 30);
        Debug.Log("Normal: " + v);
        Debug.LogWarning("Warning: " + v);
        Debug.LogError("Error: " + v);
    }
}

If you attach this code to a GameObject in your scene, it will output the following:

Logging console.

As you can see, LogWarning formats your output as a warning and LogError formats your output as an error, which can help you specify the importance of the message.

Note

If you use Debug.Log(), Debug.LogWarning(), or Debug.LogError() inside your Update() function, make sure to remove or comment them when you’re done debugging. These debug calls can hurt the performance of your game and are purely for debugging purposes, it’s easy to forget to remove them from Update().

Setting Code Breakpoints

If you need to step through your code or get more details on an object, you can set a breakpoint in Visual Studio at the code line where you want to investigate. Inside Visual Studio, click on the left of the code line number and a red circular marker should appear:

Logging code.

Now press the “Attach to Unity” button on the top menu bar inside Visual Studio and run the game from the Unity Engine. The code will pause at this line, allowing you to step through the code or analyze your objects.

Visual Debugging With Gizmos

Sometimes it’s easier to visually debug an issue than print statements or step through code with breakpoints. For example, if you have 2 squares and want to check if your square-to-square collision detection code is working, what you can do is move the squares in the scene and make them change color when they collide. 

There’s a function you can define in your Unity code called OnDrawGizmos() which allows you to draw shapes in the scene without running the game. With this function you can draw visual aids in the Scene view to help with debugging and level design. Here’s an example of how you can use it to check if your square-to-square collision code is working:

using UnityEngine;

public class SquareCollision : MonoBehaviour
{
    [SerializeField]
    private Collider2D squareCollider1;

    [SerializeField]
    private Collider2D squareCollider2; 

    private bool CheckCollision()
    {
        return squareCollider1.bounds.Intersects(squareCollider2.bounds);
    }

#if UNITY_EDITOR

    private void OnDrawGizmos()
    {
        if(squareCollider1 != null && squareCollider2 != null)
        {
            Gizmos.color = CheckCollision() ? Color.green : Color.red;

            Gizmos.DrawSphere(squareCollider1.transform.position, 0.5f);
            Gizmos.DrawSphere(squareCollider2.transform.position, 0.5f);
        }
    }
#endif
}

To test this code, assign it to an Empty object in your scene. Then create 2 square sprites, add a Box Collider 2D component to them, then assign them to squareCollider1 and squareCollider2 from the inspector. Here OnDrawGizmos() checks if there’s a collision between the 2 sprites. If there’s one, it draws a green sphere and if there isn’t it draws a red sphere. To see this in action, move around one of the squares and overlap it with the other:

Square to square collision debugging.

You can also draw rays. For instance, you might want to see the local coordinates of your object while doing level design: 

using UnityEngine;

public class LocalCoordinates : MonoBehaviour
{
#if UNITY_EDITOR
    private void OnDrawGizmos()
    {
        const float length = 2.0f;
        Gizmos.color = Color.green;
        Gizmos.DrawRay(transform.position, transform.up * length);

        Gizmos.color = Color.red;
        Gizmos.DrawRay(transform.position, transform.right * length);

        Gizmos.color = Color.blue;
        Gizmos.DrawRay(transform.position, transform.forward * length);
    }
#endif
}

Attach this script to any object and it will display the local coordinates of the object in the Scene view. Here’s an example with a cube:

Local coordinates.

You can also display the collider bounds of your objects using this code:

using UnityEngine;

public class VisualCollider : MonoBehaviour
{
#if UNITY_EDITOR
    private void OnDrawGizmos()
    {
        Collider collider = GetComponent<Collider>();
        if (collider != null)
        {
            Gizmos.color = Color.green;
            Gizmos.DrawWireCube(collider.bounds.center, collider.bounds.size);
        }
    }
#endif
}

Attaching this script to your object, given it has a collider, will display the bounds of the collider. Here’s an example of a sphere with a sphere collider:

Collider bounds.

If you have a point light, it’s useful to display the range of the light when designing your scene. You can do it with this code:

using UnityEngine;

public class VisualLightRange : MonoBehaviour
{
#if UNITY_EDITOR
    void OnDrawGizmos()
    {
        Light pointLight = GetComponent<Light>();
        if (pointLight != null)
        {
            Gizmos.color = Color.blue;
            Gizmos.DrawWireSphere(pointLight.transform.position, pointLight.range);
        }
    }
#endif
}

Here’s how it looks like:

Point light range.

Last but not least, you can also write text using the Handles class. You can for instance write the position of an object with this code:

using UnityEditor;
using UnityEngine;

public class VisualPosition : MonoBehaviour
{
#if UNITY_EDITOR
    void OnDrawGizmos()
    {
        Vector3 position = transform.position;
        string text = "(" + position.x + ", " + position.y + ", " + position.z + ")";

        GUIStyle style = new GUIStyle();
        style.normal.textColor = Color.red; 
        style.fontSize = 42;
        style.alignment = TextAnchor.MiddleCenter;

        Handles.Label(transform.position + Vector3.up * 1.5f, text, style);
    }
#endif
}

If you attach this script to an object and move it around, it will display the object’s position above it. Adjust the position of the text to fit your object’s height. Here’s an example with a capsule object:

Drawing object's position text.

These are just samples of what you can do with OnDrawGizmos(). You can do much more with it depending on your application such as drawing pathfinding routes, trigger areas, spawn points, camera frustum, etc. 

Visual Debugging During Play Mode

When drawing shapes inside OnDrawGizmos(), the shapes are always visible regardless of whether you are in play mode or not. Sometimes you don’t want to clutter your Scene view and just want to debug when the game is running in play mode. To do that you can use Debug.DrawLine(), Debug.DrawRay(), or Debug.DrawSphere() inside your Update() function. This will also draw in the Scene view but the drawings will be removed once you come out of play mode.

Note

If you use Debug.DrawLine(), Debug.DrawRay(), or Debug.DrawSphere() inside your Update() function, make sure to remove or comment them when you’re done debugging. These debug calls can hurt the performance of your game and are purely for debugging purposes, it’s easy to forget to remove them from Update().

Debugging Shaders

Debugging shaders can be challenging because you cannot output statements directly to the Unity console from shader code. This is because shaders run on the GPU and cannot interact with Unity’s logging system. You can however use colors. You can output a certain color if a condition is met and otherwise output a different color. For instance, if you want to check what part of your texture has a red component bigger than 0, you could do this in your fragment shader:

fixed4 frag (v2f i) : SV_Target
{
    fixed4 col = tex2D(_MainTex, i.uv);

    if(col.r > 0){
        return fixed4(1,0,0,1);
    }else{
        return fixed4(0,0,0,1);
    }
}

You can apply this in many different situations. For example, if you have some light calculation and want to check if it is larger than a certain value.

Maybe you want to check if the normal vectors in your object are correct. You can output your normal vector as a color:

Shader "Custom/NormalShader"
{
    Properties
    {
        _Color("Base Color", Color) = (1, 1, 1, 1)
    }
    SubShader
    {
        Tags { "RenderType" = "Opaque" }
        LOD 200

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata_t
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
                float4 color : COLOR; 
            };

            struct v2f
            {
                float4 pos : SV_POSITION;
                float3 normal : TEXCOORD0;
            };

            v2f vert(appdata_t v)
            {
                v2f o;
                o.pos = UnityObjectToClipPos(v.vertex);

                // Convert normal to world coordinates 
                o.normal = normalize(UnityObjectToWorldNormal(v.normal));
                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {
                // Convert color from -1 to 1 to color range from 0 to 1
                fixed3 normalColor = (i.normal * 0.5) + 0.5;
                return fixed4(normalColor, 1.0); 
            }
            ENDCG
        }
    }
    FallBack "Diffuse"
}

The vertex shader converts the normal vector to world coordinates so that if you rotate the object you will see the colors changing, reflecting the change in the normal vector’s direction. Here’s what this material would output for a cube:

Drawing normals.

This confirms that the normal vectors for this cube are correct since the red face has normal vector (1,0,0) which corresponds to red, the green face has normal vector (0,1,0) which corresponds to green and the blue face has normal vector (0,0,1) which corresponds to blue. 

You can do the same for UV coordinates. Here’s a code that will output the uv coordinates of a plane:

Shader "Custom/UVDebugShader"
{
    Properties
    {
        _Color("Base Color", Color) = (1, 1, 1, 1)
    }
    SubShader
    {
        Tags { "RenderType" = "Opaque" }
        LOD 200

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata_t
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0; 
            };

            struct v2f
            {
                float4 pos : SV_POSITION;
                float2 uv : TEXCOORD0; 
            };

            v2f vert(appdata_t v)
            {
                v2f o;
                o.pos = UnityObjectToClipPos(v.vertex);
                o.uv = v.uv; 
                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {
                fixed3 color = float3(i.uv, 0.0); 
                return fixed4(color, 1.0); 
            }
            ENDCG
        }
    }
    FallBack "Diffuse"
}

Here’s what it looks like when adding this material to a plane:

Drawing uvs.

This confirms that the uv coordinates are correct. At the bottom left corner, the uvs are (0,0), which corresponds to black since the color is (0,0,0). If you move right from the bottom left corner, the color becomes increasingly red because the first component of the uv increases while the second component remains 0, eventually reaching the bottom right corner of uv (1,0) which corresponds to the red color (1,0,0). If you move up from the bottom left corner, the color becomes increasingly green because the first component of the uv remains at 0 while the second component increases, eventually reaching the top left corner of uv (0,1) which corresponds to the green color (0,1,0). If you move up and diagonally from the bottom left corner the color becomes increasingly yellow because the first and second components of the uv increase uniformly at the same time, eventually reaching the top right corner of uv (1,1) which corresponds to the yellow color (1,1,0).

Conclusion

Debugging video games can be quite different from regular software debugging. Sometimes you have to rely on creative techniques to debug a problem, as we have seen in the shader debugging section. Spending time to write re-usable code specifically for debugging issues might seem like a waste of time, but it will help you gain time later on as you will most likely run into the same bugs over and over again. Finally, always double check that your debugging code is not running on your release builds, especially if you are planning on releasing your game to the public.


We Need Your Help!

Help us increase our apps’ visibility in the app stores. Check out our apps and if there’s anything you like, download it on your phone and try it out. Don’t forget to leave us a review. We appreciate your support, it truly makes a difference!

You Might Also Like