Building a level editor in Godot
Hey everybody and welcome back to another Gravity Ace devlog!
I missed last week because I was FREAKING OUT about the whole pandemic situation. I’m still freaking out but life goes on. I hope you’re doing well and staying safe out there.
So a lot of people have been asking me about the level editor in the game. I built the level editor for two reasons. ONE: I needed a way to make levels. And TWO: I wanted to give players a tool to create and SHARE their own levels from within the game. So let’s jump in and first of all just give you a tour of the level editor itself. Then I’ll walk through some of the code to explain how it does some of the things it does.
I’ve spent the last few days cleaning up and improving the UI and adding a lot of little quality of life improvements. It was a chore I knew I needed to do and I’ve been putting it off. The level editor was already FUNCTIONAL but I needed to take it past the finish line. I revamped the user interface. If you haven’t seen the old version then TRUST ME this version is MUCH better. All of the panels are visible at all times and the toolbar now has every function and explains clearly what each button does. I’m writing documentation for the level editor now and that’ll connect to the Help button here. Another thing I did was improve the UI for connecting signals between objects. The old way WORKED but it was pretty manual and required typing in IDs. This new method allows you to do the whole thing with the mouse and it’s much faster and more intuitive.
I also added some juice with an animation and new sound effect when placing objects. And I added this cool hologram shader for object previews. When you select an object it, a hologram attaches itself to the cursor so you can see exactly where it will be placed. The hologram also snaps to nearby walls to make placement faster. Before you had to guess and make adjustments. Now you can see exactly where the object will be placed so the guesswork is removed.
So let’s look at how all of this works. First, let’s look at the main game scene. This is the scene where the game is actually played. It has some UI but it’s mostly empty because the level data is loaded dynamically. Level data is loaded into the Level node. Everything else you see here is UI like the score, the speedrun timer, out of fuel warning, and the level editor scene.
You can see the Editor scene here. In Gravity Ace the entities are all loaded into the Level node. Entering the editor doesn’t pause the game. What happens is a global mode variable is changed and all of the entities in the game check that to see what they should be doing. If the mode is PLAY then they behave normally. If the mode is EDIT Then they don’t move, they don’t shoot, and they don’t die. Here’s an example. This turret has code for aiming and shooting at the player. But when the game is in EDIT mode it doesn’t shoot and it can’t die.
One approach to building levels would be to use placeholders for all of the entities. Then when the level is loaded I’d replace all of the entities with actual gameplay entities. But I didn’t want to do that because I thought it would be fun to have everything look just like it does in game. I was RIGHT! But it does complicate things a little. For example take a look at how I select and move things around in the editor.
Every entity in the game is a physics body. Either a RigidBody2D, Area2D, or StaticBody2D. Moving physics bodies isn’t so straightforward because they’re part of a physics simulation AND they don’t have any UI controls for drag and drop. So I wrote my own selection and drag and drop code that works with physics bodies. I use the _unhandled_input() callback so that buttons and other GUI controls get to consume inputs first. Any unhandled inputs land here. I check which objects the mouse cursor is touching or near by doing queries directly against the physics server. The physics server has a lot of useful methods for querying it based on points, shapes, and raycasts. Here I’m returning the object that is touching the mouse pointer.
Then I do some more queries using circle shapes to find the nearest wall. If I find a wall nearby I can then find the normal and the intersection point of the normal and the wall. This is how I get objects to snap to walls in the correct orientation. Then I have a ton of code here for handling different interactions based on mouse inputs and the object the cursor is touching. This is how objects are selected with a single click, drag and drop, rotating, modifying polygons, and more.
Next let’s look at the properties system. Godot has a lot of really great code introspection features that allows the code to read itself. That is, Godot Engine has methods that let you see what variables you’ve defined for an object. For example,
get_property_list() is a method that returns a list of custom properties that have been defined for any object!
has_method() is another one I use a lot to tell if an object has a particular method so I know if I should call it or not.
Let’s look at the Turret again. Here’s the code. Notice how it has these export variables. See how each one has a type and some of them have a range? The editor scans those variables to build the property inspector. This code iterates through the properties of the selected object. First the code clears the property inspector. Then it calls a method for each of the exported variables. The
add_field() method takes that property and builds the UI dynamically for each variable. Then when the UI controls are manipulated, the resulting values are copied back to the original object.
Signals are just a slightly more complicated version of the same idea. Each signal is stored in an array in the originating object. When a signal is connected to another entity the code stores the target entity and the action to call on that entity. Each entity that can be targetted in this way has a special method called
get_trigger_code_list() that returns a list of actions that can be called on this entity. And it has another method called
trigger() that actually invokes the action. So when I connect a signal in the editor, I can choose a target entity and trigger code. Those are saved in an array. Later, when the entity is instantiated, an actual signal is connected from that entity to the target’s
trigger() method and passes the action code as a parameter.
OK. Saving levels. This is actually pretty simple. Godot has a class called ConfigFile that handles the file format, loading, and saving. It’s basically a class for making INI files… key/value pairs organized into sections. I create an instance of ConfigFile. I set global level properties first like gravity strength. Then I iterate through each of the entities in the game. For each entity I iterate through its properties and store the ones I need in the file. Position, rotation, polygon for walls, and all of the public exported variables that appeared in the inspector. They’re all stored in the file and written to disk.
Loading levels is a little more complicated but it’s basically saving in reverse. First I load in all of the global level properties. Then I iterate through each of the entities in the file and instantiate each one. After each entity is created I assign all of its properties back to it. Finally, I connect all of the signal properties by creating actual signals to link the source entity to the target’s
The editor and all of its components is somewhere around 3000 lines of code. I can’t talk through it all but I hope this overview has been fun and useful. It was pretty fast so please let me know if you have questions. YouTube comments aren’t a great place for lengthy discussions so go to GravityAce.com and click the link to join the Discord server.
That’s it for now. Please tell everyone you know that Gravity Ace is launching this Spring. You can find a link to wishlist Gravity Ace on GRAVITY ACE DOT COM.
Thanks for watching and see you next time! And stay safe out there.
Published March 22, 2020