Refactor the Old Before Creating the New
Refactor the Old Before Creating the New

Refactor the Old Before Creating the New

Tags
Owner
Justin Nearing
🎶
This is part of a ongoing series called Building A Music EngineBuilding A Music Engine

It’s basically me documenting the process of smashing my head against the keyboard as I build a game prototype called LETSGOLETSGO

It’s gotten long enough to break into several sections:

So here I was, all gung-ho to implement the spec I defined in Designing The Core Gameplay LoopDesigning The Core Gameplay Loop

I quickly added skeleton classes for PhaseManager, PhaseController, and SetTonic phases.

Committed the empty, but compiling classes to Git:

As soon I started wanting to hook these things up, I collided head-first with my janky old code.

The Joys of Janky Old Code

Here’s what I want to do: I want something called a SetTonic object to receive a PlayerSetNote event from an AudioPlatform.

This is an Audio Platform:

I need to get those Audio Platforms to fire a “Player Set Note” event every time it gets stepped on.

The problem, AudioPlatforms are currently overly complex Unreal Blueprints:

image

There’s 5~ functions here handling 2 separate logical domains:

  • The top part which visualizes the musical note of the platform.
  • The bottom which plays the sound cue of that musical note.

From a I-am-so-smart programmer POV, this object is violating the single responsibility principle.

When I was just trying to get the Audio Platform to work in the first place, this was fine- all the logic was in the same place, making it easier to reason about.

However, now I’m wanting to add a third responsibility to this Platform- sending events to be consumed by Phase objects.

To me, this feels like the tipping point into real spaghetti territory.

My code, visualized
My code, visualized

I was gung-ho to start building the core loop, but I must refactor the old before creating the new.

Refactoring the Displayed Music Note

The very first feature I built for the game was a system that converts pngs into Unreal's Niagara Particle System.

🔑
This is all captured in Floating Number Particle System

Basically it takes an array of characters [ 'F' , '#' ] , converting each character into particles, and displaying those particles above the platform.

Unnecessarily complicated? Yes.

But it’s cool, and it technically works.

Literally what more could you ask for?

The first bit of logic converts an enum F# to [ 'F', '#' ] :

The comment box should say convert enum to string and loop. Comments can be dangerous.
The comment box should say convert enum to string and loop. Comments can be dangerous.

Is there a better way to do this? Probably.

But this is not the code I’m looking to refactor.

My goal is to isolate this logic away from the Audio Platform itself.

The goal is not to improve the logic itself.

Don’t get bogged down in “recursive refactoring.”

All code is terrible, especially mine.

If I start fixing every damn problem I see?

Well, Hal demonstrates this perfectly:

Anyways, that character array is then fed into the particle create/update methods.

image

The particles are spawned on top of the platform, and are updated every tick to point towards the player.

Isolating the logic into a ParticleController

If we look at the scope of the logic, we see that the Audio Platform is primarily concerned with managing the lifetime of the particle systems.

In Designing The Core Gameplay LoopDesigning The Core Gameplay Loop, I defined a pattern where anything concerned with lifetime management is called a Controller.

Following this pattern, I created a new Blueprint called ParticleController that will manage the lifetime of the particle system.

I simply cut/paste all that blueprint logic above into the new Blueprint.

🔑
I considered porting this into code, but managing the particle system in code felt like it would be too much work. Another case of recursive refactoring that I decided to avoid.

Then I have AudioPlatform initializing the ParticleController:

Audio Platform now
Audio Platform now has a particle controller, no longer is a particle controller

The only sticking point here is I need to know when to destroy the Particle Systems.

Thankfully I already have the solution to fix this, using the events defined in Designing The Core Gameplay LoopDesigning The Core Gameplay Loop

Blueprints have the concept of an EventDispatcher :

image

When the Player Character overlaps with the platform, it will trigger a On Player Triggered Platform Event.

The Particle Controller “binds” to this event on initialization, essentially listening for the event to trigger:

That red wire leads to the destruction logic of the particle systems.
That red wire leads to the destruction logic of the particle systems.

This work is captured here:

The commit itself isn’t much to look at, since this has all been uasset changes.

That’s why I’ve snapped so many screenshots.

Moving the Event Dispatcher to Code

Here’s the thing, I want that Call On Player Triggered Platform event to be setup in code.

The reason being this call is going to be used by both code and blueprint actors.

So I want to pull this section of the blueprint into code:

image

What I want is an AudioPlatform.h that acts as the corresponding code for this blueprint.

I created the file, and took way too long figuring out how to make the new file a parent of the existing AudioPlatform blueprint

💡
For future reference, you can set the parent class of a blueprint very easily by going to Class SettingsClass OptionsParent Class

Neat. Anyways, the event dispatcher code is now moved to code like:

// AudioPlatform.h

// Funny macro thing to define an Event Dispatcher
DECLARE_DYNAMIC_MULTICAST_DELEGATE_OneParam(FAudioPlatformTriggerDelegate, FLetsGoMusicNotes, Note);

// Macro to define this as an Unreal class that can be made into a blueprint
UCLASS(Blueprintable)
class LETSGO_API AAudioPlatform : public AActor
{
	GENERATED_BODY() // Unreal magic reflection stuff, don't worry about it

public:
	AAudioPlatform();

	// Notice this matches the name of the funny macro thing defined at top
	UPROPERTY(BlueprintCallable, BlueprintAssignable)
	FAudioPlatformTriggerDelegate OnAudioPlatformTriggered;
	
protected:
	virtual void NotifyActorBeginOverlap(AActor* OtherActor) override;
};
// AudioPlatform.cpp

void AAudioPlatform::NotifyActorBeginOverlap(AActor* OtherActor)
{	
	// Only broadcast if it's a player character
	if (Cast<ACharacter>(OtherActor))
	{
		OnAudioPlatformTriggered.Broadcast(Note);
	}
}

The funny macro thing tells Unreal there is a EventDispatcher using an Event Delegate.

Specifically, a delegate called FAudioPlatformTriggerDelegate is defined that wants a function to fire with the musical note Note

I attach this delegate to the function OnAudioPlatformTriggered

That delegate object gives OnAudioPlatformTriggered a Broadcast method which fires the actual event.

This is only done if a Character Actor overlaps with the Audio Platform.

(NotifyActorBeginOverlap is an event delegate itself, defined in Unreals AActor class)

This means when a Character Actor overlaps with the Audio Platform, an OnAudioPlatformTriggered event is Broadcast.

With this compiling, we can see the event is available in the ParticleController BP:

Blueprints automatically add whitespaces to OnAudioPlatformTriggered
Blueprints automatically add whitespaces to OnAudioPlatformTriggered

So now we’ve:

  • Isolated the ParticleController logic
  • Added code to fire an event when the platform is stepped on

Now we can move on to the last piece of ugly logic:

Refactor Playing the Musical Note of the Platform

The Audio Platform plays a corresponding audio wav file when it is stepped on.

Currently that's achieved through this ugly blueprint code:

image

Basically all this is doing is mapping F# to a FSharp2_Music_Note_Cue audio file, then playing that sound cue.

The cues themselves are attached as components to the Audio Platform.

image

Now, I never liked this implementation, mainly because I had to use the mouse too much when setting up all those wires.

But it was a straight-forward implementation that worked.

Literally what more can you ask for?

Well a few things, actually:

  • This represents a single scale- so 12 notes of a 66 note synth.
  • This represents a single instrument- an Arturia AnalogLab synth called “CheeseKeys” I exported out of Ableton.
  • Those notes represent a whole note at a specific BPM

So if I want all the keys, and all the instruments, with variable timings, I need a real solution.

I'm talking a proper solution.

Not this jank prototype shenan to see if I could get things working at all.

One problem.

Building out a “proper” solution for this is its own task… And I’ve already deviated quite a bit from the current task- building out the core Gameplay Loop.

The urge to enter the dreaded recursive refactoring is really strong with this one.

But, as much as it pains me to keep this prototype implementation, I’m going to keep it.

But- there is one thing I’m going to add.

Right now the sound just plays when the platform is stepped on.

I want to play the sound in time with the beat.

Getting computers to play sounds in time with a beat is actually, surprisingly difficult.

Quantized Sounds: Cooking With Quartz

Casually, I need to convert the following into C++ code:

image

This is from the Instrument_BP blueprint I made when hooking up a kick drum on 🥁Building the Drum Machine.

Basically, our ears are incredibly sensitive, able to discern the timing of sounds with millisecond accuracy.

This accuracy is so great that you need a specialized audio engine to play audio samples.

Thankfully, Unreal has such an audio subsystem called Quartz:

LETSGOLETSGO can only exist because of this subsystem. I wouldn’t have the hutzpah to build it myself. Just building a game is hard enough.

Quartz has awesome overviews for hooking up Blueprints to Quartz.

Not so much for hooking up C++ to Quartz.

🔥
There are a few hard-to-find guides.

Mainly because websearch is hopelessly broken in the year-of-our-lord 2024:

Hopefully, one day, when FAANG has bankrupted itself on the false promise of generative AI, a company will come along that actually allows you to search the web for the best result (as opposed to the most profitable ad).

Perhaps, on that day, this article will be served to devs trying to hook up Quartz in C++.

We use it because it has the function to PlayQuantized() - playing the SoundCue on the beat.

Getting off my soapbox, here’s some code:

// AAudioCuePlayer.h
UCLASS(Blueprintable, ClassGroup=(LETSGO), meta=(BlueprintSpawnableComponent))
class LETSGO_API AAudioCuePlayer : public AActor
{
	GENERATED_BODY()

public:
	AAudioCuePlayer() {};

	// This is messy until I find a better solution
	// Set references to the note cues
	// The actual values are attached in AudioPlatform_BP in the Add Audio Cue Player node
	UPROPERTY(EditDefaultsOnly)
	USoundCue* A2_Music_Note;

	UPROPERTY(EditDefaultsOnly)
	USoundCue* AFlat2_Music_Note;
	
	//... repeat for all 12 notes.

The implementation I decided to go with was to replicate that prototype Blueprint logic into code in a AAudioCuePlayer class.

I created the class, then created a Blueprint off of it:

image

Then, with that UPROPERTY(EditDefaultsOnly) macro, I can attach each sound cue to the reference in the Blueprint editor:

image

In the original AudioPlatform blueprint, I do a switch/case on the incoming Note to map to the sound cue.

I do the same thing in code now:

// This is bad but requires a real solution to be figured out and implemented
USoundCue* AAudioCuePlayer::GetSoundCue(TEnumAsByte<ELetsGoMusicNotes> ENote) const
{
	switch (ENote)
	{
	case ELetsGoMusicNotes::A:
		return A2_Music_Note;
	case ELetsGoMusicNotes::AFlat:
		return AFlat2_Music_Note;
	case ELetsGoMusicNotes::B:
		return B_Music_Note;
	//... repeat for all 12 notes
I love how my comments communicate how I feel about this code, rather than what this code does

Again, not thrilled with this implementation, but it technically works.

Literally what more can I ask for?

We then get into the Quartz implementation:

// AAudioCuePlayer.h 
public:
	UPROPERTY(VisibleAnywhere)  
	UQuartzClockHandle* Clock;
	
	UPROPERTY(BlueprintReadWrite, meta=(ExposeOnSpawn=true))
	FQuartzQuantizationBoundary QuartzQuantizationBoundary;

	UPROPERTY()
	UAudioComponent* AttachedAudioComponent;

The reference to QuartzClockHandle is the metronome that actually keeps time in the game.

It’s where we set the BPM, start the clock, etc.

I stored the main clock in a custom GameState, which we retrieve from a custom GameMode:

// AAudioCueplayer.cpp
// Called when the game starts
void AAudioCuePlayer::BeginPlay()
{
	Super::BeginPlay();
	// Get Main Clock
	const ALetsGoGameMode* GameMode = Cast<ALetsGoGameMode>(GetWorld()->GetAuthGameMode());
	Clock = GameMode->GetMainClock();

The QuartzQuantizationBoundary defines if you want something done on a beat, or the beginning of a bar, or on a half-step, whatever. Basically this object defines where on the “beat grid” something should happen.

The meta=(ExposeOnSpawn=true) allows you to connect this object in a Blueprint.

This is exactly what the Audio Platform does now, spawning the Audio Cue Player and creating a Quantization Boundary in the blueprint.

I like this approach because changing the Quantization from Beat→Bar means compiling the Blueprint, more efficient than compiling code.

image

The AudioComponent is a component attached to the Audio Cue Player that essentially wraps the SoundCue.

// AAudioCuePlayer.cpp
AAudioCuePlayer::AAudioCuePlayer()
{
	/**
	 * Creates an audio component.
	 *  Audio component needed in order to play the sound quantized
	 */
	AttachedAudioComponent = CreateDefaultSubobject<UAudioComponent>(TEXT("Attached Audio Component"));
	AttachedAudioComponent->SetAutoActivate(false); // Don't play immediately
	AttachedAudioComponent->bAllowSpatialization = false; // Don't play in world

}

// Called when OnAudioPlatformTriggered event is fired from the AudioPlatform
void AAudioCuePlayer::OnAudioPlatformTriggered(const FLetsGoMusicNotes IncomingNote)
{
	USoundCue* ThisSoundCue = GetSoundCue(IncomingNote.Note);
	AttachedAudioComponent->SetSound(ThisSoundCue);

	const FOnQuartzCommandEventBP EmptyOnQuartzCommandEventBP; 
	AttachedAudioComponent->PlayQuantized(GetWorld(),Clock, QuartzQuantizationBoundary, EmptyOnQuartzCommandEventBP);
}
I don’t know what the FOnQuartzCommandEventBP is for. My guess is if I need to receive events from the Clock itself.

Final Result

Or original Blueprint looked like this:

image

Through isolation of domain logic, as well as porting logic to code, the BP is much more simplified:

image

With code captured here:

With this change we’ve also added two additional features:

  • Have a OnAudioPlatformTriggered event fire when the platform is stepped on
  • Play the musical note of each platform in time with the beat.

And this is how it plays now:

This was a super rewarding task.

While I was kind of bummed that I wasn’t able to jump straight into building the core gameplay loop, the amount I achieved by refactoring the Audio Platform is massive.

This task represents the most code I’ve actually written for the game thus-far.

It’s validated the event-driven design I’ve defined in Designing The Core Gameplay LoopDesigning The Core Gameplay Loop

It’s definitively improved the Audio Platform, the core component of the prototype thus far.

And most importantly, it’s paved the way to actually start implementing the Core Gameplay loop.

I can finally start to:

🔨Building The Core Gameplay Loop