Home

2024

Worklog

LETSGO Game

Building A Better Drum Machine
Building A Better Drum Machine

Building A Better Drum Machine

Tags
UnrealAudioSystem
Owner
Justin Nearing
🎶
This is part of a ongoing series called Building A Music EngineBuilding A Music Engine

It documents the process of me smashing my head against the keyboard to build a game called LETSGOLETSGO

It’s gotten long enough to break into several sections:

In the last episode, I successfully completed 🔨Building The Core Gameplay Loop. With that basic gameplay loop structure set, I wanted to add a couple simple gameplay Phases to help to prove out the functionality.

I decided to rebuild the drum machine I had originally built out in Blueprints in 🥁Building the Drum Machine, this time in code, forcing me to start building out the concept of Instruments.

From Single Kick to Full Kit

One of the first features I prototyped in LETSGOLETSGO was a Blueprint that played the Kick of a drum:

Now that I’ve completed 🔨Building The Core Gameplay Loop, the next thing I wanted to do was +1 this Kick drum to play a variety of notes.

The intent of this document is to figure out how to structure le code to manage this complexity. And I think building to that exact [ 1-3, 1-3, 1-3, 1234 ] kick pattern is exactly the use case I want to pursue.

In fact, I ended up building out over a dozen separate drum patterns:

In Designing The Core Gameplay LoopDesigning The Core Gameplay Loop I had this goal of separating the “Composer” from the “Conductor”. That is, I want to abstract out the kick pattern, and the decision making for which kick pattern to use, from the actual triggering of each kick within that pattern. I think this is good design.

So I need to define some kind of data structure representing the kick patterns.

I then need to figure out what creates those data structures at runtime.

I then need to figure out how to pass that structure to the Drums Actor.

I then need to map that structure QuartzBoundary’s.

And that’s just for Kick. But at that point adding the snare, hats, crashes, etc. is relatively straightforward. Just more work for the Composer entity arranging these patterns at runtime.

Yeah this seems like the next thing to work on.

Unreal Quartz Audio

Unreal’s Quartz Audio subsystem creates sample accurate audio threads allowing for low-latency audio triggers.

One of the features of Quartz is the concept of Quantization Boundaries - an object that schedules an action relative to the beats per minute of a metronome.

This is a Quartz Quantization Boundary:

	UPROPERTY(BlueprintReadWrite, Category="LETSGO")
	FQuartzQuantizationBoundary QuartzQuantizationBoundary = {
		EQuartzCommandQuantization::Bar,
		1.0f,
		EQuarztQuantizationReference::BarRelative,
		true
	};

One use is on UQuartzClockHandle subscription:

Clock->SubscribeToQuantizationEvent(GetWorld(), QuartzQuantizationBoundary.Quantization, PlayQuantizationDelegate, Clock);

Clock is the metronome object that pulses at a set BPM.

I have a helper method CreateNewClock that does the following:

	UQuartzSubsystem* Quartz = GetWorld()->GetSubsystem<UQuartzSubsystem>();

	FQuartzClockSettings NewClockSettings = ClockSettings;
	NewClockSettings.TimeSignature = TimeSignature;

	UQuartzClockHandle* Clock = Quartz->CreateNewClock(this, ClockName, NewClockSettings, true);

	Clock->SetBeatsPerMinute(this, QuartzQuantizationBoundary, FOnQuartzCommandEventBP(), Clock, BeatsPerMinute);

	return Clock; 

My Instrument class takes that clock and subscribes to a Quantization Event:

void AInstrument::StartPlaying()
{
	Clock->StartClock(GetWorld(), Clock);
	Clock->SubscribeToQuantizationEvent(GetWorld(), QuartzQuantizationBoundary.Quantization, PlayQuantizationDelegate, Clock);
}

A Quantization Event is an event delegate that fires based on our Quantization Boundary- in other words, the Boundary says “Broadcast that an event has happened ever bar/beat/whole note/etc.” The delegate fires the event, the Boundary defines if that event fires every bar, beat, etc.

In the constructor of my Instrument class, I have the delegate bound to the function to run each time the Bar is Broadcast:

PlayQuantizationDelegate.BindUFunction(this, "OnQuantizationBoundaryTriggered");

In OnQuantizationBoundaryTriggered, I create another Quant Boundary, and play an 808 Kick:

void ADrums::OnQuantizationBoundaryTriggered(FName DrumClockName, EQuartzCommandQuantization QuantizationType,
	int32 NumBars, int32 Beat, float BeatFraction)
{
	// AudioComponent.PlayQuantized() takes in a Quant Boundary, allowing you to schedule sound relative to the current Boundary
	// ie. If OnQuantTriggered fired per bar, this boundary can schedule something for next bar
	// We want to play right now
	const FQuartzQuantizationBoundary RelativeQuartzBoundary = {
		EQuartzCommandQuantization::None,
		1.0f, // multiplier
		EQuarztQuantizationReference::BarRelative,
		true
	};

	// Play the kick drum sound
	// This creates an Actor wrapper around a new AudioComponent we want to play on this beat
	// This is so we can destroy the Actor after use
	ADrumsAudioCuePlayer* AudioCuePlayer = GetWorld()->SpawnActor<ADrumsAudioCuePlayer>();
	AudioCuePlayer->Initialize(InstrumentMetaSoundSource, Clock,RelativeQuartzBoundary);
	AudioCuePlayer->PlayAndDestroy();
}

This is the second use of Quantization Boundary. It schedules when to play relative to the current beat/bar/whatever was set in the first Quant Boundary.

In the code above, it doesn’t wait, just plays immediately- 1.0 is the first beat of a four beat bar.

If the outer Boundary was triggering every beat, then inner will not wait and play the Kick808 every beat. 4 on the floor achieved.

But we need something a bit more complex if we want to Kick on the first and third beat.

The outer Boundary needs to be triggering on each bar, not each beat, and we need to schedule 2 AudioCuePlayers to fire during each Bar event.

The code above is scheduling a single event. Using the multiplier value in the Boundary, we can update that to define which beat in the bar will be triggered:

	const FQuartzQuantizationBoundary RelativeQuartzBoundary = {
		EQuartzCommandQuantization::None,
		3.0f, // Play on the third beat 
		EQuarztQuantizationReference::BarRelative,
		true
	};

We’d need another AudioCuePlayer with its own QuartzBoundary with 1.0f as the multiplier, and this code would play a 1-3 on the kick.

You can imagine the complexity starts to sprawl if you wanted to switch this up for the 4th bar: Go 4 on the floor on every fourth bar, otherwise 1-3.

An Instrument and Its Data

Representing Beats As Data

So this is what I ended up building:

	BossaNova = FDrumPattern();
	
	// Bossa Nova
	BossaNova.Kick = FInstrumentSchedule(
		// Declare time division of each bar
		EQuartzCommandQuantization::EighthNote,  
		{
			FPerBarSchedule({1.0f, 4.0f, 5.0f, 8.0f}),
		}
	);
	BossaNova.Snare = FInstrumentSchedule(
		EQuartzCommandQuantization::EighthNote, 
		// Declare an array of beats to play in each bar
		{
			FPerBarSchedule({1.0f, 4.0f, 7.0f}),
			FPerBarSchedule({3.0f, 5.0f}),
		}
	);
You can find this here.

We have a DrumPattern, that defines a number of InstrumentSchedules, which contains a set of PerBarSchedules.

For the kick, we’re defining that it should be played on the first, fourth, fifth and eighth eighth-note. It will repeat this pattern for every bar.

For the snare, it defines two bars of snare hits. On bar one, hit the first, fourth, and seventh eighth-note. On the second bar, third and fifth- then repeat back to the beginning.

Here’s the definition of the PerBarSchedule and InstrumentSchedule:

USTRUCT()
struct FPerBarSchedule
{
	GENERATED_BODY()
	
	UPROPERTY()
	TArray<float> BeatsInBar; // [1, 3] plays sound on first and third beat

	FPerBarSchedule() {}
	explicit FPerBarSchedule(const TArray<float>& Beats) : BeatsInBar(Beats) {}
};

USTRUCT()
struct FInstrumentSchedule
{
	GENERATED_BODY()
	
	UPROPERTY()
	EQuartzCommandQuantization QuantizationDivision;
	
	UPROPERTY()
	TArray<FPerBarSchedule> BeatSchedule; // [[1,3],[1,3],[1,3],[1,2,3,4]] play 1-3, on 4th bar 4onfloor

	FInstrumentSchedule(): QuantizationDivision() {} ;
	explicit FInstrumentSchedule(const EQuartzCommandQuantization Quantization, const TArray<FPerBarSchedule>& Pattern);

};

Playing the Instrument Schedule

To play a drum pattern, we need two things- the functionality that plays the pattern, and the data to play it.

With the structure of the data above, we can define the functionality that consumes that data below:

Our Boundary Triggered function is now updated to triggered on every bar:

void AInstrument::OnQuantizationBoundaryTriggered(FName DrumClockName, EQuartzCommandQuantization QuantizationType,
	int32 NumBars, int32 Beat, float BeatFraction)
{
  // Get the Instrument schedule, ex. BossaNova.Snare[1]
	FPerBarSchedule ThisBar = InstrumentSchedule.BeatSchedule[CurrentBar];

	// BossaNova.Snare[1] == { 3.0f, 5.0f }; Loop this array
	for (int i = 0; i < ThisBar.BeatsInBar.Num(); i++)
	{
		const FQuartzQuantizationBoundary RelativeQuartzBoundary = {
			RelativeQuantizationResolution,
			ThisBar.BeatsInBar[i], // 3.0f or 5.0f - third/fifth eigthnote
			EQuarztQuantizationReference::BarRelative,
			true
		};

		// Play the kick drum sound
		// This creates an Actor wrapper around a new AudioComponent we want to play on this beat
		// This is so we can destroy the Actor after use
		AAudioCuePlayer* AudioCuePlayer = GetWorld()->SpawnActor<AAudioCuePlayer>();
		AudioCuePlayer->Initialize(InstrumentMetaSoundSource, Clock,RelativeQuartzBoundary);
		AudioCuePlayer->PlayAndDestroy();
	}

	// Reset CurrentBar if at last PerBarSchedule
	if (CurrentBar == InstrumentSchedule.BeatSchedule.Num() - 1)
	{
		CurrentBar = 0;
	}
	else
	{
		CurrentBar++;
	}
}

We store an int called CurrentBar, and compare that to which PerBarSchedule we’re on:

  {
		FPerBarSchedule({1.0f, 4.0f, 7.0f}),
		FPerBarSchedule({3.0f, 5.0f}), // Matches CurrentBar == 1 
	}

Then we create an Actor that will play the appropriate Audio Cue, and gracefully destroy itself once complete.

Finally the CurrentBar is iterated or reset if on the last PerBarSchedule.

This whole function is a subscription to a Quartz Clock quantization event that happens when the Instrument StartPlaying() is called:

void AInstrument::StartPlaying()
{
	Clock->StartClock(GetWorld(), Clock);
	Clock->SubscribeToQuantizationEvent(GetWorld(), QuartzQuantizationBoundary.Quantization, PlayQuantizationDelegate, Clock);
}

So, on StartPlaying, a Quartz Clock starts ticking, triggering OnQuantizationBoundaryTriggered event every bar. It loops through each beat listed in an InstrumentSchedule and triggers an audio cue at the correct time.

Merging Data And Functionality

One very intentional design decision I made while working on this feature was the separation of the data needed to play an instrument, and the functionality of that instrument.

I imagine an instrument being something you just connected a black box of data to, and it will start firing off events based on what’s in the box.

A result of that design is that I need somewhere to smush an Instrument and its data together.

Thankfully, the concept of Phases I built in 🔨Building The Core Gameplay Loop gives me a clear way of doing this: A phase called StartDrum.

The StartDrums phase initializes each Instrument with the required data:

void AStartDrums::Initialize()
{
	// ex. returns FDrumPattern "BossaNova" described above
	const EDrumPatterns DrumPattern = GetRandomDrumPattern(); 
	const FDrumPattern Pattern = GetDrumData(DrumPattern);
  
  // Get the "Kick" sound from a Blueprint 
	const ADrumSoundCueMapping* SoundCueMapping = GetWorld()->SpawnActor<ADrumSoundCueMapping>(ADrumSoundCueMappingClass);

	// Initialize Kick
	// Deffered Spawning allows you set an actor up before BeginPlay starts
	Kick = GetWorld()->SpawnActorDeferred<AInstrument>(AInstrument::StaticClass(), FTransform());
	Kick->Initialize(Pattern.Kick, SoundCueMapping->Kick);
	Kick->FinishSpawning(FTransform());

	// Repeat for snare, etc. 
	//...
}

Multi-Instrument Drums

Before this, I had a single kick drum playing every beat. It worked, but was limited to a single instrument playing a very simple beat.

By defining a more complex data structure I now support multiple instruments. The result, I now have a Kick, Snare, closed and open Hi-hats, and a Clap.

All of these sounds are being played as an individual instrument, with their own Beat Schedule.

The result allows for a number of basic drum patterns to be played, from a basic 1-3 rock, to more complex Bossa Nova drum patterns, to jazz swing.

But there’s a problem with the current implementation of these Instruments: They assume one, and only one note.

That’s OK for something like a drum kick, where you can get away with the single sound cue.

But the majority of instruments out there (and even a drum kick if I wanted to support multi-sampled instruments) have multiple notes available.

Initializing 48 unique instruments to represent 4 octaves of a 12 note instrument would be kind of ridiculous.

Next episode, I start building in support for multi-note instruments:

Multi-Note InstrumentsMulti-Note Instruments