Developing Audio Applications With JUCE (part 3)

Waveform of me singing badly. Okay — so far, we have an in-progress Scumbler application that can interface with audio hardware and route audio signals through itself (part 1) and also load third-party audio effects plugins into that audio stream (part 2). This time, we’ll add code that processes the audio to create the gradually fading loop that is the heart of the whole system.  JUCE provides a few base classes that will once again simplify our work here greatly — all of the common behavior that we want to be able to support is abstracted away cleanly into the framework, and we just add the code that makes our app unique here.

AudioProcessor

http://www.rawmaterialsoftware.com/juce/api/classAudioProcessor.html

Classes derived from the AudioProcessor base class inherit a ton of useful functionality — the AudioProcessorGraph that we already saw in part 1 of this series is designed to hold all of the audio processors used in your app and connect them together to implement your desired signal flow (and actually is-a AudioProcessor itself). JUCE also includes additional classes that you can use to ‘wrap’ your signal processing code so that it can be distributed as a plugin that can be hosted in other audio applications. For our purposes, the interesting member function is the processBlock() function. This function is called repeatedly by the high-priority thread that handles all audio I/O for the application.  You’re given references to two objects, an AudioSampleBuffer containing new input samples and a MIDIBuffer that contains time-stamped MIDI messages. When your function is done, you need to replace the contents of those two buffers with the modified values after your effect algorithm has processed the new input. Because this function is being called as part of audio I/O, it’s important that we not take any longer for processing samples than we absolutely need to, and because we’ve now realized that our app has to be multi-threaded, we need to take that into account and protect any member variables that will need to be accessed from both the audio and user interface threads.

AudioSampleBuffer

http://www.rawmaterialsoftware.com/juce/api/classAudioSampleBuffer.html

The AudioSampleBuffer class is (even considering how great the rest of JUCE is) an incredibly useful and utilitarian class. Rather than cobbling together containers that can deal with the lists of floating point numbers that we use to represent audio samples, this class cleanly represents a block of audio samples — it understands that we probably need to have multiple synchronized channels of audio, that we may want to efficiently change the gain being applied to a block of samples (and that we probably want to change that gain over time to implement fades in and out).  We want to be able to copy block of samples into and out of sample buffers.  We want to be able to quickly find the highest/lowest samples in a region of the buffer (useful for displaying a waveform where we need to be able to use a single pixel to display more than one sample’s worth of data) or the root mean squared level of a region of samples (useful for metering-style displays).

CriticalSection & ScopedLock

http://www.rawmaterialsoftware.com/juce/api/classCriticalSection.html

Since we’re working with multiple threads, it’s important that we be able to guarantee that a chunk of code will be allowed to run to completion without another thread interrupting it and behaving incorrectly because our member variables were left in an incoherent state. The CriticalSection class is a simple re-entrant Mutex that we can use. It’s especially useful with the associated ScopedLock class, which uses C++’s RAII-style design to claim the CriticalSection’s lock when created, and guarantee that the lock is released when the ScopedLock is destroyed, whether because that destruction happens automatically when we leave the scope where it was declared or because an exception occurred. Programming with threads can be obnoxious and tricky, but classes like these make is much less so. (Well, still tricky, but less obnoxious).

The LoopProcessor Class

With those three things, we’re ready to create the processor class that loops audio for us. Requirements for this class include:

  1. We can either be playing or not. If we’re not playing, any samples that arrive at our processBlock() function are passed through without modification. 
  2. The duration of the loop must be changeable by the user. We’ll default to a duration of 4 seconds.
  3. There’s a variable loop feedback between 0 and -96 dB. Each time through the loop, we apply that gain to the current contents of the loop samples.
  4. New samples being passed into the loop are mixed into the current contents of the loop, and stored into the loop buffer for the next pass. Those mixed samples are also the output values for this call to processBlock().

The only part of the processBuffer() member function is that we need to remember that it’s almost a certainty that the size of our loop sample buffer isn’t an integer multiple of the size of buffers being passed to that function, so we need to be aware of cases where we need to assemble the output from some number of samples at the end of our loop with some samples from the beginning of the loop when we wrap around in time. The processBlock() member function looks like this — with the comments, I hope that it’s easy to puzzle out what’s going on.

[sourcecode language=”cpp”]
void LoopProcessor::processBlock(AudioSampleBuffer& buffer, MidiBuffer& midiMessages)
{
if (fTrack->IsPlaying())
{
// Lock down all of the protected code sections.
ScopedLock sl(fMutex);
int sampleCount = buffer.getNumSamples();
int loopSampleCount = fLoopBuffer->getNumSamples();
float feedbackGain = fFeedback;
for (int channel = 0; channel < fChannelCount; ++channel)
{
// this is easy if we don’t need to wrap around the loop
// buffer when processing this block
if (fLoopPosition + sampleCount < loopSampleCount) { // Add samples from 1 loop ago, multiplying them by // the feedback gain. buffer.addFrom(channel, 0, *fLoopBuffer, channel, fLoopPosition, sampleCount, feedbackGain); // … and copy the mixed samples back into the loop buffer // so we can play them back out in one loop’s time. fLoopBuffer->copyFrom(channel, fLoopPosition, buffer,
channel, 0, sampleCount);

}
else
{
// first, process as many samples as we can fit in at the
// end of the loop buffer.
int roomAtEnd = loopSampleCount – fLoopPosition;
// and we need to put this many samples back at the
// beginning.
int wrapped = sampleCount – roomAtEnd;

// add samples from a loop ago, adjusting feedback gain.
// part 1:
buffer.addFrom(channel, 0, *fLoopBuffer, channel,
fLoopPosition, roomAtEnd, feedbackGain);
// part 2:
buffer.addFrom(channel, roomAtEnd, *fLoopBuffer, channel,
0, wrapped, feedbackGain);

// and now copy the mixed samples back into the loop buffer:
// part 1:
fLoopBuffer->copyFrom(channel, fLoopPosition, buffer,
channel, 0, roomAtEnd);
// part 2:
fLoopBuffer->copyFrom(channel, 0, buffer, channel,
roomAtEnd, wrapped);
}
}

// set the loop position for the next block of data.
fLoopPosition = fLoopPosition + sampleCount;
if (fLoopPosition >= loopSampleCount)
{
fLoopPosition -= loopSampleCount;
++fLoopCount;
}
// Notify anyone who’s observing this processor that we’ve
// gotten new sample data.
this->sendChangeMessage();
}
}
[/sourcecode]

Brett g Porter

Brett g Porter

Lead Engineer, Audio+Music at Art+Logic
Lead Engineer, Audio+Music development at Art+Logic. Always looking for excuses to write code. Tweets at both @artandlogic and @bgporter.
Brett g Porter

@bgporter

Music+Software+Music Software+Ice Cream. Relapsing composer/trombonist. Day job @artandlogic. Creator of @tmbotg.
Update from my hometown: “Lucky", the giant fiberglass bull that’s been stashed in a secret location since being re… https://t.co/RPva8FMhH9 - 16 mins ago
Brett g Porter
Brett g Porter

Latest posts by Brett g Porter (see all)

Tags:

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

11 Comments

  1. David Jameson

    Are you going to make the source available? I have a very similar need to build a custom VST host that I can control remotely (from Max) and have run into the same issues in trying to extract the core parts of the JUCE Plugin Host sample that I need.
    I looked at your github ref but didn’t see anything there.

    • bgporter

      Yeah, I’ve been dragging my feet there — banging my head against a combination of a UI thing I’ve been working on and restrictions on the night/weekend time I have available in the recent past. Let me try to put some effort into this project again…

  2. David Jameson

    Even if you just have the basic system working, it would be cool to have access to that — it’s quite painful trying to disentangle the JUCE sample

      • David Jameson

        Terrific —- I have to go out for a meeting but I’ll take a look at it later tonight.

  3. David Jameson

    Do you have a direct email address through which I can contact you? Apart from one compilation error due to a misspelling in the MainAppWindow::ViewPlugins member function, it compiles fine. However, rather than using your older included JUCE, I’d rather use the latest JUCE libraries and there seems to be some unimplemented pure virtual methods….presumably due to changes in JUCE.

  4. Abee

    Thanks I really appreciate you making this available, learning a great deal picking through your articles and source code.

    • Brett g Porter

      Excellent, thanks! Hoping to sit down and put another one together on this topic soon…

      • David Jameson

        Is there a way to direct different MIDI events to different audio plugins? For example, suppose you’re trying to build a MIDI sequencer and you have multiple tracks, each of whose MIDI data needs to go to a different plugin in a graph. It seems like the only way to get MIDI into JUCE is through the MidiMessageCollector which is owned by an AudioProcessorPlayer instance. But there’s only one instance for an entire graph of audioprocessors.

        • Brett g Porter

          This is something that’s on my list to write up — first, see the ‘audio plugin host’ example that comes with JUCE for a demonstration of how to create an instance of the built-in AudioProcessorGraph::AudioGraphIOProcessor::midiInputNode object. That device has an output pin that produces MidiBuffer objects. Once you have that source of data, you should be able to e.g. create a class derived from AudioProcessor that has 1 midi in, 1 midi out, and can be configured to filter out MIDI data as you need. For example, instantiate one of these new classes to only pass along data on channel 1, and another to only pass along data on channel two, connecting the output pins of each of these processor nodes to the input of a VST instrument or other plugin that accepts MIDI input.

          • David Jameson

            Thanks for responding — however, I’m not convinced that’s the best approach. Imagine you want to play 300 MIDI tracks, going to lots of different plugins. I don’t think one wants to have a single filter somewhere trying to figure out to which plugin some MIDI should be sent. Surely there needs to be a way to direct a specific source of MIDI data (whether from a track, calculated on the fly, or whatever) to a specific plugin without having to have a guard seeing (unnecessarily) all MIDI.
            Since I posted the question, we came up with a hack to target specific MIDI data to specific nodes but it just seems to me that JUCE should support this concept formally but we couldn’t find anything.

Trackbacks/Pingbacks

  1. Developing Audio Applications With JUCE (part 2) | Art & Logic Blog - [...] Next time… we loop. [...]
  2. Designing Audio Effect Plug-ins in C++ (book review) | Art & Logic Blog - […] been writing a series of posts here over the last few months discussing the JUCE C++ application framework and…