MIDI 2.0 Specs Released!

After over a decade of work, the final specification documents for MIDI 2.0 have been released to the public!

There’s a fantastic article on the MIDI.org site that explains what the hubbub is all about that I’ll just point you at rather than rewriting — check it here: Details about MIDI 2.0

When MIDI 1.0 was released in 1983, the complete document that detailed all you needed to know about it was eight pages long. Expect to need to read a bit more than that in 2020—the full spec for MIDI 2.0 is five separate documents, each looking at a single part of the system:

M2-100: Overview of the specifications
M2-101: Specification of MIDI-CI, the Capability Inquiry portion of MIDI 2 that’s required to enable devices to query each other and determine how two devices can work together.
M2-102: Common Rules for MIDI-CI Profiles explains how to define and work with MIDI 2 profiles to define controllers and other configuration data to permit devices to automatically adapt to the capabilities present in the currently connected instruments.
M2-103: Rules for Property Exchange, the new provisions for querying current settings and capabilites of connected devices
M2-104: Definition of the new Universal MIDI Packet data structure and the high-resolution MIDI 2 message protocol.

Before you can access these documents, you’ll need to create a (free!) account with The MIDI Association, which is an organization of MIDI users. If you’re not already a member, the link to access the docs will redirect you first to the login/account creation page.

Download everything here and then go make cool stuff with it.

Making Spectrograms in JUCE

Spectrogram of swelling trumpet sound

Art+Logic’s Incubator project has made a lot of progress. In a previous post I mentioned that Dr. Scott Hawley’s technique to classify audio involved converting audio to an image and using a Convolutional Neural Network (CNN) to classify the audio based on this image. That image is a spectrogram. I’m going to go into some detail about what we do to create one, and why to the best of my ability.

(more…)

Art+Logic at ADC

Next week (18-20 November) I’ll be attending the annual Audio Developer Conference in London. On Tuesday November 19th at 16:00 I’ll be part of a team providing the first public details about the forthcoming MIDI 2.0 standard.

The ADC is usually live-streamed on YouTube as it happens, an unfortunate series of events have endangered that this year — you can learn more about that and consider contributing to the fund that will pay for the recording and livestreaming of conference sessions—I frequently return to the archived videos and point other developers to them for reference.

Check the JUCE YouTube channel for the streams during the event (and come back later for archived recordings, or watch sessions from earlier years).

The full schedule for the event is here.

If you’re attending the event, please do track me down and say ‘hey’.

friz and the Illusion of Life

friz—a Flexible Animation Controller for JUCE

As is often the case, I found myself working on a personal project and had some UI elements that really wanted to have some life to them on the screen.

I tucked into the JUCE documentation expecting to see something that I could use to easily add some personality to the interface, and…didn’t find what I was looking for. There’s a ComponentAnimator class that supports moving a single component from one set of coordinates/bounds to another linearly, and also modify the component’s alpha value to have it fade in or out.

I was looking for something…more expressive.

(more…)