Spectrogram of swelling trumpet sound
Art+Logic’s Incubator project has made a lot of progress. In a previous post I mentioned that Dr. Scott Hawley’s technique to classify audio involved converting audio to an image and using a Convolutional Neural Network (CNN) to classify the audio based on this image. That image is a spectrogram. I’m going to go into some detail about what we do to create one, and why to the best of my ability.
Next week (18-20 November) I’ll be attending the annual Audio Developer Conference in London. On Tuesday November 19th at 16:00 I’ll be part of a team providing the first public details about the forthcoming MIDI 2.0 standard.
The ADC is usually live-streamed on YouTube as it happens, an unfortunate series of events have endangered that this year — you can learn more about that and consider contributing to the fund that will pay for the recording and livestreaming of conference sessions—I frequently return to the archived videos and point other developers to them for reference.
Check the JUCE YouTube channel for the streams during the event (and come back later for archived recordings, or watch sessions from earlier years).
The full schedule for the event is here.
If you’re attending the event, please do track me down and say ‘hey’.
There are a few events coming up in the next few weeks where A+L will have people in attendance. If you’re going to be there or nearby, please get in touch and we’ll meet up.
Computers have been around for less than 100 years. In that short period of time, some incredible things have happened: they’ve been universally adopted so quickly that we have them in our houses. In our cars. Even in our pockets. In the last 40 years, there have been many significant events when it comes to computers:
- Continuous decrease in size and increase in power.
- Access to computing at home and at work.
- Networking, the spread of the internet, and acceptance of the web.
- Computers in our hands (cell phones).
Similarly to those past events, an important development in computer science which has the potential to significantly impact the way we develop applications is machine learning and artificial neural networks.
The RESTful API has a funny place in the software development world: it’s widely regarded as the best general-purpose pattern for building web application APIs, and yet it’s also nebulous enough of a concept to cause endless disagreements within teams over exactly how to implement one.
Do I make my endpoint
/companies/123/? How about
/locations/?company=123 ? How do I handle versioning the API? Why shouldn’t I send a POST request to trigger an action on the server? If a backend task can take many seconds to process, how do I represent that in the API?
friz—a Flexible Animation Controller for JUCE
As is often the case, I found myself working on a personal project and had some UI elements that really wanted to have some life to them on the screen.
I tucked into the JUCE documentation expecting to see something that I could use to easily add some personality to the interface, and…didn’t find what I was looking for. There’s a
ComponentAnimator class that supports moving a single component from one set of coordinates/bounds to another linearly, and also modify the component’s alpha value to have it fade in or out.
I was looking for something…more expressive.