blog

Image of Guitar and Apple Watch

Bringing an Idea to Life: WatchWah Proof of Concept

by

"Wouldn’t it be cool if…"

We’ve all had that thought about something. How do you get from there to a product you can use and share with others? When I found myself playing with an idea that seemed exciting and new, I decided to capture some notes on the process, from the perspective of an Engineering Manager at a company that’s all about implementing people’s creative visions.

The Idea

In addition to working as Art+Logic’s Director of Engineering I’ve been a performing musician for much of my life. A year ago I bought an Apple Watch and it occurred to me that this small computer, packed with sensors monitoring all sorts of interesting things, could be used to enhance some element of musical expression.

As a professional software developer I’ve learned that a good idea includes these qualities:

  • It needs to spark your passion, or at least get you curious. If it’s something you’re personally neutral about, but think might make you money, it’s much harder to stay with it over the long haul. My idea was fairly simple, and I felt curious enough to spend some time on it.
  • It’s framed generally enough to be resilient as challenges arise. I’ve seen clients cling to specific details of their projects, at the expense of the product, when a looser conception of their goals might have allowed them to pivot more gracefully. In my case, “Can I use my watch to enhance my musical performance” seemed sufficiently general.
  • It’s been researched enough to know what possible challenges or competitors exist. I found an existing system that allowed the user to send messages from the Apple Watch to audio applications, but it didn’t use the sensors. I was pleased that there wasn’t anything quite like my idea that was already available, but also wondered if it might be unexpectedly difficult to implement, since nobody seemed to have done this before.

Interestingly, I’d say "the idea is possible" is not always an important question to ask. If it’s not possible now it may be in the future, and indeed it’s hard to say whether or not something can be done until you try to do it.

Development

I love the idea of iterative development, and it’s something we do a lot of at Art+Logic. With a new and unproven idea it’s extremely valuable; how quickly (and cheaply) can you learn if your idea will work, and how can it be refined, step-wise, through experimentation?

My idea seemed to have a few parts: getting sensor data from the watch, bringing that data to a guitar effect to control it’s behavior, and then the effect itself. I had a vision of using my watch to control a physical "stomp-box" style guitar effects pedal, so I wouldn’t need to be tethered to a computer, but to test and refine the basic, big-picture idea that seemed excessive, so I opted for something simpler.

Not only is a fast proof-of-concept a more reasonable business proposition, but it can be hard to sustain the passion for an untested idea over the months it takes to build out a complete solution. Iterative development allows for a whole bunch of successes along the way, which keeps excitement and engagement high. Failures happen too, admittedly, but they’re usually cheaper and less disappointing than seeing a lengthy project abandoned. Spoiler alert – there are no major failures in this article!

The Watch

A quick search turned up SensorLog for the iPhone and Apple Watch. It’s an app that can write sensor data to file, or stream it via a network socket. I even approached testing out this app iteratively – I first logged my data to file to see what the sensor data looked like, so I could make a guess about which sensors might meet my needs, before investing time in reading a live stream of data.

I had an initial misstep as I learned how to use the app. I was using my iPhone to log what I believed to be watch sensor data. I wiggled my wrist around to generate interesting watch information for the log, but the app was actually monitoring the sensors on the phone! I spent an hour trying to convince myself that I saw meaningful patterns in the accelerometer and other sensor data, when it was all being created by a device sitting motionless upon my table. After discovering my mistake and switching the system over to monitoring the watch I was delighted to find dramatically varying data that clearly correlated with my previous movements. I graphed data in a spreadsheet program from a bunch of different possible sensors, and saw lots of possibilities.

Documenting the process is really valuable. When you’re not completely sure what you’re going for, having notes about the journey can provide additional options later on. I decided to start with accelerometer data, but added notes to a "Future" document, indicating that it might be worth trying the gyro sensor, or other CoreMotion data.

The Effect

It appeared that the natural movements of my wrist and body were indeed enough for the sensors to output interesting information. That had been my hope, since it meant that a player could add some nuance without having to learn a whole new approach to playing (as with a foot pedal). I decided that the first thing I would try would be controlling a wah effect. This is a movable band pass filter that adds a lot of character and expressiveness to a sound, but can also be relatively subtle and flexible as to when it’s applied. In other words even a somewhat random movement of the filter might be kind of interesting. I decided to use my existing recording software (Logic Pro) to process the guitar sound. I set up a Logic project that featured Apple’s Fuzz-Wah effect on my guitar channel. While my ultimate goal was to have a standalone effects processor, possibly based on an effect I would write myself, staying focused on testing the basic idea was the priority, and using Logic made this part of the test trivially easy.

The App

The missing link between the watch and the wah effect was some piece of software that could listen for the incoming sensor data and convert it to a form that the wah could understand. MIDI seemed the obvious answer, since Logic has extensive capacity for remote control by MIDI, and standardized formats often provide the greatest flexibility. There was no obvious way to convert socket-based, comma-separated numbers to MIDI (aside from possibly using a fairly expensive 3rd-party product like Max/MSP), so I decided it was time to write some code.

I’ve come to really appreciate the JUCE C++ cross-platform development framework, particularly for audio and MIDI work. I consulted a colleague to learn what I might encounter as I set out to write my JUCE MIDI app. I was grateful for the support; it turns out that it’s trivially easy to write a "virtual MIDI port" app with JUCE on macOS X, and extremely difficult on Windows. Knowing that in advance was great, as it confirmed my initial decision to work on macOS X. And again, as this was a proof-of-concept rather than a commercial product, one platform would be enough for now.

I started with a skeleton MIDI / audio app, adding a slider to send MIDI Continuous Controller messages through a virtual MIDI port, so I could test the app without needing the watch. Within about an hour I was able to open my Logic project, launch my new app, and control the wah effect’s "pedal position" parameter with the slider in my app.

Adding JUCE’s StreamingSocket class allowed me to monitor the port to which the watch would be sending data, wait for a connection, and start reading in values. This happens in a separate thread, and getting the thread performance to acceptable levels took a little fiddling.

The most labor intensive portion came next; parsing the incoming sensor data and converting it to MIDI messages. SensorLog streams CSV values – comma-separated and then further delimited by line feed characters. I broke up the lines of data, and was able to convert the accelerometer readings to float values. Here too the learning was iterative: I was pleased to discover JUCE’s NormalisableRange class, which appeared to let me map my accelerometer values to the 0-127 range MIDI requires, then startled when vigorous arm movements caused my program to crash!

The problem was that I hadn’t fully understood how the accelerometer values worked. I’d thought they might be ranging from -1.0 to 1.0, based on my cursory observations, but they’re actually unbounded; 1.0 simply means acceleration equal to gravity, so quick movements can easily generate three or four times that value. I considered those movements to be outliers, and simply filtered them out. It was hard to imagine I’d be wildly swinging my fretting hand. If Pete Townsend wants this for his right arm, I can expand my range a bit! Accelerometer values also seem to depend upon the orientation within the gravitational field, so the watch at rest reports a non-zero value, and when upside down, a different value; roughly the inverse of the first.

At this point I was ready to test – my iterative process had allowed me to check various components of the project along the way so I had some confidence that it would work pretty well.

Results

Performance

The results can be seen in the linked videos, and are really exciting to me. After just a few evenings of effort I have a surprisingly musical tool. I see many possible next steps; trying different sensors (possibly simultaneously), using other effects, scaling the controller data in different ways, and simply jamming with it for a while to see how my relationship with the tool evolves. The next steps will also be iterative, and perhaps documented here. I can already see that one challenge will be balancing my urge to keep tweaking against the goal of actually building something that others can use.

Here’s a quick video of the sensor data changing the sound of a solo guitar.

Here’s a longer performance demonstration, featuring a song I wrote and recorded once before without the wah effect.

Photos courtesy of the author.

+ more