So I got an iRig for Christmas! This allows me to answer all the questions everybody had about using oScope on the iPad with the iRig to plot DotCom waves. And the answer is… it doesn’t work.
I know that I’d said that it would. A lot of other people seemed to report that it did also, and I went by that. But I’m not certain that the fault lies with the iRig.
Why? Well, because my Plantronics DSP v4 adapter + Camera Kit + headphone adapter solution doesn’t seem to work anymore either.
(On the other hand, the iRig + Amplitube seems to work. The signal from the DotCom seems pretty hot for it, so there’s a lot of distortion if I don’t turn it way down and lots of noise if I do (and then amp it back up in the iPad), but the result isn’t all bad. I wish the free version included more pedals.)
After fighting with it frustratedly for quite a while, I decided to just go chat on Skype for a while and relax, and my home-made shock mount (depicted in the site banner) self-destructed, and proved to be one of those situations where it kept *almost* being fixable and then all the elastics would leap off at once.
It was frustrating to say the least. (I’ll need to go to Michael’s for new craft loops soon.)
Looking back on 2010, the biggest and most obvious thing is that I only posted two tracks this year. So that’s going to be the big thing for the new year — to get back into the swing of actually writing, rather than just tinkering. The dotcom has been a lovely beast, but it’s also proven to not be conducive to getting music made, and I need to find a way to tame it and get it in line, or I need to figure out when it’s useful and look elsewhere for everything else.
Unlike many old-school synth people, I wasn’t really in the market when fully-implemented, fully-knobby user interfaces were the norm. When I got involved in synths, very clean, simple physical interfaces driving small displays and layer after layer of menus were popular. A big part of that was digital technology, but I think as big a part of it was that the market became strongly competetive, and these simplified physical interfaces were an easy way to keep costs way down.
Eventually, the idea of putting the user interface on a computer screen hit ,and that’s still something that’s being explored extensively, with ever-blurring lines between plug-in software and hardware instruments. The advantage of this was that it brought us back to being able to see and visualize all of the parameters and functions at once, and that was enlightening compared to having them all buried in menus and sub-menus. Also, instruments had become much more complex in the meantime, and computer interfaces supported the richness and versatility to display those architectures in an accessible and meaningful way.
However, the mouse continued to be an impediment, because while you could see all of your parameters at once, you could only really modify one at any given time.
I’m not a musical historian, so take all of this with a grain of salt, but there seemed to be two major responses to this situation. The first seemed to find its best example in the Alesis Andromeda — a return to synthesizers with all or most of their controls broken out into individual knobs, sliders, buttons, etc. on the panel, and with architectures designed to make that possible. (It also began a move toward bring back analog components and/or a nostalgic spike in popularity of classic all-analog instruments.) The Andromeda is a well-loved instrument, but it was spectacularly expensive and continues to be, even as a used item.
The other was the control surface — a box that just packed a whole pile of knobs, sliders, buttons, a keyboard, whatever you might need, which you could then map to the various functions of your software. These were much cheaper than dedicated hardware with specifically chosen knobs, sliders and buttons to match the exact functionality of your synth, carefully laid-out. However, because they were generic guesses at functionality that most people would need, they rarely provided the set of controls you really wanted, exactly, and they certainly weren’t typically organized to expose or complement the system architecture. Also, few computer musicians use just one synthesizer.
Many people are probably expecting me to say, “And then Apple came along!” And don’t get me wrong; I love my Mac. However, Apple is also the company that recently announced that coming in 2011 they’ve invented the Maximize button.
In fact, the first mass-produced commercial solution of the type I was arrowing toward in the title was from a company called JazzMutant, and was called the Lemur. It’s a multi-touch tablet specifically geared toward musicians. What it does is allow you to build a multi-touchable interface by dragging and dropping interface elements and then use that to control your software. Because you can built the interface you want and because you can save and recall different interfaces to correspond with different software, it’s a best-of-both-worlds situation. You get a single interface tool that can be coupled with any piece of software you own and customized to suit your exact needs, and you also get the immediacy of having lots of parameters displayed in such a way that you can modify more than one of them simultaneously. (And yes, that is generally something you want to do quite often.)
On the downside, the JazzMutant tablets are very, very expensive. For years this was sort of a “pick two” situation — you could get cheap and multi-touch but not customizable (control surfaces), or cheap and customizable but not multi-touch (mouse-based on-screen UIs), or multi-touch and customizable but not cheap (JazzMutant Lemur) / multi-touch and customized but not cheap (dedicated hardware).
And this is where Apple does step in, by providing a whole family of affordable multi-touch devices that many people already own. And here I’m talking about my iPad, but this is equally true of iPhones and iPod Touches. (It’s also presumably equally true of Android devices, but I’m not familiar with their marketplace.)
This has been true for quite a while now, and I’ve even had the software for quite a while, but I started toying with it last night. There are a few components. On the iPad, there’s TouchOSC, which is an app that displays customized user interfaces, accepts multi-touch interaction with them, and sends Open Sound Control messages over your WiFi connection to any device or piece of software that can accept OSC. Also on the same page linked above is the TouchOSC editor — an app that runs on a Mac OS X, Windows or Linux machine to allow you to create those user interfaces and transfer them to the iPad app.
That’s actually all you need if you want to control something that accepts OSC, such as Reaktor or Logic or Live. However, a lot of apps don’t support OSC. For those apps, you need something to act as a go-between. The most obvious option is OSCulator, an app that acts as an OSC target that runs on your Mac/Win/Lin computer and translates the OSC messages into MIDI messages that can be read by any MIDI app or device (presuming the device is connected to a MIDI interface of some kind hooked up to your computer, or connected by a USB connection that emulates same).
All of this is pretty straightforward, in theory. In practice, there is a lot of manual programming and configuration. I decided to configure a reproduction of the Easy/Morph page in Native Instruments FM8. (When I have a good implementation going on, I’ll post the templates for download.) Once I got the hang of how it all came together, and the fact that your controls completely won’t work if there’s a space in their name (that took a while to figure out), it was mostly rote creation of widgets and assigning them to parameters, then learning those parameters in FM8. However, there are really a lot of parameters even on a relatively simple page like that. And there were a few things that I never quite got working. For example, I wound up using an XY input area for playing notes — the horizontal axis for the pitch, like normal, and the vertical axis for velocity. This actually worked really well, except that I couldn’t figure out how to actually trigger a note whenever you touch the XY grid, so instead you have to push a separate button, which effectively keeps so many of your fingers / so much of your attention occupied that you can’t really do anything else, defeating the purpose.
Even so, it was a pretty compelling experience, especially using the morph square, which isn’t that intuitive feeling using a mouse but is wonderful when played with the fingers. Being untethered was also fantastic — being able to kick back on the couch with the iPad and both play and fully control the synth. And this was simply reproducing the interface exactly as it appears on-screen. The next step would be to customize it for my needs.
And while at the moment most of the press about this sort of thing focuses on getting multi-touch, including this article, that’s another aspect of this is that it allows you to generate custom interfaces to software, putting the controls you actually use on the interface (even if they appear on separate panes). That’s hard to overestimate the value of.
It’s sometimes difficult to get people really excited about things which just improve the user experience rather than providing new functionality, and in this case where the process of using these tools is still very rough around the edges and is definitely not accessible to the technically uncourageous, it’s likely an even harder sell. However, in my few hours of playing with these, I see it as making an enormous difference in the accessibility of my synths, especially the softsynths. It’s actually one of the tools that prompted me to buy the iPad originally, and I’m a little sad that it took me this long to get around to playing with it. However, I can see myself doing a lot more of that as time passes.
I’ve wanted to get an inexpensive oscilloscope for ages, as a way to help visualize what was happening in the synthesizers.com modular system. I do have an inexpensive digital storage oscilloscope, but I’ve never been able to get decent plots from the dotcom system with it, although it’s been very useful for other projects. Also, I want something that reacts very fluidly in realtime. This has mostly meant scouring eBay, but even old scopes with bits missing being remaindered from workshops from the 70s can run into the hundreds of dollars.
However, I recently discovered oScope for the iPad. (Obviously if you don’t already own an iPad, this is no longer an inexpensive solution, but I already had one.) It’s an oscilloscope that will plot audio signals coming in from the built-in mic (to be honest, I didn’t even know the iPad had a built-in mic) or from an audio source. The “Lite” version is very workable and free, and there’s a $9.99 version that adds triggering and a simple frequency spectrum analyzer.
But wait, you say — from an audio source? The iPad doesn’t have a line in! Well, it turns out that there is a great and easy way to work around that, and that workaround is something else you might already have if you own an iPad: The iPad camera connection kit. The kit, which runs about $35, comes with two small dongles, one of which has a slot for SD card media and one of which has a USB jack. This allows you to download photos and video directly to your iPad from digital still and video cameras. (In fact, I used the SD reader to transfer and upload the video later in this post, since conversion and uploading via the iPad is so simple and painless.) However, the dongle with the USB jack lets you connect all kinds of other things, from keyboards to audio devices. I discovered this a long time ago when I managed to get Skype working on my iPad over WiFi using a Plantronics USB headset.
That same headset can have the cables to the headset part disconnected and act as a standalone monophonic class-compliant USB audio interface. By plugging a 1/4″ to 1/8″ adapter into it, plugging it into the camera connection kit, and plugging that into the dock connector on the iPad, then downloading oScope, I was ready to rock.
A lot of this won’t be news to anybody in theory, but what really surprised me was how clear and fluid the results were, and how responsive oScope is to these kinds of realtime waveform display. I’ll embed a video here for you to see. It’s a bit rambling and I cover the ground again. Also, apologies for the shakycam — my tripod is currently broken, so I went handheld, which proved to be “interesting” while juggling cables and jacks and so on.
You can’t hear the audio from the modular, as you get a cleaner plot if you don’t split it. However, if you jack the audio into a synthesizers.com Q111 Pan-and-Fade and then pan it to the middle, you listen to the audio while running it to the oscilloscope. Because this greatly reduces the amplitude and forces you to zoom in much further, it does introduce a bit more noise into the plot.
As a fun note, just playing with this for two seconds found a problem in one of my Q106 oscillators, where the Saw and Ramp jacks were reversed. Very hard to detect by ear, but it showed up clear as day on the plot.
I should also note that I sent the develop a note with some questions and comments, and he responded less than 30 minutes later. Two thumbs up for customer service!
Anyway, here’s the video:
ERRATA: I say in it that any class-compliant USB audio device will work, but in fact, it seems that there are limitations. The iPad only provides a little power, so if your audio device requires a lot of power, it will have to be powered externally or via a powered hub. Also, I have no idea what it will make of multi-I/O devices. And it only does 44.1khz/16bit, so if you try to use something that works above that by default, then it won’t work unless you can manually switch it down.
ERRATA 2: (I also say “oscillator” instead of “oscilloscope” at one point. Whoops!)
ERRATA 3: When I say that it doesn’t really provide measurement so you can’t use it for calibration, I should explain what I mean, because after some usage I realize that it’s unclear. While the grid is drawn at a fixed distance on the display, the grid spacing is meaningful because the unit displays in the top left corner adjust accordingly. Where the problem lies is that the grid lines aren’t value-labelled, and the display seems to center vertically or do some other sort of compensation that I may simply be misunderstanding. So although you can say things like, “the waveform has a 5V spread”, you can’t necessarily say things like, “the waveform goes between -2V and +4V” if it’s offset. This impedes certain kinds of measurements that I think are common in performing calibrations.
ERRATA 4: In the video I say that I used the tool to discover a wiring fault in one of my oscillators. After comparing the results against my other oscillators, I’m not so sure if it’s a wiring fault in the oscillators or a display fault in the app. I’ll have to investigate further.