Nov 12, 2010 Gear
If you own a modular analog system or even just a synthesizer or sequencer that communicates using control voltage, at some point you’ll resort to measurement, and once you’ve begun on that road there’s no turning back. Many modular owners have a great deal of fancy test equipment, but the basic starting point is a good digital volt meter (or multimeter).
However, if you’re like me, you’re fairly new to all of this and you don’t have a collection of different probes with various connections, etc. You may (again, like me) have just picked up your first nice meter and you just have the probes that came with it — the ones that look like you could stab someone with them. If, again, you’re like me, you’ve tried to live with them by alligator clipping leads between cables and probes, weighing probes against plugs, coming up with all many of rigged solutions. Well, there’s an easy solution. I imagine most people have something that covers this, but it only occurred to me today, and I was able to whip up this quick fix for just a few bucks (bearing in mind that I already had most of the tools) and maybe an hour or less of work. I think it’ll have saved me at least that much effort by the end of the week.
I feel like I’m making a bigger deal with this leadup than it deserves and I felt pretty silly posting super detailed instructions, so instead I’ll just show you the mod:
As you can see, it was simply a matter of cutting the probe leads in half and connecting the end closer to the meter to a 1/4″ female monophonic unbalanced (2-connector) jack, and the end closer to the probes to a 1/4″ male monophonic unbalanced (2-connector) plug.
This way, when I’m measuring the modular, I can plug patch cords directly into the multimeter, which is hugely easier than what I was doing before. (Let’s not even go there.) However, if I want to use the meter with the probes as originally designed, I can just jack them back in and away I go.
My test measurements before and after check out, and it seems to work great. THis is going to be a huge time and headache saver for me. All I had to buy (other than the various tools and such, which fortunately I already had) was the jack, the plug, and a few inches of heat-shrink tubing. I made about a billion mistakes, because I’m clutzy, but maybe an hour of total time and only two of my fingertips burned later, my dreams had come true. :)
The biggest lessons learned in this project:
- Measure reference voltages before beginning that you can use to accurately verify that it all works after. It’s a lot harder to obtain this data after you’ve sliced your probe leads in half.
- Make sure to slide all the tubing and barrels and anything the cables have to go through into place before you start connecting and soldering. (I forgot this one maybe six thousand times.)
- Metal components that have been recently soldered tend to be very, very hot. (Ow!)
(Note: If you’re very new to soldering and need detailed instructions, please contact me! Seriously, I don’t mind writing them out for you.)
Nov 11, 2010 Uncategorized
I’ll be honest right out of the gate on this one: I have a lot of hobbies, and I tend toward the scattered rather than the focused. I do work hard on music, but probably not as much or as many hours as many people working in their spare time. And that’s nowhere near what the people who are both focused and doing music full-time can spend.
So sometimes when I read about techniques people use that have a very high work to impact ratio on the sound, I find them very intellectually interesting, but I don’t think of them as things that I’m likely to myself do anytime soon. And yet, I could probably find ways to employ them, with a little effort at being efficient about them.
One of the techniques I’ve read about from a lot of sources is the gentle art of directly editing the attack portion of all the drum hits in your songs.
Many producers, I’ve read, will assign each individual drum to a separate track (I do this on some of my more recent tracks, but haven’t in the past) and will then bounce each track out to audio and sit down and edit the wave manually, hit by hit for the whole song. The key things being done are creating very steep, sharp, clean attacks and moving them precisely to the beat. This kind of detailed hand editing is something I’ve done in sound design, but never as a polish activity at the track level, because it’s always seemed very time-consuming.
But then… I have no real idea if that’s what’s going on (it’s likely not) with Trifonic’s track, “Broken,” but when I listen to it, they’re doing something that’s wonderful with the percussion. The song starts out soft and languid, and the music mostly retains that, but there is a point (around 1:15 or so) where the percussion snaps into this hyper-realistic sharpness that creates an incredible textural gradient with the rest of the music, and it makes me think that perhaps I should be spending the time to master that, at least sparingly. And that’s when I start to fantasize about where I could take tracks if I had eight or ten or sixteen hours a day to lose myself in them, every day.
But then I remember that when I have, say, four hours, like tonight, I spend half of that installing Xcode and reading the introduction of my “Learning Cocoa” book, and I remember who we’re talking about.
The real question I find myself wondering about more and more, after many failed attempts at cultivating true focus, is whether I can make things that stand on their own despite the way my mind flits and jumps, or if there’s a way perhaps even to harness that for great justice.
In the meantime, though, here’s the song, complete with some totally random fan “video” that you should probably not worry about too much (I’m mostly just using YouTube as a handy method of referencing the song):
Nov 9, 2010 Gear
I’m currently building a performance sequencer module in Reaktor. Once I’m done, I’ll probably release it for download and use via Native Instruments’ free Reaktor Player. (ETA: As pointed out in the comments, you can’t load user content in Reaktor Player, so this part of the plan will have to be adjusted accordingly.) My goal is also, as per my last post, to make a TouchOSC template for it so that it can be fully controlled wirelessly via the iPad. (I dunno if I’ll do iPhone templates — I guess it would depend on whether anyone wants them or not.) The initial impetus for this was oohing and ahhing over the M185 Sequencer for the Roland System 100M. It won’t be a clone of it when I’m done, mind you, but that’s what got me thinking about it. I’ve never built a sequencer in Reaktor before, and it’s taking some learning, but I’m making progress already.
So my question to the readers — are there any features you’d love in a computer-based performance MIDI sequencer? (Note that my ultimate goal is to use it with my Synthesizers.com modular and for that it’ll be interfacing via the Q104 MIDI Interface module, so I’m likely restricted to note values and velocities in terms of what parameters I’ll be sequencing.) If they seem like cool features, I’ll try to squeeze them in.
(While I’m at it, are there any other tools in general that you’d like to see me take a stab at? I’m not likely to learn any plugin programming languages anytime soon, so I’m limited to what I can do in Reaktor, but that’s still fairly broad.)
Note that this’ll all be available for free, when I’m done. (Or, any of it that winds up seeming worth releasing.)
Unlike many old-school synth people, I wasn’t really in the market when fully-implemented, fully-knobby user interfaces were the norm. When I got involved in synths, very clean, simple physical interfaces driving small displays and layer after layer of menus were popular. A big part of that was digital technology, but I think as big a part of it was that the market became strongly competetive, and these simplified physical interfaces were an easy way to keep costs way down.
Eventually, the idea of putting the user interface on a computer screen hit ,and that’s still something that’s being explored extensively, with ever-blurring lines between plug-in software and hardware instruments. The advantage of this was that it brought us back to being able to see and visualize all of the parameters and functions at once, and that was enlightening compared to having them all buried in menus and sub-menus. Also, instruments had become much more complex in the meantime, and computer interfaces supported the richness and versatility to display those architectures in an accessible and meaningful way.
However, the mouse continued to be an impediment, because while you could see all of your parameters at once, you could only really modify one at any given time.
I’m not a musical historian, so take all of this with a grain of salt, but there seemed to be two major responses to this situation. The first seemed to find its best example in the Alesis Andromeda — a return to synthesizers with all or most of their controls broken out into individual knobs, sliders, buttons, etc. on the panel, and with architectures designed to make that possible. (It also began a move toward bring back analog components and/or a nostalgic spike in popularity of classic all-analog instruments.) The Andromeda is a well-loved instrument, but it was spectacularly expensive and continues to be, even as a used item.
The other was the control surface — a box that just packed a whole pile of knobs, sliders, buttons, a keyboard, whatever you might need, which you could then map to the various functions of your software. These were much cheaper than dedicated hardware with specifically chosen knobs, sliders and buttons to match the exact functionality of your synth, carefully laid-out. However, because they were generic guesses at functionality that most people would need, they rarely provided the set of controls you really wanted, exactly, and they certainly weren’t typically organized to expose or complement the system architecture. Also, few computer musicians use just one synthesizer.
Many people are probably expecting me to say, “And then Apple came along!” And don’t get me wrong; I love my Mac. However, Apple is also the company that recently announced that coming in 2011 they’ve invented the Maximize button.
In fact, the first mass-produced commercial solution of the type I was arrowing toward in the title was from a company called JazzMutant, and was called the Lemur. It’s a multi-touch tablet specifically geared toward musicians. What it does is allow you to build a multi-touchable interface by dragging and dropping interface elements and then use that to control your software. Because you can built the interface you want and because you can save and recall different interfaces to correspond with different software, it’s a best-of-both-worlds situation. You get a single interface tool that can be coupled with any piece of software you own and customized to suit your exact needs, and you also get the immediacy of having lots of parameters displayed in such a way that you can modify more than one of them simultaneously. (And yes, that is generally something you want to do quite often.)
On the downside, the JazzMutant tablets are very, very expensive. For years this was sort of a “pick two” situation — you could get cheap and multi-touch but not customizable (control surfaces), or cheap and customizable but not multi-touch (mouse-based on-screen UIs), or multi-touch and customizable but not cheap (JazzMutant Lemur) / multi-touch and customized but not cheap (dedicated hardware).
And this is where Apple does step in, by providing a whole family of affordable multi-touch devices that many people already own. And here I’m talking about my iPad, but this is equally true of iPhones and iPod Touches. (It’s also presumably equally true of Android devices, but I’m not familiar with their marketplace.)
This has been true for quite a while now, and I’ve even had the software for quite a while, but I started toying with it last night. There are a few components. On the iPad, there’s TouchOSC, which is an app that displays customized user interfaces, accepts multi-touch interaction with them, and sends Open Sound Control messages over your WiFi connection to any device or piece of software that can accept OSC. Also on the same page linked above is the TouchOSC editor — an app that runs on a Mac OS X, Windows or Linux machine to allow you to create those user interfaces and transfer them to the iPad app.
That’s actually all you need if you want to control something that accepts OSC, such as Reaktor or Logic or Live. However, a lot of apps don’t support OSC. For those apps, you need something to act as a go-between. The most obvious option is OSCulator, an app that acts as an OSC target that runs on your Mac/Win/Lin computer and translates the OSC messages into MIDI messages that can be read by any MIDI app or device (presuming the device is connected to a MIDI interface of some kind hooked up to your computer, or connected by a USB connection that emulates same).
All of this is pretty straightforward, in theory. In practice, there is a lot of manual programming and configuration. I decided to configure a reproduction of the Easy/Morph page in Native Instruments FM8. (When I have a good implementation going on, I’ll post the templates for download.) Once I got the hang of how it all came together, and the fact that your controls completely won’t work if there’s a space in their name (that took a while to figure out), it was mostly rote creation of widgets and assigning them to parameters, then learning those parameters in FM8. However, there are really a lot of parameters even on a relatively simple page like that. And there were a few things that I never quite got working. For example, I wound up using an XY input area for playing notes — the horizontal axis for the pitch, like normal, and the vertical axis for velocity. This actually worked really well, except that I couldn’t figure out how to actually trigger a note whenever you touch the XY grid, so instead you have to push a separate button, which effectively keeps so many of your fingers / so much of your attention occupied that you can’t really do anything else, defeating the purpose.
Even so, it was a pretty compelling experience, especially using the morph square, which isn’t that intuitive feeling using a mouse but is wonderful when played with the fingers. Being untethered was also fantastic — being able to kick back on the couch with the iPad and both play and fully control the synth. And this was simply reproducing the interface exactly as it appears on-screen. The next step would be to customize it for my needs.
And while at the moment most of the press about this sort of thing focuses on getting multi-touch, including this article, that’s another aspect of this is that it allows you to generate custom interfaces to software, putting the controls you actually use on the interface (even if they appear on separate panes). That’s hard to overestimate the value of.
It’s sometimes difficult to get people really excited about things which just improve the user experience rather than providing new functionality, and in this case where the process of using these tools is still very rough around the edges and is definitely not accessible to the technically uncourageous, it’s likely an even harder sell. However, in my few hours of playing with these, I see it as making an enormous difference in the accessibility of my synths, especially the softsynths. It’s actually one of the tools that prompted me to buy the iPad originally, and I’m a little sad that it took me this long to get around to playing with it. However, I can see myself doing a lot more of that as time passes.