So far we’ve looked at iOS devices as digital audio workstations, notation readers and scorers, and as musical synthesizers. This time we’re going to take a look at the devices as effects processors.
The phrase “effects processor” is a catch-all term that refers to just about any way that sound is manipulated before its amplified, recorded, etc. This could be as basic as adding reverberation to make it sound like your in a large auditorium instead of a small recording studio, or as complex as auto-tuning, looping, or otherwise radically altering the sound.
Effects devices typically took two forms. There were rack-mounted devices that controlled EQ, compression, reverb, delay, etc. Then there were performance devices. These were usually geared toward guitarists, and included the Fuzz, WahWah, Flanger, and distortion peddles. Now a whole range of effects peddles can be found. Rack-mounted effects are still important in studio work, but most of those effects can now be found on performance devices themselves, such as keyboards, etc.
Effects apps for iOS seem to look more like performance level devices, and this makes sense. The portability of the device makes it a great alternative if you needs some quick effects and don’t want to lug all your gear with you. If you’re doing a jam session or just practicing, these are great. I’m not sure how it would work in a studio setting, though. Continue reading “iPad as Effects Processor”
OK, if you haven’t figured it out by now, I’m a synthesizer geek. In high school my favorite bands were synth-heavy bands like Yes, Kansas and Styx, and my hero was Rick Wakeman with his banks of keyboards and flowing robes. I was even privileged to meet and interview Dr. Robert Moog when I was in college, and I tried to learn all I could about music synthesis.
Back in the 1970’s I would drop by Pecknel Music and drool over the Minimoogs and ARP 2600s they occasionally had on display there. And drool was all I could do, for at prices ranging from $2000-$3000, these were far beyond my reach. When I finally got a teaching job and could purchase my own synthesizers, we were well into the Digital FM Synthesis days of the 1980’s. Musicians were already reminiscing about the fat analog sounds of the old monophonic synths of the 1970’s, something they still do.
…And that nostalgia seems to be paying off. Many of the old synths, or at least the concepts behind them, have been recreated as VSTis, and now these are being ported over to the iPad and iPhone. Technology that used to cost thousands of dollars is now available as a $4.99 app. But there’s a trap…with the cheap price you may find yourself wanting ALL of the available synths, and still wind up sinking a ton of money for apps you’ll use a couple of times and forget…still cheaper than a vintage synth, but pricey all the same. Continue reading “iPad Music Synthesis”
A couple of years ago I purchased a little Akai LPK25 keyboard. I was exploring MIDI sequencing and notation input on mobile devices such like my netbook, and was looking for a quick input device. I was sorely disappointed in what was available inexpensively, and I never seemed to be able to get the keyboard to work with either my netbook or my laptop. The keyboard sat on the edge of my desk for months, unused.
When I got the iPad one of the first apps I found was Garage Band. It was cool, but the virtual keyboard on the device just didn’t seem natural for playing. I missed raised black notes.
I had purchased the camera connection kit for the iPad, and found out from several online forums that the USB connector in the camera kit would work connect the LPK to the IPad. This started my first serious delvings into using the iPad as a digital audio workstation, or DAW. I’ve now had a chance to work with several DAW apps. Here’s a quick rundown with my impressions. Continue reading “iPad as DAW”
I figured that before I dive into the musical capabilities of the iPad, it might not hurt to define some of these terms and acronyms that I’m tossing around. As with any field, electronic music has its own jargon that can be quite confusing. These are roughly in order of how frequently I’ll be using the terms over the next several posts. I don’t pretend to be an expert, and will probably get some of this wrong, but here goes…
Musical Instrument Digital Interface (MIDI) – MIDI was developed as a communications protocol in the late 1970’s, early 1980’s. It allows musical keyboards to control other keyboards and devices such as computers, etc.
Even though it was developed thirty years ago, MIDI is still very much in use. Back in the day when computers didn’t have much memory, MIDI was also a very efficient way to create complex compositions. A computer or external sequencer only had to record key-on/key-off, pitch, and duration data. The actual sounds were produced by the external keyboards or sound modules. Other capabilities were added to the protocol, such as the ability to detect velocity, or how hard a key is pressed, and the ability to control various settings on instruments such as sustain, and to trigger events such as changing lighting, changing settings, etc. You could also play multiple keyboards from one controller keyboard, creating thick sounds and tonalities from multiple instruments.
MIDI has 16 different channels, and different instruments can be assigned to various channels. On most keyboards you will find a MIDI in, out, and through port. On many modern keyboards the MIDI signal is now transmitted through a USB port. Continue reading “Electronic Music Primer”