Playing an instrument is an analog experience: a tangible act based in physical reality. Because of this, when instrument simulating apps (soft synths or virtual instruments) can use the touch paradigm available on iPads, musicians can find themselves having more of an analog experience with that technology. This is useful as music creation has traditionally started via an analog process: playing a string, wind, or percussion instrument, for example. Because of touch technology, the process of playing digital instruments is now able to become more seamless.
Some of a musician’s existing analog created neural pathways are likely activated by executing movements on the iPad that are analogous to movements on actual instruments: for example, percussively striking the screen to have a simulated percussive interaction with warped synthesis of those strikes using (Impaktor) and blowing into the iPad’s mic input for a wind instrument interaction (Blowfinger). With these types of interaction, the possibility of musical expression is further enhanced.
From experience, I can say that introducing this level of tactile experience translates to musical expression because spontaneity increases. The iPad starts to become something more as the fourth wall that has traditionally separated musicians from digital instruments begins to break down; you can feel the immediacy of an instrument, literally at your fingertips.
The iPad also enables new musical interactions and connections that do not exist with an analog instrument. For instance, I can move my fingers in circles in Morph Whiz and make sounds that follow the velocity of my movement. Along with this, visual elements such as color and shape are used to interpret the sound. This can be both interesting and potentially inspiring for those who think of sound visually or are open to new ways to perceive sound.
In the app Fourier Touch, you use the accelerometer to provide high and low frequencies by moving your arm up and down. This looks like you are using Theremin of sorts (and the sound is similar), but the sound changes are accomplished by moving the iPad up and down (instead of with the antennas on the Theremin). The smallest movement of tilting the hand translates to pitch or dynamic shifts. The entire body can participate, suddenly blurring the line between dance and music-making.
In general, effective apps include an attractive appearance, UI creation incorporating the touch paradigm, and knowledgeable use by the developers of both the tools provided by Apple in their development guides and the iPad’s processing capabilities. The following music apps represent some of this forward thinking and further illustrate my thoughts on the possibilities of the iPad as a new musical instrument.
The Animoog, in my opinion, does more than any other app to make the iPad a viable performing and recording instrument. Moog, the seminal creators of modular synths and forefront of the east coast modular synth school of thought, have really stepped into the modern age over the last five years. I’ve spoken with engineers in the company and they seem truly committed to this new forward movement. I’m excited to see what comes from them over the next years.
The Animoog uses Moog’s unique Anisotropic Synth Engine (ASE) to allow complex sound creation on the iPad. What does “anisotropic” mean, you may ask? Honestly, I had to look that one up:
Anisotropic: exhibiting properties with different values when measured in different directions (“Anisotropic.” Merriam Webster’s Collegiate Dictionary, Eleventh Edition. Merriam-Webster, Inc., 2003. http://www.merriam-webster.com/dictionary/anisotropic. 3 Apr. 2015.)
The visual representation of the sound is shown through the oscilloscope (called the X/Y pad), and it is both informative and mesmerizing. It takes a standard oscilloscope view and allows for multiple oscillators to be morphed through (different values) with the “orbit” ball (different directions) which is triggering the sound as it moves. You can create custom paths for the orbit ball as you move it through the different oscillators, thus creating customized sounds. In addition, you can change the orbit ball’s rate and relation to the X/Y coordinates. Then, by using classic subtractive synthesis techniques such as filters and envelopes, parameters that are applied to modulate the sound, the possibilities of sonic manipulation are almost endless. This is a stellar app that can be controlled by an external midi trigger device (such as a keyboard) or send continuous controller (CC) messages to another midi device or a DAW you’re working in to modulate parameters.
Another excellent app is called iDensity. iDensity started as PC software and then moved to the iPad. When it made that move it got much better; it became an interactive instrument. It allows you to sample live sound (and/or use recorded sound) and manipulate it in real time with its granular synth engine. Granular synths do a great job of creating atmospheres and changing the sound. Basically, they sample sound on a microsound time scale. You can then layer (or not) those “grains” of sound on top of each other and affect their speed, duration, frequency, and phases, among other parameters. This can lead to interesting and abstract sound environments.
In iDensity, you can also capture the parameters you’ve created and have up to four presets occur simultaneously in a “snapshot.” You then have those loaded for use in real time for performance. It is an interesting idea of the developers that works pretty well and is worth exploring more.
Apple’s very own GarageBand does something a bit different. It doesn’t create a new instrument, but mimics analog reality for a similar virtual experience. You can “strum” or “bend” strings on a virtual guitar, for instance. I was pleased when they took their GarageBand interface and used it as a remote midi controller that wirelessly connects to Logic Pro X. Similar to what I discussed earlier, this is another case of the iPad enabling a tactile exchange with synth instruments. This time, instead of the iPad being the sound source, the source is Logic Pro X.
Other musical applications of the iPad include reading and annotating sheet music, as well as audio and soft synth recording (both as roughs and in a finished state). For carrying sheet music one of many choices is forScore. Recommended apps for audio production include MultiTrack DAW (24bit, 96Khz capability) and GarageBand (soft synth generation in addition to analog generated audio). Most of these apps are on the PARC iPad image. Hope to see you soon and answer any questions you may have!